70 likes | 190 Vues
The WLCG Storage and Data Management Jamboree summary outlines critical needs for improving data management in high-energy physics analysis. Emphasizing the importance of integration with existing experiments, the event captured discussions on utilizing networks over tape storage and the scalability of data solutions in line with Moore’s law. It advocates for pre-distributing data and fetching it as required, while addressing the need for standardized transfer protocols. The session also explored metrics for measuring outcomes and demonstrated practical implementations at WLCG.
E N D
WLCG Storage and Data Management Jamboree Summary • Jens Jensen • OGF29, Chicago • June 2010
Executive Summary • WLCG: Need for improving data management for analysis • Inviting demonstrators • MUST work with existing experiment • Demonstrate by WLCG @ IC
Input • Strawman • Use networks rather than tape • Networking better than MONARC • Also scales with Moore’s law • “More money for networking” • Tape as dark archive • Streaming video “solved” • But HEP is != streaming video
Data management • Make data available for analysis • Pre-distribute • Fetch when required • Transfers • Restart failed transfers
xroot • Not standardised • “Changes backward compatible” • 1.5 implementations
Metrics • Normally measure CPU • … what are the appropriate metrics?
“Outcome” • Demonstrate at WLCG at IC • … detailed look at notes