1 / 16

IU/GA/ORNL Summary

IU/GA/ORNL Summary. Most of Monday. Wael explaining his script and configuration file design Intent to follow Don's design as closely as possible Single configuration file with subsections instead separate one for each component, then global one for overall simulation

aman
Télécharger la présentation

IU/GA/ORNL Summary

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IU/GA/ORNL Summary

  2. Most of Monday • Wael explaining his script and configuration file design • Intent to follow Don's design as closely as possible • Single configuration file with subsections instead separate one for each component, then global one for overall simulation • Driver is just another component in the design

  3. Monday • Framework: • parses config file • instantiates components • Initializes • Call driver.init( ) [the overall driver] • Call driver.step( ) • Call driver.finalize( ) • Framework also provides services that instantiated components can use ...

  4. Monday • Framework services are extensive. Some intended to be used by physics components, some only by driver component, but no enforcement of this • Widespread agreement (except one ORNL guy) • Need a design document describing this • Documentation in/with code • Complete neophytes (Samantha Foley and Bramley) will help with nagging questions and incorrect comments in scripts

  5. Monday • Other discussions: • Timestepping, particularly when combined with non-time stepping components • Label with start or end time of the interval? • Issue is not going to be settled here, just assure that • Scripting mechanisms will allow whatever is chosen • Data management systems can provide data to physicists • Requires clearly stated conventions if they are chosen (e.g., always timestamp with start of interval) • A data schema implicitly chosen by directory/subdirectory naming convention in current driver/set of components

  6. Monday • Portal: • Initially for monitoring and data presentation • Job launch capabilities needed, but first need scripts to launch them ... • Data management schemata ...

  7. CS Research Goals • Running multiple codes as part of a single overarching simulation • Issues of data consistency across components which before did not need to cooperate • How to provide mechanisms to allow signaling and potentially interrupts across components running as separate executables • Running multiple codes with drastically different resource requirements (short term: ignored because near term science not needing this)

  8. CS Research Goals • Slow MHD case will require single executable; • move towards integration between components being linked together directly (not necessarily a single executable but fewer executables) • Some need for this likely even now in fast MHD; e.g. TSC is being hobbled by run mode chosen in SWIM

  9. CS Research Goals • Overall data management system • Overwhelming even with single code • Needs automated • File handling • Archiving of results • Metadata creation • Needs ability to flexibly allow components to identify new/different files, added to data mgmt seamlessly

  10. CS Research Goals • User/developer interaction • Need job monitoring, event notification, accounting • Easy interface to monitoring of computational results • ElViz tool of choice here, but need to define interactions with data management system • Allow users to identify variables/entities to monitor • Portal launch • When/how will system be for non-SWIM developers

  11. CS Research Goals • Comprehensible build systems, automated regression testing • Stage in by addressing key components: component scripts, plasma state, driver component • Exhaustive regression testing unlikely to be possible • but some limited regressions on couple of simple cases might be addressible – and would be useful • May still be able to provide some tools for limited use by vital components like Plasma State

  12. CS Research Issues • Comprehensible build systems, automated regression testing, ctnd • Already facing this with linking of extant components and PS changes. If not full solution, some protocols to help notify when changes made that require component developer's intervention

  13. Collaborations to Consider • CS • CCA/TASCS: already have overlap • CPEDS: (Ian Foster, Jenny Schopf): grid logging, MDS, data placement, rehashed GriPHYN stuff. Working with LIGO, Earth Systems Grid (Bramley) • CsSCADS (Ken Kennedy) multicore systems • PERI (Bob Lucas). Pat Worley will seep info to SWIM CS folks as useful. • Earth Systems Grid (Dean Williams) David Bernholdt available as contact point

  14. Collaborations to Consider • CS • PDSI: Petascale Data Storage Initiative ? • Scientific Data Management (Ari Shoshani): parallel I/O, some workflow via Kepler (see Scott Klasky) • OSG: Open science grid, Miron Livy (David Schissel) • Keith Jackson (PyGrid) contacts: Dipti and David. Wrap science app as a web service, all Python.

  15. Collaborations to Consider • CS • TOPS, ITAPS, APDEC (Phil Colella), other refurbished round-1 SciDACs • VACET: viz and analytics CET (Wes Bethel). Looking for collaborators (Jeembo and Lee) • Fusion-centered projects (but CS part mainly) • FACETS (John Cary) • CPES (Scott Klasky) Portal people will keep him in sight, hailing distance. Need to id which portal tech he is wanting to use/develop, look for overlap.

  16. Collaborations to Consider • EU people. (Elan Becoulet, replaced by Par Strand) Similar project but w/o significant CS/Math involvment. (Don and Lee triangulate his old director) Topics mentioned include • XML schema for machine descriptions – seems of interest to SWIM researchers as well • High level run management • Using XML schema for specifying a Plasma State entity (specifies the data entries but not necessarily the actual data object), accessors. • More top-down driven specification

More Related