1 / 13

Day 1

Day 1. How many beamlines have eqpmt not able to support present software? Certain experiments can’t be done now, but in future? What does current software not do? Need readouts of mevs, A-1, current display of data that is not esoteric, is a peak an artifact of the instrument?

shona
Télécharger la présentation

Day 1

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Day 1 • How many beamlines have eqpmt not able to support present software? • Certain experiments can’t be done now, but in future? • What does current software not do? • Need readouts of mevs, A-1, current display of data that is not esoteric, is a peak an artifact of the instrument? • Need data simulations during experiment • Do the results correspond to expectations? • Intelligent systems? • What is the “2-minutes” based on?, Time requirements of different instruments? • Walk-in, crawl under hood, push button, analyze data versus simulation • Who will determine if sample is good? Modeling program for structure will aid this? • Experiment pre-planning will allow sample structure determination to be what is expected: analysis software with rapid response, and tie into structure databases on planet Baker/McGreevy

  2. Day 1 • What leads to hypothesis-driven research? • How to data dip in an integrated way • Need standard modules describing resolution models for each instrument • Central repository of software • Code throw-out: lower the bar, make the development process easy and straightforward • Will the code be written in open source so future use cannot be denied? • Code openness leads to responsibility to assist others in use of code; this may lead to reluctance for sharing • Neutron community should support sharing of the code! • Quality assurance is a hurdle; vote with feet Baker/McGreevy

  3. Day 1 • User to have option of leaving facility w/o data reduction • Gory details of sample for metadata? Encourage but not mandate • Capture as much metadata as possible automatically, electronic notebook, smart card • Should software identify, and data formats, track errors? YES! • Implications of computer security • Track treatment of each data set Baker/McGreevy

  4. Day -1 • Different levels of access for software repository? • Benchmarking software is important • Prioritize software adapted to SNS, who? • Resources will never be available to support all codes • Need the ability to integrate “personal” code into SNS framework • How will needs of casual user differ from experienced user? • Need to know what codes are being written • Two levels of codes: (1) facility supported with high level of detail; (2) user supplied with lower level of documentation • Capture intellectual prowess of community Baker/McGreevy

  5. Day -1 • Going from specific use code to general purpose is difficult and time-consuming • Common modules for instruments or analysis functions • Reducing data will be provided by the SNS, common across instruments; treatment routines would be different, help provided by facility • API: application program interfaces are needed; • Facility will provide instrument-specific output after reduction, capable of being used with analysis programs • Need assessments of individual code packages to see their potential for use at SNS • Source code is needed if facility manages code: trust but verify • Identify class of customers for software • Inexperienced users need statistical packages (SAS..) to be integrated with other data treatments, plug and play Baker/McGreevy

  6. Day -1 • Total problem-solving environment needed? • Entropy module to be pulled in, along with knowledge base • Top-level from home institution, plugged in and ported; home, airport and on plane • Be able to cut junk, eliminate other parts of data, but note various subtractions. • The only facility product is what comes out of detectors, all of the rest is visualization • Are there codes available that learn? Codes that are expert systems? Common component architecture (CCA) can be used to integrate pre-packaged data analysis • Should there be an effort to translate the experience of instrument scientists into an expert system? YES • Immediate incentive is needed for supplying metadata and ancillary code; convince a few super-users of benefit • Electronic notebook to include metadata Baker/McGreevy

  7. Day -1 • Identify barriers to use personal codes and then the added value to use facility programs • Jennifer White and viz the lipids • User written programs typically start at the analysis stage; SNS will provide data reduction programs • SNS will enable people’s science, this is a partnership, not get in each other’s way • What is minimum to get in the SNS door? Should SNS be another tool in a tool box? • SNS should provide the framework for the full process, yet allow individuals to use their favorite tools, to be extensible Baker/McGreevy

  8. Biologists • Biologists have high level requirements, but are not code diggers, but need high-level interface; they need data formats that allow use of existing software (CCP-4) Baker/McGreevy

  9. Chemistry • Jennifer White and viz the lipids Baker/McGreevy

  10. Materials Science Baker/McGreevy

  11. Physics Baker/McGreevy

  12. Day 1 + • 3-D goggles with colored neutrons viewing structure, with dispersion lit up on touch • Flexibility to go down into pulse-level raw data, as well as go up in ultimate massaging • Ability for instrument simulation of resolution may be soon or longer term Baker/McGreevy

  13. Paul’s summary Data format • Standard formats • Capture (automatically) as much metadata as possible • Carry processing history • Carry error propagation • Carry full history of analysis (MC, MD..) Data visualization • All type of formats • Instantly at instrument • From everywhere (office, on/off line) Analysis reduction • Software architecture should be plug-in for legacy code and individual code • SNS to provide core validated code • Provide depository and knowledge base Future • Encourage growth of software with professional programmers and contributions from the community Other • Reliability is key (every time it is needed whatever it is) • Users should have access to data from raw data level to final image as desired Baker/McGreevy

More Related