1 / 10

Software Consolidation Tasks B . Hegner (for the CERN PH/SFT group)

Software Consolidation Tasks B . Hegner (for the CERN PH/SFT group) Preparation for ECFA HL-LHC Workshop 17- 7 -2013. The Context. Thinking about the future of SW in the next 10 to 20 years we will be challenged by Increasing processing demand Increasing parallelism in hardware

xannon
Télécharger la présentation

Software Consolidation Tasks B . Hegner (for the CERN PH/SFT group)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Software Consolidation Tasks B. Hegner (for the CERN PH/SFT group) Preparation for ECFA HL-LHC Workshop 17-7-2013

  2. The Context Thinking about the future of SW in the next 10 to 20 years we will be challenged by • Increasing processing demand • Increasing parallelism in hardware • Uncertainty of production platforms Bigger problems means our software must become simpler if we want to be able to solve them!

  3. Making software simpler Rule #1 of simpler software: Only consider functionality that is really needed Rule #2 of simpler software: Only provide functionality if it isn’t covered otherwise already Rule #3 of simpler software: Don’t try to be smarter than you actually are Rule #4 of simpler software: Re-assure from time to time on #1, #2 and #3

  4. Evolution of Languages • A lot of high energy physics software projects are still in their 1st iteration with OOP and C++ • The standard library, Boost and other libraries evolved since then • Following the evolution is a pre-requisite to stay future-proof • Transition to C++11 started • C++14, C++17, etc to come • E.g. major investment in ROOT 6.0 • Transition to Python 3.0 rather slow • Will that ever fly in our community? • We should systematically review whether the approaches we chose are still state-of-the-art

  5. Review existing code-base • LHC software is now > 15 years old • Evolution of requirements since initial design • Thread-safety wasn’t on the radar for a long time • How many other requirement changes are there? • With present LHC experience the requirements are much clearer than before • Common understanding allows us to set up common projects • One common project emerging is the share of physics validation infrastructure • SW designers will be gone by HL-LHC, so knowledge preservation is a concern • Fix now everything that can be/ has to be fixed • And document everything else! • We should have a review of our libraries and should not limit ourselves when • Refactoring code for thread-safety • Changing interfaces to make them long-term maintainable • Deprecating functionality • Such reviews should not be a one-time effort, but translate into baseline activities • E.g. Geant4 relies on and does regular review, update and removal of physics models

  6. Focus on Core Competencies HEP traditionally has a strong do-it-yourself attitude • … even within experiments! Core competencies of our community are I/O, statistical analysis and simulation • Need to parallelize them from the ground up Other important needs of our community need to be addressed as well • mathematical operations (e.g. VDT) • physics data structures (e.g. SoA vs. AoS) • tracking and clustering (e.g. fastjet) • experiment frameworks (e.g. GaudiHive) We have to focus and invest with common projects in these fields

  7. Software Validation • Testing and validation has ever been a core part of HEP software development • Our workflow is different from other communities • Importance for validation increases • May need to support another major production platform • Multithreaded reproducibility is much more complex • Profiling tools essential • First rule of optimization: know where to optimize! • We should use more standards in the community! • igProf could be such a standard • Validation tools • We ought to be able to have a common project among experiments!

  8. Portability and Preservation • We need to facilitate the transition to new platforms and resources • Virtualization technology can help • Separate hardware and software update cycles • Versioned SW distribution a viable solution for long term environment preservation • E.g. CernVM for Na49/NA61 • Provides out-of-the-box solution for clouds • Validated environment can be cloned basically everywhere

  9. Dealing with Bugs • Don’t let bugs enter the software in the first place • Better training of the community • Test-driven development • Important for platform migrations as well • Spot bugs early on • Run-time testing and validation • Static code analysis • Make debugging easier • What are good tools for multi-threaded debugging? We should seriously consider investing more here!

  10. Conclusions • Problems are getting bigger • Let’s make software simpler! • Functional requirements much clearer than for LHC • We have to make debugging, validation and profiling easier • Invest in common tools and knowledge • We have to consolidate existing code base • Languages and external libraries evolved • Fix for ‘outdated’ decisions now • Review of software projects • We should dare refactoring things! • We have to consolidate on functionality • What are the core needs of the community? • We should dare deprecating things! • We have to consolidate on knowledge • Use more standards • Extend training for community

More Related