1 / 25

One Semi-Automated Force (OneSAF)

One Semi-Automated Force (OneSAF). User Engagement Points and Processes Session 6.1 Doug Parsons Chief Engineer, OneSAF Software Development. OneSAF processes overview Help desk User Forum User feedback Change Request/PTR processes Build process External developer handover process.

sovann
Télécharger la présentation

One Semi-Automated Force (OneSAF)

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. One Semi-Automated Force (OneSAF) User Engagement Points and ProcessesSession 6.1Doug ParsonsChief Engineer, OneSAF Software Development

  2. OneSAF processes overview Help desk User Forum User feedback Change Request/PTR processes Build process External developer handover process User Processes & Engagement PointsAgenda Briefing objective: Discuss OneSAF processes that engage with our users and inform users of entry points into the OneSAF system.

  3. User Processes & Engagement Points Process overview • The Architecture & Integration (A&I) team was tasked from program initiation to develop a robust set of processes that would guide development of the OneSAF system. • Unlike typical software acquisition programs PM OneSAF is essentially the ‘prime’. • Developing a set of processes that could guide the efforts of a variety of different task orders became essential. • These process have been certified at SEI CMMI level V (July 2004) and ISO 9001:2000 (June 2007). * * CMMI audit was performed on SAIC's Integrated Mission and Simulation Systems Operation (of which includes OneSAF A&I).

  4. User Processes & Engagement Points Internal processes • The intent is not to provide a review of our internal software development processes or practices; however, there is value in pointing you to where these processes can be found. • https://dev.onesaf.net/Architecture_and_Integration/index.html • Pay particular attention to the Electronic Process Guide (EPG) • Knowledge Engineering and Conceptual Modeling • System definition • Product line definition • Component development • Integration & Test • Internal and external handover process

  5. User Processes & Engagement Points Electronic Process Guide 1. From the OneSAF.net internal development site (https://dev.onesaf.net/ or through the OneSAF Community page) find the Architecture & Integration Task Order link. 2. Locate the EPG button as shown. Note that some of these processes are being reviewed and revised to more appropriately reflect post OneSAF v1 development and release activities.

  6. User Processes & Engagement Points Help Desk Process • Addresses a wide variety of issues and ‘how do I’ questions. • Installation ▪ Databases • Training ▪ Operation • Scenarios ▪ Exercises • Model development ▪ Knowledge acquisition • Knowledge engineering ▪ System administration • Software development • Most prevalent questions addressed to date: • Installation/media – 13% • OneSAF.net – 10% • Terrain database - 8% • General info/data requests – 7% • Tools operation (MCT, SCAMT, MSDE, etc.) – 7% • DIS and HLA – 6% • Behavior models – 5% • Network/distributed operations – 4%

  7. User Processes & Engagement PointsGetting help desk support Links to help desk support

  8. User Processes & Engagement Points User Forum • Purpose is to facilitate and foster cross-OneSAF community discussion and collaboration. • Tips • Problem resolution • Good ideas for future capabilities • Sharing plans for co-development and/or future use (i.e. exercise or experiment). Discussion boards (most of them): Technical Support • Installation and Configuration ▪ General operation Scenario Execution and Generation • Scenario creation ▪ Terrain • C2 ▪ AAR • Multi-Node, Interop and Performance Analysis • Repeatability and Replication ▪ Data Loading/Collection • Model verification Development and Maintenance • Compositions ▪ ERC/SNE • Infrastructure ▪ KA/KE • Tool, Agent and Modeling Services ▪ System services

  9. User Processes & Engagement Points User Forum

  10. User Processes & Engagement Points User Feedback • Where the help desk will address a wide variety of questions, the user feedback tool offers a mechanism to describe problems or make requests for new capabilities. • Help desk personnel will forward appropriate help tickets to the user feedback tool. • User feedback tickets are regularly reviewed by Gov’t review board. The User Feedback Review Board (UFRB) will appropriately disposition these tickets: • Problem Trouble Report → Engineering CCB • Enhancement request → TPO → Requirements process • Return for more information • Cancel

  11. User Processes & Engagement Points Entering User Feedback Tickets

  12. User Processes & Engagement Points Good User Feedback (1/2) Purpose is to help us duplicate the problem and distinguish symptom from root cause. • Descriptive title of user feedback. • Standalone or distributed? If distributed, • Compositions running in exercise • Number of machines; number that are MCTs vs. Simcores • Number of entities • Have any nodes been late-joined? • Interoperating with DIS? • Which other applications are in use? • Interoperating with HLA? • Which federates are involved? • Which FOM and RTI are you using? • Interoperating with C2 systems? • Which messages are you using? • Which devices/versions are you using? • Terrain database. • Issues involving units/entity should include composition name to include resolution. • Order(s) issued to composition. • Was a work-around used? • Attaching a log file or scenario file capturing the problem is extremely helpful.

  13. User Processes & Engagement Points Good User Feedback (1/2) • Observation • Detailed description of the issue, such as: • Location of units/entities when the observation was made (e.g. 10 digit grid coordinate). • What was the OneSAF login used? • Item selected on editor, terrain features, etc. • What you expected vs. what happened • What units/entities were executing when the observation was made • What was the problem? Good user feedback description, as well as this sample, can be found on OneSAF.net community site by selecting Documentation → Processes → User Feedback and then finding User Feedback Tool on the page.

  14. User Processes & Engagement PointsPTR Process https://www.onesaf.net/community/systemdocuments/Documentation/ProcessGuide/ PTR/PTR_process.html UFRB

  15. User Processes & Engagement PointsPTR Tracking

  16. User Processes & Engagement PointsChange Request Process UFRB

  17. Peer Reviews Integration of TPO/domain participation into the development cycle. Lesson-learned – Don’t close key peer reviews without direct engagement (i.e. face-to-face or telecon) with user. Being employed successfully with the Knowledge Engineering/ Conceptual Modeling process. Dependency meetings Coordination with internal task order activities Coordinate handovers from co-developers Peer review participation Discuss PTRs for upcoming release Schedule ECCB and PRB meeting Engineering CCB Conduct technical assessment of integrations impacting the baseline (e.g., hardware/software baseline changes, external integration). Make recommendation to the CCB regarding content of the release baseline (i.e., v1.1, v1.5, v2.0…). User Processes & Engagement PointsBuild process

  18. User Processes & Engagement PointsProcess Flow for Future Builds Handover to I&T (for next build) Integration complete Test complete Dependency mtg (for next build) Build kick-off 1 2 3 4 5 6 7 8 9 10 Ten week build cycle • During each build three simultaneous activities are ongoing. • Knowledge engineering • Code and unit test • Integration & Test • Dependency Meetings • Task Order and co-developer schedule • Handovers from co-developers • Peer review participation • Schedule UFRB meetings • ECCB Meetings (Agenda)

  19. User Processes & Engagement PointsBuild 5 Important Dates • Build 5 Start – 16th July 2007 • Build 6 Dependency Meetings • Internal - 28th August 2007 (800 – 1500) • External - 30th August 2007 (1400 – 1700) • Build 5 Complete – 21st September 2007 • Build 6 Start – 24th September 2007 • Build 6 Complete – 30th November 2007

  20. User Processes & Engagement PointsHandover Process

  21. User Processes & Engagement PointsSoftware Handover Process Architecture Compliance Process

  22. User Processes & Engagement PointsSoftware Handover Forms • Signature and Feedback Form • This form documents completion at each phase of the handover process. • External Development Handover Form • This is the biggie! This form is where you begin in putting together a handover package. It contains all the documentation required, and criteria necessary, for a smooth and successful handover. • New/Modified Files Change List • This list should correspond with the files contained in the source build (see Software Included on the External Development Handover Form). Please include full paths for all files. • Known Handover Issues • Pretty self explanatory. You got "uh oh's" coming in? Let us know about them! • Analysis/Design Document • What did you add or change, and how did you accomplish the addition/modification? Feel free to add additional pages or make the cells bigger to accommodate your design descriptions. • List of Test Plans • List any test plans that were developed to test your new and/or modified functionality. • List of JUnit Tests • List all JUnit Tests that are new or were modified with your handover. • Domain Documentation List • Did you add new units? new models? new data? Excellent! This Document allows OneSAF to maintain traceability on the source of the new "stuff". Where did you get that data? What is the justification for those models and units?

  23. User Processes & Engagement PointsCompliance Assessment Checklist External developer performs self-assessment A&I performs code review Infrastructure Compliance Platform Compliance User Interface Compliance Process Compliance Deployment Compliance Verification Compliance Performance Compliance

  24. User Processes & Engagement PointsIssues and Consideration • Lessons learned • OOSC • USMC CACCTUS • SMDC • CGA • Two potentially opposing conditions: • Need to keep the handover process relatively painless for the external developer. Otherwise, they may not be willing to provide code back for integration. • Need to assure that the external developer provides the appropriate review and documentation. Otherwise, quality of the baseline may suffer and contain support gaps for future users. • We are currently working with the A&I team to improve the handover process and identify automated tools.

  25. Questions and Discussion User Processes & Engagement Points

More Related