1 / 12

GCSS-J Best Agile Practices

GCSS-J Best Agile Practices. GCSS-J Program Overview. Joint Logistics Environment: Visibility of logistic processes, resources, and requirements. Where is it?. How will it get here?. When will it get here?. Web-based system (NIPR / SIPR) managed by DISA

Télécharger la présentation

GCSS-J Best Agile Practices

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. GCSS-J Best Agile Practices

  2. GCSS-J Program Overview Joint Logistics Environment: Visibility of logistic processes, resources, and requirements • Where is it? • How will it get here? • When will it get here? • Web-based system (NIPR / SIPR) managed by DISA • Retrieves authoritative data on facilities, units, ships, aircraft, etc. • Provides decision support tools to analyze, manipulate and display the data • Users collaborate to integrate combat support data • Users represent Maintenance, Supply & Services, Health Service & Support, Movement Services, Civil Military Engineering & Personnel Mngmt • Began as Rapid Improvement Team (RIT) -Dec 2001 • Working Release 7 - “agile” for approx 2 yrs LTG Gainey, JS/J4 Ops Sponsor: “I want capability delivered in dog-years!” 2

  3. SPRINT SPRINT GCSS-J Scrum The Sprint Cycle Initial Planning DiscoveryPlanning Sprint Planning Sprint Review Daily Scrum Release Planning Product Backlog Sprint Backlog Production Ready Feature • Currently supporting 2 releases per year (goal is 4) • Multiple (3-5) Sprints per release • Development Sprints are one month in duration • Sprints are NOT fielded but available for users on the First Look Site Scrum is an agile software development framework 3

  4. GCSS-J Schedule FY11 FY10 4th Quarter 3rd Quarter 1st Quarter 2nd Quarter 4th Quarter Sep Oct Nov Dec Jan Feb Mar Apr May Jun Jul Aug Jun Jul Aug Sep Oct Requirements Analysis Agile Development Process – includes planning, development, review, user assessment Sprint a Sprint a Sprint b Sprint b Sprint c Sprint c Sprint d Sprint e Contractor Integration Test Security Test & Evaluation System Acceptance/Qualification Test Risk Assessment GCSS 7.2 Test Concept Brief GCSS 7.3 Test Plan OT Readiness Review OT&E Includes Site Prep, SW Installation, and Pilot Fielding Decision Review Release 7.2 • Currently supporting 2 releases per year (goal is 4) • Multiple (3-5) Sprints per release • Development Sprints are one month in duration • Sprints are NOT fielded but available for users on the First Look Site Release 7.3: 3 Dec 4

  5. Evaluation Structure Critical Operational Issues Mission Based: Traced to Joint Capability Areas (JCAs), levels 1-3 COI 1 . . . Measures of Effectiveness / Measures of Suitability Mission Based: JCAs Level 4-7, or OV-5 system’s activity model MOE 1.1 MOE 1.1 . . . Measures of Performance Task Based: Traced to Universal Joint Task List (UJTLs), or User input – tasks they perform MOP 1.1.2 MOP 1.1.3 MOP 1.1.1 . . . Measures of System Attributes Technical Measures: • Key Performance Parameters (KPPs) • Critical Technical Parameters (CTPs) • Key System Attributes (KSAs) MOSA 1.1.2.a MOSA 1.1.a Majority of Evaluation Structure can be “Solution Agnostic”

  6. Evaluation Structure (cont.) Joint Capability Areas (JCAs)/Universal Joint Task List (UJTL) MOE = Measure of Effectiveness

  7. Interoperability Integrated within Evaluation Structure Task MOPs Mapped to External Interfaces • Each Interface is mapped to the mission tasks (MOPs) that require effective end-to-end data exchange to be successful • If a data exchange is NOT • Timely • Accurate • Complete • Useable • Effective in determining which of the interfaces are truly ‘critical’ for mission accomplishment Task MOPs = Mission Tasks may FAIL during testing or post-fielding So if GDSS doesn’t meet Interoperability criteria, then four mission tasks (MOPs) could be affected - Information exchanges are correlated to the tasks they support. - A single interface may provide different information exchanges for different tasks. - Some tasks are mission critical, some are not (determined by users).

  8. Assessing Risk to Determine Level of Test 0 0 2 Minor Change / Fix Minor Change / Fix 2 • Functional Reqmts (FR) developed for each User Story • FR’s associated with Functional Areas, and categorized • Low risk items for use-case testing (users not needed) • High risk items for Op Mission Thread testing (users needed) • Design of Experiments (DOE) employed to determine key factors and effects of varied environmental conditions Example: Maintenance and Supply Services FR 125 FR 127 FR 126 Maintenance 1 No change from last release: no testing required New Capability Enhancement Risk: Moderate Requires OMT testing Supply/Services 0 0 New Capability Enhancement Impact on Mission Likelihood of Failure Query Tool Joint Decision Support Tools Functional Area Display/Map Collaborate Predict Moderate 1. Maintenance 1 Minor 2. Supply/Services No Change Risk: Minor Use-case test only New Enhanced Minor OMT Tests Focused on Higher Risk Areas 8

  9. Best Practices • Early Involvement • Users and Testers • Collaborate on issues, metrics, data elements, conditions • Develop mission-based evaluation structure • Readily accommodates emerging user stories and functional reqmts • Risk-based • Level of Test determined by risk associated with how s/w change impacts the affected tasks/activities • “Right sizes” test scope = less burden on operational user test • Mission-focused • Founded in Joint doctrine (JCAs, JPs, UJTLs) • Links information exchanges to tasks/missions • Answer the “so what” to decision makers

  10. Best Practices • Continuous Monitoring • Continuous data collection on operational system • Reliability , Availability, Maintainability • Information Assurance • Integrated team (DIA & JITC) scans continuously throughout development • Annual assessment of hosting facility • Remote Data Collection • Use of Defense Connect Online to observe and record user actions

  11. Lessons Learned • Requirements & the JCIDS Process • requirements evolve based on warfighter needs • agility to modify the evaluation plan (COI, MOE/S, MOP) for every GCSS-J increment/release • TES / TEMP • Not a good fit for Agile programs • cannot write detailed descriptions of test events, objectives, scope, as we currently do • shift emphasis to test plan (release) or test cases (sprint) • Does not include key stakeholders endorsements (JS/J6 and DAA) • Embedded Instrumentation • automation essential to Agile methods • requires built-in instrumentation to support lifecycle testing • Test Integration at Sprint Level • all test disciplines (dt, ot, iop, sec) • Test windows too small for a statistically relevant RAM evaluation • Information Assurance (IA) evaluations in hosting facilities

  12. JITC... Experts in testing & certification, accelerating the Nation's IT dominance

More Related