1 / 65

Space Network Access System (SNAS) Test Readiness Review

Space Network Access System (SNAS) Test Readiness Review. June 6, 2008 NASA Code 452 Space Network (SN) Project. TRR Agenda. Purpose of Review Rose Pajerski System Overview Requirements Overview Chii-der Luo Software Readiness Hardware Readiness Merri Benjamin

Télécharger la présentation

Space Network Access System (SNAS) Test Readiness Review

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Space Network Access System (SNAS) Test Readiness Review June 6, 2008 NASA Code 452 Space Network (SN) Project

  2. TRR Agenda • Purpose of Review Rose Pajerski • System Overview • Requirements Overview Chii-der Luo • Software Readiness • Hardware Readiness Merri Benjamin • Security Status Joe Clark • Acceptance Test Readiness Merri Benjamin • Transition and Training David Warren • Wrap Up and Next Steps Rose Pajerski

  3. Purpose of Review • The objectives of the review are to provide a comprehensive briefing to • Review SNAS requirements and testing activities to date • Present acceptance test plans and activities • Brief of the system, personnel and documentation readiness for acceptance testing • Demonstrate the system’s readiness to support acceptance testing at WSC • Readiness criteria • All applicable functional, unit, subsystem testing, etc. has been successfully completed. • Test objectives clear, system and environment configured and all test documentation complete • All RFAs / issues from previous reviews have been satisfied according to plan. • All known significant system discrepancies have been identified and dispositioned • Interfaces under configuration management or as specified in test plan • Testers identified, trained and ready • Upon completion of the review, the SNAS team seeks approval to proceed with acceptance testing

  4. Review Board • Lynn Myers, (chair), NASA Code 450 • Markland Benson, NASA Code 583 • David Campbell, NASA Code 584, HST • Gregory Coombs, HTSI Mission Integration & Operations • Bryan Batson, LM/JSC/MSOC Engineering

  5. Project Overview Rose Pajerski

  6. Organization SN Project Manager Code 452 Keiji Tasaki SNAS Product Development Lead Rose Pajerski Resource/Business Manager Paula Tidwell IT Security Curtis Emerson Code 451 Network Integration Management Office Scott Greatorex / NENS Systems Engineering Contractor ITT Implementation Contractor NENS Product ManagementContractor PAAC-II NASA Contractor

  7. Prior Milestones • Project Milestones • System Requirements Review 7/08/03 • NENS Task #20 SNAS PREP 4/04 - 10/04 • NENS Task #60 SNAS start 3/04/05 • Delta-System Requirements Design Review 4/28/05 • Preliminary Design Review 9/12/05 • Critical Design Review 5/04/06 • Implementation 6/5/06 - 7/20/07 • Integration/System Testing 8/07 - 05/08 • Beta Testing 8/07 - 8/08 • Server Shipment to WSC 5/06-12/08 • System Installation and Checkout 5/14-30/08

  8. Moving Forward Schedule Scheduled Completion • Test Readiness Review today • Acceptance Testing 7/25/08 • CIM #8 July • Beta Testing 8/15/08 • Transition to WSC Operations July - September • Training August • User’s Guide August • Server Operator’s Guide August • ORR (date tba) ~ 8/28/08 • Operational Availability (NCCDS) 9/02/08

  9. Customer Community Milestones • Group Customer Interface Meetings • CIM #1, April 13, 2005 Pre-SRR • CIM #2, June 14, 2005 Pre-PDR • CIM #3, August 25, 2005 Post-PDR • CIM #4, January 26, 2006 Pre-CDR • CIM #5, November 11, 2006 Build 1 demo • CIM #6, March 28, 2007 Build 2 demo • CIM #7, July 26, 2007 Build 3/4 demo • Beta Releases & Participants • 1st beta: 4/07 - 7/16/07 (Build 2 – NOMs, WSC) • 2nd beta: 7/07 - 11/29/07 (Build 3/4 - NOMs, WSC, HST) • 3rd beta: 11/07 - 3/03/08 (NOMs, WSC, HST, JSC, TRMM) • 4th beta: 3/08 - 4/18/08 (NOMs, WSC, HST, JSC, TRMM, SP&M, SPTR) • 5th beta: 4/08 - 5/30/08 • Release 0.1 06/04/08

  10. Project Documentation • System Requirements Doc. (DCN 002) CCB approved (4/30/08) • Operations Concept Doc. (DCN 002) CCB approved (11/15/06) • ICD between DAS/SNAS CCB approved (5/03/06) • ICD between EPS/SNAS CCB approved (11/15/06) • ICD between SN/CSM (DCN 002) CCB approved (10/29/07) • Security Documentation 452 Approved (10/29/07) • System Test Document Final, 5/16/08 • Acceptance Test Plan Final, 6/04/08 • MOC & O&M Client Users Guides Review drafts • Server Operators Guide Review draft

  11. Requirements Overview Chii-der Luo

  12. Requirements • Current version: SRD DCN 002 (4/08) • 860 baseline requirements • Security requirements for 2810.1A compliance added • Internet Security Service Officer and System Administrator Roles • System Logging changes • Auditing • Updates: Clarification of State Vectors required by JSC; clarification of requirements on orbital data processing and recurrent scheduling • SRD DCN 001 (4/06) • External interface requirements added • User Planning System processes • Recurrent Scheduling • Orbital Data Processing • External Processing Interface • Report Transfers • Additional Bulk File Type

  13. Requirements Verification & Test Cases • Requirement verification disposition • Verification methods • 689 Test (covered by 70 test cases) • 33 Demonstration • 4 Analysis (RMA) • 44 Inspection (including s/w standards – 6) • 28 Security • 62 Recommended for removal in DCN003 • Test case development process • Development team generated drafts during Development Testing • Reviewed by ITT • Updated based on comments • Reviewed by ITT and WSC • Finalized by WSC

  14. Requirements Traceability Verification Matrix

  15. Software Readiness Chii-der Luo

  16. Servers and Components • Open IONet Server • Services Access Manager (SAM) provides MOC & O&M Client connection • Closed IONet Server • Services Enabler (SvE) controls all message processing • Client messaging to and from servers • Control and Monitoring of all internal connections • Data collector and transfer activator • Forwards NCC and DAS messages to the appropriate Clients • SNAS-NCC Interface (SNIF) & SNAS-DAS Interface (SDIF) • Monitors DB for outgoing Client messages ready for transmission • Transmits all ICD formatted messages to NCC/DAS • Receives all NCC/DAS responses and alerts, and updates DB • SAM for Client connections via Closed IONet • Data Server • Oracle RDBMS • Data Server Data Manager (DSDM) for heavy-duty DB interactions

  17. Configuration Control • Development Environment • Changes controlled using CM tool • Concurrent Versioning System (CVS) setup on development server • Eclipse IDE environment established on developer workstations connected to CVS • Code checked into CVS after successfully compiled/linked and unit tested by developers • Version can be tagged at specific Build milestones • Build scripts were developed and executed on development server to create new server processes and client executables • Clients are copied to remote computers and tested

  18. Development Testing • Peer code reviews • Recorded issues in Jupiter plug-in to Eclipse development tool • Use of freeware Findbugs in streamlining code reviews • Unit testing • Used JTest and JProbe • Integration & tests • Testing updated functionality • Used to develop and validate test cases • Exercised all subsystems using MOC and O&M Clients • System tests • Used developed test cases • Regression tests • Recorded and tracked problems in Bugzilla • Open discrepancies to be transferred to Comprehensive Discrepancy System (CDS) Internal Discrepancy Reports (IDR)r3eviews

  19. Development Environment

  20. Beta Testing • User Planning System (UPS) and Space Web Services Interface (SWSI) engineers applying operational experiences • Beta testers • SWSI users: WSC NCCDS & DAS Ops, SWIFT/GLAST/AIM NOMs, SP&M • UPS users: HST, TRMM, JSC, SP&M • ITT • Major functionality tested • Request generation capability • DAS / NCC File transfers • EPS message transfers • Orbital data processing • Recurrent scheduling • Four Beta versions since 8/07 • Continue use of Bugzilla for problem reporting and tracking

  21. Beta TestingUser Community • GSFC Network Operations Managers • Gamma-ray Large Area Space Telescope (GLAST) – DAS/NCCDS scheduling • Astronomy of Ice in the Mesosphere (AIM) – NCCDS scheduling • Wide-field Infrared Survey Explorer (WISE) – DAS/NCCDS scheduling • Hubble Space Telescope (HST) • STScI (Baltimore) - Forecast period scheduling • STOCC (GSFC) - Active period scheduling • Johnson Space Center (JSC) • User Planning System (UPS) Engineering – Forecast/Active scheduling and Recurrent scheduling • Tropical Rain Monitoring Mission (TRMM) • Recurrent scheduling & orbital data processing • White Sands Complex • Demand Access System (DAS) engineering • Network Control Center Data System (NCCDS) engineering

  22. Beta Testing Configuration • Remote Client communication to server access via • GSFC’s Central Network Environment (CNE) • NASA’s Open IONet • Connected to WSC’s ANCC and DAS HMD • Data populated from O&M • Official mission data • TDRS IDs/Groups, SICs, SSCs, PEs (from ANCC) • Service and UPD definitions (from SWSI) • O&M-approved data (from Mission Manager) • User accounts and roles, SSC changes, PE changes • User generated (realistic Beta testing) • All mission defaults and characteristics, and EPS node data • DAS schedule & playback requests • NCC schedule requests • Orbital and vector data via file imports

  23. Beta (I&T) Environment

  24. Development/Beta Testing Results • Discrepancy status: 81 I&T 373 Resolved 36 Beta (HST) 21 Wishlist/Enhancement 6 Beta (ITT) 5 Open Discrepancy 8 Beta (JSC) ====================== 23 Beta (NOM) 399 Total 22 Beta (WSC) 223 System Testing ================ 399 Total • No open priority 1 or 2 discrepancies • 5 Open discrepancies of priority 3, 4 or 5

  25. Continued Use of Beta System • Beta version will be kept in step with AT version • Continued UPS user familiarization • MOC Operation capabilities • NCCDS request processing • Bulk schedule request ingest • Orbital data processing • TSW transmission • Recurrent scheduling • Auto-forwarding • Report generation • Graphics timeline • Comments and bug reports will be accepted • Continued SWSI user familiarization • MOC Operational capabilities • DAS and NCCDS request processing • Bulk schedule request ingest • UPD and GCMR processing • Vector transfer • TUT access and retrieval • Request summaries • O&M Client functionality • Comments and bug reports will be accepted

  26. Hardware Readiness Merri Benjamin

  27. Operational (AT) Environment

  28. Operational String - Status • Systems Configuration • SNAS Servers • Installed under EC –TO60-1 • Two racks in WSC CDCN (Open Servers and Closed Servers) • Connected to NISN Open and Closed IONet equipment racks in the WSC GCE • Software and Database configured on system • Remote System Admin access from GSFC via IONet verified • End-to-end connections from Client to Servers to ANCC and DAS HMD verified • NISN IONet modifications • Bandwidth increased on Closed IONet routers to accommodate SNAS data • Modifications to WSC Open IONet interface to accommodate SNAS system and increased bandwidth allocations

  29. Security Overview Joe Clark

  30. SNAS Security Boundary

  31. SNAS Security Categorization (1 of 2) • Guidance taken from • NIST 800-60 • NPR 2810.1A • Discussions with representatives from Code 731 and Code 453 • SNAS has been determined to be a low impact system following FIPS 199 Guidelines because • the loss of confidentiality, integrity, or availability might: • cause a significant degradation in mission capability to an extent and duration that the organization is able to perform its primary functions, but the effectiveness of the functions is noticeably reduced; • result in minor damage to organizational assets; • result in minor financial loss; or • result in minor harm to individuals

  32. SNAS Security Categorization (2 of 2)

  33. SNAS Security Documentation Status • Risk Assessment Report • Approved 29 October 2007 • SNAS Security Plan • Approved 29 October 2007 • Contingency Plan has been incorporated in the Security Plan • Self-Assessment Checklist • Approved 29 October 2007 • SNAS will become a subsystem of WSC • The SNAS Security Plan will be incorporated into the WSC Security Plan

  34. Acceptance Test Readiness Merri Benjamin

  35. Test Environment – Systems Resources for AT • SNAS Servers – Operational String • Connections to ANCC and DAS HMD verified • Connection to NCCDS and DAS Ops blocked • Software Release 0.1 loaded • SNAS Database configured - populated with required pre-test configurations (i.e., test SICs and SSCs)

  36. Test Environment – Systems Resources for AT(con’t) • ANCC configuration – shared SN resource • Pre-test database configurations completed • No interactive UPD updates – not a concern for AT • DAS HMD – shared SN resource • TO-83, DASE Task Order, activities concurrent with SNAS AT – not a concern for test integrity – modifications to DAS HMD not related to DAS functionality • Client Platforms – multiple platforms available to AT testers • MOC Client on test platforms • O&M Client on test platforms

  37. Test Plan • AT Objectives • Verification of MOC Client functionality • Verification of O&M Client functionality • Verification of data flows between Client, Open Server system, and Closed Server system • Verification of data flows between Closed Server system and ANCC and DAS HMD • Verification of Data Server functions for data storage and retrieval • ATP Completed - testing to requirements – RTVM in ATP • Functional Requirements • User interface • Interactions with NCCDS • Interactions with DAS • Operational Requirements • Operational procedures • Maintenance • Security

  38. Test Plan (con’t) • Test Cases - Five Test Groups • System Configuration for Mission and Users • Logins, Mission Setups, etc. • MOC Client Functionality • Orbital data processing, NCCDS and DAS scheduling, Real-time monitoring, Reports and Queries, etc. • O&M Client Functionality • System monitoring, customer configuration update requests, etc. • Security tests • System Performance tests

  39. ATP Test Case Groups (1 of 6) • System Configuration for Mission and Users • User Configuration Test Group • SNAS-MOC-USER-LOGIN • SNAS-MOC-USER-PREF • SNAS-MOC-VIEW-STAT • SNAS-MOC-VIEW-SYS • SNAS-MOC-USER-WORKSPACE • Mission Configuration Test Group • SNAS-MOC-MSNSETUP-USERACCT • SNAS-MOC-MSNSETUP-SPCCHRS • SNAS-MOC-MSNSETUP-EPSCFG • SNAS-MOC-MSNSETUP-NCCSSC • SNAS-MOC-MSNSETUP-DSCHPARM • SNAS-MOC-MSNSETUP-PE • SNAS-MOC-MSNSETUP-EDITSUPER • SNAS-MOC-MSNSETUP-ORBITAL

  40. ATP Test Case Groups (2 of 6) • MOC Client Functions • Import Data Test Group • SNAS-MOC-OVV-GEOC • SNAS-MOC-OVV-GEOD • SNAS-MOC-OVV-IMPVEC • SNAS-MOC-OVV-TCWTSW • SNAS-MOC-OVV-TRANSTSW • SNAS-MOC-OVV-UAVPSAT • NCC Scheduling Test Group • SNAS-MOC-SCHED-TUT • SNAS-MOC-SCHEDNCC-SAR • SNAS-MOC-SCHEDNCC-SDR • SNAS-MOC-SCHEDNCC-RR • SNAS-MOC-SCHEDNCC-ASAR • SNAS-MOC-SCHEDNCC-WLR • SNAS-MOC-SCHED-IMPORT-NCC • SNAS-MOC-EPS-AUTONCC • SNAS-MOC-SCHEDNCC-RECURRENT

  41. ATP Test Case Groups (3 of 6) • MOC Client Functions (con’t) • DAS Scheduling Test Group • SNAS-MOC-SCHEDDAS-TVR • SNAS-MOC-SCHEDDAS-RAR • SNAS-MOC-SCHEDDAS-RAMR • SNAS-MOC-SCHEDDAS-RADR • SNAS-MOC-SCHEDDAS-PBKS • SNAS-MOC-SCHEDDAS-PBKMR • SNAS-MOC-SCHEDDAS-PBKDR • SNAS-MOC-SCHED-IMPORT-DAS • SNAS-MOC-EPS-AUTODAS • Scheduling Tool Test Case (NCC & DAS) • SNAS-MOC-SCHEDTOOLS-ACTIVE • SNAS-MOC-SCHEDTOOLS-BULKMOD • SNAS-MOC-SCHEDTOOLS-REQUEST • SNAS-MOC-SCHEDTOOLS-TIMELINE • SNAS-MOC-SCHEDTOOLS-TIMESHIFT • SNAS-MOC-SCHED-TSWSUM

  42. ATP Test Case Groups (4 of 6) • MOC Client Functions (con’t) • Real-time Monitoring Test Group (NCC & DAS) • SNAS-MOC-CONTMON-REALTIME • SNAS-MOC-USER-SWITCHUSER • Report and Query Test Group • SNAS-MOC-RPTS&QRYS-RPTS • SNAS-MOC-RPTS&QRYS-USERENV • SNAS-MOC-RPTS&QRYS-QUERIES

  43. ATP Test Case Groups (5 of 6) • O&M Client Functions • User Configuration Test Group • SNAS-OM-USER-LOGIN • SNAS-OM-SYSMON-STAT • SNAS-OM-SYSMON-SYS • SNAS and Mission Configuration Test Group • SNAS-OM-MSNMAINT-TDRS • SNAS-OM-MSNMAINT-NCCCONN • SNAS-OM-MSNADMIN-SIC • SNAS-OM-MSNADMIN-DASSSC • SNAS-OM-MSNADMIN-NCCSSC • SNAS-OM-MSNADMIN-MISC • SNAS-OM-MSNADMIN-PURGE • SNAS-OM-USERADM-ACCT • SNAS-OM-SYSMSG-BCAST • SNAS Monitoring Test Group • SNAS-OM-SYSMON-MOC • SNAS-OM-SYSMON-NCC

  44. ATP Test Case Groups (6 of 6) • Security Test Group • Roles and Responsibilities • NIST Controls – General • NASA Wide Common Security Controls • Computer Support and Operations • Network Security Operations • Routine Monitoring • Periodic Testing and Security Controls Requirements • Incident Handling and Reporting • Awareness and Training • Account Management • Logical Access Requirements • Audit Trails and Accountability • Goddard Procedures and Guidelines • Server System and System Performance Test • SNAS-SYSTEM-PERFORM • SNAS-SYSTEM-ETE • SNAS-SYSTEM-HA

  45. RTVM Test Completion • RTVM fields to be populated during AT • Verification Status • Pass • Passed conditionally - test passed with some reservation • Failed • Blocked - the test was not performed for some reason • Requirement Status • Fully Met • Partially Met • Not Met • Approval Status • Approved • Disapproved

  46. Test Reporting • AT Discrepancy Reporting • SN Comprehensive Discrepancy System (CDS) • Internal Discrepancy Report (IDR) Database • Test Status Meetings • Testers, System Developers, and Task Monitor • Review all open discrepancies and determine plan for resolution • Identify shared resource contentions and coordinate resource availability with WSC management (DAS HMD) • Review valid discrepancies against concurrent on-going Beta testing • Determine plan for resolution • No patches delivered unless required to continue AT or minimal impact to testing (I.e., minimal regression test, no impacts to integrity of test results) • Plan for meeting two times per week • Final Test Report and PVM will be delivered following completion of AT

  47. Test Resources • Testers Identified • WSC engineers with NCC and DAS experience • NCC & DAS engineers trained in July 2007 at GSFC • Participated in Beta testing • Supporting resources • Customer MOC • Input of project unique configuration and data • Orbital Data Processing • Recurrent Scheduling • EPS-Client message exchanges with legacy systems • WSC DBA and SNAS SA • Maintain system configuration • Update database configurations

  48. AT Risks and Mitigation • If the test system connects to either NCC or DAS OPS during testing, there could be impacts to on-going SN mission operations. • Rated: likelihood is LOW; consequence is HIGH • Mitigation: • Test SICs will be AT unique • Property files will be adjusted to limit access • Reduced risk to LOW • With on-going Beta testers and AT testers sharing the same database (i.e., ANCC and DAS HMD) then there could be impacts to testing. • Rated: likelihood is MEDIUM; consequence is HIGH • Mitigation: • Test SICS and/or SUPIDENS will be AT unique. • Reduced risk to LOW

  49. Open AT issues • Version of DAS HMD to be used for SNAS AT. Continuing discussions with DASE project. • Contention for DAS HMD resources during time planned for SNAS testing. Activities for DAS Enhancement Project (TO-83) will be on-going during SNAS AT. Work with WSC scheduling to optimize use of resources.

  50. Transition and Training David Warren

More Related