300 likes | 501 Vues
Metrics Based Approach for Evaluating Air Traffic Control Automation of the Future. Purpose. Provide overview of air traffic control automation system metrics definition activity Motivation Process
E N D
Metrics Based Approach for Evaluating Air Traffic Control Automation of the Future
Purpose • Provide overview of air traffic control automation system metrics definition activity • Motivation • Process • Present comparison of Host Computer System (HCS) radar tracks to GPS-derived aircraft positions
Definitions • Software Testing • Process used to help identify the correctness, completeness, security and quality of developed computer software • "Testing is the process of comparing the invisible to the ambiguous, so as to avoid the unthinkable happening to the anonymous."James Bach (Contemporary author & founder of Satisfice, a test training & consulting company) • “Testing can show the presence of errors, but never their absence.” Dijkstra (Famous Dutch computer scientist and physicist, author, etc.) • Two Fundamental Processes • Verification • Building the product right (e.g. determining equations are implemented correctly) • Validation • Building the right product (e.g. solving the right equations)
Why is this important to the FAA? • En Route Automation Modernization (ERAM) • Replaces EnRoute Host Computer System (HCS) and backup • ERAM provides all of today’s functionality and: • Capabilities that enable National Airspace System evolution • Improved information security and streamlined traffic flow at our international borders • Additional flight radar data processing, communications support, and controller display data • A fully functional backup system, precluding the need to restrict operations as a result of a primary system failure • Improved surveillance processing performance using a greater numbe/variety of surveillance sources (e.g. ADS-B) • Stand-alone Testing and Training capability
ERAM Test Challenges • Limited funding • Installed and operational at 20 sites in 2008-9 • System Requirements • 1,298 in FAA System Level Specification • 4,156+ in contractor System Segment Specifications • 21,906 B-Level “shalls” • Software: 1.2 million SLOC • COTS/NDI/Developmental mixture • Numerous potential impacts, significant changes • ATC Safety, ATC Functions, System Performance, RMA, ATC Efficiency • Replacement of 1970s legacy software that has evolved to meet today’s mission
Metric Based Approach • Formation of Cross Functional Team • Members from ERAM Test, Simulation, Human Factors, System Engineering, Air Traffic Controllers, and others… • Charter • “To support the developmental and operational testing of ERAM by developing a set of metrics which quantify the effectiveness of key system functions in ERAM” • Focus beyond requirement based testing but validation emphasis linked directly to services • Targeted system functions – Surveillance Data Processing (SDP), Flight Data Processing (FDP), Conflict Probe Tool (CPT), Display System (DS)
Background • Metrics may be absolute or comparative in nature • Comparative metrics will be applied to current air traffic control automation systems (and later to ERAM) • Measure the performance of the legacy En Route automation systems in operation today to establish a benchmark • Allow direct comparison of similar functionality in ERAM • Absolute metrics would be applied to FAA standards • Provide quantifiable guidance on a particular function in ERAM • Could be used to validate a requirement • Task phases • Metrics Identification • Implementation Planning • Data Collection/Analysis
Background (cont.) • Identification Phase – List of approximately 100 metrics were mapped to the Air Traffic services and capabilities found in the Blueprint for NAS Modernization 2002 Update • Implementation Planning Phase – Metrics have been prioritized to generate initial reports on a subset of these metrics • Data Collection/Analysis Phase – Iterative process
Iterative Process • A series (drops) of data collection/analysis reports generated in the targeted system areas • Generate timely reports to the test group • Documentation is amended as process iterates
Example Metrics • High Priority Metric – false alert rate of Surveillance Data Processing (SDP) Safety Alert Function • Direct link to ATC Separation Assurance from NAS Blueprint • Affects several controller decisions: aircraft conflict potential, resolution and monitor • Directly observable by controller and impacts workload • Several ERAM requirements – e.g. “ERAM shall ensure that nomore than 6 percent of the declared alerts are nuisance alerts…” • Lockheed Martin is using it in their TPM/TPI program • Low Priority Metric – wind direction accuracy for Flight Data Processing (FDP) Aircraft Trajectory • Trajectory accuracy already high priority metric • Potentially affects controller decisions but only indirectly by increasing trajectory prediction accuracy • Not directly observable by controller
High Priority Metrics FY05/06 • Surveillance Data Processing (SDP) • Positional accuracy of surveillance tracker • Conflict prediction accuracy of Safety Alert Functions • Flight Data Processing (FDP) • User Request Evaluation Tool (URET) trajectory accuracy metrics • Comparison of route processing (HCS/URET & ERAM) • Forecast performance of auto-hand-off initiate function • Conflict Probe Tool (CPT) • URET conflict prediction accuracy metrics for strategic alerts (missed and false alert rates), working closely with development contractor (scenarios, tools, etc.)
High Priority Metrics FY05/06 • Display System (DS) • By En Route Automation Group • DS Air Traffic Function Mapping to ATC Capabilities • By NAS Human Factors Group • Usage Characteristics Assessment • Tightly controlled environment, not dynamic simulation • Focused on most frequent and critical controller commands (e.g., time required to complete a flight plan amendment) • Baseline Simulation • High-fidelity ATC simulation, dynamic tasks • Focused on overall performance, efficiency, safety (e.g., number of aircraft controlled per hour)
Completed Studies • “Comparison of Host Radar Tracks to Aircraft Positions from the Global Positioning Satellite System,” Dr. Hollis F. Ryan, Mike M. Paglione, August 2005, DOT/FAA/CT-TN05/30.* • “Host Radar Tracking Simulation and Performance Analysis,” Mike M. Paglione, W. Clifton Baldwin, Seth Putney, August 2005, DOT/FAA/CT-TN05/31.* • “Comparison of Converted Route Processing by Existing Versus Future En Route Automation,” W. Clifton Baldwin, August 2005, DOT/FAA/CT-TN05/29.* • “Display System Air Traffic Function Mapping to Air Traffic Control Capabilities,” Version 1, Christopher Reilly, Lawrence Rovani, Wayne Young, August 2005. • “Frequency of Use of Current En Route Air Traffic Control Automation Functions,” Kenneth Allendoerfer, Carolina Zingale, Shantanu Pai, Ben Willems, September 2005. • “An Analysis of En Route Air Traffic Control System Usage During Special Situations,” Kenneth Allendoerfer, Carolina Zingale, Shantanu Pai, November 2005. *Available at http://acy.tc.faa.gov/cpat/docs/
Current Activities • Continue the baseline of system metrics • Begin comparison of ERAM performance to current system metrics
Immediate Benefits to Initial Tests • Establish legacy system performance benchmarks • Determine if ERAM supports air traffic control with at least the same “effectiveness” as current system • Provides data driven scenarios, methods, and tools for comparison of current HCS to ERAM • Leverages broad array of SMEs to develop metrics and address ERAM testing questions
Longer Term Benefits • Apply experience to future ERAM releases • Provide valid baseline, methods and measurements for future test programs • Support Next Generation Air Transportation System (www.jpdo.aero) initiatives • Contribute to the development of future requirements by defining system capabilities based on measurable performance data
Study 1: Comparison of Host Computer System (HCS) Radar Tracks to Aircraft GPS-Derived Positions
Background Task: Determine the accuracy of the HCS radar tracker • Supports the test and evaluation of the FAA’s En Route Automation Modernization (ERAM) System • Provides ERAM tracking performance baseline metric • Recorded HCS radar track data available from Host Air Traffic Management Data Distribution System • GPS-derived position data available from the FAA’s Reduced Vertical Separation Minimum (RVSM) certification program • GPS data assumed to be the true aircraft positions
GPS-Derived Data • RVSM certification flights • Differential GPS • Horizontal position (latitude & longitude) • Aircraft positions identified by date/call-sign/time • 265 flights, 20 Air Route Traffic Control Centers (ARTCCs), January thru February 2005 • Continuous flight segments – level cruise, climbs, descents, turns
HCS Radar Track Data • Recorded primarily as track positions in the Common Message Set format, archived at the Technical Center • Extracted “Flight Plan” and “Track” messages from RVSM flights • Track positions identified by date, call sign, ARTCC, and time tag (UTC)
Methodology • Point-by-point comparison – HCS track position to GPS position – for same flight at same time • Accuracy performance metrics in nautical miles: • horizontal error - the unsigned horizontal distance between the time coincident radar track report and the GPS position • along track error - the longitudinal orthogonal component (ahead and behind) of the horizontal error • cross track error - the lateral orthogonal component (side-to-side) of the horizontal error • Distances defined in Cartesian coordinate system • Latitude/longitude converted into Cartesian (stereographic) coordinates • Stereographic coordinate system unique to each ARTCC
Reduction of Radar Track Data • Split flights into ARTCC segments • Convert latitude/longitude to stereographic coordinates • Clean up track data • Discard data not matched to GPS data • Resample to 10 second interval & synchronize
Reduction of GPS Data • Discard non-contiguous data (15% discarded) • Identify ARTCC and convert lat/longs to stereographic coordinates • Reformat to legacy format • Re-sample to 10 second intervals and synchronize
Comparison Processing • Radar track point (x1,y1) matched to corresponding GPS point (x2,y2) • Pairs of points matched by date, call sign, time tag • Horizontal distance = SQRT [(x1-x2)2+(y1-y2)2] = SQRT [(Along Track Dist.)2+(Cross Track Dist.)2]
Horizontal Error (nm) Cross Track Error (nm) Along Track Error (nm) Type Sample Size Mean RMS Mean RMS Mean RMS Signed 54170 0.69 0.78 0.00 0.16 -0.67 0.77 Unsigned 0.12 0.67 Descriptive Statistics
Radar Horizontal Track - Flight #1 Falcon Mystere business jet Springfield – Kansas City – Wichita – Fayetteville radial – St Louis Climb – Cruise (FL350 & FL370) - Descend Y Coordinate in NM X Coordinate in Nautical Miles
Radar (Left) & GPS (Right) Flight #1 – Turn (“south” heading) Y Coordinate in NM X Coordinate in Nautical Miles
Radar (Right) & GPS (Left) Flight #1 – Straight (northeast heading) Y Coordinate in NM X Coordinate in Nautical Miles
Horizontal Error (nm) Cross Track Error (nm) Along Track Error (nm) Type Sample Size Mean RMS Mean RMS Mean RMS Signed 374 0.80 0.89 -0.04 0.12 -0.79 0.88 Unsigned 0.10 0.79 Track Errors – Flight #1
Contact the Author: mike.paglione@faa.gov 609-485-7926 Available Publications: http://acy.tc.faa.gov/cpat/docs/index.shtml