1 / 77

Ground Damage Database

Ground Damage Database. David Anderson Nancy Rockbrune. Ground Damage Database Introduction. Introductions. David Anderson GDDB Chair Head of Operational Safety ~ British Airways Nancy Rockbrune GDDB Secretary Assistant Director SMS and Operational Data Management ~ IATA.

tareq
Télécharger la présentation

Ground Damage Database

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ground Damage Database David Anderson Nancy Rockbrune

  2. Ground Damage Database Introduction

  3. Introductions • David Anderson • GDDB Chair • Head of Operational Safety ~ British Airways • Nancy Rockbrune • GDDB Secretary • Assistant Director SMS and Operational Data Management ~ IATA

  4. Ground Damage Database Overview

  5. History • Ground damage has been long reported to cost airlines $4b annually • Estimate • Unplanned costs • Significant changes need to be made • Thin margins • Focus on Ground Operations has been made • Collaboration to make a difference

  6. History • Launched 2011, with limited membership • Shift in IATA’s data management and analysis approach • During 2011 and 2012 new reporting protocols and requirements were developed • Updated Contract • Launched Q1 2012 • First useable data received from 10 members (with some manipulation) • Consistently receiving useable data from 14 participants • 2013 focus on expanding participation • Any airline, ground service provider, and / or airport which provide ground services are eligible to participate in the program

  7. GDDB Coverage ~ As of December 2012

  8. GDDB Coverage ~ As of May 1, 2013

  9. GDDB Coverage ~ As of May 1, 2013

  10. Efforts to Support Growth • Introductory letter to all Stakeholders • Updated contract with reporting criteria • IT development • Automated quality check • Web Form • Discussions with strategic partners to develop GDDB submission extract from existing reporting systems • Common dimension tables • Participant query tool

  11. Efforts to Support Growth • Introduction of IOSA provision ~ ISM Ed. 7 • GRH 1.11.6The Operator should have a process to ensure aircraft ground damages are reported to IATA for inclusion in the Ground Damage Database (GDDB). Such reports should be submitted in accordance with the formal IATA ground damage reporting structure. (GM)

  12. Ground Damage Database Purpose

  13. Purpose Facilitate data driven improvements to effectively improve performance Gather and analyze global data with Industry partnership Provide information not otherwise possible Identify trends and contributing factors allowing for the development and assessment of effective mitigation actions Establish a baseline of ground damage performance in which future comparisons can be made

  14. Use of Data Conduct statistical analysis on clean defensible data Statistical analysis produces more tangible information Measures process performance Identify and prioritize contributing factors to process performance Measure and predict process performance improvements Provides confidence interval Measures the quality of the data Communicate findings to applicable WGs and TFs

  15. What Data? Data (even seemingly benign) is critical Identifying issues Determine the effectiveness of any mitigation actions Demonstrating effectiveness of the program as a whole Data driven decisions Not only improves safety / performance but also makes for a more efficient organization as a whole

  16. Types of Data? Reactive ~ wait for incidents to happen and try to understand why Proactive ~ analyze identified risks to mitigate before they turn into an accident / incident Predictive ~ mature system which conducts predictive analytics (statistical modeling) to identify and mitigate unknown risks

  17. Confidential Reporting Confidential reporting can be used for the following safety concerns: Unsafe behaviors Inadvertent errors and mistakes Near miss occurrences (incidents that did not occur but could have easily resulted in a serious event) Inadvertent errors or violations of aircraft handling or servicing systems Procedures or processes that could be improved

  18. Data Collection Process Identify hazards Report on hazards and occurrences Collect and risk assess reported hazards Trend and analyze information Identify mitigation action Monitor for effectiveness Review and monitoring for continuous improvement

  19. Challenge ~ Data Quality Any airline, ground service provider, and / or airport which provide ground services are eligible to participate in the program Variance in data received Data integrity the utmost of importance Confidence in analysis and decisions derived from it is equal to the confidence in the data itself

  20. Data Management Principles • Information producers and knowledge workers alike must know the meaning of information; otherwise they cannot perform their work properly • Information producers must also know the business rules, valid values, and formats to create information correctly • Information definition is to data (content) what manufacturing product specifications are to the manufactured product • Quality “Information Product Specifications” are necessary for the consistent production of quality information

  21. Data Management Principles • Managing by averages leads to flawed decision making as you are not accounting for process variation • If measurement system variation is too large there is an increased risk of: • Rejecting good data • Accepting bad data • Important to know how much of the observed variation of a process is due to the actual process itself (normal) and how much is due to the measurement system

  22. Data Management Principles • Operational definitions (includes taxonomies) help reduce subjectivity and variance in your measurement system (data) • Operational definitions can be: • A written statement • Templates • Display of comparisons (colour chart) • Operational definitions should be: • Something people can really use • Enables different people to reach the same conclusion (repeatability) • Enables the same person to reach the same correct conclusion at different times (reproducibility)

  23. Taxonomy / Operational Definitions • Controls data inputs • Reduce subjectivity • Reduce variation • Means for integration (internal and external)

  24. Solution ~ Defined Fields GDDB TF developed reporting criteria Representatives from Operators, GSP’s, Manufacturers and Industry groups Identified data to be consistently reported amongst ALL members Includes definitions / assumptions Minimize data variance Identify means in which data and analysis will feed ground operations working groups and vice versa

  25. Ground Damage Database Reporting Criteria

  26. Definition In Scope Out of Scope FOD Wildlife Damage Lightning Strikes Environmental • While parked at Gate / Stand or other parked area • During Marshaling or using Stand Guidance • During Deicing • While being Towed • Near Miss • Slide Deployments “Any occurrence / event associated with ground operations that results in aircraft damage”

  27. Field Categories Optional Causal factors Corrective actions Free text • Mandatory • Incident details • Location details • Aircraft details • Ramp conditions • Phase of operation ~ definitions included • Activities • Type of damage • Damage to aircraft • Ground equipment • Severity ~ definitions included

  28. Field Categories Mandatory Incident details Location details Aircraft details Ramp conditions Phase of operation Activities Type of damage Damage to aircraft Ground equipment Severity

  29. Field Categories

  30. Field Categories

  31. Field Categories

  32. Field Categories

  33. Field Categories • Arrival ~ Time period from when the aircraft nose wheel crosses onto the stand until the anti-collision light is off • Towing ~ Time period when an aircraft is being towed from one location to another • Servicing ~ Time period an aircraft is being serviced at a gate / stand • Departure ~ Time period when the anti-collision light is turned on and the brakes are off until control is handed over to Flight Operations

  34. Field Categories

  35. Field Categories

  36. Field Categories

  37. Field Categories

  38. Field Categories • Minor ~ No operational effect • Low ~ Aircraft inop < 60 mins • Moderate ~ Aircraft inop ≥60 minutes <24 hours • High ~ Aircraft inop ≥ 24 hours • Catastrophic ~ Hull loss

  39. Field Categories • Hours out of service ~ rounded up to the nearest hour Note: this field is for calculation purposes only, and should not be confused with severity

  40. Field Categories Optional Causal factors Corrective actions Free text

  41. Field Categories

  42. Ground Damage Database Statistical Analysis

  43. Summarizes data Measures the quality of the data Assess the degree of variation Critical to understanding the process performance Measure process performance Calculate separate descriptive statistics for each group, allowing to see how the groups differ Identify “critical x’s” Predict process performance with improvements Statistical Analysis

  44. Quickly compare distributions Highlights the variability of the data Displays data from different categories Can compare several groups of data at once Sample ~ Box Plot

  45. Displays the control of a process In control process shows random variation Out of control process shows unusual variation due to special causes Help to determine where to focus problem-solving efforts by distinguishing between common and special-cause variation Sample ~ Control Charts

  46. Displays defects from largest to smallest Prioritize issues and focus improvement efforts on areas where largest gains can be made Separates the "vital few" problems from the "trivial many” Sample ~ Pareto Charts

  47. Measure process improvements If distributions are normal can estimate the performance if new procedures are put in place   Sample ~ Probability Charts

  48. Hypothesis Test ~ P-Value Help us determine whether observed differences are: Statistically significant Or Due to chance (random variation)

  49. Null Hypothesis = No statistical difference Things are “on Target”, “same”, difference is due to “random variation” Not likely to be a critical x or may require more data Alternative Hypothesis = statistical difference Things are NOT “on target”, “the same”, due to “random variation” Data supports this x as a likely cause for further investigation Hypothesis Test ~ P-Value

  50. P-Value represents the risk that we are wrong if we conclude that the null hypothesis is false That we claim there is a difference and there isn’t one Numerically P-Value < 0.05 we can conclude that we found a statistical difference We may say we have a 5% chance of being wrong when we conclude that something is “off-target” Hypothesis Test ~ P-Value

More Related