1 / 36

INSY 3020/7976/ ENH 670 Human Error

INSY 3020/7976/ ENH 670 Human Error. Human Error Defined.  An inappropriate or undesirable human decision or behavior that reduces or has the potential to reduce effectiveness , safety, or system performance  A human action/decision that exceeds system tolerances

meghan
Télécharger la présentation

INSY 3020/7976/ ENH 670 Human Error

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. INSY 3020/7976/ ENH 670Human Error

  2. Human Error Defined •  An inappropriate or undesirable human decision or behavior that reduces or has the potential to reduce effectiveness, safety, or system performance •  A human action/decision that exceeds system tolerances • “An action is taken that was ‘not intended by the actor; not desired by a set of rules or an external observer; or that led the task or system outside its acceptable limits” (Senders & Moray, 1991, p. 25 as cited in Proctor & van Zandt, 1994, p. 43).

  3. Human Error • Operator error: • Due entirely to the human operator. • You can’t eliminate all of these, but a good human factors design will make these virtually impossible. • Design error: • Due to poor design.

  4. Examples www.baddesigns.com

  5. Examples

  6. Examples

  7. Examples

  8. Examples

  9. Human Error • Human Error Probability - the ratio of errors made with respect to the number of opportunities for error; • P(error) = 1 - Human Reliability

  10. Reliability Analysis • Total system reliability is a function of the reliability of the components. • Component reliability: r = 1 - p. r = component reliability. p = probability of component failure. • Two kinds of systems: • Serial: Sequence of components. • Parallel: Two or more components perform the same function (redundancy).

  11. Reliability Analysis • Serial system reliability: R = (r1) * (r2) …(rn). • Adding a component will always decrease reliability for a serial system. • Parallel system reliability: R = 1 - [(1 - r1) * (1 - r2) …(1-rn)] • Adding a component will always increase the reliability of a parallel system.

  12. Human Reliability • Operator error probability = number of errors / number of opportunities for error. • Human reliability = 1 - operator error probability. • Estimating human reliability: • Monte Carlo simulations: Describe the task, set up a simulation of the operator, repeat many times, estimate human reliability.

  13. Human Reliability • The goal of human reliability analyses is to apply the same principles to the human operator that we apply to the machine/device to prevent error that leads to system failure.

  14. Human Error Theories of Accident Causation • Accident-Proneness Theories • Accident proneness – this theory suggests that certain individuals are more likely to have accidents than others; supported by statistical data; underlying assumption is that all workers are to same job and environmental hazards • Accident liability – suggests accident-proneness is limited to specific factors (situational, age, etc.) • Job Demand vs. Worker Capability Theories • Accident liability increases when job demands exceed worker capabilities (similar to time-ratio estimates of mental workload) • Adjustment-to-stress theory • Arousal-alertness theory

  15. Human Error Stages of Human Decision-making at which Human Error can Occur: 1. Activation/detection of system state signal 2. Observation and data collection 3. Identification of system state 4. Interpretation of situation 5. Definition of objectives 6. Evaluation of alternative strategies 7. Procedure selection 8. Procedure execution

  16. Attention Resources Perception Response Selection Response Execution Sensory Registration Decision Making WorkingMemory Long-Term Memory Information Processing Model Wickens et al. 2004 Perceptual Encoding Central Processing Responding

  17. Basic Errors Attentional Failures Slip Unintended Action Memory Failures Lapse Rule-based or Knowledge-based Mistakes Mistake Intended Action Routine violations Exceptional violations Sabotage Violation Human Error Taxonomy Reason (1992) Unsafe Acts

  18. Error Mechanism Categories Basic Errors • Skill Based: Attention Failures • Memory Failures • Failures in Execution • Perceptual Based: Visual • Auditory • Tactile Rule Based: Misapplication of a good rule Application of a bad rule Knowledge Based: Inaccurate knowledge of the system Incomplete knowledge of the system Decision Errors

  19. Attentional Failures • Intrusion – entering a dangerous area / location • Commission – performing an act incorrectly • Omission – failure to due something • Reversal – trying to stop or undo a task already initiated • Misordering – task or set of task performed in the wrong sequence • Mistiming – person fails to perform the action within theltime allotted

  20. Memory Failures Losing ones place; forgetting intentions Rule-based Based Mistakes Application of a bad rule “I’m in a public space in view of many people, therefore I won’t be robbed.” Misapplication of a good rule “A patient on chronic medication became concerned about addiction and therefore deliberately stop taking the drug for a period each year even though the drug in question was not addictive.”

  21. Contributing Factors Contributing Factors in Accident Causation (CFAC)Sanders and Shaw (1988) • Management (organization/policies) • Environment (physical conditions) • Equipment (design) • Work (task characteristics) • Social/psychological environment (culture) • Worker/coworkers (personal attributes)

  22. Typical Errors Associated with new technologies or systems Mode Error – user thought system was in one mode when it was actually in another. Getting Lost – Users get lost in display architectures. Difficulty in finding the right screen or data set. Not Coordinating Data Entries – poor coordination between multiple users inputting data into the same system. Overload – system usedrains attention resources from other equally important tasks. Data Overload –users forced to sort through a large amount of data produced by the system in order to determine the true nature of the situation. Not Noticing Changes – digital displays used to communicate system changes or trends. Automation Surprises – system automation did something user did not expect or anticipate.

  23. Techniques & Methods For Human Error Identification • Technique for human error rate prediction (THERP) • Hazard and operability study (HAZOP) • Skill, rule and knowledge model (SKR) • Systematic human error reduction and prediction approach(SHERPA) • Generic error modeling system (GEMS) • Potential Human Error Cause Analysis (PHECA) • Murphy Diagrams • Critical Action and Decision Approach (CADA) • Human Reliability Management System (HRMS) • Influence modeling and assessment system (IMAS) • Confusion Matrices • Cognitive Environment Simulation (CES)

  24. Murphy Diagrams • Diagrammatic representations of error modes that illustrate the underlying causes associated with cognitive decision making tasks. • 1. Activity  activation/detection of system state signal •  observation and data collection •  identification of system state •  interpretation of situation •  definition of objectives •  evaluation of alternative strategies •  procedure selection •  procedure execution • Outcome • Proximal Sources • Distal Sources

  25. Murphy Diagram Example

  26. Typical Investigation Errors Due to “Hindsight Bias” • Conterfactual Reasoning – Stating only what users should have done to avoid the mishap; does not explain why users did what they did. • Data Availability/Observability – Pointing out data that could have revealed the true nature of the situation; does not explain which data observers used, how they used it and why they used it. • Micro-Matching Error – Matching fragments of peoples overall performance with rules and procedures taken from documentation; does not explain why the user did what they did. • Cherry-Picking Error – Identifying an over-arching condition in hindsight (“users were in a hurry”) based on the outcome then trace back through the sequence of events to confirm your conclusions.

  27. Human Error Investigations Suggested Procedures • Do not use the outcome of a sequence of events to assess the quality of the decisions that lead up to it (avoid hindsight bias) • Don’t mix elements from your own knowledge into those of the users at the time of the mishap. • Don’t present your knowledge to the users you investigate. Determine what knowledge the users utilized at the time of the mishap. • Recognize that consistencies and certainties of the system are products of your hindsight, not the users mindset at the time of the mishap.

  28. Human Error Investigations Suggested Procedures • To understand and evaluate human performance, you must understand how the situation unfolded around users at the time of the mishap. You must adopt a view from inside the situation as it occurred. • Remember that the point of a human error investigation is to understand why users did what they did, not to judge them for what they did not do.

  29. Human Error Investigations Sources of Data / Information • Third-party and historical sources • Recordings of people performance and process performance • Debriefings of system user participants involved in error mishap • Purpose is to help reconstruct the situation surrounding the users at the time of the error mishap and get their point of view on the event.

  30. Human Error Investigations Debriefing & Interviewing Approaches & Techniques: • Have users tell the story from their point of view. Do not present them with replays or summaries to “refresh their memory” • Tell the story back to them as an investigator (checks understanding) • Have users identify critical junctures in the sequence of events places, or short stretches of time where either people of processes contributed critically to the direction of subsequent events or the outcomes that resulted

  31. Human Error Investigations Debriefing & Interviewing • Have them describe how the world looked to them at each critical juncture: • what cues were observed? • what knowledge was used to deal with the situation? • what expectations did users have about how things were going to develop? • what options did they think they had to influence events? • what other influences helped determine how they interpreted the situation and how they would act?

  32. Human Error Investigations Debriefing & Interviewing

  33. Where are the holes? What do they consist of? Why are the holes there in the first place? Why do the holes sizes and locations change over time? How and why can the holes line up to produce a mishap?

  34. Latent Conditions Organizational Influences Latent Conditions Unsafe Supervision Latent Conditions Preconditions for Unsafe Acts Active Conditions Unsafe Acts Failed or Absent Defenses Accident & Injury

  35. Generic Approaches to Minimizing Human Error • Personnel SelectionAppropriate skills and capabilities to perform required tasks • Training Helps ensure appropriate skills; can be expensive and time consuming; people may revert to original behaviors under stress • DesignPreferred method; maintainability, displays & controls, feedback (error detection), user expectations; categories: exclusionary, preventative, and fail-safe

  36. Get Help From System Users What would have helped you get the right picture of the situation? Would any specific training, experience, knowledge, procedures, or cooperation, from others have helped? If a key feature of the situation could have been different, what would you have done differently? Could clearer guidance from your organization, help you make better trade-offs between conflicting goals?

More Related