1 / 41

You: The most important piece of the bulk power system

You: The most important piece of the bulk power system Human factors in supporting reliability and recovery from physical and cyber events. Mike Legatt, Ph.D. Principal Human Factors Engineer Electric Reliability Council of Texas, Inc. Michael.Legatt@ercot.com. Introduction.

vahe
Télécharger la présentation

You: The most important piece of the bulk power system

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. You: The most important piece of the bulk power system Human factors in supporting reliability and recovery from physical and cyber events Mike Legatt, Ph.D. Principal Human Factors Engineer Electric Reliability Council of Texas, Inc. Michael.Legatt@ercot.com

  2. Introduction This exercise is intended to prepare you for things that don’t seem “quite right” – how you can communicate and collaborate to identify events and reduce their impacts Furthermore, it’s intended to serve as a brief primer on maintaining human performance by tracking stress, accuracy, and keeping cognitive biases in check

  3. Objectives You will: • Identify the systematic strengths and weaknesses of people, technology, and their interactions • Recognize the role of your “gut feelings” in operations • Identify when and how to share these feelings within and outside your organization and prevent against biases

  4. Definitions • Running estimate • Common operational picture (COP) • Cognitive bias • Situation awareness • Selective attention • Ego depletion • Hyperstress / hypostress

  5. PATTERN RECOGNITION: The core human activity

  6. Running Estimate Process

  7. Attention, Memory and Mistakes

  8. Selective Attention

  9. In which state of stress are you most likely to make a mistake in an emergency? • Hypostress • Hyperstress • Hypostress before the emergency, then hyperstress when it happens • Being in the “zone of maximum adaptation”

  10. Human Performance Under Stress Stress and performance, from Hancock (2008)

  11. In which state of stress are you most likely to make a mistake in an emergency? • Hypostress • Hyperstress • Hypostress before the emergency, then hyperstress when it happens • Being in the “zone of maximum adaptation”

  12. If you’re operating a substation remotely and flip the wrong breaker because you were on “autopilot” (very familiar with this substation), what was the likely cause? • Inattention • Misinterpretation of a rule • Inaccurate mental model • Organizational bias

  13. How we make mistakes From: NERC Cause Analysis Methods

  14. If you’re operating a substation remotely and flip the wrong breaker because you were on “autopilot” (very familiar with this substation), what was the likely cause? • Inattention • Misinterpretation of a rule • Inaccurate mental model • Organizational bias

  15. In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for? • Ego depletion • Semmelweis reflex • Outgroup homogeneity • Hindsight bias

  16. Ego Depletion • Self-control is a limited resource, and like a muscle, it tires out.

  17. Situation Awareness

  18. COGNITIVE BIASES

  19. Cognitive Biases (a sampling) • “Apparently, when you publish your social security number prominently on your website and billboards, people take it as an invitation to steal your identity.” – Zetter, K. “LifeLock CEO’s Identity Stolen 13 Times.” Wired.com, April 2010.

  20. There’s an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs? • Zero-risk bias • IKEA affect • Organizational bias • Confirmation bias

  21. You’re looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then? • Cognitive dissonance avoidance • Google effect • IKEA effect • Attentional bias

  22. Cognitive Biases (a sampling) • Anchoring – something you’ve seen before seems like the benchmark (e.g., first time you paid for gas) • Attentional bias – You’re more likely to see something if you’re thinking about it • Cognitive dissonance – uncomfortable to have to conflicting thoughts • Confirmation bias – pay attention to things that support your belief

  23. Cognitive Biases (a sampling) • Diffusion of responsibility – “someone else will take care of it” • Google effect – easy to forget things that are easily available electronically • Groupthink – people less likely to contradict ideas in a large group • Hindsight bias – the past seems perfectly obvious

  24. Cognitive Biases (a sampling) • IKEA effect – things you’ve built seem more valuable to you than things others have built • Illusion of transparency-expect others to understand your thoughts/feelings more than they can • Loss aversion – you’re more likely to try avoid losing than gaining

  25. Cognitive Biases (a sampling) • Organizational bias – you’re likely to think ideas within your organization are better • Outgroup homogeneity – you’re likely to think that people in another group all think the same • Semmelweis reflex – rejecting new ideas that conflict with older, established ones • Zero-risk bias – likely to choose worse overall solutions that seem less risky

  26. There’s an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs? • Zero-risk bias • IKEA affect • Organizational bias • Confirmation bias

  27. You’re looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then? • Cognitive dissonance avoidance • Google effect • IKEA effect • Attentional bias

  28. In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for? • Ego depletion • Semmelweis reflex • Outgroup homogeneity • Hindsight bias

  29. Scenarios

  30. Group exercise: Scenario 1 • Substation X • Camera malfunction • Low oil level alarm on a transformer • Dispatch troubleshooter • Bullet holes in camera and transformer • Random act of vandalism, ploy or directed threat?

  31. Group exercise: Scenario 2 • Substation Y • Communications vaults for 2 providers damaged (AT&T and Level3). • > 100 shots fired at transformers, oil leakages in several transformers (> 51k gallons spilled). • Only energized transformers shot. • Attackers never entered substation • Initial assumption: vandalism? • Dress rehearsal for future attacks? • It happened: April 16, 2013, Metcalf Substation

  32. Group exercise: Scenario 3 • Utility control room • Telemetry doesn’t look quite right – not sure why • Sees significant flow into substation without a load, then goes away • RTU failure, manipulated data, cyberattack?

  33. Group exercise: Scenario 4 • ISO Control Room • News report of civil unrest in an area • Call from utility: substation transformer • Call from utility: telemetry issues • Several other “below the line” calls • To whom do you share this information?

  34. Summary • Value of communication and collaboration when “things are not quite right.” • Reporting structure for handling incidents • Remember – your data may just be part of something larger

  35. References NERC CAP Annex D, Phase 0 (draft) NERC CIPC Report to Texas RE MRC NERC Cause Analysis Methods Macmillan, N.A., Creelman, C.D. (1991). Detection theory: a user’s guide. New York: Cambridge University Press Hancock, P.A., & Szalma. J.L. (Eds.). (2008). Performance under stress. Ashgate, Chichester, England..

  36. Questions ? ?

  37. There’s an emergency, and you have idea how to solve it, while a co-worker has a different idea. What might make you think yours is better than theirs? • Zero-risk bias • IKEA affect • Organizational bias • Confirmation bias

  38. In which state of stress are you most likely to make a mistake in an emergency? • Hypostress • Hyperstress • Hypostress before the emergency, then hyperstress when it happens • Being in the “zone of maximum adaptation”

  39. If you’re operating a substation remotely and flip the wrong breaker because you were on “autopilot” (very familiar with this substation), what was the likely cause? • Inattention • Misinterpretation of a rule • Inaccurate mental model • Organizational bias

  40. You’re looking at a one-line, and thinking about keeping flows under thermal limits. Why are you more likely to notice a base case violation then? • Cognitive dissonance avoidance • Google effect • IKEA effect • Attentional bias

  41. In a prolonged emergency situation (or even after a long shift), what do you need to be most careful about watching for? • Ego depletion • Semmelweis reflex • Outgroup homogeneity • Hindsight bias

More Related