1 / 27

Trust, Reliability, and Safety

Trust, Reliability, and Safety. Madeline Tolge Shon Trinh Jackson Hollowell. Errors. A deviation from accuracy or correctness; or simply put, a mistake What are some errors in computing? Categories of errors: Problems for individuals

keahi
Télécharger la présentation

Trust, Reliability, and Safety

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Trust, Reliability, and Safety Madeline Tolge Shon Trinh Jackson Hollowell

  2. Errors • A deviation from accuracy or correctness; or simply put, a mistake • What are some errors in computing? • Categories of errors: • Problems for individuals • Systems failures that affect large numbers of people and/or cost large amounts of money • Problems in safety-critical applications that may injure or kill people

  3. Errors (continued) • New voting systems used in recent election Electronic Voting Video • How would you describe an error like this? • Medical equipment battling cancer The Therac-25 • How would you describe an error like this?

  4. Killer Robot Case (Part 1) • Programming error results in the accidental death of Cybernetics, Inc.’s Bart Matthews • Head programmer Randy Samuels of Silicon Techtronics, Inc. charged with manslaughter • Robotics Division Chief Ray Johnson told Sam to finish the robot by January 1 or “heads would roll.” • After reading this case, who do you believe was more at fault, the head programmer or manager? How could it have been avoided?

  5. Reliability • In general, reliability (systemic def.) is the ability of a person or system to perform and maintain its functions in routine circumstances, as well as hostile or unexpected circumstances. • Does reliability lead to safety? • How to assure that a system is reliable?

  6. Testing • Bullwhip Effect • Reuse of software • Therac-25 used software from earlier versions, Therac-6 and Therac-20 • People only see the last link of the whole chain • When cell-phone breaks, company that produced it is blamed, not the one that made defective part • Why is testing important? • Computer Professionals entered into a contract with society.

  7. Killer Robot Case: Testing Overview • “…correct code was written, successfully tested, but the wrong code was inserted into the delivered product” – Wesley Silber, Professor of Software Engineering • Cindy Yardley faked software tests • Ray Johnson forced her to do so

  8. Yardley’s Choice •  "On the basis I decided to fake the test results-I was trying to project my job and the job of my co-workers"  •  Do you agree with Yardley's rationale?  What ethical perspective would support your argument? • "I think I was mislead by Ray Johnson.  He told me that the robot was safe.“ • What insight does this give about the corporate culture and value of employees? 

  9. Meeting deadlines • Duty to clients • Honest testing • Duty to all stakeholders • Transparency • Duty to stakeholders and co-workers/computer professionals ACM/IEEE Code of Ethics (Social Contract)

  10. Human Factor • What contributes more to failure of computing devices? • Human factor vs Technological error • What cases can you think of? • Wrong input in taxing systems • How to eliminate human factor? vs

  11. Trusted Computing • What is trust? • Human – Human interaction • Human – Computer interaction • Should computing devices be trusted? • Should we be trusted?

  12. Ethical Issues • What issues might arise as we look at this phenomena from perspectives of different philosophical ethics? • Can we still apply philosophical ethics to Human – Computer interactions?

  13. Killer Robot Case: Quality of Training Overview • Ruth Witherspoon, “Justuce for Randy Samuels Committee” spokesperson, said: “Randy is being made a scapergoat for a company which had lax quality control standards!” • “The robot will be safe to operate and even under exceptional conditions” • Robot operator has sworn that neither she nor any other robot operator was ever told that the robot arm could oscillate violently. • “The written test developed by Silicon Techtronics to certify robot operators was considered a joke”

  14. User Training • User rarely has understanding of the system and therefore must blindly follow guidelines provided by developers • How intense should the training be? • Is a manual enough? • How many hours of training is enough? 10? 20? 40?

  15. Killer Robot Case: User Interface Overview • Interface designer and not the programmer should be charged with negligence, if not manslaughter • Shneiderman’s “Eight Golden Rules” • The Robbie CX30 operator interface violated each and every one of Shneiderman’s rules. Several of these violations were directly responsible for the accident which ended in the death of the robot operator

  16. User Interface • User only seessystem’s shell. Everything inside has undergone a certain level of abstraction (isolated from user) • User Interface – is a way for user to communicate with machine • What are potential problems? • UI might be misleading • What warranty does user have that computer is not lying?

  17. Who to blame? • Cindy Yardley? • For faking tests and trying to save her co-workers • Ray Johnson • For misleading the whole company and client in order to meet deadlines • Operators? • For not putting additional time and effort into training, because they trust Silicon Techtronics • Randy Samuels? • For being a scapegoat • Silicon Techtronics? • No one/everyone is guilty

  18. Killer Robot Case-Part 3 • Who is to blame for the death of Bart Matthews? • Who should have taken responsibility? • Individual vs. Corporate Responsibility

  19. Killer Robot Case Part 3 • What are some of the factors that contributed to the accident?

  20. Killer Robot Case-Part 3 • Harry Yoder: • “if innocent blood is shed in a town, then the leaders of that town must go to the edge of the town and perform an act of penance. “ • Systems in which the part is related to the whole and the whole to the part

  21. Diffusion of Responsibility • Individuals: • The leaders at Silicon Techtronics: Watersonand Johnson. • Special burdens of responsibility: Randy Samuels, Cindy Yardley • Organization: Silicon Techtronics • This sickness created the accident-stems from Management • Employees contributed • Feedback loop • Society: Profit driven & Competitive

  22. Ethical Questions • Are you developing as a professional and ethical person? • Is it ethical to for companies to hire “the hacker type” • Do you find Kallman and Grillo’s method for ethical decision making useful and easier to implement? • What ethical perspective does it mirror?

  23. How could we avoid and prevent future accidents? • Management structure-multiple communication channels • Hire/value ethical responsibility and background • Good software engineering techniques at every stage of development • Good training and corporate environment • Appreciation for risks

  24. Why study failures and errors in computer systems?

  25. To understand their causes and help prevent future failures

  26. Why do we use computers even with the potential of detrimental consequences? • They are tools-we are better with then than without • The breakdown of a tool does not condemn it

  27. Risk and Progress The Machine Knows Most new technologies were not very safe when first developed We discover and solve problems learn how to prevent and recover, we learn.

More Related