1 / 25

Ethics in a Computing Culture

Ethics in a Computing Culture. Chapter 9 Autonomous and Pervasive Technology. Objectives. What makes a technology pervasive? What does it mean to be autonomous? How are pervasiveness and autonomy related? How do they contribute to change in societies?

Télécharger la présentation

Ethics in a Computing Culture

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Ethics in a Computing Culture Chapter 9 Autonomous and Pervasive Technology

  2. Objectives What makes a technology pervasive? What does it mean to be autonomous? How are pervasiveness and autonomy related? How do they contribute to change in societies? How can the existence of such phenomena cause problems for members in society? Ethics in a Computing Culture

  3. Autonomous and Pervasive Technology Pervasive: technology that has spread widely throughout society Autonomy: the freedom to make decisions without outside constraints or interference Ethics in a Computing Culture

  4. Case: Pervasiveness of Camera Phones What are some of the other effects of the pervasiveness of cameras, and especially camera phones? List some of the ways people use cameras differently now than they did 20 years ago. Are these changes good, bad, or neutral? Ethics in a Computing Culture

  5. Case: Pervasiveness of Camera Phones • Lifelogging: a lifelogger wears a computer that records every moment of his or her life, usually in both audio and video. The camera and audio recording devices are never turned off, no matter what, but the recorded data is not necessarily publicly posted. • What are the potential dangers, and benefits, of lifelogging? • Is lifelogging morally permissible? Ethics in a Computing Culture

  6. Case: Injured By GPS • Consider the three main parties in the case: • Ms. Rosenberg • Google • the driver of the car • What percentage of the blame does each deserve for what happened? Ethics in a Computing Culture

  7. Case: Injured By GPS (continued)

  8. Case: Injured By GPS (continued) • Jacobsson’s article about the lawsuit only briefly mentions the driver of the car. • Why do you think that the article focuses on the Rosenberg vs. Google part of the case, instead of Rosenberg vs. the driver? Ethics in a Computing Culture

  9. Case: More GPS, More Risks Overall, do you think that emergency locator beacon technology is a good thing, or a bad thing? How should authorities handle false alarms? Should people who send false alarms be fined, charged for the cost of their rescue, or otherwise penalized? Is it morally permissible to charge people to be rescued when they really are in danger? Ethics in a Computing Culture

  10. More on the Definition of “Autonomous” Ethics in a Computing Culture

  11. More on the Definition of “Autonomous” (continued) • Autonomous, intelligent, or robotic? • Google’s Web search • Security cameras that can “recognize” wanted criminals and alert authorities if a match is seen • The control mechanism of a traffic light, which causes it to cycle through its signals appropriately Ethics in a Computing Culture

  12. Automated Security Cameras • One way to minimize the possibility for abuse of surveillance cameras is to remove the robotic tilt and pan, so that the cameras cannot be redirected to spy on people. • What are the possible negative consequences of this policy? • Should the policy be adopted? Ethics in a Computing Culture

  13. Case: Grading Essays by Computer • Could a computer do as good a job of grading essays as an average teacher? • Could it do a better job than the worst teacher you have had? • Than the best teacher you had? • Should students have the right to challenge their grades and demand that a human grade their papers if they disagree with a computer-generated grade? • Should students have the right to challenge their grades and demand that a computer grade their papers if they disagree with a human-generated grade? Ethics in a Computing Culture

  14. Case: Grading Essays by Computer (continued) Is it morally permissible for a college instructor to use e-rater and assign grades to student essays based on its output without actually looking at the essays? Assuming that college instructors continued to grade papers in the way that they always have, would it be beneficial for them also to use e-rater, to get a second opinion? Ethics in a Computing Culture

  15. Case: Grading Essays by Computer (continued) • Imagine a situation in which a student applies to an MBA program, but is rejected due to a low GMAT score. Suppose also that this low GMAT score was an error: The student’s essays were actually quite good, but e-rater scored them incorrectly because the student had an unusual writing style. • Who, if anyone, is morally responsible for the student’s unfortunate situation? Ethics in a Computing Culture

  16. Case: Remote Parking • A driver pulls up next to a parking space, checks to make sure the space is clear, presses the button to start the automatic parking, and then walks away. After the driver’s back is turned, a small child runs into the space and is seriously injured. • Who is primarily morally responsible for the child’s injury? • the driver • the car company • the child • the adult in charge of the child • no one Ethics in a Computing Culture

  17. Case: Remote Parking (continued) • Imagine that, instead of using a computerized parking assistant, the driver had used valet parking (that is, a human parking assistant), and a child was injured. • What if the valet was noticeably dizzy and smelled strongly of alcohol, and the driver still chose to give his keys to the valet? Ethics in a Computing Culture

  18. Software with Emergent Behaviors • Machine learning: type of artificial intelligence;algorithms that allow computers to take in data and automatically learn to recognize patterns and make predictions • Unmanned Aerial Vehicles (UAVs or “drones”) • What are some possible negative effects of an over-sensitive UAV that mistakes something harmless for a threat? • Consider this statement: “An erroneous decision made by the UAV is a fault in the UAV, and not directly attributable to any person or group of persons.” Ethics in a Computing Culture

  19. I, Roommate • Companion robots: not capable of “loving you back” • How is this problematic? • Is it ethical to encourage senior citizens to form emotional bonds with robots, simply because it makes them feel happier? Ethics in a Computing Culture

  20. I, Roommate (continued) • In I, Robot, Isaac Asimov proposed three rules that would provide the basic moral guidelines for a robot: • A robot may not injure a human being or, through inaction, allow a human being to come to harm. • A robot must obey orders given it by human beings except where such orders would conflict with the First Law. • A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. Ethics in a Computing Culture

  21. Touch Screens and Visual Disabilities Ethics in a Computing Culture

  22. Touch Screens and Visual Disabilities (continued) • Recall the ACM/IEEE Software Engineering Code of Ethics and Professional Practice from Chapter 2, which describes how software engineers ought to behave. • Should software and hardware developers be required, by the Code of Professional Ethics, to take vision impairments into account when designing new technologies? • Was it morally praiseworthy for Google to hire T.V. Raman to work on accessibility features for their products? Ethics in a Computing Culture

  23. Case: The Flash Crash • Most experts believe that high-frequency trading has significantly increased the efficiency of the stock market. This results in everyone, not just the high-frequency traders, being better off. • Is the risk of periodic flash crashes worth the benefits of high frequency trading? • What additional information would you need to answer this question more fully? Ethics in a Computing Culture

  24. Augmented Reality Marketing • Augmented reality: a computer graphics technology that draws virtual objects laid over the real world • If augmented reality is widely adopted, the government could make regulations about what types of ads can be associated with public billboards (for example, banning ads for cigarettes or ads with obscene sexual content). • Should the government regulate these ads? If so, how? Ethics in a Computing Culture

  25. Augmented Reality Marketing (continued) • Should the public posting of QR codes that lead the viewer to racy ads be banned? • Shock images: incredibly graphic images posted on the Internet; designed to shock or sicken ordinary people • Examples include pictures of real deaths and violence, obscene or illegal pornography • Should the public posting of QR codes leading to shock images be banned? Ethics in a Computing Culture

More Related