1 / 29

IS MORALITY HARDWIRED INTO OUR BRAINS BY EVOLUTION?

IS MORALITY HARDWIRED INTO OUR BRAINS BY EVOLUTION?. The following thoughts are based on an article in Discover Magazine , April, 2004, called “Whose Life Would You Save?” by Carl Zimmer. To ponder….

Télécharger la présentation

IS MORALITY HARDWIRED INTO OUR BRAINS BY EVOLUTION?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. IS MORALITY HARDWIRED INTO OUR BRAINS BY EVOLUTION?

  2. The following thoughts are based on an article in DiscoverMagazine, April, 2004, called “Whose Life Would You Save?” by Carl Zimmer

  3. To ponder… • Let’s say you’re walking by a pond and there’s a drowning baby. If you said, “I’ve just paid $200 for these shoes and the water would ruin them, so I won’t save the baby,” you’d be an awful, horrible person. • But there are millions of children around the world in the same situation, where just a little money for medicine or food could save their lives. And yet we don’t consider ourselves monsters for not giving all of our extra money to Oxfam.

  4. A mini-philosophy lesson…Immanuel Kant,John Stuart Mill and David Hume

  5. Immanuel KantGermany, 1724-1804 “Pure reason alone can lead us to moral truths.” • Based on his own pure reasoning, he declared that it was wrong to use someone for your own ends and right to act only according to principles that everyone could follow.

  6. John Stuart MillEngland, 1806-1873 “The rules of right and wrong should above all else achieve the greatest good for the greatest number of people, even though particular individuals might be worse off as a result.” • This approach became known as “utilitarianism,” based on the “utility” of a moral rule

  7. Quick summary: • Kant put what’s right before what’s good • Mill put what’s good before what’s right

  8. Problem: • But neither of these philosophies explains how moral judgments work in the real world

  9. Thought experiment: • Imagine that you’re at the wheel of a trolley and the brakes have failed. • You’re approaching a fork in the track at top speed. • On the left side, five rail workers are fixing the track. • On the right side, there is a single worker. • If you do nothing, the trolley will bear left and kill the five workers. • The only way to save five lives is to take the responsibility for changing the trolley’s path by hitting a switch. Then you kill one worker. • What would you do?

  10. Another dilemma… • Now imagine that you are watching the runaway trolley from a footbridge. This time there is no fork in the track. • Instead, five workers are on it facing certain death. But you happen to be standing next to a big man. If you sneak up on him and push him off the footbridge, he will fall to his death. • Because he is so big, he will stop the trolley. • Do you willfully kill one man, or do you allow five people to die?

  11. Logically, the questions have similar answers • But most people find that they’re more willing to throw a switch than push someone off a bridge. • Why should what seems right in one case seem wrong in another. (Kant or Mill?)

  12. Maybe it’s not about the logic of moral judgments, but in the role our emotions play in forming them. Enter: David Hume

  13. David Hume18th c. Scottish philosopher • People call an act good not because they rationally determine it to be so but because it makes them feel good. • They call an act bad because it fills them with disgust. • Moral knowledge comes partly from an ‘immediate feeling and finer internal sense.’

  14. Is morality an INSTINCT?? • Monkeys have a sense of fairness: • If capuchin monkeys are trained to take a pebble from the trainer; if they give the pebble back they get a cucumber • If two monkeys are sitting in adjacent cages so that each can see the other, and one monkey still got a cucumber but the other got a grape (a tastier reward) more than half the monkeys who got cucumbers balked at the exchange • Sometimes they threw the cucumber back at the researchers. • Sometimes they refused to give the pebble back • Apparently they realized that they weren’t being treated fairly (Sarah Brosnan and Frans de Waal, Emory University)

  15. Another study: • Colony of chimpanzees got fed by their zookeeper only after they had all gathered in an enclosure • One day a few young chimps dallied outside for hours, leaving the rest to go hungry • The next day, the other chimps attacked the stragglers, apparently to punish them for their selfishness

  16. Is this an evolutionary sense of morality? • A sense of fairness would have helped early primates cooperate • A sense of disgust and anger at cheaters would have helped them avoid falling into squabbling • As our ancestors became more self-aware and acquired language, they would transform those feelings into moral codes that they then taught their children

  17. Some thoughts: • We make moral judgments so automatically that we don’t really understand how they’re formed • Is this “sense of fairness” a potential solution to the “trolley problem”? • Although the two scenarios have similar outcomes, they trigger different circuits in the brain • Killing someone with your bare hands would most likely have been recognized as immoral millions of years ago • It summons up ancient and overwhelmingly negative emotions – despite any good that may come of the killing. • It just feels wrong

  18. Comparing apples to oranges? • Throwing a switch for a trolley isn’t the sort of thing our ancestors confronted • Cause and effect, in this case, are separated by a chain of machines and electrons, so they don’t trigger a snap moral judgment • Instead we rely more on abstract reasoning

  19. MRI scanning of brains making decisions • Researchers asked moral and nonmoral questions of subjects, and scanned their brains while they were deciding on their answers (Jonathon Cohen, Joshua Greene, Princeton University)

  20. Dorsolateral prefrontal cortex • Vital for logical thinking • It helps keep track of several pieces of information at once so they can be compared • We can use our brain to make decisions about things that evolution hasn’t wired us up for

  21. New MRI research shows: • Impersonal decisions triggered many of the same parts of the brain as nonmoral questions do (like whether you should take the train or the bus to work)

  22. Two other parts of the brain used for personal moral questions: • Personal moral decisions triggered different parts of the brain than impersonal decisions (push a guy off the bridge vs. throwing a switch) • One is at the cleft of the brain behind the center of the forehead • Another is called the “superior temporal sulcus” – just above the ear • It gathers information about people from the way they move their lips, eyes, and hands • A third , the posterior cingulate, becomes active when people feel strong emotions

  23. Another way to study moral intuition: • Look at brains that lack it: Psychopaths • They can put themselves inside the heads of other people, but have a hard time recognizing fear or sadness in people’s faces or in their voices

  24. Psychopaths • Roots of criminal psychopathy can first be seen in childhood • Abnormal level of neurotransmitters might make children less empathetic • When most children see others get sad or angry, it disturbs them and makes them want to avoid acting in ways that provoke such reactions • Budding psychopaths don’t perceive other people’s pain, so they don’t learn to rein in their violent outbreaks

  25. Sometimes… • Two parts of the brain produce opposite responses of equal strength and the brain has difficulty choosing between them • Because of this, it sometimes takes the brain a while to choose • When people decide that personally hurting or killing someone is appropriate it takes them a long time to say yes – twice as long as saying no to these kinds of questions

  26. But people vary too.. • Some people aren’t willing to push a man over the bridge, but others are • “Kantians” • “Utilitarians”

  27. How do we reconcile these findings? • Is right and wrong nothing more than the instinctive firing of neurons? • Perhaps if you look at someone’s behavior on a mechanical level, it’s hard to look at them as evil • You can look at them as dangerous; you can pity them, but evil doesn’t exist on a neuronal level

  28. Cultural differences?? • Different cultures produce different kinds of moral intuition and different kinds of brains • Indian morality focuses on collective decisions • American morality focuses on individual autonomy • This indicates that these differences shape a child’s brain at a relatively early age

  29. The World’s Great Conflicts • Are they rooted in neuronal differences? • Genes, culture and personal experience have wired their moral circuitry in different patterns • Perhaps research on the brain’s moral circuitry may ultimately help resolve some of these seemingly irresolvable disputes

More Related