1 / 97

Improved auditor skepticism through an examination of cognitive dissonance

Improved auditor skepticism through an examination of cognitive dissonance. AGA Winter Conference Nashville, Tennessee January, 2012 Art “Bubba” Hayes Director of State Audit Arthur.hayes@cot.gov 615-747-5397. 1. Is objectivity a myth?

byron
Télécharger la présentation

Improved auditor skepticism through an examination of cognitive dissonance

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Improved auditor skepticism through an examination of cognitive dissonance • AGA Winter Conference • Nashville, Tennessee • January, 2012 • Art “Bubba” Hayes • Director of State Audit • Arthur.hayes@cot.gov • 615-747-5397 1

  2. Is objectivity a myth? • How much do we struggle to not have an open mind (by holding onto our assumptions/biases/beliefs) when we profess to have an open mind? • As professionals, we are expected to employ critical thinking in analyzing information/evidence • This includes weighing conflicting information from various sources • But are we to be totally objective? • The scientific method is designed to prove that a hypothesis is true?? • What is the main role of attorneys?

  3. Two main perspectives: • What we tell ourselves to justify what we do..staying off the slippery slopes • Our possible predispositions to whether we think a person or an organization is trustworthy • And how those notions may affect our evaluation of what they say or do • What others tell us to justify what they have done or not done • And whether we buy off on it • If this sounds familiar, it is what we tell friends/family when they have been hurt • It wasn’t your fault/they were jerks/you are better off without him/her/that job • And the basis of cognitive reframing therapy

  4. What are the two primary types of mistakes we can make in evaluating information? • False positives • False negatives • Which is worse?

  5. Purpose of this session • To assist you in recognizing the traps we all can fall into when we are evaluating information and evidence

  6. When our brains are made up, it is very hard to change them • Cognitive dissonance—a state of tension created whenever a person holds two cognitions (ideas, attitudes, beliefs, opinions) which are psychologically inconsistent. Leon Festinger • Smoking is not a good thing, it can kill me; I smoke two packs a day

  7. It produces mental discomfort From minor pangs to deep anguish We don’t rest easy until we find a way to reduce it Quit smoking Convince yourself smoking isn’t so bad Or it is worth the risk because it helps me relax, or prevents me from gaining weight (another health risk) 7 7

  8. Three primary applications to auditing and accountability • Auditors and the need to remain objective in skeptically analyzing audit evidence • Management and those charged with governance who need to remain objective and vigilant to indicators of possible fraud, waste or abuse through designing, establishing and monitoring effective internal controls • All of us as human beings who can trip down that ol’ slippery slope

  9. Auditor responsibilities per SAS 99 • Paragraph 14: when responses to inquiries of management, those charged with governance, or others are inconsistent or otherwise unsatisfactory (for example, vague or implausible), the auditor should further investigate the inconsistencies or unsatisfactory responses.

  10. Paragraph 14: maintain the proper questioning mind throughout the audit • Paragraph 15: the questioning mind should include setting aside any prior belief that management is honest and has integrity and consider the risk of management override of controls

  11. Paragraph 15: • Consider known external and internal factors that might: 1.create incentives/pressures to commit fraud, 2. provide opportunities for fraud to be perpetrated and 3. indicate a culture or environment that enables rationalization for committing fraud

  12. Paragraph 16: professional skepticism should lead auditors to continually be alert for information or other conditions that could indicate that MMDF may have occurred

  13. Paragraph 16: professional skepticism should lead auditors to thoroughly probe the issues, require additional evidence as necessary, consult with other team members and, if appropriate, experts in the firm, rather than rationalize or dismiss the information or other conditions indicating that a MMDF may have occurred.

  14. Requirements of SAS 109 • Paragraph 19: the auditor should plan and perform the audit with an attitude of professional skepticism, which should be exercised throughout the audit engagement • Auditors should be rigorous in following up on indications of MMDF or error • Auditors should be alert for information or other conditions indicating a MMDF/E may have occurred.

  15. 15

  16. We are ingenious in developing ways to reduce the dissonance • But they are usually self-deluding • Albert Camus: we humans are creatures who spend our lives trying to convince ourselves that our existence is not absurd • But to hold onto two ideas that contradict each other is to flirt with the absurd 16

  17. Basically we lie to ourselves • We develop “Good Excuses” • How the people erroneously predicting the end of the world justify their error to themselves and their flocks 17

  18. So, we strive to lead lives that are, at least in our own minds, meaningful and consistent • Cognitive dissonance is the engine of self-justification 18

  19. Confirmation bias(we are not logical beings) • If we are forced to look at disconfirming information, we will find a way to criticize it, to distort or dismiss it so that we can maintain or even strengthen our original belief: • If we obtain new information that is consistent with our beliefs, we consider the information useful and well founded • However, if the new information is inconsistent with our beliefs, we will consider it biased, inaccurate or foolish. 19

  20. A neurological basis • MRI’s have revealed that when people are facing dissonance/information that conflicts with their beliefs, the “reasoning parts” of their brains shut down • And when the consonance was restored, the emotional circuits in their brains lit up 20

  21. Reactions to opposing scholarly articles • Subjects who held strong opinions on one side of an issue with two distinct sides were given two scholarly articles which took the two different sides of that particular issue • After reading them, rather than seeing the merits of the other side, they discredited the other side’s article—finding and magnifying minor flaws—and actually became even more committed to their original opinion 21

  22. Absence of any evidence can become evidence for one’s beliefs • Confirmation bias can even convince us that the lack of evidence to support our position or belief is evidence that our belief/position is correct • Mere rumors can be enough • WWII internment of Japanese Americans • The “fact” that there was no evidence to support sabotage by this group was seen as evidence of just how sinister, clever, organized and dangerous they were 22

  23. Sticking with our decisions • Especially irrevocable decisions • And decisions that involved a lot of cost or emotional commitment • The greater the dissonance the greater to need to reduce it by over-emphasizing the good aspects of it • As a result, realize that testimonial advertising is the least reliable • If I spent a lot of money and time for a particular therapy, I am going to say it has made a great difference in my life (not “sure I wasted ten years of my life and $50,000 on terrible therapy!!”) 23

  24. Dissonance can even make us hang onto practices and notions that are not based on conscious decisions, just “feelings” • We may not have a clue why we are holding onto this view or opinion, but • We are too proud to admit that • We are smart and clever people, aren’t we? • And again, especially if developing/holding onto it involved a lot of pain, cost • Initiation into a fraternity 24

  25. A common beliefit is good to vent your anger • Based on good old Freud—catharsis • That expressing your anger or acting aggressively gets rid of your anger • Throw something • Yell at someone • Hit that punching bag • You’ll feel much better….or will you?? 25

  26. Justifying our anger—a vicious circle • If we get angry at someone and do something directly negative towards them, then we have a problem—we have to justify what we did to them. • and later we might even feel remorse for over-reacting • So, what do we do with this dissonance? • we may tend to convince ourselves that we were justified in being angry and even getting the person in trouble.. • As a result, rather than finding relief by “venting” our anger, we are committed to continuing our anger to confirm that our actions were right and the other person is a &%$^ 26

  27. When we are generous we reinforce that act as well—a virtuous circle • If we give them something, even if it is not very expensive or didn’t take much effort, maybe even on a whim • If we had had negative feelings about them before we gave, there is dissonance • Why would I give something to someone so bad? • so we may tend to start thinking of them in a more positive way • He’s not so bad, actually, she is pretty nice 27

  28. The greater the gift or the effort of giving, the greater the dissonance and the more we tend to view the person favorably 28

  29. Our view of ourselves • Most reasonably adjusted people have a positive view of themselves as moral, competent and smart • Hence when we do something that is inconsistent with those positive self images, the dissonance is particularly painful to us. 29

  30. So if we do something really stupid • (buy into an end of the world hoax) • We must either admit we are stupid…ouch • Or • “Realize” how really heroic we were and smart too • Through our faith, we saved the entire world…too bad everyone isn’t that smart, huh? 30

  31. The opinions of so-called experts • Empirical studies show their rates of success are around 50-50 • Our guesses are as good as theirs—the best estimates are based on data and not on experts’ analysis of them, their “experiences” or “opinions” • AND • When they are wrong, they struggle hard to convince themselves and others that they would have been right “if only”..the timing had been different, the “unforeseen mishap” hadn’t happened, yada yada 31

  32. Dissonance reduction is not limited to those with high self esteem • Self justification protects how we feel about ourselves, whether it is high or low esteem • Regarding low self esteem, the dissonance occurs when they seem to do something right, rather than when they do something wrong. • Someone who considers themselves unattractive will take the tack that as soon as the person who now seems to find them attractive really gets to know them, they will go away, rather than thinking “how nice, I must be more appealing than I thought”. 32

  33. crooks • Similarly, someone who recognizes that he/she is a scam artist or thief will not feel badly when they cheat a retiree out of their life savings. • It is consistent with their self image. 33

  34. A Cheating Scenario • Two students are both facing a critical exam for admission to a graduate program • They are both reasonably honest and have the same attitude towards cheating—it shouldn’t be done • That is, they are very close together in their views of cheating • But they each find themselves stumped by a question on the exam • And they each find themselves with an easy opportunity to cheat by copying an answer from another student which appears to be a very good answer • One cheats and the other does not 34

  35. Each gains something very important: • One gives up integrity for a passing grade • The other gives up a passing grade to preserve their integrity • So how do they feel a week later? • Each has had ample time to consider their choice 35

  36. The cheater will decide that cheating is not such a terrible crime • Hey, everyone cheats at least some of the time • It’s no big deal • And I had to do it for my future and the future of my family • The other will determine that cheating is even more immoral than he originally believed • People who cheat should be expelled • We need to make an example out of them 36

  37. The outcome: • The two students are now very far apart in their belief systems re cheating • The cheater thinks the other is ridiculously pollyanna • The other thinks the cheater is totally immoral • They have internalized these divergent beliefs to the point that they believe they have always held these view points 37

  38. Back to the cross roads moment • At that time, as they stood at the top of the slippery slope, the choices seemed a lot more ambiguous than after they have made their choice and traveled down the slope • The early, almost inconsequential steps, didn’t seem that big • But it starts a process of entrapment • We justify each step to reduce our uneasiness/the ambiguity of the dilemma and it also increases the level of our commitment and the intensity of that committment and takes us further and further away from our original principles and intentions 38

  39. People who have been sorely tempted, who battled the temptation, and almost gave in to it but resisted it at the eleventh hour, come to dislike and even despise those who were not able to resist the temptation. • This dynamic applies to most important decisions we face in life 39

  40. Gray areas • At the top of the pyramid, the choices seem to involve gray areas, ambivalent choices, with consequences which are shrouded in uncertainty • And the first steps along the process are likewise morally ambiguous—the correct choice isn’t so clear. 40

  41. Gray gives way to certainty • So we make early, apparently inconsequential decisions and we justify each small step to reduce the ambiguity • But each step requires additional justifications and we find it hard to turn back against our earlier justifications • So the intensity of our commitment to the course of action increases 41

  42. watergate • Jeb McGruder—should he have known or drawn the line the first time they asked him to do something illegal? Not so easy.. • Courted by Halderman • Not told that perjury, cheating, and breaking the law were required for the job; instead was • Flattered and told he was going to make a difference for the world, not just make money for a company and himself—part of history being made • Erlichman • John Dean—Blind Ambition • Liddy’s wild and crazy plans and then the “toned down one” about the break-in 43

  43. Stanley Milgram • “the experiment requires you to continue” • Created a “slippery slope” • Not high voltage charges that could seriously harm someone, but • $20 to participate in a scientific study to determine whether a mild shock with a miniscule amount of voltage (10 volts) can increase people’s ability to learn • He even tries it on you and you can barely feel it • So it’s harmless and interesting (hmm, will spanking the kids make them study?) and you get some money ($20 in 1963) 44

  44. Then when you start you are told that if the student gets the answer wrong, you have to nudge the voltage up to 20 volts • No big increase • Then up to 30, 40, 50…to a point on the switch that reads “450 volts—danger” • Those who resisted early were more likely to stop before they got to the higher voltages • But 2/3 of the subjects went all the way—they found it hard to suddenly justify drawing the line and stopping • each new increment had been just a small increase but each increase had committed them a little further 45

  45. How do we lose our ethical compass? • Get someone to take one small step at a time • Self-justification will do the rest • To preserve our belief that we are smart, we will all, on occassion, do some things that are dumb • We can’t help it, we are wired that way..the human condition 46

  46. Fighting hypocrisy 47

  47. Psychological blind spots • The biggest one: that we don’t have any • Justify our own perceptions and beliefs as being accurate, realistic and unbiased • Naïve realism: we believe that we perceive objects and events clearly, as they really are • We assume that other people see things the way we do • If they disagree with us, obviously they aren’t seeing things correctly 48

  48. 1. people who are open minded and fair ought to agree with a reasonable opinion • 2. any opinion I hold must be reasonable; if it weren’t I wouldn’t hold it. • 3. therefore, if I can just get my opponent to sit down and listen to me, so I can tell him how things really are, he will agree with me. • 4. and if he doesn’t, it must be because he is biased. • Labeling of the source of positions influences our acceptance of the substance of them 49

  49. Perceptions • Ideas and opinions that we hold dear through years of introspection due to the need to avoid dissonance, we view as reasoned—a source of accuracy and enlightenment. • But people with other views developed over years of their introspection we view as biased and inflexible (she can’t possibly be impartial on the topic of xxxxx, because she has held that position for years) 50

More Related