1 / 39

Politics and Propaganda in Film

Course Overview. Politics and Propaganda in Film. Out-of-Class Workload (Average = 1:40 out of class for each 1:00 in class). Focus of the Course. The combined area is HUGE, so we focus on this part . Tools of the Trade. Techniques of Propaganda and Persuasion. Definitions

jovita
Télécharger la présentation

Politics and Propaganda in Film

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Course Overview Politics and Propaganda in Film

  2. Out-of-Class Workload (Average = 1:40 out of class for each 1:00 in class)

  3. Focus of the Course The combined area is HUGE, so we focus on this part 

  4. Tools of the Trade Techniques of Propaganda and Persuasion

  5. Definitions • One definition: “The authoritative allocation of values” • 45 ½ more in Donague • Key elements: power, struggle, values • Politics is broader than government • Government is broader than you may think • Politics is a process rather than an end-state. B. When is a film “political?” When it is about politics or part of the process of politics: contested values, socioeconomic conditions, power, policy, or law. I. Politics

  6. Information: Attempts to inform audience of an issue without changing their attitudes. Is this possible? Consider: • Census “ads” and public service announcements • IRS tax instructions • Booklets on how to vote II. Forms of Political Communication

  7. Common View: B. Argument, Persuasion, Propaganda

  8. Relies on intent of speaker – but this is unknown to the listener Assumes that “good reasoning” makes for good argument – but good reasoning can be used to sell the wrong thing, i.e. giving someone the means to commit unspeakable ends The difference between “evidence” and “emotions” is slippery. Emotions are often triggered by evidence. “Ignores the consequences” is unduly restrictive. Fear is often triggered by worst case scenarios, and excitement by best case scenarios. 2. Problems with the Common View

  9. 3. DEFINING PROPAGANDA AS PERSUASION

  10. Propaganda is any argument with which I disagree, or persuasion by anyone with whom I disagree. “We” inform and educate, “they” use demagoguery and propaganda Can any line at all be drawn? 4. A Cynical View

  11. 5. The Continuum View Multiple perspectives Uses evidence in context Uses “quality” evidence – reliable and valid Describes how to perform a task Relies on field-recognized experts Experts explain their findings All relevant evidence Single perspective Takes evidence out of context Uses invalid or unreliable measures Urges performance of political act Relies on testimonials or out-of-field experts Expert conclusory statements are used as authority Partial truths (omissions) or false statements Persuasion

  12. Is it always wrong to persuade people using spurious logic, appeals to emotion, or even outright lies? Can the ends justify the means? • Abolitionists and Anti-Apartheid Demonstrators? • Claims of “genocide” in Kosovo and Bosnia when civilian killings were “only” in the tens of thousands? • 51/49 Principle – I may be 51% sure I’m right but I have an incentive to claim I’m 100% sure. Is this contemptible? • Are salespeople and advertisements immoral? III. Ethics of Propaganda: Questions

  13. Persuasion is possible – any studies show that framing the same question in different words causes respondents to answer it differently. Evidence also exists for most other techniques of persuasion and propaganda. IV. Does Propaganda Work?

  14. U.S. high school seniors who believe that there is a “great risk” to regular marijuana use doubled from 35% in 1978 to 73% in 1993. • Reported marijuana use also dropped from 37% to 16% • Smoking rate declined from 50% to 25% in 30 years. • Government campaign to encourage use of seatbelts flopped. • 7 cable TV messages broadcast 943 times a day during prime time to 6400 households. • Studies showed that the campaign had no effect • What separates successful campaigns from failed ones? Examples

  15. Assumptions • People want to have “correct” attitudes and beliefs, but cannot give every issue the attention it deserves (limited time, many issues). • People must compromise by paying more attention to some things than others • This leaves two routes to persuasion – the central route of information provision (requires time and focus) and the peripheral route of incidental cues like the attractiveness of the speaker, graphics, music, etc. B. elaboration likelihood model (ELM)

  16. The central route involves relating the information to what the receiver already knows and assessing the information’s likely effect on the receiver -- the theory calls this “elaboration.” • When do people elaborate? • When they have motivation: immediate, personal impact • When they have the ability: critical thinking skills • Predictions: • People more likely to elaborate when stakes are high – large purchases, major life decisions, etc. Persuasion must target central route by providing lots of evidence. • Attitudes resulting from central route persuasion will be relatively stable predictors of future behavior 3. the Central Route

  17. Used when evidence is weak (central route will fail) and/or elaboration likelihood is low (no elaboration is expected) • Note that the peripheral route isn’t “subconscious” – it’s just something to which we don’t pay much attention • Peripheral cues have many sources: communicator, context (music, food, cause message to be perceived as positive), message (fear can work if you provide the solution), personal appeals, forewarning (used to reinforce support), and distraction (used to undermine opposition) • Predictions: • Attitudes can be changed by peripheral cues, BUT • Such changes tend to be short-lived and unstable 4. The Peripheral Route

  18. Jury study on witness credibility -- Speech style influences persuasion. • Question: Approximately how long did you stay before the ambulance arrived? • (Confident) Twenty minutes. Long enough to help get Mrs. Davis straightened out. • (Hesitating) Oh, it seems like it was about uh, twenty minutes. Just long enough to help my friend Mrs. Davis, you know get straightened out. • Straightforward witnesses rated as more competent and credible even when saying the same thing. • Study in which doctors sent letters to their patients who smoked. • 8% of patients who received positively framed messages (if you quit now you will live longer) attempted to quit. • 30% of patients who received negatively framed messages (if you continue to smoke you will die sooner) attempted to quit. Examples of Peripheral Cues

  19. Aristotle: Ethos, Pathos, Logos all necessary/important Chomsky and Hermann: Propaganda Model (more on this later…) Basic agreement exists on many techniques, even though the theories that explain the effectiveness of these techniques are all flawed C.Other Models of Persuasion

  20. Speaker credibility and perceived views: • We prefer to hear from people with our tastes on matters of taste, but we prefer to hear from “independent” observers on matters of fact. • A celebrity or attractive model is most effective when the audience has low involvement, the theme is simple, and broadcast channels are used. An exciting spokesperson can attract attention to a message that may otherwise be ignored. • Affect: Positive generally works better than negative (surprising to many communication scholars). Strong emotional appeals and fear arousal are most effective when the audience has minimal concern about or interest in the topic. • Medium: Radio and TV messages tend to be more persuasive than print, but if the message is complex, better comprehension is achieved through the print media. • Message: Repeated appeals to self-interest work best IV. Tools of Persuasion

  21. Camera angles enhance perspective, such as low angles that give the subject power. Mise-en-scene(set and setting inside camera frame) creates cultural and ideological context. Is the film shot at the Capitol, a suburb, a poor neighborhood? Sound effects animate products or even ideas, giving them emotion. Lightingis used to draw your eye to certain details. Editingis used to pace and generate excitement. Notice how military and video game ads have very fast cuts, usually a scene change every second E. Technical Effects

  22. Examples of Film Techniques

  23. Logical fallacies: Use faulty logic (the claims may be true or false, but their logic can’t tell us). Common ones include • Appeal to authority: Only a problem if the authority is no more expert than we are. Examples: Ongoing squabble over Susan B. Anthony’s views on abortion, “Log Cabin Republicans,” the Dixie Chicks on American foreign policy, etc. • Circular Reasoning (Begging the Question): Uses the claim itself as the support for a claim V. Techniques of Propaganda

  24. Criticizes the person making the claim rather than the claim itself. Frequently attacks “hypocrisy” (a character flaw) rather than the evidence presented. • Examples: Attacking the auto bailout because the CEOs were arrogant, attacking Michael Moore for being fat, attacking climate change researchers for flying to conferences, attacking free-trade advocates for seeking protection for their firms. • Note that attacking a relevant characteristic (expertise in the case of someone rendering an expert judgment or being nominated for office) is not necessarily fallacious. 3. Ad Hominem Attack

  25. Reducing an issue to only two sides, where other opinions may exist, and/or presenting any counter-argument as an argument in favor of “the other side” Very common in material labeled “propaganda” (reduces number of views being presented) Example: “Either you’re with us, or you’re with the terrorists.” (Omits options of being against both or for both – the first being more plausible than the second) 4. Either-Or (Or Black and White) Fallacy

  26. Comparison with something dissimilar. Long-time favorites in foreign policy discussions: Pearl Harbor, Bay of Pigs, Vietnam (now joined by 9/11). Problem: Reasoning by analogy is almost always fallacious because no two events/processes are the same. But analogies are one of the most powerful tools of persuasion and one of the most common tools of analysis. The entire subfield of Comparative Government was founded on analogies between pairs of countries. 5. False AnalogY

  27. Also known as “post hoc ergo propter hoc” – Assuming that whatever happened just before some event was the cause of the event. • Superstitions are great examples of this – “lucky caps” for baseball players • Common in propaganda – show whatever your opponents were doing before a catastrophe, then show (or merely trigger memories of) the catastrophe • Similar to “affirming the consequent” • pq • q • Therefore p 6. Post Hoc Fallacy

  28. Substituting a weaker for a stronger argument, then defeating the weak argument and ignoring the strong one. • Traditional politician’s trick: “Answer the question you wish had been asked rather than the one that was asked.” • Examples: • Pro-lifers refute the “pro-abortion” argument (ignoring the stronger pro-choice position that promotion of birth control will reduce abortions more than a ban) • Pro-choicers refute the “anti-choice” argument (portraying opponents as anti-woman or pro-government control, even though the stronger pro-life argument is based on rights arguments applied to human life and includes support for poor mothers) • Almost everyone tries this one if they can get away with it 7. “Straw Man” Fallacy

  29. What was true in some (unusual) case is presumed to be the norm. Similar to Fallacy of Composition: Drawing an inference about the whole based on its parts (Example: prominent charity or political organization has multiple tax evaders on its board, so it must be corrupt) People are highly vulnerable to this problem because we naturally seek patterns, so if we have two dots, we’ll draw a line between them and call it a trend. Listening to the cases of ten people wronged by the other side’s policies, we naturally presume the problem is widespread. In a country of 300 million, this may not be true. Even rare events will affect hundreds or thousands. 8. Hasty Generalization

  30. Presumes that a policy represents movement in a “direction” which cannot be stopped short of the extremes. • Examples: • “An ID card system will lead to a slippery slope of surveillance and monitoring of citizens…it would create a system of internal passports that would significantly diminish the freedom and privacy of law-abiding citizens.” (ACLU.org) • “In what some call a denial of a basic civil right, a Missouri man has been told he may not marry his long-term companion…. a 22-year-old mare named Pixel. The Missouri man and homosexual "marriage" proponents categorically reject the definition of marriage as the union of a man and a woman…But once marriage is no longer confined to a man and a woman, it is impossible to exclude virtually any relationship between two or more partners of either sex--even non-human ‘partners.’” (Family Research Council) • Note that evidence that no intermediate policy can exist transforms the slippery slope from fallacy to genuine. (Example: Lincoln’s “House Divided” speech) 9. The Slippery Slope

  31. Tautologies: Stating the obvious increases credibility, so speakers often use statements that are true by definition (but therefore devoid of any real content) The “Glittering Generality:” Vague words that sound nice but contain little informational content. Common in politicians’ public addresses (all politicians’ speeches start to sound the same after a while). See Gingrich memo… Renaming or using euphemisms: Orwell’s “Ministry of Truth” and “collateral damage.” See also: pro-abortion vs. pro-choice and anti-choice (or anti-abortion) vs. pro-life. Deleting the agent of a sentence – obscures responsibility. Instead of US declared war, war was declared. Use of third person has a similar effect. Deleting the experiencer—imputes a harder fact. Instead of journalists estimated 10,000 at the demonstration, say 10,000 hit the streets. B. Rhetorical Weapons

  32. Deception about the self: “Plain Folks Strategy” and the creation of “front groups.” Deception about the opponent: guilt by association, quoting out of context Deception about both:“Black Propaganda” purported to come from the other side(faking correspondence is common). Stacking the Cards: Omitting contrary evidence. Lying: Generally a last resort, since it can backfire. C. Misrepresentation

  33. Milking the story: maximizing media coverage of a particular issue by the careful use of briefings, leaking pieces of a jigsaw to different outlets, allowing journalists to piece the story together and drive the story up the news agenda, etc. Red Herrings / Changing the Agenda: Changing the subject or diverting the argument. Debater’s rule: Stay on your own ground and don’t get caught up debating on their ground. Testimonials: Whether true or false. Even if disproven, the other side gains little (you said they were monsters and they have shown that your proof they were monsters was too weak to support your claim – big deal) Co-option (e.g. embedding journalists) and the “cultivation of sources” by reporters D. Media Manipulation

  34. In the 1991 Gulf War, a U.S. public relations firm got a Kuwaiti Ambassador’s daughter to pose as a nurse claiming she saw Iraqi troops killing babies in hospitals. The purpose of this was to demonize Iraq so war was more acceptable. (The Senate vote was 52-47 authorizing the use of force). “dead baby” story

  35. Milking the story: maximizing media coverage of a particular issue by the careful use of briefings, leaking pieces of a jigsaw to different outlets, allowing journalists to piece the story together and drive the story up the news agenda, etc. Red Herrings / Changing the Agenda: Changing the subject or diverting the argument. Debater’s rule: Stay on your own ground and don’t get caught up debating on their ground. Testimonials: Whether true or false. Even if disproven, the other side gains little (you said they were monsters and they have shown that your proof they were monsters was too weak to support your claim – big deal) Co-option (e.g. embedding journalists) and the “cultivation of sources” by reporters D. Media Manipulation

  36. Indirect works best: Nazis discovered that direct pro-Nazi films did poorly at the box office, so they worked on instilling Nazi “values” in other films without mentioning the Party or Hitler “Show, Don’t Tell.” Don’t say that true Germans are Aryans. Just show “Aryan-looking” people whenever you mention ordinary Germans and “non-Aryan-looking” people when you need villains. General Rule: The doctrine to be instilled in the target audience should not be articulated. The proper procedure is to drill them home by constantly presupposing them (use of repetition) E. Indoctrination

  37. Strategy of telling huge lies instead of little ones – honest people tell small lies, but not big ones, so the big lie is perversely more credible Little evidence exists on the effectiveness of this strategy. Hitler coined the phrase, but he accused the Jews of doing it. Goebbels also used the phrase – to refer to British propaganda. F. The “Big Lie”

More Related