1 / 50

Human Social Interaction Research proposal

1. Human Social Interaction Research proposal. Dr. Roger Newport Room B47 Drop-in times: Tuesdays 12-2 www.psychology.nottingham.ac.uk/staff/rwn. Understanding Emotion: visual recognition. 2. Introduction to facial emotions The neuroscience of Fear and Disgust (the simple story)

Télécharger la présentation

Human Social Interaction Research proposal

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 1 Human Social Interaction Research proposal Dr. Roger Newport Room B47 Drop-in times: Tuesdays 12-2 www.psychology.nottingham.ac.uk/staff/rwn Understanding Emotion: visual recognition

  2. 2 Introduction to facial emotions The neuroscience of Fear and Disgust (the simple story) Other emotions (the complicated story) Current research

  3. Lecture Overview What are facial expressions of emotion and what are they for? Are there specific centres in the brain dedicated to emotion perception? Are different emotions processed in different ways? 3 Understanding Emotions

  4. Why are we interested in emotion perception? 4 Evolutionary survival Social survival

  5. 5 facial expressions of emotion - what are the emotions? Motivational Basic Self-conscious/social Happiness Fear Anger Surprise Disgust Sadness Shame Embarrassment Pride Guilt Thirst Hunger Pain Mood Do not feature prominently in social communication. Feature prominently in social communication. Regulate social behaviour

  6. 6 Faces are special Face perception may be the most developed visual perceptual skill in humans. Infants prefer to look at faces from shortly after birth (Morton and Johnson 1991). Most people spend more time looking at faces than at any other type of object. We seem to have the capacity to perceive the unique identity of a virtually unlimited number of different faces

  7. Understanding Emotion: from facial expressions 7 Facial expressions as a communicative tool We laugh more if in a group/ show distress more if in a group Babies (10 months) almost only smile in presence of caregiver Babies look to caregiver and behave according to caregiver response when encountering novel object. E.g. a barking dog or a snake This is known as social referencing and is also seen in chimpanzee societies A similar process, observational fear, is seen in other monkeys. Infant monkeys show fearful unconditioned response to mother’s expression of fear when the mother could see a snake, but the infants could not. That is, infants showed a fear response to the mother’s fear response.

  8. facial expressions as communication 8 In Erickson and Schulkin, 2003 Percentage of facial responses to unpleasant odour classified as unpleasant, neutral, or pleasant in a spontaneous condition, a posed to real person condition, and a posed to imaginary audience condition

  9. facial expressions as communication 9 Facial expressions allow for rapid communication They are produced when there is an emotional stimulus and an audience present Our interpretation of another’s emotion modulates our behaviour and vice versa The ability to recognise emotion expressions appears very early

  10. Recognition as an automatic processes - fear and threat 10 Angry faces are detected much more rapidly than faces depicting non-threatening expressions Attention is driven by fear Ohman et al., 2001

  11. Automatic processes = dedicated network 11 Fear and the amygdala Evidence from animal, neuropsychological and imaging studies suggest that the amygdala is of primary importance in the recognition of fear.

  12. 12 Fear and the amygdala - evidence from animal studies Bilateral amygdala removal: reduces levels of aggression and fear in rats and monkeys facial expressions and vocalisations become less expressive impairs fear conditioning

  13. Fear and the amygdala - evidence from human neuropsychology Bilateral amygdala damage reduces recognition of fear-inducing stimuli reduces recognition of fear in others reduces ability to express fear Does NOT affect ability to recognise faces or to know what fear is See patients SM, DR and SE (Adolphs et al. and Calder et al.) Alzheimer’s disease impairs fear conditioning 13

  14. 14 Fear and the amygdala - evidence from human imaging Results from several studies Fear face recog Fear cond. Increased amydala activity for facial expressions of fear vs. happiness, disgust, anger, neutral non-conscious processing of fear expressions Subliminal activation of amygdala to fear Neuromodulatory role of left amygdala: less fear = less activity

  15. 15 A typical study Amygdala Response to Facial Expressions in Childrenand Adults Thomas et al., 2001 Blocks of fixation of fear / neutral faces No task, just watch Right Left Left amygdala activation for fear vs fixation in male children and adults Overall, adults showed greater amygdala activation for fear v neutral whereas children did not (neutral faces may be ambiguous)

  16. Methods and Materials Subjects Six male adults (mean 􏰅 24 years, SD 􏰅 6.6 years) and 12 children (mean 􏰅 11 years, SD 􏰅 2.4 years) recruited in the Pittsburgh area were scanned in a 1.5-T scanner during passive viewing of fearful and neutral faces. The children, sixfemale and six male, ranged in pubertal development from Tanner stages1 I/I to V/IV. Male and female subjects did not differ in mean age or Tanner stage. Data from an additional three adults (three female) and four children (two female) were not included due to excessive motion artifact (􏰁0.5 voxels; n 􏰅 5) or claustrophobia (n 􏰅 1) or because the subject fell asleep during the task (n 􏰅 1). Subjects were screened for any personal or family history of psychiatric or medical illness, and for any contraindications for an MRI. Written child assent and parental consent were acquired before the study.

  17. Behavioral Paradigm The task consisted of the rapid and successive presentation of faces in blocks of neutral and emotional expressions. The face stimuli consisted of digitized fearful and neutral faces taken from the Ekman and Friesen (1976) study (Figure 1). A total of eight different actors (four male and four female) demonstrating both fearful and neutral expressions were used. The hair was stripped from the images to remove any nonfacial features, and both fear and exaggerated fear poses were used for each actor (Calder et al 1997), resulting in a total of 16 fear stimuli and eight neutral stimuli. Stimuli were presented for 200 msec with an interstimulus interval of 800 msec (flashing fixation point). Each block of trials consisted of the presentation of a flashing fixation point for 45 sec followed by alternating 42-sec blocks of either neutral or fearful expressions and a final 45-sec epoch of fixation (Figure 1). This procedure was repeated in three runs of trials with the presentation order counterbalanced across runs and across subjects (i.e., F-N-F-N-F or N-F-N-F-N). Following Breiter and colleagues’ (Breiter et al 1996) design, no overt response was required. Instead, subjects were instructed to fixate centrally to try to get an overall sense of the faces.2

  18. Image Acquisition, Processing, and Analysis Scans were acquired on a 1.5-T GE Signa scanner (General Electric Systems, Milwaukee) modified for echo planar imaging (Advanced NMR, Wilmington, MA) using a quadrature head coil. A T1-weighted sagittal localizer image was used to prescribe the functional slice locations. T1-weighted structural images were acquired in 4-mm contiguous coronal slices through the whole brain (echo time [TE] min, repetition time [TR] 500, matrix 256 􏰄 256, field of view [FOV] 20) for purposes of localizing the functional activity and aligning images in Talairach space (Talairach and Tournoux 1988). Functional images (T2*) were acquired at 12 of these slice locations spanning the entire amygdala (􏰬A20 to P24 in Talairach coordinates) using an EPI BOLD sequence (TE 40, TR 3000, flip angle 90°, matrix 128 􏰄 64, FOV 20, 4-mm skip 0, voxel size 3.125 􏰄 3.125 􏰄 4.0 mm). There were three runs of 100 images totaling 300 images per slice. Images were motion corrected and normalized. All 18 subjects had less than 0.5 voxels of in-plane motion. All images were registered to a representative reference brain using Automated Image Registration software (Woods et al 1992), and voxelwise analyses of variance (ANOVAs) were conducted on these pooled data using normalized signal intensity as the dependent variable (Braver et al 1997; Casey et al 2000). Separate analyses were conducted comparing male adults and male children and comparing male and female children to examine interactions of stimulus type (fearful faces, neutral faces, fixation) with age or gender, respectively. Significant activations were defined by at least three contiguous voxels and alpha = .05 (Forman et al 1995). Amygdala activation was defined on the reference brain using Talairach coordinates and consensus among three raters (BJC, KMT, PJW). Significant regions that extended outside of the brain or had large SDs were excluded.

  19. Results Adults and Children A 2 􏰄 2 (Group 􏰄 Condition)3 ANOVA comparing male adults (n 􏰅 6) and male children (n 􏰅 6) revealed significant activity in the left amygdala and substantia innominata for fearful faces relative to fixation (Figure 2) and a decrease in signal with repeated presentations of the fearful faces4 (Table 1). Neutral faces showed a similar pattern of activation relative to fixation trials (F 􏰅 23.71, p 􏰀 .001). A significant interaction was observed in the left amygdala between stimulus type and age for the comparison of fearful and neutral expressions (Table 1) (Group 􏰄 Condition, Fear vs. Neutral). Post hoc t tests indicate that adults demonstrated significantly greater activity for fearful faces relative to neutral faces (p 􏰀 .001). However, the children demonstrated greater amygdala activity for neutral faces than for fearful expressions (p 􏰀 .0001) (Figure 3). Neither age nor Tanner stage predicted the magnitude of the percent change in signal in this sample.

  20. Warning about other brain regions A variety of brain regions are involved in the processing of facial expressions of emotion They are active at different times and some structures are active at more than one time The amygdala is particularly implicated in the processing of fear stimuli receiving early (<120 ms) subcortical as well as late (~170 ms) cortical input from the temporal lobes 16

  21. Amygdala response to fear - special for faces? 17 The Amygdala Response to Emotional Stimuli: A Comparison of Faces and Scenes Hariri et al., 2002 Blocked design Matching task Getting rid of unwanted activations Preferential right amygdala response to faces (faces > IAPS)

  22. Emotions - not just an ugly face 18 Hadjihhani and de Gelder, 2003 Adolphs and Tranel, 2003 Cs better when faces present Bilat AMs worse when faces present often better at negative stimuli without faces

  23. Emotions - the importance of the eyes 19

  24. break 20

  25. 21

  26. The contribution of the eyes to facial expressions of fear 22 What emotion do these eyes depict?

  27. Emotions - the importance of the eyes 23 Whalen et al., 2004 % signal change from fix. The amygdala is responsive to large eye whites in fear (and surprise) expressions Amygdala activation above fixation baseline for non-inverted (eye white) fearful eyes

  28. Emotions - the importance of the eyes 24 The amygdala, fear and the eyes Adolphs et al., 2005. SM bubble analysis

  29. Emotions - the importance of the eyes SM’s eye fixation (or lack of it) 25

  30. Emotions - the importance of the eyes 26 When told to look at the eyes specifically SM improves, but only while given this instruction

  31. Emotions - amygdalae are not simply eye detectors 27 The amydalae are not just eye detectors - may direct attention to relevant stimuli - a biological relevance detector Some easy to tell from eyes Others from mouths We need more than just the eyes to determine emotional and social relevance Hybrid Faces from Vuilleumier, 2005.

  32. Yuck! Disgust and the Insula (and basal ganglia) 29 Animal studies insula = gustatory cortex impaired taste aversion in rats Human Neuropsychology patient NK Huntingdon’s Disease Tourette’s and OCD Electrical stimulation = nausea Repeated exposure leads to habituation Human imaging Philips et al. Wicker et al.

  33. 30 Disgust - evidence from imaging Both of Us Disgusted in My Insula: The Common Neural Basis of Seeing and Feeling Disgust Wicker et al., 2003 1. Observed actors smelling and reacting to bad, nice and neutral odours Separate visual and olfactory runs 2. Smelt bad and nice odours (+ rest) Overlay analysis

  34. 31 Disgust vs. Fear summary Fear Amygdala Activated by fear-inducing stimuli Habituates to fear Removal or damage disproportionately impairs fear recognition and feelings of fear Disgust Insula and Basal Ganglia Activated by facial expressions of disgust Insula Habituates to disgust Removal, damage or degeneration of either structure disproportionately impairs disgust recognition and feelings of disgust. A double dissociation: conclude that the neural mechanisms for fear and disgust are anatomically and functionally distinct

  35. Other basic emotions - implicated brain regions 32 Green = neutral Red = anger Purple = fear Yellow - happy Blue = sad Duvernoy 1991, but see Kessler/West et al., 2001

  36. Other basic emotions - implicated brain regions 33 Tissue loss associated with specific emotion recognition impairment Rosen et al., 2006 50 patients with neurodegenerative dementia: Negative emotions (red, rlITG/rMTG) and in particular sadness (green, rSTG) correlated with tissue loss in right lateral inferior temporal and right middle temporal regions. Reflects this area’s role in visual processing of negative emotions

  37. 34 How does knowledge about brain activation help social psychologists?

  38. Recent Research June 2006 Is emotion processing affected by advancing age? An event-related brain potential study Rationale Age is related to decreasing cognitive function - esp in frontal functions Emotional intensity is a frontal function Are old folk impaired at emotion intensity recognition? Methods Investigated using ERP (EEG) and analysed by ANOVA

  39. Results Delay in early discrimination processing, but no difference in emotion discrimination young old

  40. Recent Research January 2007 Perceiving fear in dynamic body expressions Rationale Emotional body language is important when the face cannot be seen Important for survival so should be fast, automatic and employ dedicated brain network We know which parts of the brain are active for static emotion We know that other parts of the brain are active for body motion How do these interact for emotive whole body dynamic stimuli?

  41. Methods Used event-related fMRI and video clips Fear and neutral body movements with scrambled static and dynamic video images as controls Task to press button when inverted image seen (therefore incidental processing being measured)

  42. Amygdala active for social stimuli Not bothered whether static or dynamic More bothered when it is fearful Analyses 1. Main effects of bodies vs. scrambled stimuli (Fs+Fd+Ns+Nd) − 2(Sd+Ss). 2. Main effects of fear vs. neutral bodies [(Fs+Fd)−(Ns+Nd)]. 3. Main effects of dynamic vs. static bodies [(Fd+Nd)−(Fs+Ns)]. Results

  43. Results Other brain regions are bothered that it is dynamic (and fearful) These regions will be covered in later lectures

  44. Recent Research June 2007 Attention to the person or the emotion: Underlying activations in MEG Rational Facial emotion processing is fast (100ms) and automatic and occurs regardless of whether you attend to the face or not. Facial identity is also fast (but slower) and occurs in parallel according to most models But there is some evidence from schizophrenia suggesting that the parallel (and therefore separate) brain regions interact What happens to this interaction when you attend to either emotion or identity?

  45. 90ms orbito-frontal response to emotion regardless of attention 170ms right insula response when attending to emotion Methods and Results Used MEG and happy/fear/neutral faces Identity task - press button when 2 identities the same Emotion task - press button when 2 emotions the same Conclusions So there you go Also 220ms activation increase for areas associated with identity processing

  46. Recent Research Oct 2007 Impaired facial emotion recognition and reduced amygdalar volume in schizophrenia Rationale Amygdala volume known to be reduced in Schizophrenics Emotion recognition known to be impaired in Schizophrenia Direct link between the two not studied (properly) before Methods Used 20 Sz + 20 Cs. 3T MRI And facial emotion intensity recognition task

  47. Results (1) The schizophrenia patients had smaller amygdalar volumes than the healthy controls; (2) the patients showed impairment in recognizing facial emotions, specifically anger, surprise, disgust, and sadness; (3) the left amygdala volume reduction in these patients was associated with impaired recognition of sadness in facial expressions.

  48. Summary 37 distinct neural pathways underlie the processing of signals of fear (amygdala) and disgust (insula/basal ganglia) in humans. this dissociation can be related to the adaptive significance of these emotions as responses to critical forms of threat that are associated with external (fear) and internal (disgust) defence systems. According to LeDoux, social neuroscience has been able to make progress in the field of emotion by focusing on a psychologically well-defined aspect of emotion using an experimental approach to emotion that simplifies the problem in such a way as to make it tractable circumventing vague and poorly defined aspects of emotion removing subjective experience as a roadblock to experimentation.

  49. Next week Emotion recognition from auditory cues and theories of emotion

More Related