1 / 44

Preprocessing for EEG & MEG

Preprocessing for EEG & MEG. Tom Schofield & Ed Roberts. Data acquisition. Data acquisition. Using Cogent to a generate marker pulse.. drawpict(2); outportb(888,2); tport=time; waituntil(tport+100); outportb(888,0);

ewaldman
Télécharger la présentation

Preprocessing for EEG & MEG

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Preprocessing for EEG & MEG Tom Schofield & Ed Roberts

  2. Data acquisition

  3. Data acquisition Using Cogent to a generate marker pulse.. drawpict(2); outportb(888,2); tport=time; waituntil(tport+100); outportb(888,0); logstring( [‘displayed ‘O’ at time ' num2str(time) ]);

  4. Two crucial steps • Activity caused by your stimulus (ERP) is ‘hidden’ within continuous EEG stream • ERP is your ‘signal’, all else in EEG is ‘noise’ • Event-related activity should not be random, we assume all else is • Epoching – cutting the data into chunks referenced to stimulus presentation • Averaging – calculating the mean value for each time-point across all epochs

  5. Extracting ERP from EEG ERPs emerge from EEG as you average trials together

  6. Overview • Preprocessing steps • Preprocessing with SPM • What to be careful about • What you need to know about filtering

  7. mydata.mat

  8. Epoching

  9. Epoching - SPM Creates: e_mydata.mat

  10. Downsampling • Nyquist Theory – minimum digital sampling frequency must be > twice the maximum frequency in analogue signal • Select ‘Downsample’ from the ‘Other’ menu

  11. Downsample Creates: de_mydata.mat

  12. Artefact rejection Blinks Eye-movements Muscle activity EKG Skin potentials Alpha waves

  13. Artefact rejection Blinks Eye-movements Muscle activity EKG Skin potentials Alpha waves

  14. Artefact rejection - SPM Creates: ade_mydata.mat

  15. Artefact correction • Rejecting ‘artefact’ epochs costs you data • Using a simple artefact detection method will lead to a high level of false-positive artefact detection • Rejecting only trials in which artefact occurs might bias your data • High levels of artefact associated with some populations • Alternative methods of ‘Artefact Correction’ exist

  16. Artefact correction - SPM • SPM uses a robust average procedure to weight each value according to how far away it is from the median value for that timepoint Weighting Value Outliers are given less weight Points close to median weighted ‘1’

  17. Artefact correction - SPM • Normal average • Robust Weighted Average

  18. Robust averaging - SPM Creates: ade_mydata.mat

  19. Artefact Correction • ICA • Linear trend detection • Electro-oculogram • ‘No-stim’ trials to correct for overlapping waveforms

  20. Artefact avoidance • Blinking • Avoid contact lenses • Build ‘blink breaks’ into your paradigm • If subject is blinking too much – tell them • EMG • Ask subjects to relax, shift position, open mouth slightly • Alpha waves • Ask subject to get a decent night’s sleep beforehand • Have more runs of shorter length – talk to subject in between • Jitter ISI – alpha waves can become entrained to stimulus

  21. Averaging R = Noise on single trial N = Number of trials Noise in avg of N trials (1/√N) x R More trials = less noise Double S/N need 4 trials Quadruple need 16 trials

  22. Averaging Creates: made_mydata.mat

  23. Averaging • Assumes that only the EEG noise varies from trial to trial • But – amplitude will vary • But – latency will vary • Variable latency is usually a bigger problem than variable amplitude

  24. Averaging: effects of variance Latency variation can be a significant problem

  25. Latency variation solutions • Don’t use a peak amplitude measure

  26. Time Locked Spectral Averaging

  27. Other stuff you can do – all under ‘Other’ in GUI • Merge data sessions together • Calculate a ‘grand mean’ across subjects • Rereference to a different electrode • FILTER

  28. Filtering Why would you want to filter?

  29. Potential Artefacts • Before Averaging… • Remove non-neural voltages • Sweating, fidgeting • Patients, Children • Avoid saturating the amplifier • Filter at 0.01Hz

  30. Potential Artefacts • After Averaging… • Filter Specific frequency bands • Remove persistent artefacts • Smooth data

  31. Types of Filter • Low-pass – attenuate high frequencies • High-pass – attenuate low frequencies • Band-pass – attenuate both • Notch – attenuate a narrow band

  32. Properties of Filters • “Transfer function” • Effect on amplitude at each frequency • Effect on phase at each frequency • “Half Amp. Cutoff” • Frequency at which amp is reduced by 50%

  33. High-pass

  34. Low-pass

  35. Band-pass and Notch

  36. Problems with Filters • Original waveform, band pass of .01 – 80Hz • Low-pass filtered, half-amp cutofff = ~40Hz • Low-pass filtered, half-amp cutofff = ~20Hz • Low-pass filtered, half-amp cutofff = ~10Hz

  37. Filtering Artefacts • “Precision in the time domain is inversely related to precision in the frequency domain.”

  38. Filtering in the Frequency Domain B C A D E

  39. Filtering in the Time Domain • Filtering in the time domain is analogous to smoothing • At a given point an average is calculated in relation to two nearest neighbours or more X-1 X X+1

  40. Filtering in the Time Domain • Waveform progressively filtered by averaging the surrounding time points. • Here x = ((x-1)+x+(x+1))/3

  41. Recipe for Preprocessing • Band-pass filter e.g.0.1 – 40Hz • Epoch • Check/View • Merge • Downsample? • Artefacts; Correction/Rejection • Filter • Average

  42. Recommendations • Prevention is better than the cure • During amplification and digitization minimize filtering • Keep offline filtering minimal, use a low-pass • Avoid high-pass filtering

  43. Summary • No substitute for good data • The recipe is only a guideline • Calibrate • Filter sparingly • Be prepared to get your hands dirty

  44. References • An Introduction to the Event-related Potential Technique, S. J. Luck • SPM Manual

More Related