City, University of London
Browse
TEXT
32Emotion.vmrk (49.51 kB)
.EEG
01Emotion.eeg (141.1 MB)
TEXT
01Emotion.vhdr (13.23 kB)
TEXT
01Emotion.vmrk (63.95 kB)
.EEG
02Emotion.eeg (81.25 MB)
TEXT
02Emotion.vhdr (13.23 kB)
TEXT
02Emotion.vmrk (37.41 kB)
.EEG
03Emotion.eeg (157.71 MB)
TEXT
03Emotion.vhdr (13.23 kB)
TEXT
03Emotion.vmrk (62.15 kB)
.EEG
03Emotion2.eeg (5.74 MB)
TEXT
03Emotion2.vhdr (13.24 kB)
TEXT
03Emotion2.vmrk (2.96 kB)
.EEG
04Emotion.eeg (152.91 MB)
TEXT
04Emotion.vhdr (13.23 kB)
TEXT
04Emotion.vmrk (64.57 kB)
.EEG
05Emotion.eeg (130.82 MB)
TEXT
05Emotion.vhdr (13.23 kB)
TEXT
05Emotion.vmrk (64.02 kB)
.EEG
06Emotion.eeg (121.81 MB)
1/0
243 files

EEG of facial emotion judgment task including tactile probes

Version 2 2023-05-24, 11:42
Version 1 2022-10-10, 09:58
dataset
posted on 2023-05-24, 11:42 authored by Bettina ForsterBettina Forster, IRENA ARSLANOVAIRENA ARSLANOVA

  

  

EEG data recordings of 35 participants who gave informed written consent before taking part in a facial emotion judgement task. The study was approved by City, University of London, Psychology Research Ethics Committee. 

Participants were seated in an electromagnetically shielded, sound attenuated, dimly lit room, viewing a 60 Hz computer monitor at a distance of 80 cm. EEG was recorded from 64 Ag/AgCL active electrodes of which 60 were mounted equidistantly on an elastic cap (M10 montage; EasyCap GmBH, Herrsching, Germany) and standard EEG recording preparation procedures were used to ensure good signal quality (i.e. degreasing of skin and use of electrolyte). Electrodes were referenced to the right earlobe. The horizontal electrooculogram (HEOG) was recorded by placing electrodes 62 and 63 about 1 cm lateral to the external canthi of each eye, and the ECG was recorded by placing electrode 64 about 2 cm under the left collarbone. Continuous EEG was recorded using a BrainAmp amplifier (BrainProducts; amplifier bandpass 0.01–100Hz) and a 500 Hz sampling rate.

A set of 80 face pictures (20 per emotion) depicting angry, sad, happy, and neutral emotions from the Karolinska Directed Emotional Faces set (Lundqvist et al., 1998) were grey scaled and enclosed in a rectangular frame (1.40 X 1.57 inches), excluding most of the hair and non-facial contours. Face stimuli were presented centrally on a black background using the E-prime 2 software (Psychology Software Tools, Pittsburgh, PA), which also controlled delivery of the tactile stimuli. These tactile stimuli were completely task-irrelevant, and their purpose was to probe somatosensory activity. They were delivered by 12 V solenoids (5 mm in diameter) attached with microporous tape to the tip of the left index finger. When a current passed through the solenoid tactile stimulation was delivered by driving a metal rod with a blunt conical tip that contacted participants’ fingertip. To mask sounds made by the tactile stimulators, white noise (65 dB, measured from the participants’ head) was presented through a loudspeaker placed 70 cm in front of the participants.

Each trial of the emotion judgement task started with the presentation of a fixation cross (500ms), followed by a neutral, sad, happy, or angry face (600ms; trigger codes ending in 1, 2, 3, or 4, respectively). On half of the trials, in addition to a face picture, participants received brief (5ms), task-irrelevant tactile stimulation. During the visual–tactile conditions tactile stimuli were delivered 105ms after face onset (Pitcher et al., 2008; Sel et al., 2014; trigger codes S11 – S14 and S111 – S114). The other half of trials were visual-only trials, where the same facial stimuli were presented an equal number of times as in the visual-tactile condition but without tactile stimulation; however, a trigger code (codes S21 – S24 or S121 - S124) was delivered with the same timing as on visual-tactile trials (i.e. 105 ms after image / visual onset). We used 20 practice trials that did not contain any experimental material (5 trials per condition, with 8 trials followed by a question asking about the emotion expression). The overall experiment consisted of 800 randomized trials, presented in four blocks, including 200 neutral, 200 angry, 200 sad and 200 happy faces. In 10% of the trials of each block (trigger codes S111 – S114 or S121 – S124), participants were asked whether the face stimulus was happy, sad or angry. Participants were told to closely observe the faces presented on the screen, ignore all tactile stimuli, and to respond vocally (yes/no) as soon as possible if a question was presented (maximum response time 3000 ms). The inclusion of the question was to ensure that participants directed attention to the task and judged each facial expression. Participants were offered a break in between blocks. 


  

References

Lindquist, K. A., Wager, T. D., Kober, H., Bliss-Moreau, E., & Barrett, L. F. (2012). The brain basis of emotion: a meta-analytic review. The Behavioral and Brain Sciences, 35(3), 121–143. https://doi.org/10.1017/S0140525X11000446

Pitcher, D., Garrido, L., Walsh, V., & Duchaine, B. C. (2008). Transcranial magnetic stimulation disrupts the perception and embodiment of facial expressions. The Journal of Neuroscience, 28(36), 8929–8933. https://doi.org/10.1523/JNEUROSCI.1450-08.2008

Sel, A., Forster B. & Calvo-Merino, B. (2014) The emotional homunculus: ERP Evidence for Independent Somatosensory Responses during Facial Emotional Processing. Journal of Neuroscience, 34(9), 3263-7.

Funding

Experimental Psychology Society

History

Usage metrics

    School of Health & Psychological Sciences

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC