Processing of facial emotional expression: spatio-temporal data as assessed by scalp event-related potentials

Share Embed


Descrição do Produto

European Journal of Neuroscience, Vol. 13, pp. 987±994, 2001

ã Federation of European Neuroscience Societies

Processing of facial emotional expression: spatio-temporal data as assessed by scalp event-related potentials P. Krolak-Salmon,1,2 C. Fischer,1 A. Vighetto2 and F. MauguieÁre1 1

Functional Neurology Unit and 2Neuro-Ophthalmology Unit, P. Wertheimer Neurological Hospital, 59 Boulevard Pinel, 69394 Lyon cedex 03, France Keywords: late-latency potentials, retrograde neuromodulation, topographic maps

Abstract Event-related potentials (ERPs) were recorded in 10 adult volunteers, who were asked to view pictures of faces with different emotional expressions, i.e. fear, happiness, disgust, surprise and neutral expression [Ekman, P. & Friesen, W.V. (1975). Pictures of Facial Affect. Consulting Psychologist Press, Palo Alto, CA]. ERPs were recorded during two different tasks with the same stimuli. Firstly, subjects were instructed to pay attention to the gender of the faces by counting males or females. Secondly, they had to focus on facial expressions by counting faces who looked surprised. The classical scalp `face-related potentials', i.e. a vertex-positive potential and a bilateral temporal negativity, were recorded 150 ms after the stimulus onset. Signi®cant differences were found, ®rstly between late-latency ERPs to emotional faces and to neutral faces, between 250 and 550 ms of latency and, secondly, among the ERPs to the different facial expressions between 550 and 750 ms of latency. These differences appeared only during the expression discrimination task, not during the gender discrimination task. Topographic maps of these differences showed a speci®c right temporal activity related to each emotional expression, some particularities being observed for each expression. This study provides new data concerning the spatio-temporal features of facial expression processing, particularly a late-latency activity related to speci®c attention to facial expressions.

Introduction Since Darwin (1872), facial emotional expressions have been known to play a crucial role in the communication between individuals of the same species. Humans appear to be experts in analysing rapidly and precisely facial characteristics, especially emotional expressions, which have the same social signi®cance among civilizations (Ekman et al., 1969; Ekman & Friesen, 1975). The processing of this powerful vector of communication seems to develop from early childhood (Challamel, 1992), and may involve large neuronal populations in the occipto-temporal visual pathways, limbic, paralimbic and frontal structures (Breiter et al., 1996; Morris et al., 1998; Phillips et al., 1998; Blair et al., 1999). Selective impairments in recognizing facial expressions, sparing the ability to recognize identity, suggested that the processing of facial expressions can be dissociated from the processing of other facial features (Tranel et al., 1988; Bowers et al., 1985; Young et al., 1993; Adolphs et al., 1994, 1996; Calder et al., 1996; Sprengelmeyer et al., 1996). Neurophysiological studies have demonstrated the existence of different populations of neurons responding to faces in the temporal lobe of monkeys, some of them being sensitive to identity, others being sensitive to facial expression (Hasselmo et al., 1989). Moreover, the two cerebral hemispheres do not play the same role in the processing of facial expressions. Most of clinical and behavioural studies have underlined the greater involvement of the minor hemisphere (Etcoff, 1984; Borod et al., 1988; Ahern et al., Correspondence: Dr P. Krolak-Salmon, as above. E-mail: [email protected] Received 30 June 2000, revised 27 November 2000, accepted 3 Jan 2001

1991; Adolphs et al., 1996). However, electrophysiological (Vanderploeg et al., 1987; Lang et al., 1990; Laurian et al., 1991; Carretie & Iglesias, 1995; Streit et al., 1999) and imaging studies (Morris et al., 1996, 1998; Phillips et al., 1998; Blair et al., 1999), as well as some behavioural studies (Stone et al., 1996), have shown a speci®c involvement of the left hemisphere in right-handed people for the processing of positive emotions such as happiness. Imaging studies have also suggested that different neuronal populations could be selectively involved in the processing of the main types of facial expressions in humans, i.e. fear, happiness, disgust, sadness and anger (Breiter et al., 1996; Morris et al., 1996; Phillips et al., 1997, 1998; Scott et al., 1997; Blair et al., 1999). For example, the amygdala appears to participate in the processing of fear (Morris et al., 1996; Scott et al., 1997) and happiness (Breiter et al., 1996) and the anterior insula in the processing of disgust (Philipps et al., 1997, 1998). Because of their time resolution, imaging studies do not provide information on the temporal sequence of the processing of facial expressions and only electrophysiological techniques can be helpful in understanding these temporal features. Event-related potential (ERP) studies have identi®ed some emotion-modulated components, but failed to demonstrate any differential processing among facial expressions, even if some of them showed differential electrical activities between emotional and non-emotional stimuli (Vanderploeg et al., 1987; Lang et al., 1990; Laurian et al., 1991; Carretie & Iglesias, 1995; Streit et al., 1999). A magnetoencephalography (MEG) single trial study recently provided an interesting combination of spatial and temporal data concerning the processing of facial emotional expressions (Liu et al., 1999). Different activities related to emotional expressions and depending on the neuropsychological task

988 P. Krolak-Salmon et al.

FIG. 1. Examples of faces depicting the different emotional expressions used in this study (Ekman & Friesen, 1975)

were found in the fusiform gyrus and the amygdala bilaterally. Interestingly, these activities were extended in time, i.e. between 20 and 545 ms. The purpose of the present study is to contribute to the understanding of the relationship between temporal and spatial aspects of the facial expression processing by using ERPs and their scalp distribution without any explicit categorization task, which can modify the spatial and temporal features of the brain processing. This study is based on the assumption that neutral and emotional faces, as well as different emotional expressions (fear, happiness and disgust), are processed in different neuronal networks, and aims at specifying the temporal features of their activation.

Ten healthy right-handed adult volunteers (six females and four males) aged between 20 and 34 years (mean, 27.5; SD, 4.6 years) were recruited among the undergraduate and graduate students of the Neurological Hospital of Lyon. Their vision was normal or correctedto-normal. They reported no history of neurological disease or ophthalmological disease and no systemic disease. They reported taking no medication.

Subjects were engaged in two different consecutive target detection tasks. They were required to keep a mental count of the number of targets presented in each series and to report that count at the end of each series. During the ®rst task, called `attention to gender', the subject made a gender discrimination by counting either males or females in each series, alternating gender between series; thus, during this condition, they counted three sets each of male and female faces in the course of six series. During the second task, called `attention to expression', a recognition of the facial expression was required by counting faces depicting surprise. ERPs to surprised faces (targets) were not included in the topographic map and the statistical analysis, because the purpose of the study was not to describe target-related potentials. Targets were just used to keep subjects' attention to the expression component of the faces during the `attention to expression' task. Each series was composed of almost the same stimuli except for one or two males or females, and one or two surprised faces. One or two targets were moved, so that the number of targets differed among the runs. The total number of targets among the six runs for each task was the same. No motor task was given to minimize possible sensorimotor interference. During the ®rst condition, `attention to gender', subjects were not informed about the emotional expressions of the faces. No randomization of the tasks was performed to detect a possible implicit processing of facial expressions during the `attention to gender' task.

Stimuli and task procedure

Recordings

Stimuli were 40 static grey-scale images of emotionally expressive faces (four females and four males depicting ®ve different expressions, i.e. fear, happiness, disgust, surprise and neutral (without any emotion), taken from a standard set of pictures of facial affect (Ekman & Friesen, 1975) (Fig. 1). The digitized size-, brightnessand contrast-adjusted images were presented on a computer screen 110 cm in front of the subject, with a visual angle of 4 3 5 °. They were exposed for 400 ms with an interval of 2000 ms between the onset of successive images. A central white cross was used to help subjects with the ®xation of the pictures. Six series of 40 stimuli were delivered for each task. The order of the stimuli and the order of the series were randomised for each subject and for each task.

The subjects were seated in a comfortable chair in a sound-, light- and electrically shielded recording room. They were instructed to look at the cross at the centre of each picture and trained to maintain ®xation. Continuous electroencephalogram (EEG) was recorded with a 32channel EEG (Neuro Scan Labsâ) and ampli®ed using SynAmps (Neuro Scan Labsâ). Thirty-one scalp tin electrodes were attached to a cap (Electrocapâ). Eighteen electrode sites belonged to the International 10±20 System (Jasper, 1958). Thirteen sites were added: three prefrontal electrodes (FP 1/2/z), ®ve fronto-central electrodes (FC 3/4/7/8/z), three centro-parietal electrodes (CP 3/4/z) and two temporo-parietal electrodes (TP 7/8). A bipolar electrooculogram was recorded from the supraorbital ridge and outer canthus of the

Materials and methods Subjects

ã 2001 Federation of European Neuroscience Societies, European Journal of Neuroscience, 13, 987±994

FIG. 2. Grand averages recorded by electrodes Fz, Cz, Pz, T5, T6, O1 and O2 across all subjects elicited by neutral faces during the gender discrimination task, and topographic maps of the different ERPs. Of particular interest is the topographic representation of the presumably face-related potential N1, which is recorded in both occipitotemporal areas. P1, N1, P2, P3 and VPP, ERPs described in the text (see Results). Latency units are ms and amplitude units are mV.

FIG. 3. Grand averages recorded by electrodes CZ, T5 and T6 across all subjects elicited by neutral and emotional faces (expressing fear, happiness, disgust and surprise) during (a) the gender discrimination task and (b) the expression discrimination task. P1, N, P2, P3 and VPP, ERPs described in the text (see Results). Latency units are ms and amplitude units are mV. Signi®cant differences (Student's t-test) are indicated for each period of 100 ms as N/F (neutral compared to fear), N/ H (neutral compared to happiness), N/D (neutral compared to disgust), D/H (disgust compared to happiness) and D/F (disgust compared to fear).

Processing of facial emotional expression

ã 2001 Federation of European Neuroscience Societies, European Journal of Neuroscience, 13, 987±994

989

990 P. Krolak-Salmon et al. right eye; the nosetip was used as the reference site and the ground was located in the medio-frontal area. Impedances were kept < 5 kW. Data were recorded continuously with a 500-Hz sampling rate through a bandpass of 0.1±70 Hz. EEG epochs were acquired beginning 100 ms prior to stimulus onset and continuing for 1500 ms. Codes synchronized to stimulus delivery were used to selectively average off-line epochs associated with the different stimulus types. A baseline correction was performed automatically by subtracting the average of the prestimulus recording from the 100 ms preceding the stimulus onset. After visual inspection, epochs with eye- or muscle-related artefacts > 30 mv were rejected. ERPs analysis Mean ERPs Across-subjects (grand-averaged) mean ERPs to all face expressions for both tasks were computed. Mean ERPs to neutral faces for the gender discrimination task were used to describe the ERPs to faces in general, which were observed with all stimuli, i.e. faces with different emotional expressions. Topographic maps Across-subjects mean ERPs were used to make topographic maps. The sequential topographic maps represented the ERP scalp voltage distribution ¯attened onto a two-dimensional space with a colour coding of voltage values. Voltages between electrode locations were interpolated using a spherical spline technique. Firstly, we performed maps of grand-averaged ERPs to neutral faces during the `attention to gender' task to study responses to the presentation of faces, without any emotional expression and without any task related to facial expression analysis. Secondly, we performed topographic maps of the differences between ERPs elicited by each of the nontarget emotional faces and the ERPs elicited by neutral faces during the `attention to expression' task to study the emotional expression processing. We used these latter topographic maps to localize the areas of interest for statistical analysis. Statistics The amplitudes and latencies of early ERPs, and the mean amplitudes of long-latency ERPs during eight periods of 100 ms from 250 to 1050 ms, were obtained by computer program for each stimulus condition and for each subject, and were used to combine a study of early- and late-latency neural activity related to facial expressions. Prior to statistical analysis, the data were screened for gaussian distribution and homogeneity of variance. Because the data met the assumptions required for the analysis of variance, differences among these measures were tested using repeated-measures analysis of variance (ANOVA). For both tasks, peak amplitudes, peak latencies and mean amplitudes of the eight periods of latency de®ned above were entered into separate repeated-measures ANOVAs with nontarget face expressions (neutral, fear, happiness and disgust) and electrodes sites (O1 vs. O2, T5 vs. T6, Fz vs. Cz and Pz) as repeated-measures factors. The Greenhouse±Geisser correction (Greenhouse & Geisser, 1959) was applied to adjust the degrees of freedom of the F-ratios. Post hoc comparisons using the Student's t-test were made to determine the signi®cance of differences. Subjective rating After each series of stimuli, subjects were asked to report the total number of targets, i.e. males or females during the `attention to gender' task and surprised faces during the `attention to expression' task. After ERP recording sessions, the subjects rated each face image (printed on

individual copies) for its emotional expression (neutral, fear, happiness, disgust and surprise). Each image was printed on a separate 16 3 23-cm sheet. The sheets were presented in a random sequence. Only the ®rst answer was accepted, without any time limitation.

Results Subjective rating The ratio between the number of reported targets and the real number of targets was 99.2% in the gender classi®cation task and 94.2% in the expression classi®cation task. There was no association between gender classi®cation errors and facial expression. In the postrecording explicit rating, 96% of the facial expressions were correctly classi®ed. Analysis of ERPs to neutral faces This analysis is presented to describe the common peaks recorded with all the faces, whatever the expression during the two tasks. The ERPs to all the faces were maximal in the occipito-temporal areas. The grand-averaged ERPs elicited by neutral faces during the `attention to gender' task in the temporo-occipital areas, i.e. recorded by T5, T6, O1 and O2, and their cartographic representations, are presented Fig. 2. ERPs recorded in the midline area, i.e. under Fz, Cz and Pz, were added to show the polarity reversal of the early peaks along the caudo-rostral direction. In the occipito-temporal areas, the ®rst positive peak (P1) was recorded with a mean latency of 102 6 9.1 ms. The corresponding voltage distribution showed an occipital midline positive area associated with a mid-frontal negativity. The ®rst negative peak (N1) in the occipito-temporal regions was associated with a vertex-positive potential (VPP) recorded by the midline electrodes, especially at Cz site. The mean latencies of N1 and the VPP were, respectively, 151 6 7.5 and 148 6 7.2 ms. The voltage mapping showed a bilateral and symmetrical distribution of the N1 in both posterior temporal regions, associated with a smaller midline positivity distributed on the vertex and the parietal areas. The second positivity (P2) recorded in the occipital areas peaked at a latency of 221 6 11.5 ms after stimulus onset. It showed a symmetrical occipital distribution associated with a mid-frontal negativity. The third and latest positivity (P3) was recorded by all the midline, temporal and occipital electrodes with a mean latency of 411 6 18 ms. It showed a large distribution maximal in the vertex and the mid-parietal regions. As the neutral faces during the ®rst task (`attention to gender') contained targets in each series of stimulation (neutral male and female faces), we assume that this positivity might be a target-evoked P300 potential. ERPs to facial expressions Figure 3a presents the grand-averaged ERPs elicited by faces expressing fear, happiness, disgust, surprise or no emotion (neutral faces), during the ®rst task `attention to gender'. The potentials described above were evoked by all types of stimuli. The latencies and amplitudes of the grand-averaged ERPs among the different expressions look very similar on visual inspection of traces. No signi®cant difference among the latencies and the amplitudes of the peaks, and the mean amplitudes of the late-latency ERPs during the nine periods described above, were shown by the ANOVA. Figure 3b presents the grand-averaged ERPs elicited by the different emotional faces during the second task involving the attention to facial expressions. The peaks described above were elicited by all types of stimuli, but the mean amplitudes of the late-latency ERPs appeared to be different. Inspection of the grand-averaged ERPs revealed greater differences among the ERPs elicited under T5 and T6. The statistical analysis

ã 2001 Federation of European Neuroscience Societies, European Journal of Neuroscience, 13, 987±994

FIG. 4. Topographic maps of the absolute values of the differences between ERPs to nontarget emotional faces expressing fear, happiness and disgust and ERPs to neutral faces between the beginning of the stimulus and 900 ms of latency. Each map shows the topographic representation of the mean response during each time period of 50 ms. An occipital activity related to fear, happiness and disgust appears during the ®rst period, i.e. between 250 and 550 ms of latency, and a right occipitotemporal activity related to fear and happiness appears during the second period, i.e. between 550 and 750 ms of latency. Absolute values were used to represent the topographic maps of these negative differences with a clear grey-level scale. Latency units are ms and amplitude units are mV.

Processing of facial emotional expression

ã 2001 Federation of European Neuroscience Societies, European Journal of Neuroscience, 13, 987±994

991

992 P. Krolak-Salmon et al. con®rmed the observations from visual analysis of grand-averaged ERPs. The ANOVA showed that the mean amplitudes of the late-latency ERPs elicited by the different facial expressions were signi®cantly different during time periods 250±350 ms (F1,9 = 6.31, P = 0.024), 550±650 ms (F1,9 = 6.59, P = 0.005), and 650±750 ms (F1,9 = 4.31, P = 0.032). It showed also that there was an interaction between the factor `expression' and the factor `electrode' during time periods 350± 450 ms (F1,9 = 4.33, P = 0.028), 450±550 ms (F1,9 = 4.49, P = 0.029), and 550±650 ms (F1,9 = 4.06, P = 0.045). No signi®cant difference was shown between the ERPs recorded by O1, O2, Fz, Cz and Pz. The t-tests which showed a signi®cant difference among the mean amplitudes of the late-latency ERPs recorded by T5 and T6 elicited are presented in Fig. 3b. The mean amplitudes of ERPs elicited by neutral faces were signi®cantly different from all others between 250 and 550 ms. Between 550 and 750 ms of latency, the ERPs elicited by happy and fearful faces were signi®cantly different from those elicited by disgusted faces. The ERPs related to neutral faces kept on being different from those elicited by emotional faces, especially fearful faces. The grand-averaged ERPs elicited by target faces, i.e. the surprised faces, showed a large positivity (mean latency 592 6 81 ms) on the vertex and parietal areas similar in shape and scalp distribution to the P300 evoked by target stimuli in classical odd-ball paradigm. This is an indirect proof of the good recognition of the targets by the subjects. Figure 4 presents the topographic maps of the differences between the ERPs elicited by the emotional faces, i.e. expressing, respectively, fear, happiness and disgust, and the ERPs elicited by the neutral faces expressing no emotion. These differences are all represented by a negative activity. Absolute values are presented to use a better colour scale. According to the inspection analysis of the grand-averaged ERPs, two different periods appeared on these topographic representations. During the ®rst period, i.e. between 250 and 550 ms of latency, the differential activity was occipital, ®rstly symmetrical, then tending to lateralize on the right, around the right occipitotemporal area. During the second period, i.e. between 550 and 750 ms of latency, a negative activity related to fear and happiness occurred mainly in the right occipito-temporal area. The activity related to the disgust occurred later, i.e. between 700 and 950 ms, predominating in frontal and right temporal areas. Of particular interest is the more diffuse and eventually almost symmetrical activity related to happiness.

Discussion This study demonstrates differences between ERPs elicited by emotional faces and neutral faces between 250 and 550 ms after the onset of the stimulus presentation, and among the ERPs elicited by different facial emotions between 550 and 750 ms of latency. These differences appeared only during the expression discrimination task. What does this differential activity re¯ect? We must consider the hypothesis whether these differences can be due to physical features of the emotional faces, i.e. the con®gurational organization of the face, re¯ecting some primary visual processing in the occipito-temporal cortex. The fact that a differential activity occurred only during the second task, whereas the same stimuli were used for the two experiments, demonstrates that this activity is not exclusively due to the physical features of the stimuli, but depends on a cognitive processing. The late latency of this activity is also against the

hypothesis of an exclusive role of low-level processing-related activity. The cortical areas which process the con®gurational aspects of the pictures are supposed to be activated much earlier than 250 ms (Bentin et al., 1996). The activity related to the emotional expression aspect (shown by the difference between ERPs to faces expressing an emotion and neutral faces) was recorded in the occipital regions between 250 and 550 ms, which suggests that attention can modulate the processing of physical features by the occipital cortex. It is very unlikely that this occipital activity was due to the discrimination of the emotional value of facial expressions, but to a top-down neuromodulation, i.e. a retrograde in¯uence from downstream structures onto the occipital lobe. This retrograde modulation may modify the properties of processing con®gurational features in the occipital lobe, then providing backwards and forwards neuro-modulations between occipital areas and downstream structures. It is very unlikely that this differential activity was due to a modi®cation of the emotional state of subjects by emotional stimuli like facial expressions, an arousal-related activity or an activity related to any emotional stimulus, because it appeared only during the `attention to expression' task. This effect would have been more important or at least similar during the ®rst task, but not exclusively during the second task, when the subjects were paying attention to facial expressions. Therefore, this differential activity between emotional and neutral faces on one hand, and among emotional faces on the other hand, appeared to re¯ect spatio-temporal aspects of the processing of the discrimination of human facial emotional expressions along the visual occipito-temporal pathways during a facial-expression attention task. Spatio-temporal analysis The visual analysis of the grand-averaged ERPs, the results of the statistical study and the topographic maps were concordant in demonstrating the existence of different ERPs to facial expressions during an expression-attention task. The activity related to emotional facial expressions appeared not to be circumscribed to a speci®c latency nor one neuronal population, as suggested in previous studies (Vanderploeg et al., 1987; Carretie & Iglesias, 1995; Streit et al., 1999), but to occur during two main periods. During the ®rst period, i.e. between 250 and 550 ms of latency, the ERPs related to neutral faces differentiated from the others, as if the face processing was then implicated predominantly in the discrimination between emotional and nonemotional stimuli. This activity was symmetrically recorded in the occipital area, but it was somewhat lateralized in the right hemisphere at the end of the period. This was consistent with an implication of the visual occipital cortex and the right occipito-temporal junction (Perrett et al., 1982; Hasselmo et al., 1989; Tanaka, 1996) in the discrimination of con®gurational features related to the facial expression analysis. This activity was recorded only when attention was paid to face expression. The second period occurred between 550 and 750 ms after the onset of the stimulus presentation. During this period, differences between the ERPs related to neutral faces and those related to emotional faces were still observed, as well as differences among the ERPs related to the different emotional expressions, i.e. fear, happiness and disgust. This activity was localized mostly in the right posterior temporal area. The activity related to happiness spread more on the left hemisphere than did the activation related to fear. This is consistent with behavioural studies (Reuter-Lorenz & Davidson, 1981; Adolphs et al., 1996; Stone et al., 1996) and functional imaging studies (Breiter et al., 1996), which reported a predominant role of the left

ã 2001 Federation of European Neuroscience Societies, European Journal of Neuroscience, 13, 987±994

Processing of facial emotional expression hemisphere in the processing of `positive' emotions, even if some studies reported the implication of left areas in the processing of fear (Morris et al., 1998). The processing of happiness seems to implicate diffuse and bilateral neuronal networks, which make them more robust to focal brain lesions (Adolphs et al., 1996). In our study, the processing of fear appears to be predominantly performed in the right hemisphere, which is consistent with the presumed role of the right hemisphere in urgent and threatening situations (Van Strien & Morpugo, 1992; Adolphs et al., 1996; Morris et al., 1999). Of particular interest is the lack of signi®cant difference between ERPs elicited under the temporal electrodes by happy faces and those elicited by fearful faces, although their topographic representations were somewhat different. We can assume that these two expressions, opposed by their emotional polarity, i.e. positive or negative, can be partly or mainly processed in structures whose activities are not directly re¯ected on the scalp, such as the amygdala (Morris et al., 1996, 1998, 1999; Adolphs et al., 1994; Breiter et al., 1996). The subtraction of the ERPs elicited by disgust from those elicited by neutral faces was smaller than the one between fear and neutral, and between happiness and neutral. This differential activity was also recorded later (700±950 ms of latency, compared to 550±750 ms of latency for fear and happiness) in the right fronto-temporal area (Fig. 4). This can be related to the hypothesis that a diffuse network, including basal ganglia, insula and frontal cortex, is implicated in the processing of disgust. This was suggested by Sprengelmeyer et al. (1996), who found decreased recognition capacities of disgust by patients with Huntington's disease. This was con®rmed by Philipps et al. (1998) who found activities related to the recognition of disgust in the anterior insula, and structures linked to a limbic cortico-striatalthalamic circuit, using functional magnetic resonance imaging. The temporal characteristics of these differential activities are particularly interesting. Using MEG, Streit et al. (1999) compared MEG records using the same stimuli, i.e. faces depicting emotional expressions, but not the same task, i.e. a face/object recognition task and a facial expression recognition task. They thus demonstrated that neural activity related to attention to emotional facial expressions could occur between 140 and 530 ms after the stimulus onset. They also showed that different neural structures, i.e. mainly different parts of temporal cortex, amygdala and inferior frontal cortex, could be involved in the processing of facial expressions at different latencies. Using the same protocol in one subject with a whole-head system, these authors con®rmed the widespread activation in time and space, especially in fusiform gyrus and amygdala (Liu et al., 1999). These ®ndings are consistent with the late latencies of the signi®cantly different activities related to facial expressions which were found in our study by comparing ERPs during the same task. Vanderploeg et al. (1987) had hypothesized earlier the existence of a widespread activation in time by emotional facial expressions during a target discrimination task. The target-evoked P300 potential was larger on the left hemisphere with neutral targets, and the `slow wave' occurring between 448 and 616 ms was larger on the left hemisphere with emotional targets. Of particular interest is the role of the psychological task, which may be crucial in determining the extent to which each hemisphere is engaged. These authors used an explicit categorization task and observed differences exclusively in ERPs elicited on the left hemisphere. In our study, subjects did not have to categorize explicitly the nontarget emotional expressions, but just detect a facial expression, i.e. the surprise. This procedure made our subjects pay attention to facial expressions without performing an explicit categorization task, so the activity of the right occipitotemporal areas was not modi®ed by an explicit categorization-related activity. A behavioural study with a callosotomised patient showed,

993

in addition, that performances in the detection of emotional facial expressions depended on the task required and the hemisphere stimulated (Stone et al., 1996). When the left hemisphere was stimulated by faces in the right visual ®eld, the performances were better with a categorization task. When the right hemisphere was stimulated, the performances were better using an association task. A retrograde neuromodulation? The differential activity related to emotional expressions appeared belatedly, i.e. between 250 and 550 ms in the occipital areas bilaterally, and between 550 and 750 ms in the right occipitotemporal areas. The neuronal information needs an average of 10 ms to go from a cortical area to the next one (Halgren et al., 1994). It is very unlikely that these occipito-temporal activities belong to an exclusive one-way sequential stream of neuro-processing in the ventral visual pathways. These very late latency activities provide striking support for the implication of a top-down in¯uence, a retrograde neuromodulation coming from downstream structures like limbic, para-limbic and/or frontal areas, in¯uencing extra-striate visual areas. Functional imagery studies have already shown the role of temporal areas in the processing of facial emotional expressions, without providing any temporal data (Morris et al., 1998; Phillips et al., 1998). We demonstrate here that this implication appears late, after that limbic structures have already been involved in processing visual inputs (Halgren et al., 1994). This extrastriate activity seems to be highly related to the psychological task. This kind of activity can exist without any attention paid to facial expressions, for example during a gender discrimination task (Morris et al., 1998). It appeared to be at least ampli®ed by an expression discrimination task, and then detected on the scalp in our study. The amygdala is one of the candidates for this neuromodulation. Its extensive anatomical connections allow it to integrate exteroceptive and interoceptive stimuli and to modulate sensory, motor and autonomic processing (McDonald, 1998). Moreover, the outputs of the amygdala include strong reciprocal projections to temporal cortex and to earlier visual areas in the occipital lobe (Iwai & Yukie, 1987). Its role in processing emotional facial expressions in humans has been demonstrated by the existence of de®cits produced by restricted amygdala lesions in humans (Adolphs et al., 1994; Calder et al., 1996), and by functional neuroimaging studies (Breiter et al., 1996; Morris et al., 1996, 1998). Using positrons emission tomography, Morris et al. (1998) have shown that amygdalar responses predict expression-speci®c neural activity in extrastriate cortex, suggesting its role in a retrograde neuromodulation of processing human facial expressions. The insula may also play a role in the neuromodulation of extrastriate cortex, especially in depicting disgust. Using functional MRI, Phillips et al. (1997) have demonstrated a neural substrate for perception of facial expressions of disgust involving primarily the anterior insula, which is also involved in processing pleasant and unpleasant tastes (Yaxley et al., 1988). Our results concerning faces depicting disgust, especially topographic representations, are concordant with an involvement of the right insula and the right frontal lobe. Interestingly, the right frontal lobe seems to play a crucial role in the discrimination of facial expressions. For example, enhanced activities for emotional faces vs. neutral faces were observed in the right orbitofrontal cortex (Morris et al., 1998), or for disgusted faces vs. neutral faces in the right medial frontal cortex and the dorsolateral prefrontal cortex (Philipps et al., 1997). Conclusion To our knowledge, this study is the ®rst one to show different visual evoked potentials related to facial expressions. The original tasks

ã 2001 Federation of European Neuroscience Societies, European Journal of Neuroscience, 13, 987±994

994 P. Krolak-Salmon et al. used in this study aimed at reducing verbal and categorization nonvisual left hemisphere activities, which could have masked any activity related to the facial expression processing itself. Differences were observed ®rstly between neutral and emotional stimuli in the occipital area, and secondly among the different emotional facial expressions in the right occipito-temporal area. Obviously, it would be arti®cial to separate two periods in the visual processing, but we can suppose that a relatively predominant neuronal activity is related ®rstly to the detection of an emotional stimulus and secondly to the speci®cation of the emotional expression. The late latencies of this processing support the idea that downstream structures can modulate the processing of facial expressions in extrastriate cortex. This retrograde neuromodulation appears to in¯uence early stages of the visual processing in the occipital lobe and right temporo-occipital areas, and is dependent on the neuropsychological context, i.e. the task and the attention required during the recording.

Abbreviations EEG, electroencephalogram; ERP, event-related potential; MEG, magnetoencephalography; P1, ®rst positive peak; N1, ®rst negative peak; VPP, vertexpositive potential.

References Adolphs, R., Damasio, H., Tranel, D. & Damasio, A.R. (1996) Cortical systems for the recognition of emotion in facial expressions. J. Neurosci., 16, 7678±7687. Adolphs, R., Tranel, D., Damasio, H. & Damasio, A. (1994) Impaired recognition of emotion in facial expressions following bilateral damage to the human amygdala. Nature, 372, 669±672. Ahern, G.L., Schomer, D.L., Klee®eld, J., Blume, H., Rees Cosgrove, G., Weintraub, S. & Mesulam, M.M. (1991) Right hemisphere advantage for evaluating emotional facial expressions. Cortex, 27, 193±202. Bentin, S., Allison, T., Puce, A., Perez, E. & McCarthy, G. (1996) Electrophysiological studies of face perception in humans. J. Cogn. Neurosci., 8, 551±565. Blair, R.J.R., Morris, J.S., Frith, C.D., Perrett, D.I. & Dolan, R.J. (1999) Dissociable neural response to facial expressions of sadness and anger. Brain, 122, 883±893. Borod, J.C., Koff, E., Perlman, D., Lorch, M., Nicholas, M. & Welkowitz, J. (1988) Emotional and non-emotional facial behaviour in patients with unilateral brain damage. J. Neurol. Neurosurg. Psychiatry, 51, 826±832. Bowers, D., Bauer, R.M., Coslett, H. & Heilman, K.M. (1985) Processing of faces by patients with unilateral hemisphere lesions. Brain Cognit., 4, 258±272. Breiter, H.C., Etcoff, N.L., Whalen, P.J., Kennedy, W.A., Rauch, S.L., Buckner, R.L., Strauss, M.L., Hyman, S.E. & Rosen, B.R. (1996) Response and habituation of the human amygdala during visual processing of facial expression. Neuron, 17, 875±887. Calder, A.J., Young, A.W., Rowland, D., Perrett, D.I., Hodges, J.R. & Etcoff, N.L. (1996) Facial emotion recognition after bilateral amygdala damage: differentially severe impairment of fear. Cogn. Neuropsychol., 13, 699±745. CarretieÂ, L. & Iglesias, J. (1995) An ERP study on the speci®city of facial expression processing. Internat. J. Psychophysiol., 19, 183±192. Challamel, M.J. (1992) Fonctions du sommeil paradoxal et ontogeneÁse. Neurophysiol. Clin., 22, 117±132. Darwin, C. (1872) The expression of emotions in man and animals. John Murray, London. Ekman, P. & Friesen, W.V. (1975). Pictures of Facial Affect. Consulting Psychologist Press, Palo Alto, CA. Ekman, P., Sorenson, E.R. & Friesen, W.V. (1969) Pan-cultural elements in facial displays of emotion. Science, 164, 86±88. Etcoff, N.L. (1984) Selective attention to facial identity and facial emotion. Neuropsychologia, 22, 281±295. Greenhouse, S.W. & Geisser, S. (1959) On methods in the analysis of pro®le data. Psychometrika, 24, 95±112. Halgren, E., Baudena, P., Heit, G., Clarke, M. & Marinkovic, K. (1994) Spatio-temporal stages in face and word processing. 1. Depth recorded

potentials in the human occipital and parietal lobes. J. Physiol. (Paris), 88, 1±50. Hasselmo, M.A., Rolls, E.T. & Baylis, G.C. (1989) The role of expression and identity in the face-selective responses of neurons in the temporal visual cortex of the monkey. Behav. Br. Res., 32, 203±218. Iwai, E. & Yukie, M. (1987) Amygdalofugal and amygdalopetal connections with modality-speci®c visual cortical areas in macaques (Macaca fuscata, M. mulatta, and M. fascicularis). J. Comp. Neurol., 261, 362±387. Jasper, H. (1958) Report of the commitee on methods of clinical examination in electroencephalography. Electroencephalogr. Clin. Neurophysiol., 10, 370±375. Lang, S.F., Nelson, C.A. & Collins, P.F. (1990) Event-related potentials to emotional and neutral stimuli. J. Clin. Exp. Neuropsychol., 12, 946±958. Laurian, S., Bader, M., Lanares, J. & Oros, L. (1991) Topography of eventrelated potentials elicited by visual emotional stimuli. Internat. J. Psychophysiol., 10, 231±238. Liu, L., Ioannides, A.A. & Streit, M. (1999) Single trial analysis of neurophysiological correlates of the recognition of complex objects and facial expressions of emotions. Brain Topography, 11±4, 291±303. McDonald, A.J. (1998) Cortical pathways to the mammalian amygdala. Prog. Neurol., 55, 257±332. Morris, J.S., Friston, K.J., BuÈchel, C., Frith, C.D., Young, A.W., Calder, A.J. & Dolan, R.J. (1998) A neuromodulatory role for human amygdala in processing emotional facial expressions. Brain, 121, 47±57. Morris, J.S., Frith, C.D., Perrett, D.I., Rowland, D., Young, A.W., Calder, A.J. & Dolan, R.J. (1996) A differential neural response in the human amygdala to fearful and happy facial expressions. Nature, 383, 812±815. È hman, A. & Dolan, R.J. (1999) A subcortical pathway to the Morris, J.S., O right amygdala mediating !unseen@ fear. Proc. Natl. Acad. Sci. USA, 96, 1680±1685. Perrett, D.I., Rolls, E.T. & Caan, W. (1982) Visual neurons responsive to faces in the monkey temporal cortex. Exp. Br. Res., 47, 329±342. Phillips, M.L., Young, A.W., Scott, S.K., Brammer, M., Calder, A.J., Andrew, C., Giampietro, V., Williams, S.C.R., Bullmore, E.T., Brammer, M. & Gray, J.A. (1998) Neural responses to facial and vocal expressions of fear and disgust. Proc. R. Soc. Lond., 265, 1809±1817. Phillips, M.L., Young, A.W., Senior, C., Brammer, M., Andrew, C., Calder, A.J., Bullmore, E.T., Perrett, D.I., Rowland, D., Williams, S.C.R., Gray, J.A. & David, A.S. (1997) A speci®c neural substrate for perceiving facial expressions of disgust. Nature., 389, 495±498. Reuter-Lorenz, P. & Davidson, R.J. (1981) Differential contributions of the two cerebral hemispheres to the perception of happy and sad faces. Neuropsychologia, 19, 609±613. Scott, S.K., Young, A.W., Calder, A.J., Hellawell, D.J., Aggleton, J.P. & Johnson, M. (1997) Impaired auditory recognition of fear and anger following bilateral amygdala lesions. Nature, 385, 254±257. Sprengelmeyer, R., Young, A.W., Calder, A.J., Karnat, A., Lange, H., HoÈmberg, V., Perrett, D.I. & Rowland, D. (1996) Loss of disgust. Perception of faces and emotions in Huntington's desease. Brain, 119, 1647±1665. Stone, V., Nisenson, L., Eliassen, J.C. & Gazzaniga, M.S. (1996) Left hemisphere representations of emotional facial expressions. Neuropsychologia, 34, 23±29. Streit, M., Ionnides, A.A., Liu, L., WoÈlwer, W., Dammers, J., Gross, J., Gaebel, W. & MuÈller-GaÈrtner, H.W. (1999) Neurophysiological correlates of the recognition of facial expressions of emotion as revealed by magnetoencephalography. Cogn. Br. Res., 7, 481±491. Tanaka, K. (1996) Inferotemporal cortex and object vision. Annu. Rev. Neurosci., 19, 109±139. Tranel, D., Damasio, A.R. & Damasio, H. (1988) Intact recognition of facial expression, gender and age in patients with impaired recognition of faces. Neurology, 38, 690±696. Van Strien, J.W. & Morpugo, M. (1992) Opposite hemispheric activations as a result of emotionally threatening and non-threatening words. Neuropsychologia, 30, 845±848. Vanderploeg, R.D., Brown, W.S. & Marsh, J.T. (1987) Judgments of emotion in word and faces: ERP correlates. Internat. J. Psychophysiol., 5, 193±205. Yaxley, S., Rolls, E.T. & Sienkiewicz, Z.T. (1988) The responsiveness of neurons in the insular gustatory cortex of the macaque monkey is independent of hunger. Physiol. Behav., 42, 223±229. Young, A.W., Newcombe, F., de-Haan, E.H., Small, M. & Hay, D.C. (1993) Face perception after brain injury. Selective impairments affecting identity and expression. Brain, 116, 941±959.

ã 2001 Federation of European Neuroscience Societies, European Journal of Neuroscience, 13, 987±994

Lihat lebih banyak...

Comentários

Copyright © 2017 DADOSPDF Inc.