Research article Recurring Topics

When is a Face a Face? Schematic Faces, Emotion, Attention and the N170

  • Emotional facial expressions provide important non-verbal cues as to the imminent behavioural intentions of a second party. Hence, within emotion science the processing of faces (emotional or otherwise) has been at the forefront of research. Notably, however, such research has led to a number of debates including the ecological validity of utilising schematic faces in emotion research, and the face-selectively of N170. In order to investigate these issues, we explored the extent to which N170 is modulated by schematic faces, emotional expression and/or selective attention. Eighteen participants completed a three-stimulus oddball paradigm with two scrambled faces as the target and standard stimuli (counter-balanced across participants), and schematic angry, happy and neutral faces as the oddball stimuli. Results revealed that the magnitude of the N170 associated with the target stimulus was: (i) significantly greater than that elicited by the standard stimulus, (ii) comparable with the N170 elicited by the neutral and happy schematic face stimuli, and (iii) significantly reduced compared to the N170 elicited by the angry schematic face stimulus. These findings extend current literature by demonstrating N170 can be modulated by events other than those associated with structural face encoding; i.e. here, the act of labelling a stimulus a ‘target’ to attend to modulated the N170 response. Additionally, the observation that schematic faces demonstrate similar N170 responses to those recorded for real faces and, akin to real faces, angry schematic faces demonstrated heightened N170 responses, suggests caution should be taken before disregarding schematic facial stimuli in emotion processing research per se.

    Citation: Frances A. Maratos, Matthew Garner, Alexandra M. Hogan, Anke Karl. When is a Face a Face? Schematic Faces, Emotion, Attention and the N170[J]. AIMS Neuroscience, 2015, 2(3): 172-182. doi: 10.3934/Neuroscience.2015.3.172

    Related Papers:

    [1] Mary B. Dratman, Joseph V. Martin . The many faces of thyroxine. AIMS Neuroscience, 2020, 7(1): 17-29. doi: 10.3934/Neuroscience.2020002
    [2] Mariana FP de Araújo, Wagner A de Castro, Hiroshi Nishimaru, Susumu Urakawa, Taketoshi Ono, Hisao Nishijo . Performance in a gaze-cueing task is associated with autistic traits. AIMS Neuroscience, 2021, 8(1): 148-160. doi: 10.3934/Neuroscience.2021007
    [3] Brion Woroch, Alex Konkel, Brian D. Gonsalves . Activation of stimulus-specific processing regions at retrieval tracks the strength of relational memory. AIMS Neuroscience, 2019, 6(4): 250-265. doi: 10.3934/Neuroscience.2019.4.250
    [4] Justyna Gerłowska, Krzysztof Dmitruk, Konrad Rejdak . Facial emotion mimicry in older adults with and without cognitive impairments due to Alzheimer's disease. AIMS Neuroscience, 2021, 8(2): 226-238. doi: 10.3934/Neuroscience.2021012
    [5] Mani Pavuluri, Amber May . I Feel, Therefore, I am: The Insula and Its Role in Human Emotion, Cognition and the Sensory-Motor System. AIMS Neuroscience, 2015, 2(1): 18-27. doi: 10.3934/Neuroscience.2015.1.18
    [6] Marie Vandekerckhove, Yu-lin Wang . Emotion, emotion regulation and sleep: An intimate relationship. AIMS Neuroscience, 2018, 5(1): 1-17. doi: 10.3934/Neuroscience.2018.1.1
    [7] Pascale Voelker, Mary K. Rothbart, Michael I. Posner . A Polymorphism Related to Methylation Influences Attention during Performance of Speeded Skills. AIMS Neuroscience, 2016, 3(1): 40-55. doi: 10.3934/Neuroscience.2016.1.40
    [8] Donald G. MacKay, Rutherford Goldstein . Creativity, Comprehension, Conversation and the Hippocampal Region: New Data and Theory. AIMS Neuroscience, 2016, 3(1): 105-140. doi: 10.3934/Neuroscience.2016.1.105
    [9] Pauline M. Insch, Gillian Slessor, Louise H. Phillips, Anthony Atkinson, Jill Warrington . The Impact of Aging and Alzheimers Disease on Decoding Emotion Cues from Bodily Motion. AIMS Neuroscience, 2015, 2(3): 139-152. doi: 10.3934/Neuroscience.2015.3.139
    [10] Monireh Asadi Ghaleni, Forouzan Fattahi Masrour, Narjes Saryar, Alexandra J. Bratty, Ebrahim Norouzi, Matheus Santos de Sousa Fernandes, Georgian Badicu . Effects of an intervention combining physical activity and components of Amygdala and Insula Retraining (AIR) on sleep and working memory among older male adults. AIMS Neuroscience, 2024, 11(4): 421-438. doi: 10.3934/Neuroscience.2024025
  • Emotional facial expressions provide important non-verbal cues as to the imminent behavioural intentions of a second party. Hence, within emotion science the processing of faces (emotional or otherwise) has been at the forefront of research. Notably, however, such research has led to a number of debates including the ecological validity of utilising schematic faces in emotion research, and the face-selectively of N170. In order to investigate these issues, we explored the extent to which N170 is modulated by schematic faces, emotional expression and/or selective attention. Eighteen participants completed a three-stimulus oddball paradigm with two scrambled faces as the target and standard stimuli (counter-balanced across participants), and schematic angry, happy and neutral faces as the oddball stimuli. Results revealed that the magnitude of the N170 associated with the target stimulus was: (i) significantly greater than that elicited by the standard stimulus, (ii) comparable with the N170 elicited by the neutral and happy schematic face stimuli, and (iii) significantly reduced compared to the N170 elicited by the angry schematic face stimulus. These findings extend current literature by demonstrating N170 can be modulated by events other than those associated with structural face encoding; i.e. here, the act of labelling a stimulus a ‘target’ to attend to modulated the N170 response. Additionally, the observation that schematic faces demonstrate similar N170 responses to those recorded for real faces and, akin to real faces, angry schematic faces demonstrated heightened N170 responses, suggests caution should be taken before disregarding schematic facial stimuli in emotion processing research per se.


    1. Introduction

    The ability to recognise the facial expressions of others is a fundamental component of both survival [1] and social functioning [2]. For example, an emotional facial expression provides important non-verbal cues as to the imminent behavioural intentions of a second party and hence, important cues to behaviour (e.g. fight, flight, withdraw or approach). Consequently, it should be of no surprise that within the field of emotion science the processing of faces (emotional or otherwise) has been at the forefront of behavioural, clinical and neuroscience research. Notably, however, from such research a number of controversies have arisen. Amongst others these include the face-selectively of the N170 and the ecological validity of utilising schematic faces in emotion research. In the present research paper we present evidence that speaks to both.

    The N170, a negative polarity within the 120-220 ms range, is a robust event-related potential argued to be selectively elicited by face stimuli [3,4]. It shows a greater magnitude at occipito-temporal sites, particularly over the right hemisphere, to pictures of faces compared with other (non-face) stimuli [5]. The early occurrence of this component following stimulus presentation suggests N170 reflects structural encoding of faces prior to their recognition [6,7]. Consistent with this, the N170 has been used to investigate several aspects of face specific processing, such as feature sensitivity, orientation sensitivity, and familiarity; all of which correspond to structural encoding stages outlined in cognitive models of face processing [8,9].

    However, in recent years there has been debate as to whether this component is the earliest face-sensitive component [10,11,12,13] and, additionally, whether the N170 can be modulated by expertise and/or endogenous (e.g. higher-level) factors. For example, Dering, Hoshino & Thierry [14] argue that the N170 is not face-selective but represents more general ‘expertise’. Here they found that the magnitude of the word-inversion N170 was comparable to the face-inversion N170 for expert readers, but not novice (i.e. late) English readers. This is similar to earlier research by Tanaka & Curran [15], in which it was demonstrated that individuals who become skilled at identifying specific object categories also demonstrate an enhanced N170 response to that category [see also 16]. Fan et al. [17] have further observed that the N170 is modulated by competition [consistent with early research by [18], and Schinkel, Ivanova, Kurths & Sommer [19], that adaption of the N170 is modulated by top-down factors. In this research, following the viewing of a range of adaptor stimuli (i.e. houses, mooney faces, and specific single face features), Schinkel et al. had participants decide whether a test face stimulus had visible-teeth (feature-specific) or was male/female (holistic). Results revealed N170 adaptation effects were modulated by the specific task (i.e. feature-specific or holistic) the participant engaged in as compared with which adaptor stimulus they had previously viewed. Thus, Schinkel et al. argued that the N170 is not a fully automated process, given it is modulated by endogenous factors (in this case specific cognitive demands; but see also [20]). To sum, taken as a collective, the above studies provide robust evidence that factors other than processes of configural/holistic face processing can modulate the N170 [21]. In consequence, it could be the case that something as simple as stimulus saliency and/or directed attention can also modulate N170 responding.

    Leading from this, is the debate as to whether the N170 is modulated by the emotional valence (or saliency) of the face viewed [22]. Whereas several early studies found no evidence for emotional modulation of the N170 [23,24], a number of more recent studies demonstrate increased N170 amplitudes for threatening emotional faces compared with neutral or positive emotional face stimuli [25,26,27]. Moreover, increased N170 amplitude effects hold for non-attended [28] and subliminally presented threatening faces [29]. Indeed, in a recent meta-analysis of 57 experiments reporting N170 in response to facial expressions, the N170 was found to be differentially sensitive to different emotional facial expressions [30]. Specifically, Hinojasa et al., found that angry expressions, as compared to other emotional expressions, were associated with the greatest amplitude increase of the N170 response in comparison to neutral faces and, secondly, that task-related factors also modulated the N170. Here, it was observed that effect sizes were greater for studies in which the task utilised was non-direct. That is, tasks in which the different facial expressions were task-irrelevant resulted in the greatest modulations of N170 responding to emotional face stimuli. Thus, findings from studies of emotional faces again call into question the N170 as an ERP marker for structural face encoding only, with the meta-analysis of Hinojasa et al., clearly demonstrating that the N170 is sensitive to task-relevancy, and the emotional expression of face stimuli – especially those which feature displays of anger.

    The finding that angry faces elicit the greatest modulation of the N170 is consistent with cognitive and neural models of fear and threat processing (i.e. threat-superiority). In these models it is stated that early/rapid processing should be particularly apparent for biologically-prepared stimuli with evolutionary significance [31,32]. This not only fits with a number of MEG neuroimaging studies imaging temporal aspects of face processing [33,34,35], but is also consistent with a plethora of behavioural data that suggests low-level and emotional factors can modulate face processing [36,37,38]. Of note, in a number of such studies schematic faces have been employed to control perceptual differences between emotional expressions [39,40]. For example, utilising the Öhman, Lundqvist & Esteves [41] schematic faces set, Maratos & colleagues have repeatedly demonstrated threat superiority in processes of attention and working memory [42,43,44]. Yet, despite this, few studies have investigated the N170 to schematic emotional faces and, to our knowledge, none have utilised the Öhman et al. set in analyses of the N170. Here, however, exploration of the N170 would be useful given a major concern of schematic faces in emotion research is that they lack ecological validity [45,46].

    Thus the purpose of the present study was two-fold: i) to further investigate the extent to which endogenous processes, and specifically selective attention, modulate the N170; and ii) to investigate the extent to which schematic faces, and especially angry schematic faces, modulate the N170. To achieve this, we used a three stimulus oddball paradigm. Here, neutral, angry and happy schematic face stimuli served as the task-irrelevant oddball stimuli and two scrambled faces served as the task-relevant target and non-target stimuli. We hypothesised that: i) if N170 is modulated by endogenous processes then the N170 response should differ in response to a task-relevant target stimulus as compared to a non-target stimulus as a consequence of selective attention; ii) if schematic faces are representative of real faces than the N170 response to non-target schematic faces should be greater than to a non-target scrambled face; and iii) given its biological saliency, the threatening schematic face should elicit the greatest N170 response.

    2. Materials and Methods

    2.1. Participants

    Eighteen right-handed staff and students (14 female) from the University of Southampton participated in the study, which received local ethics committee approval. All provided written, informed consent and had normal or correct-to-normal visual acuity. Age ranged from 20 to 36 (mean = 25 years, SD = 4.34 years). Two participants reported previous treatment for depression, but both were treatment free at the time of data collection.

    2.2. Stimuli

    Three schematic emotional faces and two scrambled face stimuli were used in the visual oddball paradigm (Figure 1a). The schematic face stimuli were the same as those used by Öhman et al. [41] and served as the task-irrelevant oddball stimuli. The two scrambled faces served as the target and standard stimuli and comprised key features of the schematic face stimuli in random positions and orientations. All stimuli subtended a visual angle of 5.7° × 7.5° and were displayed on a black background at a viewing distance of 60 cm. Stimulus presentation was controlled by Presentation (www.neurobs.com) and each stimulus presented for 130 ms with a 900-1100 ms ISI using a 100 Hz refresh rate.

    Figure 1. (A) The stimuli used in the three-stimulus oddball task. S1 and S2 were the standard and target stimuli (counter-balanced across participants), and the face stimuli the task-irrelevant oddball stimuli. (B) Top row: Effects of the oddball stimuli on the N170. For the angry face stimulus planned comparisons revealed the N170 amplitude was significantly greater than that for the happy or neutral stimuli. Bottom row: Effects of target and standard stimuli on the N170. For the target stimulus, analyses revealed the N170 amplitude to be significantly greater than that for the standard stimuli, but comparable to that observed for the happy and neutral face stimuli.

    2.3. Procedure

    The visual oddball task consisted of three experimental blocks in which 29, 31, or 30 target stimuli, respectively, were embedded in a stream of standard stimuli (n= 200). Participants were asked to mentally count the number of target stimuli presented in each block. For half of the participants the target stimulus was S1 and the standard stimulus S2, and for the remainder vice-versa. In every block, an additional 30 oddball face stimuli were also presented, i.e., 10 angry, happy and neutral faces; although participants were not informed of this at the beginning of the experiment. Thus each block was composed of 259, 261 or 260 stimuli respectively, which comprised of 29, 31 or 30 target stimuli (S1 or S2 counterbalanced), 200 standard stimuli (S1 or S2 counterbalanced) and 30 schematic faces (10 angry, 10 happy and 10 neutral). Stimulus order was randomized within each block with the exception that two target or two emotional stimuli could not follow consecutively. Prior to the experiment proper, participants received a brief practice, this was a stimulus stream comprised of four standard and two target stimuli randomly intermixed.

    2.4. EEG Data acquisition and analysis

    EEG was recorded continuously from 21 scalp positions (F7, F3, Fz, F4, F8, FC5, FC6, T7, C3, Cz, C4, T8, CP5, CP6, P7, P3, Pz, P4, P8, O1, O2) against CP2 using Ag/AgCl sintered electrodes mounted on a cap (Easycap GmbH, Germany). Off-line, the channels were re-referenced to the linked mastoids. The EEG was digitised at a sampling rate of 250 Hz with a 0.1 Hz high pass and 70 Hz low pass filter using SynAmps2 amplifiers (Neuroscan, Compumedics, U.S.A). Vertical (VEOG) and horizontal (HEOG) eye movements were recorded from above and below the right eye and the outer canthi respectively. Impedances were kept below 10 kΩ.

    Using Brain Vision Analyzer 1.05 (Brain Products GmbH, Germany), the EEG raw data were filtered (low pass = 25 Hz, 12 dB/oct), segmented (100 ms pre- to 800 ms post- stimulus), corrected for blinks [47], and screened for additional artefacts. Trials containing artefacts (amplitude deviations of ± 200 μV) were rejected (< 15% of trials in total). All epochs were aligned to the pre-stimulus baseline from -100 to 0 ms. Artefact-free EEG epochs were averaged for each subject, condition and electrode.

    Time intervals and locations chosen for peak detection were based on previous N170 research [48]. In consequence, the N170 amplitude was determined at P7 and P8 as the maximum amplitude in a time window of 100-200 ms.

    3. Results

    3.1. Behavioural results

    Ten participants made no errors when completing the task of mentally counting the target stimulus, a further six participants demonstrated over 95% mean accuracy and the remaining two participants demonstrated 87% and 89% accuracy respectively. In the case of erring, accuracy was calculated as the number of under-count and/or over-count errors. For example, an individual who mentally counted 30, 32 and 29 targets, respectively, was recorded as having made 3 errors (thus 97 % accuracy).

    3.2. Analysis of N170

    3.2.1. Amplitude analyses

    Figure 1b shows the grand averages of the ERPs to (a) the schematic face stimuli, and (b) the target and standard stimuli, at electrodes P7 and P8, accompanying descriptive statistics are reported in Table 1. At these sites, an analysis of variance (ANOVA) with stimulus type (standard, target, angry, happy, neutral) and hemisphere (left, right) as independent variables and amplitude as the dependent variable, revealed main effects of both stimulus type (F(4, 68) = 17.044, p < 0.001; hp2 = 0.50) and hemisphere (F (1, 17) = 7.856, p = 0.012; hp2 = 0.32), but no interaction between these variables (F (4, 68) = 7.450, p > 0.15). For the main effect of hemisphere, the N170 was of higher amplitude in the right hemisphere. For the main effect of stimulus type, Bonferroni-corrected comparisons revealed that: i) the amplitude of the N170 for the standard stimulus was significantly attenuated in comparison to the N170 for all other stimuli (p < 0.001 for the standard vs. angry, happy and neutral face stimuli and p = 0.041 for the standard vs. target stimuli); ii) the amplitude of the N170 did not differ for the target stimulus compared with the happy or neutral stimuli; although iii) it was significantly attenuated in comparison to that elicited by the angry face stimulus (p = 0.021). Considering the research of Hinojasa et al. [31], two Bonferroni-corrected planned comparisons further revealed the angry schematic face stimulus to elicit a greater N170 response compared with the neutral (t = −2.654, df = 17, p < 0.01, one-tailed) and happy (t = −2.368, df = 17, p = 0.015, one-tailed) schematic face stimuli.

    Table 1. Descriptive statistics (Mean ± SD) for N170 amplitudes as a function of stimulus type and electrode.
    Stimulus TypeP7 ElectrodeP8 ElectrodeStimuli per Participant
    Angry -3.30 ± 3.28-5.91 ± 5.2Min–Max, Av: 23–30, 27
    Happy-2.12 ± 3.01-4.80 ± 5.13Min–Max, Av: 22–30, 28
    Neutral-2.65 ± 2.93-4.13 ± 4.46Min–Max, Av: 23–30, 28
    Standard-.27 ± 2.14-1.48 ± 3.19> 60
    Target-1.47 ± 2.61-2.78 ± 3.75> 200
     | Show Table
    DownLoad: CSV

    Finally, in exploratory analyses, to verify that the target/standard effect was not carried by one of the scrambled face stimuli (i.e. counter-balancing the stimuli across individuals), we further performed an additional analysis of stimulus type (standard, target) and hemisphere (left, right), with group (i.e. Group 1: S1 = standard, S2 = target; Group 2: S1= target, S2 = standard) as the between-subjects factor. For this analysis, we again observed main effects of stimulus type (F (1, 16) = 10.388, p < 0.01; hp2 = 0.40) and hemisphere (F (1, 16) = 9.199, p < 0.01; hp2 = 0.37), but importantly there was no main effect of group, nor any significant interactions (group by stimulus interaction p = 0.729).

    3.2.2. Latency analyses

    An ANOVA with stimulus type (standard, target, angry, happy, neutral) and hemisphere (left, right) as independent variables and latency as the dependent variable revealed no significant effects.

    4. Discussion

    The main aims of this study were, firstly, to further investigate the extent to which endogenous processes, and specifically selective attention, modulate the N170 and, secondly, to investigate the extent to which schematic faces, and especially angry schematic faces, modulate the N170. To this end, we manipulated the valence of task-irrelevant schematic face stimuli and the attentional relevance of target/non-target scrambled face stimuli in a three stimulus oddball paradigm. Results revealed both a main effect of hemisphere (i.e. right hemisphere dominance) and stimulus type. Importantly, regarding the latter, it was observed that the N170 elicited by a target scrambled face stimulus was: i) of significantly greater magnitude than that of the (same) standard scrambled non-target face stimulus; ii) of comparable magnitude to the neutral and happy schematic face task-irrelevant oddball stimuli; and iii) of significantly lower magnitude than that elicited by the angry face task-irrelevant oddball stimulus. Planned comparisons further revealed that the magnitude of the N170 elicited by the angry schematic face stimulus was significantly greater than that elicited by the task-irrelevant neutral or happy schematic face stimuli. Thus our findings provide additional support for the argument that the N170 response can be modulated by endogenous events, as well as demonstrate that schematic faces are associated with an N170 neural signature similar to that observed in the literature for real faces.

    In accordance with a growing body of research, the first major finding of the present research was that an endogenous factor, in this case selective attention, can modulate the N170 response. Here, whether a scrambled face stimulus was designated the standard or target stimulus significantly affected the magnitude of the N170 observed. This finding does not detract from arguments of N170 face-sensitivity per se, but it does suggest that this component can be modulated by higher-order factors extending research demonstrating modulation of the N170 by expertise [14,15,16] and specific task demands [17,18,19]. Indeed, in our study the magnitude of the N170 component for the scrambled face stimuli was found to be directly dependent upon the immediate task-relevance of the stimulus to the participant, i.e. selective attention, as reflected by the task instructions participants received. Thus a given scrambled face stimulus by means of simply being labelled the ‘target’ was found to elicit an N170 response comparable to that observed for the task-irrelevant neutral and happy schematic faces. Considering this, Hinojasa et al. [30] have recently argued that the N170 may be better understood as a ‘correlate of a perceptual representation stage’ reflecting a flexible integration process of several different information sources. Our results speak to this by suggesting that something as simple as ‘tagging’ a stimulus for visual recognition can lead to the said percept demonstrating an N170 response. Of course, it could be argued that when attention was drawn to the scrambled face, this event in itself caused its ‘face-like’ features to elicit a larger N170 [29,49]. Yet this would not detract from the important role played by endogenous selective attention in this process - because we only observed modulation of the N170 to the scrambled face designated the target and not that designated the standard stimuli (counter-balanced across participants).

    Building upon this, the second major finding of the present research was that the expression of the non-task relevant (i.e. oddball) schematic faces also modulated the N170 response observed. That is, whilst we observed all our schematic faces to elicit a larger N170 response as compared to the non-target scrambled face, the magnitude of the N170 for our schematic angry face was significantly greater than that associated with all other stimuli. This is also in keeping with the results of Hinojas et al. [30]. Namely, in their meta-analysis of 57 facial expression experiments, it was found that angry expressions caused by far the greatest increase in N170 amplitude. In accounting for this result they suggested that angry facial expressions are those most likely requiring rapid responding, and as such angry faces require more rapid facial decoding as reflected by increased N170 magnitude. That we observed a larger N170 for our schematic angry faces as compared to the schematic neutral or happy faces accords well with their meta-analysis research.

    The finding of a larger N170 to the schematic angry faces as compared to the neutral or happy schematic faces also fits well with previous behavioural research utilising the same schematic faces. In such research it has been demonstrated that schematic angry faces as compared to neutral and happy faces, not only reduce the magnitude of the attentional blink [42,43], but are also more likely to be retained in short term memory following competition for limited resources [44]. Indeed, Simione et al. [44] have argued that in cases of imminent threat, such as the facial display of anger, the amygdala (and/or related structures) rapidly signals to occipital and occipitotemporal cortices allowing for the early redistribution of attentional weights to objects in the visual array. This would bias the processing of ‘tagged’ stimuli as reflected not only by rapid changes in amygdala activity [22,31,34] but, perhaps, also via changes in evoked activity markers such as that reflected by the N170. This would therefore explain the greater modulation of the N170 by our angry schematic faces, as previous source analysis research has suggested N170 to be elicited within fusiform gyrus [50,51].

    A final result of the present study was the observation of right-hemisphere dominance. This, however, is in keeping with previous research in which foveal stimulus presentation is typically associated with a more pronounced right-hemisphere N170 response [3,51].

    5. Limitations

    Of course, as with most research, our study is not without limitation. Most notably, we did not include a comparable, but additional, oddball experiment in which the same participants were shown photographs of real emotional faces and their scrambled counterparts. In future research this would be advisable (as well as enlightening); as it may be that schematic faces produce broadly comparable responses to real faces, but with differences in amplitude and/or latency. In addition, as our paradigm incorporated a mental counting task it is apparent that when analysing target data, for a number of participants, a small number of incorrect target identifications may have been included. Again, in future research, this could be easily rectified by including a simple behavioural response (e.g. button-press) task. Finally, it would also be advisable to include more oddball stimuli per face type to minimise the impact of data loss (i.e. errors and artefact rejection).

    6. Conclusions

    To sum, the results of the present research are important in providing additional evidence that the N170 response can be modulated by both task demands and stimulus characteristics. By demonstrating that schematic faces, emotional expression and selective attention can all modulate the N170 response, our results suggest N170 likely reflects the activity of multiple neural sources critical in tagging a stimulus for pattern recognition. Of course, whether these processes are specific to the ‘faceness’ of specific stimuli, or indeed frequency of presentation [52] is still a matter for debate and investigation. However, the additional observation that schematic faces demonstrate similar N170 responses to those previously observed for real faces and, most notably, that angry schematic faces (akin to real angry faces) demonstrate heightened N170 responses, does suggest similarity in the processing of real and schematic faces. Hence, caution should be observed before disregarding schematic facial stimuli in emotion processing research.

    Acknowledgements

    The anonymous reviewers are thanked for their helpful comments, suggestions and advice concerning this manuscript.

    Conflict of Interest

    All the authors declare to have no conflict of interest in this paper.

    [1] Bannerman RL, Milders M, de Gelder B, et al. (2009) Orienting to threat: faster localization of fearful facial expressions and body postures revealed by saccadic eye movements. Proc Biol Sci 276(1662): 1635-1641.
    [2] Simon EW, Rosen M, Ponpipom A (1996) Age and IQ as predictors of emotion identification in adults with mental retardation. Res Dev Disabil 17(5): 383-389.
    [3] Eimer M (2011) The face-sensitive N170 component of the event-related brain potential. Oxford handbook face percept: 329-344.
    [4] Rossion B, Jacques C (2011) 5 The N170: Understanding the time course. Oxford handbook potent components 115.
    [5] Maurer U, Rossion B, McCandliss BD (2008) Category specificity in early perception: face and word N170 responses differ in both lateralization and habituation properties. Front Hum Neurosci.
    [6] Eimer M, Kiss M, Nicholas S (2010) Response profile of the face-sensitive N170 component: a rapid adaptation study. Cerebral Cortex 312.
    [7] Jacques C, Rossion B (2010) Misaligning face halves increases and delays the N170 specifically for upright faces: Implications for the nature of early face representations. Brain Res 13(18): 96-109.
    [8] Itier RJ, Alain C, Sedore K, et al. (2007) Early face processing specificity: It's in the eyes! J Cog Neurosci 19: 1815-1826.
    [9] Itier RJ, Batty M (2009) Neural bases of eye and gaze processing: the core of social cognition. Neurosci Biobehav Rev 33(6): 843-863.
    [10] Dering B, Martin CD, Moro S, et al. (2011) Face-sensitive processes one hundred milliseconds after picture onset. Front Hum Neurosci 5.
    [11] Eimer M (2011) The face-sensitivity of the n170 component. Front Hum Neurosci 5.
    [12] Ganis G, Smith D, Schendan HE (2012) The N170, not the P1, indexes the earliest time for categorical perception of faces, regardless of interstimulus variance. Neuroimage 62(3): 1563-1574.
    [13] Rossion B, Caharel S (2011) ERP evidence for the speed of face categorization in the human brain: Disentangling the contribution of low-level visual cues from face perception. Vision Res 51(12): 1297-1311.
    [14] Dering B, Hoshino N, Theirry G (2013) N170 modulation is expertisedriven: evidence from word-inversion effects in speakers of different languages. Neuropsycholo Trend 13.
    [15] Tanaka JW, Curran T (2001) A Neural Basis for Expert Object Recognition. Psychol Sci 12: 43-47. doi: 10.1111/1467-9280.00308
    [16] Gauthier I, Curran T, Curby KM, et al. (2003) Perceptual interference supports a non-modular account of face processing. Nat Neurosci 6: 428-432. doi: 10.1038/nn1029
    [17] Fan C, Chen S, Zhang L, et al. (2015) N170 changes reflect competition between faces and identifiable characters during early visual processing. NeuroImage 110: 32-38. doi: 10.1016/j.neuroimage.2015.01.047
    [18] Rugg M D, Milner AD, Lines CR, et al. (1987) Modulation of visual event-related potentials by spatial and non-spatial visual selective attention. Neuropsychologia 25: 85-96. doi: 10.1016/0028-3932(87)90045-5
    [19] Schinkel S, Ivanova G, Kurths J, et al. (2014) Modulation of the N170 adaptation profile by higher level factors. Bio Psychol 97: 27-34. doi: 10.1016/j.biopsycho.2014.01.003
    [20] Gong J, Lv J, Liu X, et al. (2008) Different responses to same stimuli. Neuroreport 19.
    [21] Thierry G, Martin CD, Downing P, et al. (2007) Controlling for interstimulus perceptual variance abolishes N170 face selectivity. Nat Neurosci 10: 505-511.
    [22] Vuilleumier P, Pourtois G (2007) Distributed and interactive brain mechanisms during emotion face perception: Evidence from functional neuroimaging. Neuropsychologia 45: 174-194. doi: 10.1016/j.neuropsychologia.2006.06.003
    [23] Munte TF, Brack M, Grootheer O, et al. (1998) Brain potentials reveal the timing of face identity and expression judgments. Neurosci Res 30: 25-34. doi: 10.1016/S0168-0102(97)00118-1
    [24] Eimer M, Holmes A (2007) Event-related brain potential correlates of emotional face processing. Neuropsychologia 45: 15-31. doi: 10.1016/j.neuropsychologia.2006.04.022
    [25] Batty M, Taylor MJ (2003) Early processing of the six basic facial emotional expressions. Cog Brain Res 17: 613-620. doi: 10.1016/S0926-6410(03)00174-5
    [26] Krombholz A, Schaefer F, Boucsein W (2007) Modification of N170 by different emotional expression of schematic faces. Biol Psychol 76: 156-162. doi: 10.1016/j.biopsycho.2007.07.004
    [27] Jiang Y, Shannon RW, Vizueta N, et al. (2009) Dynamics of processing invisible faces in the brain: Automatic neural encoding of facial expression information. Neuroimage 44: 1171-1177. doi: 10.1016/j.neuroimage.2008.09.038
    [28] Hung Y, Smith ML, Bayle DJ, et al. (2010) Unattended emotional faces elicit early lateralized amygdala-frontal and fusiform activations. Neuroimage 50: 727-733. doi: 10.1016/j.neuroimage.2009.12.093
    [29] Pegna AJ, Landis T, Khateb A (2008) Electrophysiological evidence for early non-conscious processing of fearful facial expressions. Int J Psychophysiol 70: 127-136. doi: 10.1016/j.ijpsycho.2008.08.007
    [30] Hinojosa JA, Mercado F, Carretié L (2015) N170 sensitivity to facial expression: A meta-analysis. Neurosci Biobehav Rev.
    [31] Ledoux JE (1996) The emotional brain: The mysterious underpinnings of emotional life. New York: Simon & Schuster.
    [32] Öhman A, Flykt A, Esteves F (2001) Emotion drives attention: Detecting the snake in the grass. J Exper Psychology-General 130: 466-478. doi: 10.1037/0096-3445.130.3.466
    [33] Luo Q, Holroyd T, Jones M, et al. (2007) Neural dynamics for facial threat processing as revealed by gamma band synchronization using MEG. Neuroimage 34(2): 839-847.
    [34] Maratos FA, Mogg K, Bradley BP, et al. (2009) Coarse threat images reveal theta oscillations in the amygdala: a magnetoencephalography study. Cog Affect Behav Neurosci 9(2): 133-143.
    [35] Maratos FA, Senior C, Mogg K, et al. (2012) Early gamma-band activity as a function of threat processing in the extrastriate visual cortex. Cog Neurosci 3(1): 62-68.
    [36] Fox E, Russo R, Dutton K (2002) Attentional bias for threat: Evidence for delayed disengagement from emotional faces. Cog Emotion 16(3): 355-379.
    [37] Gray KLH, Adams WJ, Hedger N, et al. (2013) Faces and awareness: low-level, not emotional factors determine perceptual dominance. Emotion 13(3): 537-544.
    [38] Stein T, Seymour K, Hebart MN, et al. (2014) Rapid fear detection relies on high spatial frequencies. Psychol Sci 25(2): 566-574.
    [39] Öhman A, Soares SC, Juth P, et al. (2012) Evolutionary derived modulations of attention to two common fear stimuli: Serpents and hostile humans. J Cog Psychol 24(1): 17-32.
    [40] Dickins DS, Lipp OV (2014) Visual search for schematic emotional faces: angry faces are more than crosses. Cog Emotion 28(1): 98-114.
    [41] Öhman A, Lundqvist D, Esteves F (2001) The face in the crowd revisited: a threat advantage with schematic stimuli. J Personal Soc Psychol 80: 381-396.
    [42] Maratos FA, Mogg K, Bradley BP (2008) Identification of angry faces in the attentional blink. Cog Emotion 22(7): 1340-1352.
    [43] Maratos FA (2011) Temporal processing of emotional stimuli: the capture and release of attention by angry faces. Emotion 11(5): 1242.
    [44] Simione L, Calabrese L, Marucci FS, et al. (2014) Emotion based attentional priority for storage in visual short-term memory. PloS one 9(5): e95261.
    [45] Pinkham AE, Griffin M, Baron R, et al. (2010) The face in the crowd effect: anger superiority when using real faces and multiple identities. Emotion 10(1): 141.
    [46] Stein T, Sterzer P (2012) Not just another face in the crowd: detecting emotional schematic faces during continuous flash suppression. Emotion 12(5): 988.
    [47] Gratton G, Coles MGH, Donchin E (1983) A new method for off-line removal of ocular artifact. Electroencephalogr Clin Neurophysiol 55:468-474 doi: 10.1016/0013-4694(83)90135-9
    [48] Kolassa IT, Musial F, Kolassa S, et al. (2006) Event-related potentials when identifying or color-naming threatening schematic stimuli in spider phobic and non-phobic individuals. BMC Psychiatry 6(38).
    [49] Babiloni C, Vecchio F, Buffo P, et al. (2010). Cortical responses to consciousness of schematic emotional facial expressions: A high‐resolution EEG study. Hum Brain Map 31(10): 1556-1569. doi: 10.1002/hbm.20958
    [50] Deffke I, Sander T, Heidenreich J, et al. (2007) MEG/EEG sources of the 170 ms response to faces are co-localized in the fusiform gyrus. Neuroimage 35(4): 1495-1501.
    [51] Luo S, Luo W, He W, et al. (2013) P1 and N170 components distinguish human-like and animal-like makeup stimuli. Neuroreport 24(9): 482-486.
    [52] Mercure E, Cohen Kadosh K, Johnson M (2011) The N170 shows differential repetition effects for faces, objects, and orthographic stimuli. Front Hum Neurosci 5(6).
  • This article has been cited by:

    1. Bo Yu, Lin Ma, Haifeng Li, Lun Zhao, Hongjian Bo, Xunda Wang, Biological Computation Indexes of Brain Oscillations in Unattended Facial Expression Processing Based on Event-Related Synchronization/Desynchronization, 2016, 2016, 1748-670X, 1, 10.1155/2016/8958750
    2. Andrea Orlandi, Alice Mado Proverbio, ERP indices of an orientation-dependent recognition of the human body schema, 2020, 146, 00283932, 107535, 10.1016/j.neuropsychologia.2020.107535
    3. Akos Szekely, Suparna Rajaram, Aprajita Mohanty, Context learning for threat detection, 2017, 31, 0269-9931, 1525, 10.1080/02699931.2016.1237349
    4. Sahoko Komatsu, Emi Yamada, Katsuya Ogata, Shizuka Horie, Yuji Hakoda, Shozo Tobimatsu, Facial identity influences facial expression recognition: A high-density ERP study, 2020, 725, 03043940, 134911, 10.1016/j.neulet.2020.134911
    5. Lauren C. Kelly, Frances A. Maratos, Sigrid Lipka, Steve Croker, Attentional Bias towards Threatening and Neutral Facial Expressions in High Trait Anxious Children, 2016, 7, 2043-8087, 343, 10.5127/jep.052915
  • Reader Comments
  • © 2015 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(7212) PDF downloads(1235) Cited by(5)

Article outline

Figures and Tables

Figures(1)  /  Tables(1)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog