Studying the Functional Equivalence of Mechanisms for Emotion Perception and Emotion Generation by Means of Evoked Potentials
Introduction. Many functional systems of the brain confirm the hypothesis of functional equivalence. However, the mechanisms involved in emotion perception and reproduction still remain an open question. This study attempts to define and compare the dynamics of the activity of brain structures when perceiving and generating emotions. The novelty of the research lies in comparing the dynamics of evoked activity when the same respondents perceive and generate emotions within a single experiment.
Methods. The respondents were shown photographs of various facial expressions and also objects generating various emotions. The respondents had to detect emotions from photographs of facial expressions, and also recognize emotions which the objects generated. The answers made it possible to group and average EEG fragments to distinguish evoked potentials (neutral, positive, and negative faces and objects). EEG was recorded from 128 derivations, which allowed determining the trajectories of the foci of maximal activity using sLORETA. The responses to faces and objects of the same emotional valence were compared.
Results. For the first time the responses to the visual stimuli expressing and generating emotion were recorded and compared within a single experiment. The N170 wave activity displayed the differences between evoked responses to different facial expressions and objects with different emotional valence. The analysis of the trajectories of the foci of maximal activity when developing reactions to faces and objects showed no crossing.
Discussion. The experiment should have parted the dynamics of the compared responses. Assuming the existence of a mirror mechanism, the coincidence of the processes is possible only at a certain stage in this case. The analysis of the results demonstrated no crossing for the compared processes.
Conclusion. The findings showed no signs of the functional equivalence of the mechanisms for emotion recognition and emotion generation.
Kosslyn S. M., Ball T. M., Reiser B. J. Visual images preserve metric spatial information: evidence from studies of image scanning. Journal of experimental psychology: Human perception and performance, 1978, V. 4, no. 1, pp. 47–60.
Holmes P. S., Cumming J., Edwards M. G. Motor imagery in learning processes: Motor imagery and observation in skill learning. In: Guillot A., Collet C. (eds.) The neurophysiological foundations of mental and motor imagery. Oxford, UK, Oxford University Press, 2010, pp. 253–269.
Moran A., Guillot A., Macintyre T., Collet C. Re-imagining motor imagery: Building bridges between cognitive neuroscience and sport psychology. British Journal of Psychology, 2012, V. 103, no. 2, pp. 224–247. DOI: 10.1111/j.2044-8295.2011.02068.x
Bastiaansen J. A. C. J., Thioux M., Keysers C. Evidence for mirror systems in emotions. Philosophical Transactions of the Royal Society of London B: Biological Sciences, 2009, V. 364, no. 1528, pp. 2391–2404. DOI: 10.1098/rstb.2009.0058
Lamm C., Bukowski H., Silani G. From shared to distinct self-other representations in empathy: evidence from neurotypical function and socio-cognitive disorders. Philosophical Transactions of the Royal Society B, 2016, V. 371, Issue 1686, 20150083. DOI: 10.1098/rstb.2015.0083
Decety J., Grèzes J. The power of simulation: imagining one's own and other's behavior. Brain Research, 2006, V. 1079, no. 1, pp. 4–14. DOI: 10.1016/j.brainres.2005.12.115
Leslie K. R., Johnson-Frey S. H., Grafton S. T. Functional imaging of face and hand imitation: towards a motor theory of empathy. Neuroimage, 2004, V. 21, no. 2, pp. 601–607. DOI: 10.1016/j.neuroimage.2003.09.038
Singer T., Lamm C. The social neuroscience of empathy. Annals of the New York Academy of Sciences, 2009, V. 1156, no. 1, pp. 81–96. DOI: 10.1111/j.1749-6632.2009.04418.x
Decety J. To what extent is the experience of empathy mediated by shared neural circuits? Emotion Review, 2010, V. 2, no. 3, pp. 204–207. DOI: 10.1177/1754073910361981
Rütgen M., Seidel E.-M., Silani G., Riecansky I., Hummer A., Windischberger C., Petrovic P., Lamm C. Placebo analgesia and its opioidergic regulation suggest that empathy for pain is grounded in self pain. Proceedings of the National Academy of Sciences, 2015, V. 112, no. 41, pp. E5638–E5646. DOI: 10.1073/pnas.1511269112
Han X., He K., Wu B., Shi Z., Liu Y., Luo S., Wei K., Wu X., Han S. Empathy for pain motivates actions without altruistic effects: evidence of motor dynamics and brain activity. Social Cognitive and Affective Neuroscience, 2017, V. 12, no. 6, pp. 893–901. DOI: 10.1093/scan/nsx016
Carr L., Iacoboni M., Dubeau M.-C., Mazziotta J. C., Lenzi G. L. Neural mechanisms of empathy in humans: a relay from neural systems for imitation to limbic areas. Proceedings of the national Academy of Sciences, 2003, V. 100, no. 9, pp. 5497–5502. DOI: 10.1073/pnas.0935845100
Wicker B., Keyser C., Plailly J., Royet J. P., Gallese V., Rizzolatti G. Both of us disgusted in my insula: the common neural basis of seeing and feeling disgust. Neuron, 2003, V. 40, no. 3, pp. 655–664. DOI: 10.1016/S0896-6273(03)00679-2
Heller A. S., Lapate R. C., Mayer K. E., Davidson R. J. The face of negative affect: trial-by-trial corrugator responses to negative pictures are positively associated with amygdala and negatively associated with ventromedial prefrontal cortex activity. Journal of Cognitive Neuroscience, 2014, V. 26, no. 9, pp. 2102–2110. DOI: 10.1162/jocn_a_00622
Lamm C., Majdandžić J. The role of shared neural activations, mirror neurons, and morality in empathy – A critical comment. Neuroscience Research, 2015, V. 90, pp. 15–24. DOI: 10.1016/j.neures.2014.10.008
Britton J. C., Britton C., Taylor S. F., Sudheimer K. D., Liberzon I. Facial expressions and complex IAPS pictures: common and differential networks. Neuroimage, 2006, V. 31, no. 2, pp. 906–919. DOI: 10.1016/j.neuroimage.2005.12.050
Olszanowski M., Pochwatko G., Kuklinski K., Scibor-Rylski M., Lewinski P., Ohme R. K. Warsaw set of emotional facial expression pictures: a validation study of facial display photographs. Frontiers in psychology. 2014, V. 5, 1516. DOI: 10.3389/fpsyg.2014.01516
Pantic M., Valstar M., Rademaker R., Maat L. Web-based database for facial expression analysis. IEEE International Conference on Multimedia and Expo (ICME, 2005). DOI: 10.1109/ICME.2005.1521424
Lundqvist D., Flykt A., Öhman A. The Karolinska directed emotional faces (KDEF). Stockholm, Sweden, Karolinska Institute, Department of Clinical Neuroscience, Psychology Section, 1998.
Delorme A., Makeig S. EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics. Journal of Neuroscience Methods, 2004, V. 134, no. 1, pp. 9–21. DOI: 10.1016/j.jneumeth.2003.10.009
Pascual-Marqui R. D. Standardized low-resolution brain electromagnetic tomography (sLORETA): Technical details. Methods and Findings in Experimental and Clinical Pharmacology, 2002, V. 24, pp. 5–12.
Maratos F. A., Garner M., Karl A., Hogan A. M. When is a face a face? Schematic faces, emotion, attention and the N170. AIMS Neuroscience, 2015, V. 2, no. 3, pp. 172–182. DOI: 10.3934/Neuroscience.2015.3.172
Almeida P. R., Ferreira-Santos F., Chaves P. L., Paiva T. O., Barbosa F., Marques-Teixeira J. Perceived arousal of facial expressions of emotion modulates the N170, regardless of emotional category: Time domain and time–frequency dynamics. International Journal of Psychophysiology, 2016, V. 99, pp. 48–56. DOI: 10.1016/j.ijpsycho.2015.11.017
Almeida P. R., Ferreira-Santos F., Vieira J. B., Moreira P. S., Barbosa F., Marques-Teixeira J. Dissociable effects of psychopathic traits on cortical and subcortical visual pathways during facial emotion processing: an ERP study on the N170. Psychophysiology, 2014, V. 51, no. 7, pp. 645–657. DOI: 10.1111/psyp.12209
Lin H., Schulz C., Straube T. Contextual effects of surprised expressions on the encoding and recognition of emotional target faces: An event-related potential (ERP) study. Biological Psychology, 2017, V. 129, pp. 273–281. DOI: 10.1016/j.biopsycho.2017.09.011
Cacioppo J. T., Tassinary L. G., Berntson G. Handbook of psychophysiology. 3rd Edition. Cambridge, Cambridge University Press, 2007. 908 p.
Rice G. E., Watson D. M., Hartley T., Andrews T. J. Low-level image properties of visual objects predict patterns of neural response across category-selective regions of the ventral visual pathway. Journal of Neuroscience, 2014, V. 34, no. 26, pp. 8837–8844. DOI: 10.1523/JNEUROSCI.5265-13.2014
Zhen Z., Yang Z., Huang L., Kong X.-Z., Wang X., Dang X., Huang Y., Song Y., Liu J. Quantifying interindividual variability and asymmetry of face-selective regions: A probabilistic functional atlas. NeuroImage, 2015, V. 113, pp. 13–25. DOI: 10.1016/j.neuroimage.2015.03.010
Goffaux V., Peters J., Haubrechts J., Schiltz C., Jansma B., Goebel R. From coarse to fine? Spatial and temporal dynamics of cortical face processing. Cerebral Cortex, 2010, V. 21, no. 2, pp. 467–476. DOI: 10.1093/cercor/bhq112
Grill-Spector K., Weiner K. S. The functional architecture of the ventral temporal cortex and its role in categorization. Nature Reviews Neuroscience, 2014, V. 15, no. 8, pp. 536–548. DOI: 10.1038/nrn3747
Hietanen J. K., Astikainen P. N170 response to facial expressions is modulated by the affective congruency between the emotional expression and preceding affective picture. Biological Psychology, 2013, V. 92, no. 2, pp. 114–124. DOI: 10.1016/j.biopsycho.2012.10.005
Hinojosa J. A., Mercado F., Carretié L. N170 sensitivity to facial expression: a meta-analysis. Neuroscience & Biobehavioral Reviews, 2015, V. 55, pp. 498–509. DOI: 10.1016/j.neubiorev.2015.06.002
Wang X., Jin J., Liu Z., Yin T. Study on differences of early-mid ERPs induced by emotional face and scene images. International Symposium on Neural Networks. Springer, Cham, 2017, pp. 550–558. DOI: 10.1007/978-3-319-59081-3
Eimer M., Holmes A. Event-related brain potential correlates of emotional face processing. Neuropsychologia, 2007, V. 45, no. 1, pp. 15–31. DOI: 10.1016/j.neuropsychologia.2006.04.022
This work is licensed under a Creative Commons Attribution 4.0 International License.