Significance

Emotions coordinate our behavior and physiological states during survival-salient events and pleasurable interactions. Even though we are often consciously aware of our current emotional state, such as anger or happiness, the mechanisms giving rise to these subjective sensations have remained unresolved. Here we used a topographical self-report tool to reveal that different emotional states are associated with topographically distinct and culturally universal bodily sensations; these sensations could underlie our conscious emotional experiences. Monitoring the topography of emotion-triggered bodily sensations brings forth a unique tool for emotion research and could even provide a biomarker for emotional disorders.

Abstract

Emotions are often felt in the body, and somatosensory feedback has been proposed to trigger conscious emotional experiences. Here we reveal maps of bodily sensations associated with different emotions using a unique topographical self-report method. In five experiments, participants (n = 701) were shown two silhouettes of bodies alongside emotional words, stories, movies, or facial expressions. They were asked to color the bodily regions whose activity they felt increasing or decreasing while viewing each stimulus. Different emotions were consistently associated with statistically separable bodily sensation maps across experiments. These maps were concordant across West European and East Asian samples. Statistical classifiers distinguished emotion-specific activation maps accurately, confirming independence of topographies across emotions. We propose that emotions are represented in the somatosensory system as culturally universal categorical somatotopic maps. Perception of these emotion-triggered bodily changes may play a key role in generating consciously felt emotions.
We often experience emotions directly in the body. When strolling through the park to meet with our sweetheart we walk lightly with our hearts pounding with excitement, whereas anxiety might tighten our muscles and make our hands sweat and tremble before an important job interview. Numerous studies have established that emotion systems prepare us to meet challenges encountered in the environment by adjusting the activation of the cardiovascular, skeletomuscular, neuroendocrine, and autonomic nervous system (ANS) (1). This link between emotions and bodily states is also reflected in the way we speak of emotions (2): a young bride getting married next week may suddenly have “cold feet,” severely disappointed lovers may be “heartbroken,” and our favorite song may send “a shiver down our spine.”
Both classic (3) and more recent (4, 5) models of emotional processing assume that subjective emotional feelings are triggered by the perception of emotion-related bodily states that reflect changes in the skeletomuscular, neuroendocrine, and autonomic nervous systems (1). These conscious feelings help the individuals to voluntarily fine-tune their behavior to better match the challenges of the environment (6). Although emotions are associated with a broad range of physiological changes (1, 7), it is still hotly debated whether the bodily changes associated with different emotions are specific enough to serve as the basis for discrete emotional feelings, such as anger, fear, or happiness (8, 9), and the topographical distribution of the emotion-related bodily sensations has remained unknown.
Here we reveal maps of bodily sensations associated with different emotions using a unique computer-based, topographical self-report method (emBODY, Fig. 1). Participants (n = 701) were shown two silhouettes of bodies alongside emotional words, stories, movies, or facial expressions, and they were asked to color the bodily regions whose activity they felt to be increased or decreased during viewing of each stimulus. Different emotions were associated with statistically clearly separable bodily sensation maps (BSMs) that were consistent across West European (Finnish and Swedish) and East Asian (Taiwanese) samples, all speaking their respective languages. Statistical classifiers discriminated emotion-specific activation maps accurately, confirming independence of bodily topographies across emotions. We propose that consciously felt emotions are associated with culturally universal, topographically distinct bodily sensations that may support the categorical experience of different emotions.
Fig. 1.
The emBODY tool. Participants colored the initially blank body regions (A) whose activity they felt increasing (left body) and decreasing (right body) during emotions. Subjectwise activation–deactivation data (B) were stored as integers, with the whole body being represented by 50,364 data points. Activation and deactivation maps were subsequently combined (C) for statistical analysis.

Results

We ran five experiments, with 36–302 participants in each. In experiment 1, participants reported bodily sensations associated with six “basic” and seven nonbasic (“complex”) emotions, as well as a neutral state, all described by the corresponding emotion words. Fig. 2 shows the bodily sensation maps associated with each emotion. One-out linear discriminant analysis (LDA) classified each of the basic emotions and the neutral state against all of the other emotions with a mean accuracy of 72% (chance level 50%), whereas complete classification (discriminating all emotions from each other) was accomplished with a mean accuracy of 38% (chance level 14%) (Fig. 3 and Table S1). For nonbasic emotions, the corresponding accuracies were 72% and 36%. When classifying all 13 emotions and a neutral emotional state, the accuracies were 72% and 24% against 50% and 7% chance levels, respectively. In cluster analysis (Fig. 4, Upper), the positive emotions (happiness, love, and pride) formed one cluster, whereas negative emotions diverged into four clusters (anger and fear; anxiety and shame; sadness and depression; and disgust, contempt, and envy). Surprise—neither a negative nor a positive emotion—belonged to the last cluster, whereas the neutral emotional state remained distinct from all other categories.
Fig. 2.
Bodily topography of basic (Upper) and nonbasic (Lower) emotions associated with words. The body maps show regions whose activation increased (warm colors) or decreased (cool colors) when feeling each emotion. (P < 0.05 FDR corrected; t > 1.94). The colorbar indicates the t-statistic range.
Fig. 3.
Confusion matrices for the complete classification scheme across experiments.
Fig. 4.
Hierarchical structure of the similarity between bodily topographies associated with emotion words in experiment 1 (Upper) and basic emotions across experiments with word (W), story (S), movie (M), and Face (F) stimuli (Lower).
We controlled for linguistic confounds of figurative language associated with emotions (e.g., “heartache”) in a control experiment with native speakers of Swedish, which as a Germanic language, belongs to a different family of languages than Finnish (a Uralic language). BSMs associated with each basic emotion word were similar across the Swedish- and Finnish-speaking samples (mean rs = 0.75), and correlations between mismatched emotions across the two experiments (e.g., anger-Finnish vs. happiness-Swedish) were significantly lower (mean rs = 0.36) than those for matching emotions.
To test whether the emotional bodily sensations reflect culturally universal sensation patterns vs. specific conceptual associations between emotions and corresponding bodily changes in West European cultures, we conducted another control experiment with Taiwanese individuals, who have a different cultural background (Finnish: West European; Taiwanese: East Asian) and speak a language belonging to a family of languages distant from Finnish (Taiwanese Hokkien: Chinese language). Supporting the cultural universality hypothesis, BSMs associated with each basic emotion were similar across the West European and East Asian samples (mean rs = 0.70), and correlations between mismatched emotions across the two experiments (e.g., anger-Finnish vs. happiness-Taiwanese) were significantly lower (mean rs = 0.40) than those for matching emotions.
When people recall bodily sensations associated with emotion categories described by words, they could just report stereotypes of bodily responses associated with emotions. To control for this possibility, we directly induced emotions in participants using two of the most powerful emotion induction techniques (10, 11)—guided mental imagery based on reading short stories (experiment 2) and viewing of movies (experiment 3)—and asked them to report their bodily sensations online during the emotion induction. We carefully controlled that emotion categories or specific bodily sensations were not directly mentioned in the stories or movies, and actual emotional content of the stories (Fig. S1) was evaluated by another group of 72 subjects (see ref. 12 for corresponding data on movies). BSMs were similar to those obtained in experiment 1 with emotion words (Figs. S2 and S3). The LDA accuracy was high (for stories 79% and 48% against 50% and 14% chance levels for one-out and complete classification and for movies, 76% and 50% against 50% and 20% chance levels, respectively). The BSMs were also highly concordant across emotion-induction conditions (stories vs. movies; mean rs = 0.79; Table S2).
Models of embodied emotion posit that we understand others’ emotions by simulating them in our own bodies (13, 14), meaning that we should be able to construct bodily representations of others’ somatovisceral states when observing them expressing specific emotions. We tested this hypothesis in experiment 4 by presenting participants with pictures of six basic facial expressions without telling them what emotions (if any) the faces reflected and asking them to color BSMs for the persons shown in the pictures, rather than the sensations that viewing the expressions caused in themselves. Again, statistically separable BSMs were observed for the emotions (Fig. S4), and the classifier accuracy was high (70% and 31% against 50% and 14% chance levels for one-out and complete classification schemes, respectively; Fig. 3 and Table S1). Critically, the obtained BSMs were highly consistent (Table S2) with those elicited by emotional words (mean rs = 0.82), stories (mean rs = 0.71), and movies (mean rs = 0.78).
If discrete emotional states were associated with distinct patterns of experienced bodily sensations, then one would expect that observers could also recognize emotions from the BSMs of others. In experiment 5, we presented 87 independent participants the BSMs of each basic emotion from experiment 1 in a paper-and pencil forced-choice recognition test. The participants performed at a similar level to the LDA, with a 46% mean accuracy (vs. 14% chance level). Anger (58%), disgust (43%), happiness (22%), sadness (38%), surprise (54%), and the neutral state (99%) were classified with high accuracy (P < 0.05 against chance level in χ2 test), whereas the performance did not exceed the chance level for fear (8%, NS).
Finally, we constructed a similarity matrix spanning the BSMs of experiments 1–4 for the six basic emotions plus the neutral emotional state (Fig. S5). BSMs were consistent across the experiments (mean rs = 0.83) for each basic emotion. Even though there were significant correlations across mismatching emotions across the experiments (e.g., anger in experiment 1 and fear in experiment 2), these were significantly lower (mean rs = 0.52) than those for the matching emotions. Clustering of the similarity matrix revealed a clear hierarchical structure in the data (Fig. 4, Lower). Sadness, disgust, fear, and neutral emotional state separated early on as their own clusters. Anger topographies in the word and face experiments clustered together, whereas those in the story experiments were initially combined with disgust. Two categories of surprise maps were clustered together, whereas the maps obtained in the word data were linked with disgust. Only happiness did not result in clear clustering across the experiments.
When LDA was applied to the dataset combined across experiments, the mean accuracy for complete classification was similar to that in the individual experiments (40% against 14% chance level). LDA using all possible pairs of the experiments as training and test datasets generally resulted in cross-experiment classification rates (Table S3) exceeding 50% for all of the tested experiment pairs, confirming the high concordance of the BSMs across the experiments.

Discussion

Altogether our results reveal distinct BSMs associated with both basic and complex emotions. These maps constitute the most accurate description available to date of subjective emotion-related bodily sensations. Our data highlight that consistent patterns of bodily sensations are associated with each of the six basic emotions, and that these sensations are represented in a categorical manner in the body. The distinct BSMs are in line with the evidence from brain imaging and behavioral studies, highlighting categorical structure of emotion systems and neural circuits supporting emotional processing (15, 16) and suggest that information regarding different emotions is also represented in embodied somatotopic format.
The discernible sensation patterns associated with each emotion correspond well with the major changes in physiological functions associated with different emotions (17). Most basic emotions were associated with sensations of elevated activity in the upper chest area, likely corresponding to changes in breathing and heart rate (1). Similarly, sensations in the head area were shared across all emotions, reflecting probably both physiological changes in the facial area (i.e., facial musculature activation, skin temperature, lacrimation) as well as the felt changes in the contents of mind triggered by the emotional events. Sensations in the upper limbs were most prominent in approach-oriented emotions, anger and happiness, whereas sensations of decreased limb activity were a defining feature of sadness. Sensations in the digestive system and around the throat region were mainly found in disgust. In contrast with all of the other emotions, happiness was associated with enhanced sensations all over the body. The nonbasic emotions showed a much smaller degree of bodily sensations and spatial independence, with the exception of a high degree of similarity across the emotional states of fear and sadness, and their respective prolonged, clinical variants of anxiety and depression.
All cultures have body-related expressions for describing emotional states. Many of these (e.g., having “butterflies in the stomach”) are metaphorical and do not describe actual physiological changes associated with the emotional response (18). It is thus possible that our findings reflect a purely conceptual association between semantic knowledge of language-based stereotypes associating emotions with bodily sensations (19). When activated, such a conceptual link—rather than actual underlying physiological changes—could thus guide the individual in constructing a mental representation of the associated bodily sensations (9). However, we do not subscribe to this argument. First, all four types of verbal and nonverbal stimuli brought about concordant BSMs, suggesting that the emotion semantics and stereotypes played a minor role. Second, consistent BSMs were obtained when participants were asked to report their actual online bodily sensations during actual emotions induced by viewing movies or reading stories (the emotional categories of which were not indicated), thus ruling out high-level cognitive inferences and stereotypes. Third, a validation study with participants speaking Swedish—a language distant from Finnish—replicated the original findings, suggesting that linguistic confounds such as figurative language associated with the emotions cannot explain the findings. Fourth, bodily sensation maps were also concordant across West European (Finland) and East Asian (Taiwan) cultures (mean rs = 0.70), thus exceeding clearly the canonical limit for “strong” concordance. Thus, BSMs likely reflect universal sensation patterns triggered by activation of the emotion systems, rather than culturally specific conceptual predictions and associations between emotional semantics and bodily sensation patterns. Despite these considerations, the present study cannot completely rule out the possibility that the BSMs could nevertheless reflect conceptual associations between emotions and bodily sensations, which are independent of the culture. However, where then do these conceptual associations originate and why are they so similar across people with very different cultural and linguistic backgrounds? A plausible answer would again point in the direction of a biological basis for these associations.
Prior work suggests that voluntary reproduction of physiological states associated with emotions, such as breathing patterns (20) or facial expressions (21), induces subjective feelings of the corresponding emotion. Similarly, voluntary production of facial expressions of emotions produces differential changes in physiological parameters such as heart rate, skin conductance, finger temperature, and muscle tension, depending on the generated expression (22). However, individuals are poor at detecting specific physiological states beyond maybe heart beating and palm sweating. Moreover, emotional feelings are only modestly associated with specific changes in heart rate or skin conductance (23) and physiological data have not revealed consistent emotion-specific patterns of bodily activation, with some recent reviews pointing to high unspecificity (9) and others to high specificity (8). Our data reconcile these opposing views by revealing that even though changes in specific physiological systems would be difficult to access consciously, net sensations arising from multiple physiological systems during different emotions are topographically distinct. The obtained BSM results thus likely reflect a compound measure of skeletomuscular and visceral sensations, as well as the effects of autonomic nervous system, which the individuals cannot separate. As several subareas of the human cortical somatosensory network contain somatotopic representations of the body (24), specific combinations of somatosensory and visceral afferent inputs could play a central role in building up emotional feelings. It must nevertheless be emphasized that we do not argue that the BSMs highlighted in this series of experiments would be the only components underlying emotional experience. Rather, they could reflect the most reliable and systematic consciously accessible bodily states during emotional processing, even though they may not relate directly to specific physiological changes.
These topographically distinct bodily sensations of emotions may also support recognizing others’ emotional states: the BSMs associated with others’ facial expressions were significantly correlated with corresponding BSMs elicited by emotional words, text passages, and movies in independent participants. Participants also recognized emotions related to mean BSMs of other subjects. Functional brain imaging has established that the primary somatosensory cortices are engaged during emotional perception and emotional contagion (25, 26), and their damage (27) or inactivation by transcranial magnetic stimulation (28) impairs recognition of others’ emotions. Consequently, emotional perception could involve automatic activation of the sensorimotor representations of the observed emotions, which would subsequently be used for affective evaluation of the actual sensory input (13, 29). The present study cannot nevertheless establish a direct link between the BSMs and an underlying physiological activation pattern. Even though whole-body physiological responses cannot be mapped with conventional psychophysiological techniques, in the future, whole-body perfusion during induced emotions could be measured with whole-body 15O-H2O PET imaging. These maps could then be correlated with the BSMs to investigate the relationship between experienced regional bodily sensations and physiological activity during emotional episodes.

Conclusions

We conclude that emotional feelings are associated with discrete, yet partially overlapping maps of bodily sensations, which could be at the core of the emotional experience. These results thus support models assuming that somatosensation (25, 27) and embodiment (13, 14) play critical roles in emotional processing. Unraveling the subjective bodily sensations associated with human emotions may help us to better understand mood disorders such as depression and anxiety, which are accompanied by altered emotional processing (30), ANS activity (31, 32), and somatosensation (33). Topographical changes in emotion-triggered sensations in the body could thus provide a novel biomarker for emotional disorders.

Materials and Methods

Participants.

A total of 773 individuals took part in the study (experiment 1a: n = 302, Mage = 27 y, 261 females; experiment 1b: n = 52, Mage = 27 y, 44 females; experiment 1c: n = 36, Mage = 27 y, 21 females; experiment 2: n = 108, Mage = 25 y, 97 females; experiment 3: n = 94, Mage = 25 y, 80 females; experiment 4: n = 109, Mage = 28 y, 92 females; and experiment 5: n = 72, Mage = 39 y, 53 females). All participants were Finnish speaking except those participating in experiment 1b who spoke Swedish and those participating in experiment 1c who spoke Taiwanese Hokkien as their native languages.

Stimuli.

Experiment 1 a–c: emotion words.

Participants evaluated their bodily sensations (BSMs) associated with six basic (anger, fear, disgust, happiness, sadness, and surprise) and seven nonbasic emotions (anxiety, love, depression, contempt, pride, shame, and envy) as well as a neutral state. Each word was presented once in random order. The participants’ task was to evaluate which bodily regions they typically felt becoming activated or deactivated when feeling each emotion; thus the task did not involve inducing actual emotions in the participants. Experiment 1a was conducted using Finnish words and Finnish-speaking participants, experiment 1b with corresponding Swedish words and Swedish-speaking participants, and experiment 1c with Taiwanese words and Taiwanese-speaking participants. For the Swedish and Taiwanese variants, the Finnish emotion words and instructions were first translated to Swedish/Taiwanese by a native speaker and then backtranslated to Finnish to ensure semantic correspondence.

Experiment 2: guided emotional imagery.

Participants rated bodily sensations triggered by reading short stories (vignettes) describing short emotional and nonemotional episodes. Each vignette elicited primarily one basic emotion (or a neutral emotional state), and five vignettes per emotion category were presented in random order. Such text-driven emotion induction triggers heightened responses in somatosensory and autonomic nervous system (34) as well as brain activation (35), consistent with affective engagement. The vignettes were generated in a separate pilot experiment. Following the approach of Matsumoto et al. (36), each emotional vignette described an antecedent event triggering prominently one emotional state. Importantly, none of the vignettes described the actual emotional feelings, behavior, or bodily actions of the protagonist, thus providing no direct clues about the emotion or bodily sensations being associated with the story [e.g., It’s a beautiful summer day. You drive to the beach with your friends in a convertible and the music is blasting from the stereo” (happy). “You sit by the kitchen table. The dishwasher is turned on” (neutral). “While visiting the hospital, you see a dying child who can barely keep her eyes open.” (sad)]. Normative data were acquired from 72 individuals. In the vignette evaluation experiment, the vignettes were presented one at a time in random order on a computer screen. Participants were asked to read each vignette carefully and report on a scale ranging from 1 to 5 the experience of each basic emotion (and neutral emotional state) triggered by the vignette. Data revealed that the vignettes were successful in eliciting the targeted, discrete emotional states. For each vignette, rating of the target emotion category was higher than that of any other emotion category (P < 0.001; Fig. S1). K-means clustering also classified each vignette reliably to the a priori target category, Fs (6, 28) > 36.54, P < 0.001.

Experiment 3: emotional movies.

The stimuli were short 10-s movies eliciting discrete emotional states. They were derived from an fMRI study assessing the brain basis of discrete emotions, where they were shown to trigger a reliable pattern of discrete emotional responses (12). Given the inherent difficulties associated with eliciting anger and surprise with movie stimuli (37), these emotions were excluded from the study. Five stimuli were chosen for each emotion category (fear, disgust, happiness, sadness, and neutral). Each film depicted humans involved in either emotional or nonemotional activities. The films were shown one at a time in random order without sound. Participants were able to replay each movie and they were encouraged to view each one as many times as was sufficient for them to decide what kind of responses it elicited in them.

Experiment 4: embodying emotions from facial expressions.

The stimuli were pictures of basic facial expressions (anger, fear, disgust, happiness, sadness, and surprise) and a neutral emotional state, each posed by two male and two female actors chosen from the Karolinska facial expressions set (38).

Experiment 5: recognizing emotions from emBODY BSMs.

The stimuli were unthresholded emBODY BSMs for each basic emotion averaged over the 302 participants in experiment 1a.

Data Acquisition.

Data were acquired online with the emBODY instrument (Fig. 1) developed for the purposes of this study. In this computerized tool, participants were shown two silhouettes of a human body and an emotional stimulus between them. The bodies were abstract and 2D to lower the cognitive load of the task and to encourage evaluating only the spatial pattern of sensations. The bodies did not contain pointers to internal organs to avoid triggering purely conceptual associations between emotions and specific body parts to (e.g., love–heart). Participants were asked to inspect the stimulus and use a mouse to paint the bodily regions they typically felt becoming activated (on the left body) or deactivated (on the right body) when viewing it. Painting was dynamic, thus successive strokes on a region increased the opacity of the paint, and the diameter of the painting tool was 12 pixels. Finished images were stored in matrices where the paint intensity ranged from 0 to 100. Both bodies were represented by 50,364 pixels. When multiple stimuli from one category were used (experiments 2–4), subjectwise data were averaged across the stimuli eliciting each emotional state before random effects analysis. In experiment 4, instead of evaluating emotions that the faces would trigger in themselves, the participants were asked to rate what the persons shown in the pictures would feel in their bodies.
In experiment 5, participants were asked to recognize the average heatmaps of basic emotions and the neutral emotional state based on 302 respondents in experiment 1. The heatmaps were color printed on a questionnaire sheet alongside instructions and six emotion words and the word “neutral.” The participants were asked to associate each heatmap with the word that described it best. Two different randomized orders of the heatmaps and words were used to avoid order effects.

Statistical Analysis.

Data were screened manually for anomalous painting behavior (e.g., drawing symbols on bodies or scribbling randomly). Moreover, participants leaving more than mean + 2.5 SDs of bodies untouched were removed from the sample. Next, subjectwise activation and deactivation maps for each emotion were combined into single BSMs representing both activations and deactivations and responses outside the body area were masked. In random effects analyses, mass univariate t tests were then used on the subjectwise BSMs to compare pixelwise activations and deactivations of the BSMs for each emotional state against zero. This resulted in statistical t-maps where pixel intensities reflect statistically significant experienced bodily changes associated with each emotional state. Finally, false discovery rate (FDR) correction with an alpha level of 0.05 was applied to the statistical maps to control for false positives due to multiple comparisons.
To test whether different emotions are associated with statistically different bodily patterns, we used statistical pattern recognition with LDA after first reducing the dimensionality of the dataset to 30 principal components with principal component analysis. To estimate generalization accuracy, we used stratified 50-fold cross-validation where we trained the classifier separately to recognize one emotion against all of the others (one-out classification), or all emotions against all of the other emotions (complete classification). To estimate SDs of classifier accuracy, the cross-validation scheme was run iteratively 100 times.
To assess the similarity of the BSMs associated with different emotion categories, we performed hierarchical clustering. First, for each subject we created a similarity matrix: for each pair of emotion categories we computed the Spearman correlation between the corresponding heatmaps. To avoid inflated correlations, zero values in the heatmaps (i.e., regions without paint) were filled with Gaussian noise. The Spearman correlation was chosen as the optimal similarity metric due to the high dimensionality of the data within each map: with high dimensionality, Euclidean metrics usually fail to assess similarity, as they are mainly based on the magnitude of the data. Furthermore, as a rank-based metric, independent of the actual data values, it is also less sensitive to outliers compared with Pearson’s correlation. We also evaluated cosine-based distance as a possible metric, but the normalization involved in the computation lowered the sensitivity of our final results, as cosine distance uses only the angle between the two vectors and not their magnitude. We averaged individual similarity matrices to produce a group similarity matrix that was then used as distance matrix between each pair of emotion categories for the hierarchical clustering with complete linkage. The similarity data were also used for assessing reliability of bodily topographies across languages and experiments.

Acknowledgments

We thank Drs. Kevin Wen-Kai Tsai and Wei-Tang Chang and Professor Fa-Hsuan Lin for their help with acquiring the Taiwanese dataset. This research was supported by the Academy of Finland grants 265917 (MIND program grant to L.N.), 131483 (to R.H.), and 131786 (to J.K.H.); European Research Council Starting Grant 313000 (to L.N.); Advanced Grant 232946 (to R.H.); and an aivoAALTO grant from Aalto University. All data are stored on Aalto University’s server and are available upon request.

Supporting Information

Supporting Information (PDF)
Supporting Information

References

1
RW Levenson Blood, sweat, and fears: The autonomic architecture of emotion. Emotions Inside Out, eds P Ekman, JJ Campos, RJ Davidson, FBM DeWaal (Annals of the New York Academy of Sciences, New York) Vol 1000, 348–366 (2003).
2
Z Kövecses Metaphor and Emotion: Language, Culture, and Body in Human Feeling (Cambridge Univ Press, Cambridge, UK, 2000).
3
W James, What is an emotion? Mind 9, 188–205 (1884).
4
LF Barrett, B Mesquita, KN Ochsner, JJ Gross, The experience of emotion. Annu Rev Psychol 58, 373–403 (2007).
5
AR Damasio, GB Carvalho, The nature of feelings: Evolutionary and neurobiological origins. Nat Rev Neurosci 14, 143–152 (2013).
6
AR Damasio, The somatic marker hypothesis and the possible functions of the prefrontal cortex. Philos Trans R Soc Lond B Biol Sci 351, 1413–1420 (1996).
7
P Ekman, RW Levenson, WV Friesen, Autonomic nervous system activity distinguishes among emotions. Science 221, 1208–1210 (1983).
8
SD Kreibig, Autonomic nervous system activity in emotion: A review. Biol Psychol 84, 394–421 (2010).
9
LF Barrett, Are emotions natural kinds? Perspect Psychol Sci 1, 28–58 (2006).
10
A Gerrards-Hesse, K Spies, FW Hesse, Experimental inductions of emotional states and their effectiveness: A review. Br J Psychol 85, 55–78 (1994).
11
P Philippot, Inducing and assessing differentiated emotion-feeling states in the laboratory. Cogn Emotion 7, 171–193 (1993).
12
M Tettamanti, et al., Distinct pathways of neural coupling for different basic emotions. Neuroimage 59, 1804–1817 (2012).
13
PM Niedenthal, Embodying emotion. Science 316, 1002–1005 (2007).
14
C Keysers, JH Kaas, V Gazzola, Somatosensation in social perception. Nat Rev Neurosci 11, 417–428 (2010).
15
P Ekman, An argument for basic emotions. Cogn Emotion 6, 169–200 (1992).
16
FC Murphy, I Nimmo-Smith, AD Lawrence, Functional neuroanatomy of emotions: A meta-analysis. Cogn Affect Behav Neurosci 3, 207–233 (2003).
17
CE Izard The Psychology of Emotions (Plenum, New York, 1991).
18
P Heelas, Emotion talk across cultures. The Social Construction of Emotions, ed R Harré (Blackwell, Oxford), pp. 234–266 (1986).
19
B Rimé, P Philippot, D Cisamolo, Social schemata of peripheral changes in emotion. J Pers Soc Psychol 59, 38–49 (1990).
20
P Philippot, G Chapelle, S Blairy, Respiratory feedback in the generation of emotion. Cogn Emotion 16, 605–627 (2002).
21
F Strack, LL Martin, S Stepper, Inhibiting and facilitating conditions of the human smile: A nonobtrusive test of the facial feedback hypothesis. J Pers Soc Psychol 54, 768–777 (1988).
22
RW Levenson, P Ekman, WV Friesen, Voluntary facial action generates emotion-specific autonomic nervous system activity. Psychophysiology 27, 363–384 (1990).
23
IB Mauss, RW Levenson, L McCarter, FH Wilhelm, JJ Gross, The tie that binds? Coherence among emotion experience, behavior, and physiology. Emotion 5, 175–190 (2005).
24
J Ruben, et al., Somatotopic organization of human secondary somatosensory cortex. Cereb Cortex 11, 463–473 (2001).
25
L Nummenmaa, et al., Emotions promote social interaction by synchronizing brain activity across individuals. Proc Natl Acad Sci USA 109, 9599–9604 (2012).
26
L Nummenmaa, J Hirvonen, R Parkkola, JK Hietanen, Is emotional contagion special? An fMRI study on neural systems for affective and cognitive empathy. Neuroimage 43, 571–580 (2008).
27
R Adolphs, H Damasio, D Tranel, G Cooper, AR Damasio, A role for somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping. J Neurosci 20, 2683–2690 (2000).
28
G Pourtois, et al., Dissociable roles of the human somatosensory and superior temporal cortices for processing social face signals. Eur J Neurosci 20, 3507–3515 (2004).
29
J Decety, PL Jackson, The functional architecture of human empathy. Behav Cogn Neurosci Rev 3, 71–100 (2004).
30
JM Leppänen, Emotional information processing in mood disorders: A review of behavioral and neuroimaging findings. Curr Opin Psychiatry 19, 34–39 (2006).
31
RM Carney, KE Freedland, RC Veith, Depression, the autonomic nervous system, and coronary heart disease. Psychosom Med 67, S29–S33 (2005).
32
JF Thayer, BH Friedman, TD Borkovec, Autonomic characteristics of generalized anxiety disorder and worry. Biol Psychiatry 39, 255–266 (1996).
33
S Lautenbacher, et al., Pain perception in depression: Relationships to symptomatology and naloxone-sensitive mechanisms. Psychosom Med 56, 345–352 (1994).
34
SR Vrana, PJ Lang, Fear imagery and the startle-probe reflex. J Abnorm Psychol 99, 189–197 (1990).
35
VD Costa, PJ Lang, D Sabatinelli, F Versace, MM Bradley, Emotional imagery: Assessing pleasure and arousal in the brain’s reward circuitry. Hum Brain Mapp 31, 1446–1457 (2010).
36
D Matsumoto, T Kudoh, K Scherer, H Wallbott, Antecedents of and reactions to emotions in the United States and Japan. J Cross Cult Psychol 19, 267–286 (1988).
37
A Schaefer, F Nils, X Sanchez, P Philippot, Assessing the effectiveness of a large database of emotion-eliciting films: A new tool for emotion researchers. Cogn Emotion 24, 1153–1172 (2010).
38
Lundqvist D, Flykt A, Öhman A (1998) The Karolinska Directed Emotional Faces: KDEF CD-ROM. (Karolinska Institutet, Stockholm).

Information & Authors

Information

Published in

Go to Proceedings of the National Academy of Sciences
Proceedings of the National Academy of Sciences
Vol. 111 | No. 2
January 14, 2014
PubMed: 24379370

Classifications

Submission history

Published online: December 30, 2013
Published in issue: January 14, 2014

Keywords

  1. embodiment
  2. feelings
  3. somatosensation

Acknowledgments

We thank Drs. Kevin Wen-Kai Tsai and Wei-Tang Chang and Professor Fa-Hsuan Lin for their help with acquiring the Taiwanese dataset. This research was supported by the Academy of Finland grants 265917 (MIND program grant to L.N.), 131483 (to R.H.), and 131786 (to J.K.H.); European Research Council Starting Grant 313000 (to L.N.); Advanced Grant 232946 (to R.H.); and an aivoAALTO grant from Aalto University. All data are stored on Aalto University’s server and are available upon request.

Authors

Affiliations

Lauri Nummenmaa1 [email protected]
Department of Biomedical Engineering and Computational Science and
Brain Research Unit, O. V. Lounasmaa Laboratory, School of Science, Aalto University, FI-00076, Espoo, Finland;
Turku PET Centre, University of Turku, FI-20521, Turku, Finland; and
Enrico Glerean
Department of Biomedical Engineering and Computational Science and
Brain Research Unit, O. V. Lounasmaa Laboratory, School of Science, Aalto University, FI-00076, Espoo, Finland;
Jari K. Hietanen
Human Information Processing Laboratory, School of Social Sciences and Humanities, University of Tampere, FI-33014, Tampere, Finland

Notes

1
To whom correspondence may be addressed. E-mail: [email protected] or [email protected].
Author contributions: L.N., E.G., R.H., and J.K.H. designed research; L.N. and E.G. performed research; L.N. and E.G. contributed new reagents/analytic tools; L.N. and E.G. analyzed data; and L.N., E.G., R.H., and J.K.H. wrote the paper.

Competing Interests

The authors declare no conflict of interest.

Metrics & Citations

Metrics

Note: The article usage is presented with a three- to four-day delay and will update daily once available. Due to ths delay, usage data will not appear immediately following publication. Citation information is sourced from Crossref Cited-by service.


Citation statements




Altmetrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

    Loading...

    View Options

    View options

    PDF format

    Download this article as a PDF file

    DOWNLOAD PDF

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Personal login Institutional Login

    Recommend to a librarian

    Recommend PNAS to a Librarian

    Purchase options

    Purchase this article to access the full text.

    Single Article Purchase

    Bodily maps of emotions
    Proceedings of the National Academy of Sciences
    • Vol. 111
    • No. 2
    • pp. 565-876

    Media

    Figures

    Tables

    Other

    Share

    Share

    Share article link

    Share on social media