Brain States

Leave a comment

The signature of happiness

How do we know what someone else is feeling? Clues about the emotions of others can come in various forms. Facial expression can be a dead giveaway, but we can also make inferences from body posture, or even from seeing or reading about the situation that caused the emotion.  An interesting problem in neuroscience is how these very different cues about the emotions of others can all lead to the same ultimate realization: She’s happy; he is sad.

In a paper published this week in the Journal of Neuroscience, Amy Skerry and Rebecca Saxe sought to find the region of the brain that is responsible for these empathetic realizations, regardless of the origin. They did this by showing people different types of media that relayed emotion: a short video clip from a movie, or an animated clip that showed a geometric figure experiencing prosocial or antisocial action from its fellow geometric shapes.  For instance, in the figure below, a woman makes a sad face, and then a red circle is excluded from a group of purple triangles, squares, and pentagons (so sad! poor circle).


The authors then trained a computer program to look at the fMRI brain scans of people during each emotional media presentation, and guess which emotion was being conveyed.  Importantly,  the program was trained to discriminate the emotional states based on one type of media (facial expression, say) and then was tested on data for the other type of media (animated situations).  The scientists were looking for brain regions that had such distinct neural response to the emotional state that the computer program could recognize it no matter which media type the person had seen to make the inference. To qualify, the program had to perform significantly better than chance on data from that region.

Following data from previous studies, the authors homed in on the prefrontal cortex, or PFC. This is not surprising, as the prefrontal cortex is a particularly “thinky” part of the brain, responsible for, among other things, future planning and impulse control.  But the PFC is large (it’s basically everything in your forehead region) and has many functions. Specifically, it seemed to be the medial part of this structure, or MPFC, that held the key to invariant recognition of emotional states, regardless of how they were communicated.  Further subdividing, the authors found that data from both the dorsal (upper) MPFC and middle MPFC reliably allowed the computer program to perform above chance.




Skerry and Saxe then asked another question. Would these same brain regions represent emotions the same way when it was the self experiencing that emotion, rather than another? To determine the answer, the participants in the study were told that they were either winning money (happy 🙂 ) or losing it (sad 😦 ). They then had the computer program guess, based on neural response, what emotion they had induced in the individual.  Here, the middle MPFC still held reliable information, whereas the dorsal MPFC no longer did.

This study succeeded in identifying a region of the brain that has an particular response to particular emotions, regardless of how the brain whether it was perceived visually or merely implied, and regardless, even, of whether it was the self or someone else experiencing it. While the current study dealt only in binary (good or bad, happy or sad) it remains an open question whether these findings hold for more complex emotions like greed, jealousy, or gratitude.

Reference: A Common Neural Code for Perceived and Inferred Emotion. Amy E. Skerry and Rebecca Saxe. (2014) Journal of Neuroscience, 34(48): 15997-16008

Intro image by Dietmar Temps, all other images adapted from above.

Leave a comment

Fear Perception Depends on Heartbeat Cycle Timing


It always amazes me how much our reactions to the world depend on our internal states, even when those states are outside of our conscious awareness. Though we don’t typically consider it a sense like hearing or seeing, the visceral sense is real: there are tons of receptors inside us, monitoring our internal physiology at all times.

One example is the baroreceptors that monitor the pressure inside our arteries as the heart pumps blood through them. The baroreceptors have a major role in the regulation of blood pressure, as one might expect. However, recent research has revealed that information from these internal receptors is also being used when we process emotional stimuli.

Sarah Garfinkle and colleagues investigated this phenomenon recently by showing people images of human faces, either displaying the emotions of fear, disgust, or happiness, or in a neutral state of repose. The facial expressions were embedded in a stream of scrambled images, and were flashed by so quickly that they were at the limit of conscious perception: each frame occupied the screen for a mere 70 milliseconds, or less than a tenth of a second. The facial expressions were timed to coincide with the cardiac cycle, either landing at systole (between the “lub” and “dub” of a heartbeat, when arterial pressure is high) or at diastole (between heartbeats, when arterial pressure is low).

Afterward, participants were asked to identify which face they had seen in a lineup of three faces. In general, the participants were pretty bad at this test, getting it right only about half the time. But for fearful faces especially, participants were much better at picking out the face they had seen when they saw it in the middle of the heartbeat, rather than between them. Being at the “right” time with respect to the heartbeat pushed detection of the fearful face over the edge from subconscious into consciousness.

So people were better able to detect a fearful face when it came in the middle of a heartbeat, but what about how people thought of the face they saw? The scientists assessed this in a separate experiment, at the same time doing functional neuroimaging on the participants to see what their brains were doing. Again, they showed participants a fearful face, either timed to coincide with systole or diastole. This time, the face was on the screen for a bit longer, so the participants were sure to detect it. Afterward the experimenters asked the participants to rate the intensity of the emotion they saw.

Again, the results depended on the heartbeat cycle timing: if the face was presented in the middle of a heartbeat, participants rated it as more fearful than if it was presented between heartbeats.  The difference was not just perceptual, but also was evident in the activity of a brain structure called the amygdala. The amygdala is known be involved in the processing of emotion, and this area was more active when the participants saw the fearful face at systole than at diastole.

Interestingly, this effect was less marked for individuals who were anxious. People with high anxiety conditions like PTSD are more responsive to fearful stimuli, regardless of when it comes with respect to the cardiac cycle.

This work is just one example of the dynamic feedback between our physiological and emotional states. For decades, neuroscientists have debated the nature of emotions: does nervousness cause butterflies in the stomach, or do butterflies in the stomach create the feeling of nervousness? Which comes first, the physiological or the psychological? We may never have a clear-cut answer to this question, because the there may not be a clear-cut answer. It seems that the road between the body and the mind is a two-way street.

Reference: Fear from the Heart: Sensitivity to Fear Stimuli Depends on Individual Heartbeats. (2014) Sarah N. Garfinkel, Ludovico Minati, Marcus A. Gray, Anil K. Seth, Raymond J. Dolan, and Hugo D. Critchley. Journal of Neruoscience 34(19): 6573-6582



When We’re In Sync, So Are Our Brains

We all know those moments. The electrifying seconds when the home team makes a goal, or bride says her vows, or the presidential favorite wins the election. The moments when an entire room full of people is feeling exactly the same way, at exactly the same time, because they share a common perspective.

The emotion runs high because everyone is riding the same roller coaster of events, each new twist and turn causing fresh reactions.   Our emotions are jerked like rag dolls.  The result looks so synchronized, it could be choreographed.

When we are all cheering for the same goal, both our bodies and our minds become synchronized.  You throw your hands up at the same time as the rest of the stadium, and your brains are also doing the same thing. In each head, the visual cortex is processing the game, the motor cortex is holding up the arms, and the attention-controlling networks are riveting us all to the events as they unfold.

And when you are rooting for the same person, the Action-Observation Network in the frontoparietal region of starts humming in synchrony with those around you.  In fact, it is this neural synchrony that allows you to share a moment with others.

That’s the implication of a study released in the Journal of Neuroscience. In it, scientists measured the blood flow to the brains of people who were watching a boxing match. In some cases, the scientists told the subjects to watch the match as they normally would. But sometimes they told the subjects to watch the match while paying close attention to a particular boxer, trying hard to simulate in their own minds the actions and emotions of that boxer.

When different subjects focused on the same boxer, their brains began to oscillate in phase with one another in the somatosensory cortex- the part of the brain that is responsible for the sense of touch. The somatosensory cortex also plays a big role in allowing you to mentally “mirror” the actions of another person, so that you can monitor them and understand their motivations.

Importantly, the brain synchrony was bigger when the subjects were paying attention to the same boxer than when the subjects were just watching the video casually. It’s the attention to the actions and feelings of another that caused the brain regions to activate – because in large part, the brain uses the same area to understand the way someone else is feeling as to feel that way itself. 

This report is one in a long line of evidence suggesting that time-locked brain activity shared by individuals is the basic process that supports interpersonal understanding.

Just think. All our moments of mutual understanding may depend on our brains being in sync with one another.


Nummenmaa L, Smirnov D, Lahnakoski JM, Glerean E, Jääskeläinen IP, Sams M, Hari R (2014) Mental action simulation synchronizes action-observation circuits across individuals. J Neurosci 34:748–757.