Previously, researchers have never been able to pinpoint neural signatures in human brains to identify where emotion, or if emotions are even biologically triggered. Now researchers from Carnegie Mellon University have confirmed for the first time where the neural activation sites are located. The findings illuminate how the brain categorizes feelings, giving researchers the first reliable process to analyze emotions. Until now, research on emotions has been stifled by the lack of reliable methods to evaluate them.
How researchers found out where exactly neural pathways were activated was by taking MRIs of 10 actors from the Carnegie Mellon Drama Community, and had them act out 18 random emotions through writing. Participants were tasked with writing a scenario about those 18 emotions in as much or little detail, in fictitious form, or as a previous experience. Participants were then immediately scanned by a fMRI after writing about their 18 scenarios. Screens then flashed one of the emotions in word form on a screen for 9s, and were tasked with deriving thought from their scenario papers. At 9s the participants had to work as hard as they could muster to experience their maximal emotions.
Interestingly, after all of the emotional words had been displayed on the screen scientists displayed 12 neutral photos, and then 12 disgusting photos. Before the participants saw the photos they were displayed the word calm for 40s before and after the 10 minute long video session.
“Initial data processing was performed using SPM2 (Wellcome Department of Cognitive Neurology, London). The data were corrected for slice timing, motion, and linear trend, and were normalized into MNI space (3.125 mm×3.125 mm×3 mm voxels). Gray matter voxels were assigned to anatomical areas using Anatomical Automatic Labeling (AAL) masks. Where noted below, the images were partitioned into several bilateral brain areas using AAL masks: frontal, parietal, temporal, and occipital. In addition, masks corresponding to all AAL-defined subcortical areas (excluding cerebellum) and cingulate areas (consisting of anterior and posterior cingulate) were used.”
Data was then processed using the multi-voxel pattern analysis techniques which accounts for neurological activity to be displayed throughout the brain. These algorithms frequently result in increased predictive power, and recent research suggests that they hold promise for classifying emotion using neurological and physiological data.
“In particular, multi-voxel pattern analysis (MVPA) has shown great promise in classifying the emotional content of facial, bodily, and vocal expressions. Patterns of activity in voice-sensitive cortices can be used to distinguish between angry, sad, relieved, joyful, and neutral vocal expressions (Ethofer et al. 2009). Similarly, distributed patterns of activity in areas implicated in the processing of facial expressions (i.e. anterior and posterior superior temporal sulcus (STS) and frontal operculum) can be used to distinguish between facial expressions of seven emotions” (Said et al. 2010).
Speculatively speaking, these findings could significantly help in finding cures for emotional issues such as bipolar disorder, and hopefully depression issues. These techniques could help find neurological issues which could prove to be a physiological dysfunction comparatively to the chemical imbalance theory used today for depression.
We all know that the brain is a complex muscle, but as noted here we can see that there is a lot left to be discovered about ourselves. For instance emotional issues could be solved if action is taken before the age of 25. A recent study at the University of Oregon found that plasticity of our brain ends at 25—which basically means we are done growing our neurological pathways rapidly as noted in their education video called changing brains.