After being the fodder for cheesy science-fiction movies for so long, mind reading may be less unrealistic than previously thought.
Led by Kevin LaBar, professor of psychology and neuroscience, researchers used functional magnetic resonance imaging—also known as fMRI—technology to create a model of the brain based on how it experiences distinct emotional states. Philip Kragel, Pratt ’06 and a research assistant, helped develop a computer algorithm to identify unknown emotional states by comparing them to previously-mapped emotions.
“The main motivation behind this work was really that there are no scientifically validated, objective measures of specific emotions,” LaBar said.
LaBar explained that although previous studies have employed fMRI machines to study emotions, these investigations have been limited to monitoring subjects’ brains as they watch movies or engage in certain activities.
The new research, however, aimed to go further and investigate "spontaneous elicitation of emotion."
During the first part of the study, researchers showed subjects film and instrumental music, and they used the data to map the brain. In doing so, they obtained models of how the brain experiences seven different emotions—contentment, amusement, surprise, fear, anger, sadness and a neutral state.
“[Subjects would] get about seven 15-minute runs where they alternated between viewing clips and then making ratings,” Kragel said.
The researchers had to select clips that would elicit the same emotional reaction from all viewers, which Kragel said is an essential aspect of developing accurate brain models.
After the initial maps were made, participants were given an MRI and asked to report their emotional state every 30 seconds. This resting state was conducive to spontaneous emotions, LaBar said.
“This is basically one of these times where you can have memories of previous events—maybe some pleasant things you’ve done over the weekend, or you might be worrying about an upcoming exam,” he said.
In the 10 seconds before subjects were asked about their emotions, their brain states were mapped and compared to the previously constructed models. LaBar noted that the models were successful at predicting what subjects were about to say.
“We were able to basically predict which emotions they were going to report just by doing this decoding of the resting state brain data,” he said.
Despite the initial success, Kragel said the project is far from complete. He noted that he wants to expand the project to detect more than just seven emotions.
The ability to read emotions in the brain also has a wide range of clinical benefits, LaBar said, including the replacement of an antiquated system of judging patients’ emotions.
“Often times in order to assess whether an intervention works for treating depression or anxiety, we simply ask the subject, 'do you feel better?’ after the treatment, and people may not have great insights into how well they feel," LaBar said.
He added that another useful function would be detecting the emotions of those who are unable to accurately report them, such as children or those with autism.
Although identifying emotions might not yet be mind reading, Kragel suggested the technology might be heading in that direction.
“A more conservative and very accurate description is that we’re reading brain states,” Kragel said. “It’s getting toward mind reading. That’s probably the ultimate goal of what this technology might do someday.”
Get The Chronicle straight to your inbox
Signup for our weekly newsletter. Cancel at any time.