Northeastern University

A radar for emotion

Print Friendly
Northeastern graduate student Sarah Brown is building computational models of emotion using physiological signals like EKG. Photo via Thinkstock.

Northeastern graduate student Sarah Brown is building computational models of emotion using physiological signals like EKG. Photo via Thinkstock.

Engineers are good at tracking things. That’s according to Northeastern graduate student, Sarah Brown. As a fellow of Draper Laboratory in Cambridge, Brown is collaborating with researchers at both Draper and Northeastern to track something that has never really been tracked before: emotion.

Well, let me rephrase that. Emotion has been tracked before, but not over long periods of time and only using big expensive equipment like MRI machines. In conjunction with psychophysiologist Andrea Webb, Brown is attempting to build computational models of emotion using easy to collect data like pupil diameter, heart rate, and skin conductance.

Through her own research, Webb has collected an impressive data set, which, Brown says “is half the battle” for a computationally-inclined graduate student. Webb’s team asked participants to view collections of pictures and digital sound files while hooked up to heart rate monitors and other physiological recording devices. The pictures and sounds used for the tests come from standard psychology paradigms and correlate to particular emotional states. The original idea, Brown said, was to look for simple correlations between physiology and the stimuli, but it didn’t go so well. That’s when the team joined up with signal processing expert S.R. Prakash, Brown’s research advisor at Draper. Now, with Brown’s help, they’re attempting to build algorithms that can model emotional state based on physiological outputs.

The same way that radar can be used to scan through noisy data to home in on an object’s exact location, Brown hopes her computational models will allow clinicians to scan simple signals for information on emotional states. With radar you have a pretty easy way to check whether your model works: your eyeballs. But emotion isn’t nearly so concrete as some hidden physical object. If her model says a person is feeling sad, how can Brown check the results?

Well, I suppose she could just ask, “hey, were you feeling sad just then?” but self-reported data brings a slew of limitations with it. “This is a whole other aspect of the research, developing performance measures,” said Brown.

Now in her second year of grad school, she has completed an extensive literature review of machine learning and signal processing techniques. Next she plans to use existing algorithms on the data set as a jumping off point. From there, she will tweak the current models and build new ones specifically designed for her purpose.

Perhaps the most important challenge will be choosing the right pieces of the data to focus on. For instance an EKG signal, which tracks electrical activity of the heart, contains lots of numbers on the backend. Some of those numbers will prove valuable for her needs, others will not. Brown will need to determine exactly the right numbers to focus on–maybe the distance between two signal peaks, for example–rather than inefficiently processing all of the information in one hulking algorithm.

Now, you might ask why we’d want a computational model of emotion in the first place. “There are lots of reasons,” said Brown. “A specific motivator of this work is to be able to provide clinicians with a quantitative assessment of emotional state, which could assist with diagnosis of psychopathologies,” she explained.


Get every new post on this blog delivered to your Inbox.

Join other followers: