A radar for emotion

Northeastern graduate student Sarah Brown is building computational models of emotion using physiological signals like EKG. Photo via Thinkstock.

North­eastern grad­uate stu­dent Sarah Brown is building com­pu­ta­tional models of emo­tion using phys­i­o­log­ical sig­nals like EKG. Photo via Thinkstock.

Engi­neers are good at tracking things. That’s according to North­eastern grad­uate stu­dent, Sarah Brown. As a fellow of Draper Lab­o­ra­tory in Cam­bridge, Brown is col­lab­o­rating with researchers at both Draper and North­eastern to track some­thing that has never really been tracked before: emotion.

Well, let me rephrase that. Emo­tion has been tracked before, but not over long periods of time and only using big expen­sive equip­ment like MRI machines. In con­junc­tion with psy­chophys­i­ol­o­gist Andrea Webb, Brown is attempting to build com­pu­ta­tional models of emo­tion using easy to col­lect data like pupil diam­eter, heart rate, and skin conductance.

Through her own research, Webb has col­lected an impres­sive data set, which, Brown says “is half the battle” for a computationally-​​inclined grad­uate stu­dent. Webb’s team asked par­tic­i­pants to view col­lec­tions of pic­tures and dig­ital sound files while hooked up to heart rate mon­i­tors and other phys­i­o­log­ical recording devices. The pic­tures and sounds used for the tests come from stan­dard psy­chology par­a­digms and cor­re­late to par­tic­ular emo­tional states. The orig­inal idea, Brown said, was to look for simple cor­re­la­tions between phys­i­ology and the stimuli, but it didn’t go so well. That’s when the team joined up with signal pro­cessing expert S.R. Prakash, Brown’s research advisor at Draper. Now, with Brown’s help, they’re attempting to build algo­rithms that can model emo­tional state based on phys­i­o­log­ical outputs.

The same way that radar can be used to scan through noisy data to home in on an object’s exact loca­tion, Brown hopes her com­pu­ta­tional models will allow clin­i­cians to scan simple sig­nals for infor­ma­tion on emo­tional states. With radar you have a pretty easy way to check whether your model works: your eye­balls. But emo­tion isn’t nearly so con­crete as some hidden phys­ical object. If her model says a person is feeling sad, how can Brown check the results?

Well, I sup­pose she could just ask, “hey, were you feeling sad just then?” but self-​​reported data brings a slew of lim­i­ta­tions with it. “This is a whole other aspect of the research, devel­oping per­for­mance mea­sures,” said Brown.

Now in her second year of grad school, she has com­pleted an exten­sive lit­er­a­ture review of machine learning and signal pro­cessing tech­niques. Next she plans to use existing algo­rithms on the data set as a jumping off point. From there, she will tweak the cur­rent models and build new ones specif­i­cally designed for her purpose.

Per­haps the most impor­tant chal­lenge will be choosing the right pieces of the data to focus on. For instance an EKG signal, which tracks elec­trical activity of the heart, con­tains lots of num­bers on the backend. Some of those num­bers will prove valu­able for her needs, others will not. Brown will need to deter­mine exactly the right num­bers to focus on–maybe the dis­tance between two signal peaks, for example–rather than inef­fi­ciently pro­cessing all of the infor­ma­tion in one hulking algorithm.

Now, you might ask why we’d want a com­pu­ta­tional model of emo­tion in the first place. “There are lots of rea­sons,” said Brown. “A spe­cific moti­vator of this work is to be able to pro­vide clin­i­cians with a quan­ti­ta­tive assess­ment of emo­tional state, which could assist with diag­nosis of psy­chopatholo­gies,” she explained.