A clenched fist thumps the air to empha­size a point; a sweeping hand sig­nals the array of pos­si­bil­i­ties; fur­rowed eye­brows ques­tion the veracity of the politician’s remarks. These are all exam­ples of the ways we express our emo­tions while we con­verse. They’re strate­gies we may spend a life­time learning, based on our par­tic­ular cul­tures and back­grounds. But that doesn’t mean they can’t be programmed.

Newly appointed North­eastern pro­fessor Stacy Marsella is doing just that. His pro­gram, called Cere­bella, gives vir­tual humans the same ability to convey emo­tion through facial expres­sions and hand ges­tures as they com­mu­ni­cate with other virtual—or even real—humans.

Nor­mally these vir­tual human archi­tec­tures have some sort of per­cep­tion, seeing the world, forming some under­standing of it, and then deciding how to behave,” said Marsella, who holds joint appoint­ments in the Col­lege of Com­puter and Infor­ma­tion Sci­ence and the Col­lege of Sci­ence. “The trouble is some of these things are very hard to model, so some­times you cheat.”

One way to cheat, Marsella explained, is to infer con­nec­tions between given utter­ances and appro­priate responses. Once the pro­gram knows what words a vir­tual human will use to respond, it can form a library of asso­ci­ated facial expres­sions, gaze pat­terns, and ges­tures that make sense in con­junc­tion with those words.

In one ver­sion of the pro­gram, Cere­bella infers the deeper meaning behind the spoken words. The pro­gram is capable of inter­preting the meaning and responding appropriately.

In addi­tion to Cere­bella, Marsella’s work touches on a broad spec­trum of appli­ca­tions at the inter­sec­tion of emo­tion and tech­nology. For instance, UrbanSim uses sim­ilar tech­niques to gen­erate large-​​scale models of human pop­u­la­tions. Here, vir­tual models of people aren’t doing the same kind of “nat­ural lan­guage pro­cessing,” as Marsella called it, but they’re still inter­acting with one another and deter­mining follow-​​up behav­iors based on a theory of mind, a model that allows them to reason about how others in the vir­tual world will act.

They’re abstract social inter­ac­tions, where agents are either assisting or blocking each other,” Marsella explained. The result gives his pro­gram the capacity to sim­u­late whole cities for pur­poses ranging from city plan­ning to mil­i­tary training.

At North­eastern, Marsella is eager to apply his methods to a range of mul­ti­dis­ci­pli­nary col­lab­o­ra­tive projects. In par­tic­ular, he’s inter­ested in working with the per­sonal health infor­matics team. “The inter­ac­tive health inter­ven­tions are the appli­ca­tions that really interest me,” he said.

For another project, he designed a training tool for med­ical stu­dents to develop their patient inter­ac­tion skills, in which they must nav­i­gate dif­fi­cult con­ver­sa­tions with a vir­tual human embedded with the emo­tional per­son­ality of a real human. One task requires the stu­dents to inform the vir­tual human of his cancer diagnosis.

We want these inter­ac­tions to be nat­ural,” Marsella said, sum­ming up the under­lying goal of almost all his programs.