Category Archives: Symposia

New Vistas In Emotion and Technology

The Northeastern University Affective Science Institute presents New Vistas in Emotion and Technology on Friday, January 31 at 9 a.m.

This one-day summit will bring together leading researchers and engineers at the forefront of emotion science and mobile/social technology to discuss emerging, cross-cutting initiatives aimed at enhancing individual and societal wellbeing.

In addition to the speakers, which includes Arturo Bejar the engineering director at Facebook, the summit will feature top-notch moderators — Meghna Chakrabarti or WBUR’s Radio Boston, Hiawatha Bray of the Boston Globe, and Andrew Zolli of Pop Tech.

Together with the audience, panels will explore issues including:

  • how endowing technological entities (e.g., robots, virtual agents and avatars) with emotional expression alters interactions with humans,
  • how new advances in mobile technology can be used to assess emotional and physiological changes relevant to physical and mental health, and
  • how emotion science can be leveraged to enhance harmony and user satisfaction in online social networks.

Click here for speaker profiles and registration information. 

Also posted in In the News, Special Events | Tagged , , , , , , , | Comments closed

Reading the Face: Translating Science to Security

November 30th, 2012
2-4pm, Raytheon Amphitheater, Northeastern University

A symposium focused on applying current research in
emotion perception, specifically as detectable (or not) from facial
expressions, to concerns in global security (e.g., efforts to detect

Jon Freeman, Assistant Professor of Psychological and Brain Sciences at Dartmouth College. Dr. Freeman’s research focuses on the neural mechanisms of person perceptions — the processes by which the brain extracts information from facial, vocal, and bodily cues.

“Hidden” Emotion Categories: The Social‒Sensory Interface Underlying Person Perception

When we encounter another person, we are simultaneously exposed to multiple categories (e.g., emotion, sex, race), the sight of a face and body more broadly, and often the sound of a voice. To this encounter, we also bring our own social expectations and high-level cognitive states (e.g., motivation, prejudice). How, then, from all this input do we so rapidly perceive other people’s emotions? I will discuss research using neuroimaging, real-time hand movements, and computational simulations to understand this uniquely rich person perception process. Based on converging findings, a neural network model of person perception will be described. It treats person perceptions as the end-result of multiple bottom-up sensory cues and top-down social factors interacting and “compromising” over time. Although through this process perceivers eventually stabilize onto clear-cut emotion categories, both sensory factors (e.g. subtle facial cues) and social factors (e.g., stereotypes) can lead alternate emotion categories to partially activate in parallel. These “hidden” emotion-category activations may have important implications for issues in global security. More broadly, the implications of the social‒sensory interface underlying emotion perception will be discussed.

Philippe Schyns, Head of School and Professor of Psychology at University of Glasgow. Dr. Schyns’ research focuses on the information processing mechanisms of face, object, and scene categorization in the brain.

Transmitting and Decoding Facial Expressions of Emotion

The face expresses a number of signals that the brain can code within a few hundred milliseconds. Amongst these, facial expressions of emotion have been of particular biological importance for the survival of the species. Here, I will discuss the state‐of‐the‐art on the understanding of what information in the face represents each one of the six basic facial expressions of emotion (i.e. happy, surprise, fear, disgust, anger and sadness). We will then review the dynamics of cortical coding of this information, both from event related potentials and from oscillatory activity. Finally, I will discuss a new approach that generalises the extraction of information to dynamically rendered three‐ dimensional faces.

Lisa Feldman Barrett, Distinguished Professor of Psychology at Northeastern University, and Director of the Affective Science Institute. Dr. Barrett’s research focuses on neural and psychological processes underlying the construction and perception of emotion.

Reconsidering the Concept of “Emotion Recognition”

When you look at a person’s face, you automatically and effortlessly see someone who is angry, or happy, or afraid.  Such experiences have led to a simple and intuitively appealing hypothesis that informs everything from security training programs to instructional segments on Sesame Street: a person’s intent can be read from facial expressions, regardless of the cultural background or life experiences of the perceiver or the target.  This view is so deeply ingrained that it forms the basis of undergraduate psychology curricula.  In this talk, I will review emerging evidence from laboratory experiments, field studies, lesion studies, and neuroimaging research demonstrating that “emotion recognition” is better understood as “emotion perception” where the knowledge, experience, and context of the perceiver strongly influences whether and which emotions are seen in faces.  I will also consider the implications of these findings for security training programs in the US.

With discussion and commentary by:
Peter DiDomenica, the former Director of Security Policy at Boston Logan International airport, where he developed innovative anti-terrorism programs, including the creation of the behavior-based screening program adopted by the Transportation Security Administration.

Posted in Symposia | Comments closed

Emerging Perspectives in Affective Science

A symposium featuring talks by four young investigators.

Monday, June 4, 2012
12:00 – 2:30pm
Alumni Center Pavilion, Northeastern University

Eliza Bliss-Moreau, California National Primate Research Center, University of California, Davis
Comparative Affective Science: What can we learn from nonhuman primates?

This talk will explore the utility and promise of studying affect in nonhuman primates (Rhesus macaques; Macaca mulatta). I will first discuss how two translational metrics can be used to explore individual differences in affective processing in both humans and nonhuman animals. I will then present data from two studies demonstrating that measures of cardiac physiology and behavioral reactivity can be used to assess macaque affective states. Finally, I will address the unique contributions of animal models to the study of affect by presenting data documenting individual differences in macaque affect following experimentally induced changes in brain structure. Together, these findings suggest that animal models of affect can help answer questions about the evolution and fundamental properties of the mind that would be untenable if studying only humans.

Kristen Lindquist, Harvard University Mind/Brain/Behavior Initiative
Emotions emerge from core affect and conceptualization

Emotions form the fabric of memories, social interactions, and culture. They affect our health, our ability to make decisions, and can make our break relationships with others. Together, the existing evidence suggests that emotions are important mental events—but amongst great agreement about the importance of emotions exists much disagreement about what they actually are. In this talk, I will weigh in on this question by presenting evidence that emotions are mental states that emerge from the combination of more basic psychological parts that are not specific to emotion. I will present behavioral, psychophysiological, neuropsychological, and neuroimaging evidence demonstrating that emotion experiences and perceptions emerge in consciousness when people use representations of prior experiences to make meaning of body states in a given instance. I close by discussing how such a constructionist model of emotion changes how scientists might think of the mind more generally.

Leah Somerville, Sackler Institute for Developmental Psychobiology, Weill Medical College of Cornell University
Interactions between emotional processes across timescales: The case of fear and anxiety

Psychological accounts have long recognized the diversity of emotional experience in terms of intensity, timescale, and cognitive consequences. However, our understanding of the brain circuitries that support these processes is limited by the type of emotion assayed in the laboratory – which is typically a brief response to a valenced cue. In my talk, I will present approaches my colleagues and I have taken to target anxiety-relevant emotional processes across a broader range of timescales. I will present data demonstrating that anxiety maintenance draws on distinct neural circuitries relative to the detection of anxiety-relevant emotional cues. Further, these circuitries interact across timescales, providing insight into how emotional states can up- or down-regulate moment to moment emotional processes. Finally, I will feature ongoing research considering linkages between neurodevelopmental properties of these circuitries and the staggered emergence of key symptoms of anxiety disorders during the first two decades of life.

Jamil Zaki, Social Cognition & Affective Neuroscience Lab, Department of Psychology, Harvard University
A sensory integration approach to emotion perception

For as long as scientists have studied how people understand others’ minds, they have thought this task must be something like perceiving the physical world. Here I focus on extending this classic simile in a new direction: towards the study of “multi-modal” emotion perception. When encountering complex social cues—as they almost always do—perceivers use multiple processes for understanding others’ emotions. Like physical senses (e.g., vision or audition), emotion perception processes have often been studied as thought they operate in relative isolation. In the domain of physical perception, this assumption has broken down, following evidence that perception instead involves pervasive interactions between the senses. Recent data—including those from two studies I will present here—demonstrate that emotion perception processes similarly interact in ways that shape judgments about others’ affective states. These parallels suggest that researchers can leverage insights about physical perception to move towards a more complete understanding of emotion perception. Such a sensory integration approach further offers hints about Bayesian models that could formally describe how people understand each other’s internal states based on complex, multifaceted cues.

Posted in Symposia | Comments closed

Joseph LeDoux

Our first Institute-sponsored event was an interdisciplinary half-day “meet and greet” conference at Northeastern University.

The keynote, by Dr. Joseph LeDoux, was followed by a combined poster session and reception, providing an opportunity for those in the affective science community to meet and mingle.

Thursday, December 1, 2011, from 1:30pm-4:00pm.

  • 1:30-2:00pm, Welcoming Introduction, Lisa Feldaman Barrett, Fenway Center
  • 2:00-3:00pm, Keynote, Dr. Joseph LeDoux, Fenway Center
  • 3:10-4:00pm, poster session and reception, Ballroom of the Curry Student Center

Posted in Symposia | Comments closed
  • ASI logo
  • Contact

    Affective Science Institute
    125 Nightingale Hall
    360 Huntington Ave.
    Boston MA 02115-5000

  • Mailing List

    Join our listserv mailing list to receive emails about ASI events. To subscribe, send an email to with the message "subscribe asi_discussion".