MORE ABOUT THE CONFERENCE

9:00 Welcome and Opening Remarks

9:15 “Hot-Blooded” entities: How Embedding a Capacity for Emotion into Robots, Virtual Agents, and Avatars Changes the Game When it Comes to Social Interaction

20 minute talks by:
  • Stacy Marsella, “Designing Hot-Blooded Virtual Humans”
  • David DeSteno, “Detecting Trustworthiness: Can I Trust You (Even If You’re a Robot)?”
  • Magy Seif el-Nasr, “Engineering the Social Connection in Virtual Experiences”

Followed by discussion moderated by Andrew Zolli of PopTech

10:40 Coffee Break

11:00 Sensing Feelings, Physiology, and Behaviors: How Mobile and Online Technologies Can Provide New Tools in the Social and Health Arenas

20 minute talks by:
  • Matthew Goodwin, “Understanding Affect and Supporting Behavior Regulation in Individuals on the Autism Spectrum: A Computational Behavioral Science Approach”
  • Emre Demiralp, “Emotion, Context and Tech”
  • Stephen Intille, “Measuring Behavior Using Mobile Phones: New Opportunities”

Followed by discussion moderated by Hiawatha Bray of the Boston Globe

12:30 Lunch – boxed lunch is provided with registration

1:30 Keynote lecture by Arturo Bejar “Life Happens: People, Emotion, and Facebook”

Followed by discussion moderated by Meghna Chakrabarti, co-host of WBUR’s Radio Boston

3:00-4:00 Reception and Demos

Demos

    • Emotion Recognition in Faces – Dr. Raymond Fu Yun’s lab will show videos of their machine-learning models that can segregrate naturalistic facial images into different categories of emotional expression.
    • ASL Mobile XG Eye-Tracking – The Lifespan Emotional Development Lab will do a live demonstration of mobile eye-tracking and show fixation data from affective environment paradigms in the lab.
    • Ambulatory Brain Imaging – The Interdisciplinary Affective Science Lab will present the use of ambulatory brain imaging via functional Near-Infrared Spectrospcopy to image brain areas important in social cognition.
    • Ambulatory Physiology – The Interdisciplinary Affective Science Lab will demonstrate the use of ambulatory peripheral physiological measures (e.g., heart rate, skin conductance, and impedance cardiography) that can be measured in daily life settings.
    • Emotion in Games – The Center for Computer Games Research will demonstrate how emotions are used in games and game design.
    • Social Lens: Google Glass for Autism – Drs. Stephen Intille and Rupal Patel conducted a class focused on the use of Google Glass for personal health informatics. Come see a demo on the use of Google Glass to help children on the autism spectrum improve their social interaction skills.
    • Cerebella: Virtual Reality and Emotion – Dr. Marsella’s Lab will demonstrate Cerebella, a program for generating virtual human performances from speech and text. Cerebella infers underlying communicative functions from the audio and text and generates a performance by selecting appropriate animations and procedural behaviors.
    • Emota: Intelligent Facial Expression Tracking – Dr. Pensyl’s Lab will demonstrate the use of Emota, a program for intelligent facial expression tracking for mobile apps, research & retail.

MORE ABOUT THE CONFERENCE