The eyes may be win­dows into the soul, but can we really garner that much infor­ma­tion from staring into someone’s face? Research sug­gests maybe not.

A whole busi­ness has devel­oped out of training com­puters to rec­og­nize our facial expres­sions, with com­pa­nies like Emo­tient and Affec­tiva selling facial recog­ni­tion soft­ware that sup­pos­edly reveals how focus groups respond to adver­tise­ments or how shop­pers feel. Agen­cies like the CIA and the TSA have used the facial emo­tion research of psy­chol­o­gist Paul Ekman to try to examine the tiniest changes in expres­sion for signs of poten­tial decep­tion or ill intent. Com­pa­nies like Apple and Google are also working on facial recog­ni­tion tech­nology, although Google has tried to keep Glass apps facial-​​recognition free (for now at least).

Yet there’s a major issue with training com­puters (or even people) to read facial expres­sions to eval­uate behavior: Some­times you just can’t tell what’s going on inside someone’s head by looking at his or her face. North­eastern Uni­ver­sity pro­fessor Lisa Feldman Bar­rett, a psy­chol­o­gist who studies emo­tions, writes in The New York Times that her research indi­cates “that human facial expres­sions, viewed on their own, are not uni­ver­sally understood.”

Read the article at Fast Company →