Raina Levin isn’t sure where her education will lead her, but she knows her natural curiosity and Northeastern’s ecosystem of exploration—rooted in co-op—will guide her in the right direction.
Research on spoken languages has shown that they rely on the human brain’s ability to unconsciously encode patterns in speech in the form of abstract rules. But do those same rules operate in American Sign Language?
Humans favor speech as the primary means of linguistic communication. Spoken languages are so common many think language and speech are one and the same. But the prevalence of sign languages suggests otherwise. Not only can Deaf communities generate language using manual gestures, but their languages share some of their design and neural mechanisms with spoken languages. New research by Northeastern University’s Prof. Iris Berent further underscores the flexibility of human language and its robustness across both spoken and signed channels of communication.