Opening new horizons for infant health through machine learning

May 5, 2022 | Innovation, Women in STEM

Most cases of developmental disorders, such as autism, are not diagnosed until around 4 years of age. This leaves caregivers little time to learn and prepare for the challenges they and their children will face. However, new efforts are being developed at Northeastern University that will assist with earlier diagnosis of developmental disorders and possesses future applications in the realm of infant monitoring and safety.

Professor Sarah Ostadabbas of the Electrical and Computer Engineering Department was recently awarded a $600k National Science Foundation CAREER Award; Ostadabbas is working to develop machine learning algorithms that can detect signs of future developmental disorders in infants by processing their motor function captured in everyday videos. The preliminary work around this topic is based on a fruitful collaboration among Ostadabbas’ lab and multidisciplinary researchers at Northeastern and the University of Maine.

DSC 5016

 

Photo by Joshua Brown

As it stands, many developmental disorders, such as autism, are typically not diagnosed until around age four. Ostadabbas is currently working with video data collected using an off-the-shelf baby monitor from infants aged 5-10 months; machine algorithms, in particular computer vision models will work to analyze infants’ facial and bodily movements and learn from the sampled data. The goal is that the data gathered will assist with much earlier diagnoses for those types of developmental disorders. Right now, getting access to right data is key. “We don’t have enough data to train these data-hungry algorithms, which are mainly deep learning models, from scratch,” Ostadabbas says. “A lot of my research day-to-day is addressing these data limitations by providing solutions for problems in the small data domains.”

While working on the project, Ostadabbas has come across some interesting hurdles along the way. Despite support from study participants, there was still a considerable lack of usable data on infants. “It was surprising how hard it was to find infant data from publicly available sources,” she says. To get around this, Ostadabbas initially turned to existing research on adult facial and bodily movements. “At the moment, a lot of AI models – that are trained to do human behavior recognition, tracking, and pose estimation – are focused on adult data.” To her surprise, the data was mostly incompatible with infants. So, Ostadabbas had to get creative.

Although Ostadabbas has some data on infants, privacy concerns make it difficult to gather enough needed for the project. In order to combat this, Ostadabbas and her team have made use of an open-source gaming software to create artificial infant avatars. These avatars are made to look and act like infants using footages publicly available online. As she notes, “this method allows us to expand the data we have on hand, and, not only that, it allows to incorporate diversity in our video data, so we can have different shapes, sizes, races of infants in different environments.”

Of course, synthesizing digital infants for data cannot be a total replacement for human subjects. Instead, Ostadabbas is using domain adaptation methods on synthetic data to “bridge the gap” with existing data on real infants. “For example, if we are looking at a specific behavior, such as the pose of the infant over time (how their limbs are moving around), it doesn’t matter if it comes from synthetic data or real data, so the domain adaptation methods focus on these features rather than the visual differences between synthetic and real domains,” she states. Additionally, Ostadabbas can use “adjacent domain data” to continue to fill those gaps; while the adult data alone was not useful, parts of it were still applicable to the project. Together, these three sets of data (i.e., small real infant data plus synthetic infant data and real adult data) help paint a full picture for the machine algorithm to look at.

DSC 5040

 

Photo by Joshua Brown

Beyond developmental disorders, Ostadabbas’ work has even more potential down the road. “Another thing from studying infant behaviors that has broader applications is commercial AI. Can we actually bring AI to everyday use, like baby monitoring?” This goes beyond existing baby monitoring systems – this AI would be able to understand what exactly the infant is doing. “It would process the specific movements that the infant is doing: are they making risky movements? Are they standing? Climbing out of the crib? Lying on their stomach with their face covered?” Knowing exactly what’s going on with their children would put parents’ minds at ease. In the end, the research being done here has a wide range of potential applications, both medical and consumer based.

For Ostadabbas, the future potential for the work comes in many different angles, which together drive her passion for the project. Not only are the applications incredibly useful and potentially varied, but the algorithm itself will grow thanks to its ability to work with models with limited data. “The interesting part that makes me excited is that the impact is both from the application side and the algorithmic side,” she says.

Article by Corey Ortiz