News Hub

Advanced AI able to identify autism speech patterns

Written by Fri 24 Jun 2022

A team of researchers, led by academics at Northwestern University, has been able to uncover speech patterns linked to autism in both English and Cantonese.

Thanks to machine learning innovations, it is hoped that this breakthrough will be one of many when it comes to improving the understanding of autism.

There is also potential for this research to be used to make the process of diagnosing autism simpler and therefore more accessible to people, especially at a time when healthcare systems are overloaded in many countries.

“When you have languages that are so structurally different, any similarities in speech patterns seen in autism across both languages are likely to be traits that are strongly influenced by the genetic liability to autism,” said Molly Losh, the Jo Ann G. and Peter F. Dolle Professor of Learning Disabilities at Northwestern.

“But just as interesting is the variability we observed, which may point to features of speech that are more malleable, and potentially good targets for intervention,” she adds.

As part of the study, researchers compared speech patterns and characteristics of people both with and without autism spectrum disorder (ASD) who were asked to narrate a picture book called ‘Frog, Where Are You?’.

Differences in rhythm were found in both the English and Cantonese groups, with there potentially being a possibility to create tools and solutions based on this research that can go beyond simply identifying autistic speech patterns but also being able to measure changes in the speakers patterns over time.

The study found that children who have been diagnosed with autism speak slower than other children and have notable differences in pitch, intonation and rhythm. These so-called ‘prosodic differences’ have proved to be difficult for researchers to identify before this research.

Written by Fri 24 Jun 2022


AI research
Send us a correction Send us a news tip