New AI predicts inner feelings from radio signals
Written by James Orme Mon 15 Feb 2021

AI trained to accurately predict emotions from WIFI-like radio waves
Scientists have developed a mind-reading AI that detects people’s private feelings and emotions from mere wireless signals.
Researchers from Queen Mary University used radio waves to measure heart rate and other electronic physiological signals and applied deep learning techniques to produce a model that accurately predicts how someone is feeling on the inside.
The technique, published last week in PLOS ONE, bypasses ordinary visual and semantic cues to ascertain a person’s state of mind and will raise fresh concerns about the power of algorithms to encroach on individual privacy.
To that end, the researchers intend to publish new research assessing the ethical implications of artificial intelligence that can breach people’s inner sanctums.
Deep learning
During the study researchers presented participants with videos selected to trigger the primary emotional responses of anger, sadness, joy and pleasure.
The scientists then blasted the volunteers with harmless radio waves and measured the signals that bounced back – mapping body movements to the data to reveal biological responses like heart and breathing rates.
The researchers then amassed enough data to train an AI to infer primary emotional responses from these signals alone.
While researchers have previously used classical machine learning approaches to detect emotions from wireless signals, the Queen Mary scientists are the first to use deep learning techniques.
In the traditional approach, algorithms are used to identify and classify emotional states within data. By comparison, deep learning leverages self-learning neural networks that feed on time-dependent raw data. They are invariably more accurate and versatile predictors and work much like the human brain.
“Deep learning allows us to assess data in a similar way to how a human brain would work looking at different layers of information and making connections between them,” explained Achintha Avin Ihalage, a PhD student at Queen Mary.
“Most of the published literature that uses machine learning measures emotions in a subject-dependent way, recording a signal from a specific individual and using this to predict their emotion at a later stage,” he added.
“With deep learning we’ve shown we can accurately measure emotions in a subject-independent way, where we can look at a whole collection of signals from different individuals and learn from this data and use it to predict the emotions of people outside of our training database.”
The research opens up opportunities for applications in robotic health assistance and emotional wellbeing, the researchers said.
Written by James Orme Mon 15 Feb 2021