Get all your news in one place.
100's of premium titles.
One app.
Start reading
The Conversation
The Conversation
Rachel Bouserhal, Associate Professor, École de technologie supérieure (ÉTS)

Using the body’s own sounds to diagnose Alzheimer’s and Parkinson’s before the first symptoms appear

Alzheimer's and Parkinson's disease are often diagnosed late, after the symptoms are well established. (Unsplash)

Long before the tell-tale signs of Parkinson’s disease appear, including tremors and muscle stiffness, there are other, more subtle signs of the disease.

These include changes in oral pronunciation and language and difficulties breathing or swallowing.

People with Alzheimer’s disease can also experience a reduction in their vocabulary and a tendency to repeat certain words. These also appear well before there is clear memory loss.

Unfortunately, diagnoses are usually made only when the changes observed in someone become significant enough to set them apart from the rest of the population.


This article is part of our ongoing series The Grey Revolution. The Conversation Canada and La Conversation are exploring the impact of the aging boomer generation on Canadian society, including housing, working, culture, nutrition, travelling and health care. Grey Revolution stories, written by experts in their fields, explore the upheavals already underway and those on the horizon.


The importance of early diagnosis

Early diagnosis is a key factor in slowing the progression of these diseases. Some treatments are more effective when administered in the very early stages. Yet unfortunately, diagnoses are often made too late, after the symptoms become obvious.

This situation owes in part to the subtle and variable nature of early signs of the disease, but also to clinical practices that still largely rely on observing visible and advanced symptoms.

As a professor at the École de technologie supérieure (ÉTS) and holder of the Marcelle-Gauvreau Research Chair in Multimodal Health Monitoring and Early Disease Detection, my research focuses on the use of wearable sensors, particularly in-ear devices, to capture and analyze physiological and behavioural health signals.

The aim is to develop tools that can make continuous, non-invasive and personalized health monitoring possible so certain conditions, particularly neurodegenerative diseases, can be detected earlier.

Wearable technologies play a key role in neuropsychology, since they enable us to gather much more information than we can through traditional clinical observation. Thanks to sensors and advanced algorithms, it’s now possible to objectively measure subtle markers related to cognitive and emotional functioning, in real time. This makes it possible to have more precise and personalized clinical practices, which will both complement clinicians’ expertise and promote better understanding of and support for patients.

Each of us is a unique individual. Our cognitive abilities are not all equal from the outset. Our vocabulary, memory, attention, reasoning and visuospatial abilities set us apart from others. We need to be able to detect individual changes earlier in order to see signs of health deteriorating more quickly.

Furthermore, an approach combining multiple signals — rather than a single parameter, such as heart rate — allow for a more comprehensive view of how the disease is progressing.

The ear: an unexpected window into our health

To detect these diseases earlier, we are trying an innovative approach: analyzing the acoustic signals the body emits.

When an ear is blocked by an in-ear device, certain internal sounds — heartbeats, breathing, swallowing, speech, and even blinking — are amplified in the lower frequencies. This phenomenon, known as the occlusion effect, can be harnessed using a miniaturized microphone that captures these sounds as subtle signals.

Several of these signals are particularly relevant, as they are affected from the earliest stages of neurodegenerative diseases, yet often remain too subtle to be detected clinically.

For example, the ratio of inhalation to exhalation in people with Parkinson’s disease, or the interactions between breathing and swallowing, are altered very early on, long before symptoms become severe enough to be noticed. Similarly, the eye movements of patients with Alzheimer’s disease, particularly saccades, could be captured using an in-ear microphone and provide valuable information about the progression of the disease.

The device we use was developed as part of the ÉTS-EERS Industrial Research Chair in In-Ear Technologies. It consists of an earpiece fitted with two microphones and a miniaturized loud speaker. The microphone placed inside the ear canal captures sounds generated by the body. The external microphone and the speaker (located inside) are used to relay external sounds, in order to reduce the discomfort caused by the occlusion effect.

Extracting the right signals

One of the main technical challenges we have is separating different bodily signals that are captured simultaneously. Indeed, a person can speak and hear their heart beating, all at the same time.

We are exploring several ways to disentangle these overlapping signals, including machine learning—a form of artificial intelligence—as well as audio source separation algorithms. We are also testing complementary technologies, such as photoplethysmography, which measures variations in blood flow to extract heartbeats. These signals, such as heart rate and its derivatives, are particularly interesting as they offer a better understanding of a person’s emotional state.

We have already demonstrated that it is possible to detect stress from heart signals picked up by an in-ear microphone, particularly during moments of silence and minimal movement. This approach is particularly relevant for people with cognitive impairment, who often struggle to understand speech in noisy environments, even in the absence of hearing loss.

We want to go beyond simply measuring their speech comprehension in noisy settings by assessing their stress levels in these situations, as well. This will help determine whether they experience higher levels of stress than their peers without cognitive impairment.

Our team is currently conducting two separate studies to compare the physiological signals of healthy individuals with those of patients with neurodegenerative disorders.

The first study, carried out in collaboration with Parkinson Québec and the Université de Montréal, focuses on patients with Parkinson’s disease and their caretakers. The second, conducted with Montréal’s Douglas Research Centre, aims to collect data from healthy individuals as well as those with Alzheimer’s disease or mild cognitive impairment.

In the medium term, we believe our algorithms will be able to accurately detect whether a person is already showing signs of neurodegenerative disease.

In the longer term, we aim to contribute to a genuine revolution in the field of early diagnosis of Alzheimer’s and Parkinson’s, thereby enabling faster, better targeted, and potentially more effective interventions.

La Conversation Canada

Rachel Bouserhal received funding from the Natural Sciences and Engineering Research Council of Canada (NSERC).

This article was originally published on The Conversation. Read the original article.

Sign up to read this article
Read news from 100's of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.