Would you like some AI or machine learning with your hearing aids?

Hearing aid manufacturers are increasingly talking about artificial intelligence and machine learning. But what are these two concepts? And how does this technology impact our industry – and your practice?

Artificial intelligence (AI) and machine learning are two concepts we’ve been hearing a lot about, and the hearing aid industry is no exception. A recent article published at hearinghealthmatters.org examines the differences between AI and machine learning in hearing aids and the implications this might have. It also looks at recent survey results for the machine learning-based SoundSense Learn feature, which is available in the WIDEX EVOKE hearing aids.

We’ve asked one of the authors – James W. Martin, Director of Audiological Communications at Widex – to give us his perspective on this ongoing discussion, and what the future holds for hearing aids.

First off, how did AI and machine learning come about?

The AI movement was created out of an attempt to see if a computer could convince a human that they were speaking with another human instead of a computer – that’s how the Turing Test was developed in 1950.  Since that time, machine scientists have been on the pursuit to see if they could create a computer that could function or mimic human cognitive processes, and that’s how AI and machine learning have become important technologies today.

What’s the difference between AI and machine learning?

AI is an attempt to have a computer perform a task that humans normally do, like drive a car, speak, and recognize faces. These are examples of what we consider learned tasks. Machine learning, on the other hand, focuses on continued learning and ongoing problem-solving capabilities that go beyond human capabilities – instead of just mimicking them.

What do AI and machine learning do for hearing aids?

With AI you can, for instance, adjust one hearing aid, and automatically the other side follows the adjustment. You could also use a sound classifier to initiate changes to features and gain in specific environments – that’s another example of AI in hearing aids. Machine learning takes it a step further by capturing learning based on a patient’s interaction, intention, and preference on how they like to hear in any specific environment. These intentions and preferences can only come from connecting machine learning to humans.

It’s my belief that hearing aids can only use what they can capture. The system’s ability to adjust to the environment and what’s happening in the acoustic landscape can be impaired if the hearing aid can only capture limited information.

Widex recently launched the SoundSense Learn technology. Can you briefly explain what it does?

SoundSense Learn takes a well-researched machine learning approach for individual adjustments of the hearing aid. It compares complex combinations of settings by collecting user input through a simplified interface, so that it’s easy for the user to manage.

Can you give us an example?

If we have three acoustic parameters: low, mid, and high frequencies, and they can each be set to 13 different levels… that totals 2,197 combinations of the three settings. To sample and compare all these combinations to find the optimal listening settings would result in over two million comparisons. No person can manage to review that kind of number to find their optimal setting. But SoundSense Learn can reach the optimal outcome in 20 interactive steps with a human – or less.

So the listener gets immediate gratification without changing any of the programming that the hearing healthcare professional has done. But the listener has the power to refine and adjust their acoustics settings to meet their specific listening intention in real time – that’s right here, right now. That’s an example of a symbiotic collaboration that goes beyond what a human can achieve alone.

How does this kind of machine learning benefit the hearing aid user?

It allows a hearing aid user to customize and personalize a soundscape environment to their preference and intention. Previously, a hearing aid user would have to try and remember all of the details of a difficult listening situation, so they could explain it to their hearing healthcare professional at a later point. That’s really hard to do.

You’re collecting anonymized data from the SoundSense Learn feature to understand how it’s used. What interesting findings have you made?

One of the areas to really highlight is where users created a program to repeat the use of their preferred settings – namely in the workplace. We saw 141 programs that were created and reused by users in their workplace, and they were all very different.

A couple of years ago, a study from MarkeTrak told us that 83% of hearing aid users had reported being satisfied with their hearing aids in their workplace. That still leaves more than 15% who are either not satisfied with the sound in their workplace or would like it to be more customized. SoundSense Learn is a powerful and effective way for them to achieve that goal.

What are some examples of machine learning or AI from the hearing aid industry?

Currently Widex is the only manufacturer that uses real-time machine learning. I know that other manufacturers are using things like proximity sensors for fall detection and the internet of things, but my concern is that none of these features are focusing on sound quality, patient preference or the listening intentions of the patient.

What do these new technologies mean for hearing care professionals?

Hearing technology has evolved, and we must evolve with it. Hearing healthcare providers will not be replaced by technology. But just like most professions, what will happen is that these providers will need to master upcoming technology to keep pace with the industry and the general technological development.

In your article, you write that in the future, “real-life applications of machine learning and AI go beyond what a human can achieve alone” – what might we be seeing as a result of this in the future of hearing aids?

We are on the precipice of integrating advanced technology that has never been seen before into the hearing industry. I can only imagine what the future holds… but I will leave that up to the incredible engineers in the hearing aid industry.

Go to the top