Experimental earphones can detect ear infections with chirp

Experimental earphones can detect ear infections with chirp

A picture of the headset Galaxy Buds Pro worn in the ear

picture: Gizmodo

Remember how your parents tried to convince you to eat vegetables when you were a kid by promising that they were good for your health? This is the same tactic that a lot of wearable makers are using today, by adding health tracking features to devices like smartwatches. Now, researchers have developed a way for earbuds to track your ear health too.

Every time Apple holds an event, He spends a few minutes promoting the health benefits of wearing an Apple Watch, who has Heart rate tracking features that It can spot heart problems before they become serious complications. IThis is correct It is rumored that the long-awaited update to me Apple’s AirPods Pro wireless earbuds will do Possibly including the measurement of body temperatureallowing devices to detect fever: an early symptom of a myriad of other conditions.

It turns out that the earbuds’ inherent capabilities — that is, shooting sound into your ears — also allows them to detect conditions that can affect the inner ear and ear canal, University at Buffalo researchers found using their experimental device called EarHealth.

EarHealth system diagram

What’s most interesting about EarHealth is that it relies on earbuds that more or less feature off-the-shelf hardware, though there’s an upgraded microphone inside designed to pick up sounds in the ear, not around the wearer. Based on images shared from the prototype, EarHealth does not appear to be based on wireless earbuds, although Official statement about research on the University at Buffalo website Specifically referring to the use of Bluetooth earbuds – that’s a good thing, Because none of us want to go back to wires.

whereSince the Apple Watch uses visual detection tricks to monitor heart health, EarHealth uses audio instead. The earbuds emit a quick peep that reverberates through the ear canal, Produces unique sounds and echoes that are picked up by the microphone. The captured sounds are then processed by a dedicated app on a connected smartphone that relies on a deep learning algorithm to create a profile of the user’s inner ear architecture.

The first peep is It is performed while the user is healthy to create a base profile for his inner ear, while subsequent tweets, which can be scheduled regularly, create profiles that are compared to the original to identify differences. these They can be used to diagnose one of three different conditions: earwax blockage, a ruptured eardrum, and otitis media, a common infection or inflammation of the middle ear caused by the common cold or sore throat.

In tests on 92 users that included 27 healthy people, 22 with a ruptured eardrum, 25 confirmed cases of otitis media, and 18 with obstructive earwax, EarHealth had a diagnostic accuracy of 82.6% , but that could be improved. The researchers are optimizing both the hardware and the user sample base. The benefit of using AI-powered algorithms is that they will continue to improve and become more accurate in making diagnoses over time as more sample data becomes available.


Leave a Comment

Your email address will not be published.