cogmhear-logo
Published on

Bringing real life into the lab to make better hearing aids

Authors
  • Name
    Dorothy Hardy
    COG-MHEAR Research Programme Manager

Conversations are complicated! People move around, change aspects of their voices, and the background noise changes. There is a loop of interaction as this happens. So how can this real life be brought into the laboratory? Prof Volker Hohmann of Oldenburg University explained how in a recent talk to the COG-MHEAR teams.

Volker explained that basic hearing tests using pure tones are useful, but they do not give much information about the way in which conversations and hearing aids work in real life. And the trouble with measuring in real life situations is that you cannot repeat results, which is important when finding out how hearing technology is working and then working out how to improve it. So Volker and colleagues at Oldenburg University have created a lab which can be transformed into any type of space, such as a living room, roadside or cafeteria. The environments are created in virtual reality, and the equipment can even reproduce the annoying way in which railway station announcements are emitted from multiple loudspeakers, bounce off hard surfaces and become garbled due to echoes and delays. Photorealism is not necessary in the lab, but the way in which people and sounds move and are recorded are key. These can be captured using a range of sensors and audio-visual equipment, including cameras. Several projectors reproduce the scene. The active virtual reality that is used even recreates eye blinks, which are some of the detail that is required for a person to feel immersed in a scenario.

The lab is useful in working out the benefits of different types of hearing aid software in diverse noisy situations. One aspect of the work is about movement: the way in which you move when wearing a hearing aid has a significant effect on what you hear in different environments. (Maartje Hendrikse’s 2020 paper gives more information about this link) Keeping the environments controlled shows the difference between hearing aids, software and controls. The aim is to get the maximum benefit from a hearing aid regardless of the way in which you move.

The work on hearing has expanded into other areas such as development of lip-synching techniques for animated characters. Types of hearables are being developed. These are ‘smart’ in-ear devices with functions that include enhancing speech or carrying out noise cancellation. Volker’s talk gave some details of the way in which new hearing technology is being developed and how anyone can get involved: the teams at Oldenburg have developed an open access hearing aid software development platform. It can run on a laptop or a device similar to a Raspberry Pi to make it as accessible as possible: link.

The research in Oldenburg is funded by grant: DFG SFB 1330 Hearing Acoustics (HAPPAA)