Future of AIAI

Applying AI to Auditory Neuroscience: How Soundbites is Using RAG and Generative Models to Humanize Hearing Loss Prevention

By Joop Korrel Martin, Chief Technology Officer, Soundbites

When most companies consider artificial intelligence (AI) and how best to use it, the conversation typically revolves around automation, efficiency gains, and cost savings. Commercial players are pouring incredible amounts of capital into AI-powered initiatives to optimize their workflows, slash operational expenses, and/or generate marketing content faster. We recommend a fundamentally different routeโ€”one thatโ€™s rooted in education, empathy, and accessibility.ย 

Our mission is to make multiple decades of complex auditory neuroscience not just available, but actually accessible, to people confronting hearing loss and hearing loss-related challenges (e.g., tinnitus, hyperacusis, hidden hearing loss, and dementia). So, our approach was not simply to deploy AI to do more with less. Instead, weโ€™re using it to communicate better, understand deeper, and connect more authentically.ย ย 

That mission led us to build OTIS (in beta version now), a conversational AI (currently English-only) that uses a retrieval-augmented generation (RAG) architectureโ€”pairing a domain-specific vector database with OpenAIโ€™s GPT modelsโ€”so every answer is grounded in our curated auditory neuroscience corpus.ย ย 

We uncovered something unexpected in doing so. AI isnโ€™t just changing how we engage with dataโ€”itโ€™s transforming how people engage with themselves and their health.ย 

Using RAG to bridge the AI-human gapย 

The technical foundation of this approach is RAG, which enhances the generative capabilities of large language models (LLMs) like ChatGPT by injecting domain-specific data into the model’s reasoning process. This allows the AI to generate responses grounded in trusted, up-to-date information rather than relying solely on the modelโ€™s pretraining knowledge.ย 

In our case, that domain-specific data included more than 30 years of translational and basic auditory neuroscience. Data of this kind is largely inaccessible to the general public due to its density, technical language, and publication in academic silos. By carefully curating and embedding hundreds of relevant peer-reviewed studies and internal research documents, we were able to surface insights quickly and succinctly for site visitors.ย 

Why do we recommend RAG? Because it ensures the answers OTIS provides arenโ€™t generic. They’re instead rooted in evidence. Responses reflect the hard-won insights of decades of human expertiseโ€”reformatted and rephrased into language real people can understand and act on.ย 

AI that meets people where they areย ย 

We assumed OTIS users would ask about mechanisms, molecules, and scientific papers given the base dataset was built on scientific literature. But as OTIS launched and real interactions began, a different (but not surprising) pattern began to emerge.ย 

Users were asking deeply personal questions like โ€œWill this help my tinnitus?โ€ or โ€œCan this make a difference for my dad, whoโ€™s showing early signs of dementia?โ€ย 

Even though itโ€™s somewhat obvious, the shift in thinking was still meaningful for us, as it highlighted something often overlooked in AI development. People donโ€™t just want information. They want clarity, reassurance, and personalized context. They want to understand how something specifically relates to them.ย 

By combining LLMs with RAG and a human-centered interaction design, we made it possible for OTIS not just to answer questions, but to understand intent. And thatโ€™s where AI is already goingโ€”away from transactional interactions and toward real conversations.ย ย 

Human-centered AI in the health industryย 

A longstanding challenge in translational medicine has always been bridging the gap between promising lab research and practical, real-world application. AI has huge potential to be that bridge.ย 

Our chatbot acts as a translator between the researchers and the readers. So, rather than reading a dense 15-page paper on synaptic magnesium buffering, a user can ask OTIS, โ€œWhat causes hidden hearing loss, and can anything reverse it?โ€ In seconds, theyโ€™ll get an explanation that yes, synthesizes rigorous research, but itโ€™s returned to the prompter at an accessible level.ย ย 

This modelโ€”using AI to humanize complex medical informationโ€”has big implications. Not just for our field of auditory health, but for all domains of the larger industry, where scientific literature far outpaces public understanding. In the U.S. and U.K. alone, where health literacy levels remain low, such AI applications can empower people to take informed action earlier and ultimately improve outcomes.ย 

Toward responsible, domain-specific AIย 

AI certainly isnโ€™t suffering from a shortage of hype. But the next wave of transformation wonโ€™t come from general-purpose AI tools with flashy interfaces. It will instead come from domain-specific AI systems trained to answer the questions real people are asking. And thatโ€™s where RAG shines. By grounding generative models in specific, relevant data, RAG can boost accuracy and increase trust.ย ย 

However, this approach requires discipline. Itโ€™s not enough to dump documents into a vector store and call it a day. We stress-tested OTIS across edge cases and ensured the chatbotโ€™s tone balanced scientific accuracy and a neutral voice. Monitoring and iterative development are ongoing priorities.ย 

A new kind of competitive advantageย 

So, while itโ€™s tempting to chase flashy outputs and applications, our experience suggests a more enduring strategy that invests in understanding versus output alone. OTIS is constantly improving based on user interactions, but chats are anonymized, securely stored, and reviewed only to improve response accuracy.ย 

By building systems that reflect how people actually think and feel, we can all unlock AIโ€™s promise, creating experiences that educate, support, and empower. Thatโ€™s the ultimate ROI no matter your industry.ย ย 

AI is already a lens through which we view and redesign the way information is accessed, processed, and understood. We chose to focus that lens on the decades-old communication gap in hearing science and hearing preservation.ย 

Weโ€™ve seen firsthand how AI, when thoughtfully used, doesnโ€™t replace people. It better connects us.ย 

Author

Related Articles

Back to top button