Hume AI launches worldโs first Empathic Voice Interface, enabling developers to integrate an emotionally intelligent AI voice into applications across health and wellness, AR/VR, customer service call centers, healthcare and more โ with a few lines of code.
NEW YORK–(BUSINESS WIRE)–Hume AI (โHumeโ or the โCompanyโ), a startup and research lab building artificial intelligence optimized for human well-being, today announced it has raised a $50M Series B. The round, led by EQT Ventures, will support the debut and continued development of Humeโs new flagship product: an emotionally intelligent voice interface that can be built into any application. Union Square Ventures, Nat Friedman & Daniel Gross, Metaplanet, Northwell Holdings, Comcast Ventures, and LG Technology Ventures also participated in the round.
Hume AI was founded by Dr. Alan Cowen, a former Google researcher and scientist best known for pioneering semantic space theory โ a computational approach to understanding emotional experience and expression which has revealed nuances of the voice, face, and gesture that are now understood to be central to human communication globally. The Company, which operates at the intersection of artificial intelligence, human behavior, and health and well-being, has created an advanced API toolkit for measuring human emotional expression that is already used in industries spanning from robotics to customer service, healthcare, health and wellness, user research, and more.
In connection with the fundraise, Hume AI has released a beta version of its flagship product, an Empathic Voice Interface (EVI). The emotionally intelligent conversational AI is the first to be trained on data from millions of human interactions to understand when users are finished speaking, predict their preferences, and generate vocal responses optimized for user satisfaction over time. These capabilities will be available to developers with just a few lines of code and can be built into any application.
AI voice products have the ability to revolutionize our interaction with technology; however; the stilted, mechanical nature of their responses is a barrier to truly immersive conversational experiences. The goal with Hume-EVI is to provide the basis for engaging voice-first experiences that emulate the natural speech patterns of human conversation.
Humeโs EVI is built on a new form of multimodal generative AI that integrates large language models (LLMs) with expression measures, which Hume refers to as an empathic large language model (eLLM). The companyโs eLLM enables EVI to adjust the words it uses and its tone of voice based on the context and the userโs emotional expressions. EVI also accurately detects when a user is ending their conversational turn to start speaking, stops speaking when the user interrupts the AI, and generates rapid responses in real-time with latency under 700 ms โ allowing for fluid, near-human-level conversation. With a single API call, developers can integrate EVI into any application to create state-of-the-art voice AI experiences.
โHumeโs empathic models are the crucial missing ingredient weโve been looking for in the AI space,โ said Ted Persson, Partner at EQT Ventures who led the investment. โWe believe that Hume is building the foundational technology needed to create AI that truly understands our wants and needs, and are particularly excited by Humeโs plan to deploy it as a universal interface.โ
โThe main limitation of current AI systems is that theyโre guided by superficial human ratings and instructions, which are error-prone and fail to tap into AIโs vast potential to come up with new ways to make people happy,โ Hume AI founder Alan Cowen explained. โBy building AI that learns directly from proxies of human happiness, weโre effectively teaching it to reconstruct human preferences from first principles and then update that knowledge with every new person it talks to and every new application itโs embedded in.โ
โWhat sets Hume AI apart is the scientific rigor and unprecedented data quality underpinning their technologies,โ said Andy Weissman, managing partner at Union Square Ventures. โHume AIโs toolkit supports an exceptionally wide range of applications, from customer service to improving the accuracy of medical diagnoses and patient care, as Hume AIโs collaborations with Softbank, Lawyer.com, and researchers at Harvard and Mt. Sinai have demonstrated.โ
The growing Hume AI team currently comprises 35 leading researchers, engineers, and scientists advancing Dr. Cowenโs work on semantic space theory. His research, which has been presented in numerous leading journals including Nature, Nature Human Behavior, and Trends in Cognitive Sciences, involves the widest range and most diverse samples of emotions ever studied and informs Humeโs data-driven approach to creating more empathic AI tools. Humeโs technology leverages these research advances to learn from the tune, rhythm, and timbre of human speech, โummsโ and โahhsโ and laughs and sighs, and nonverbal signals to improve human-computer interactions.
โAlan Cowenโs research has transformed our understanding of the rich languages of emotional expression in the voice, face, body, and gesture,โ said Dacher Keltner, a leading emotion scientist. โHis work has opened up entire fields of inquiry into understanding the emotional richness of the voice and the subtleties of facial expression.โ
About Hume AI
Humeโs goal is to enable a future in which technology draws on an understanding of human emotion to better serve human goals. As part of its mission, the company conducts groundbreaking scientific research, published in leading scientific journals, and supports a non-profit, The Hume Initiative, that has released the first concrete ethical guidelines for empathic AI.
Contacts

