Future of AI

“Virtual product placement is not just about slapping a logo on a billboard. It should fit an emotional tone”,—Amazon’s product manager Yash Chaturvedi on the VPP marketplace

Everybody keeps talking about the Virtual Product Placement (VPP) marketplace. If you’re not yet, you probably will once you know what it is. Basically, this platform connects content creators and media rights holders with advertisers for seamless, AI-driven in-content product placement. That includes adding branded items—like soda cans, posters, or billboards—into films, TV shows, or streaming content, either in post-production or even live.

It was Amazon who first embraced this technology, led by Yash Chaturvedi as the product owner. Yash had previously built an impressive career in artificial intelligence and machine learning at Home Depot, where he worked on innovative visual search and recommendation systems.

For the last several years, he’s been working in Amazon Prime Video Advertising in the department responsible for ads in streaming.

In this interview, Yash talks about the birth of the idea, how it evolved, and where this technology headed.

Tell us how the idea of ​​creating VPP came about? Were you guided by market demand or was it an initiative of Amazon management?

YC: I think it came from both sides. It naturally started as a response to a major shift in viewer behavior: people were watching more streaming content yet tuning out traditional ads. In early 2020, based on my work experience and market research, we realized that a growing need existed for less intrusive and contextually relevant advertising. That was a push to dig deeper into the topic. My next step was to conduct various user studies examining how ad-free platforms presented a challenge to brands seeking visibility on those platforms. The insights I received validated my hunch that a new format was needed.

Meanwhile, Amazon was also looking for a solution to the upcoming challenges. They focused primarily on innovations within Prime Video and Freevee. So, VPP wasn’t simply top-down or bottom-up, but more a confluence of ideas: we had a compelling concept, and Amazon leadership recognized it was in line with viewer preferences.

How did you convince Amazon to invest in this product, and what key arguments helped you secure support?

YC: When we pitched VPP to senior leadership, we highlighted two pivotal factors that affected two main players in this game—viewers and advertisers. For the first group, the argument was that by integrating products seamlessly, we would maintain the core viewing experience. Early internal mock-ups showed how subtle in-scene ads were less disruptive than pre-roll or mid-roll commercials. As for the second group, we showed how dynamic placement could open up brand opportunities even after a show had been filmed.

In my opinion, success hinged on a visible proof of concept; this is where we ran a bunch of small-scale tests in Amazon Originals. For instance, we inserted a branded product into a scene from Bosch: legacy, measured viewer reception and advertiser value, and then used that data to underscore our proof of concept. It demonstrated not only the viability of VPP but also how it could enhance brand visibility without traditional ad breaks.

What AI and machine learning technologies are used to enhance content personalization and improve the accuracy of product placements in VPP?

YC: Fortunately, I have a widely varied experience managing AI products and implementing machine learning technologies. As I worked on this project, I applied everything I knew and had experienced to it. From the outset, we deployed computer vision and deep learning algorithms to detect suitable on-screen “real estate” in every frame; be it a billboard in the background or a table in the foreground. This helps us identify where brand imagery can be inserted without disrupting the narrative flow. Once those opportunities are flagged, our context analysis models determine scene sentiment and thematic alignment so the placement feels organic. For instance, a branded energy drink can only appear in a dynamic gym scene, rather than a calm family dinner.

We also use multimodal AI for script and dialogue parsing, ensuring the inserted product complements the character interactions or setting. This intelligence is combined with our advertiser-matching technology, which cross-references brand preferences, campaign objectives, and even real-time performance data to pick the most relevant sponsors for each moment. Finally, a brand creative insertion layer manages asset rendering and updates: if a logo changes or a new campaign launches, we can seamlessly refresh the in-frame elements via our global post-production pipeline—all powered by machine learning that continues to refine placements based on feedback loops and viewer engagement metrics. Since this technology is quite attractive, other companies have already begun using it.

From a product management perspective, how did you scale the technology to make it global and accessible for thousands of videos?

YC: With my experience, I have realized that scaling requires a modular approach, especially when solving a new AI problem. This is where we designed each component, such as scene detection, brand matching, and post-production rendering, which utilized various CVML models separately built out to tackle these components. Just as critical, we created robust internal tools so content and advertising teams could manage and approve placements themselves. By back-end enabling them to upload asset files or configure brand guidelines, we kept the process smooth as we rolled out VPP to more shows, genres, and countries.

You mentioned that other companies have already started using your technology. How have they adapted or expanded upon your idea? Which of your patented solutions have they implemented?

YC: Shortly after we debuted VPP in May 2022, some major industry players and emerging ad-tech startups began experimenting with similar post-production insertion techniques. For example, by 2023, NBCUniversal’s Peacock had introduced its own “in-scene” ad format for select titles, which, much like VPP, uses technology to seamlessly embed branded visuals into content after filming. Meanwhile, a few VPP startups have secured substantial funding rounds to further develop their own virtual placement systems, often harnessing advanced machine learning and computer vision for scene detection and contextual matching. Many of these companies have independently implemented approaches that mirror key elements of our dynamic post-production engine, including real-time asset rendering.

Amazon was the first company to implement these technologies, but it appears to be a very competitive environment now. What in your opinion could still make VPP different from others?

YC: I think we focus more on image quality and dynamics, while keeping in mind real-time post-production integration engines. I am sure that virtual product placement is not just about slapping a logo on a billboard or cramming as many brands as you can in 30 minutes. Our approach customizes placements to fit the emotional tone and theme of the scene.

Some of these platforms have also added interactive overlays or personalization layers, allowing viewers to engage with or purchase featured items immediately. It’s exciting to see how quickly the broader landscape is evolving—these parallel efforts underscore the growing demand for digital product placements, and they point to a future where on-screen advertising is more seamlessly integrated, data-driven, and responsive to audience preferences.

How do you see the future of VPP in the context of evolving streaming services and shifting viewer preferences? What new opportunities and directions do you consider promising for this technology?

YC: Viewers have high expectations for convenience and relevance in the streaming industry. I see VPP evolving toward real-time interaction in the near term, as 70% of people watch streaming on multiple devices every day. They will be able to pause a show, find out about a featured item, and purchase it.

Longer term, as augmented and virtual reality gain traction, there’s a possibility of immersive placements that adapt to the viewer’s perspective—something we’ve only begun exploring. Additionally, as streaming platforms continue to diversify by offering ad-free tiers, ad-supported tiers, and hybrid models, there will be a growing need for flexible, carefully integrated advertising solutions.

Overall, we’re still in the early days. However, I’m confident that as technology evolves, VPP will continue offering more intuitive and meaningful ways to connect brands, creators, and audiences—ultimately reshaping the way the industry thinks about on-screen advertising. Almost certainly, we will see it in the near future.

Author

Related Articles

Back to top button