As AI increasingly intermediates discovery, comparison and recommendation, the primary consumer of product data is no longer a human browsing a page, but a machine interpreting, filtering and deciding on their behalf. In this new environment, product data has moved from supporting the experience to being the experience. This shift demands a move from product data as stored truth to product data as foundational machine-ready truth.
In traditional eCommerce, brands and retailers could compensate for imperfect data. A strong UX, compelling imagery or persuasive copy could bridge gaps in product information. Discovery was as much about presentation as it was about accuracy.
This safety net no longer exists. AI agents do not browse in the human sense nor do they infer intent from visual design or emotional cues; they ingest structured inputs, evaluate attributes, compare options and generate outputs based on what they can reliably interpret. If product data is incomplete, inconsistent or ambiguous, it is not misunderstood, it is ignored.
This is already visible in the rise of AI-mediated discovery. A growing proportion of consumers now begin their journey in large language models rather than search engines. According to Capital One research, 76% of consumers want AI-powered shopping assistants. 71% want to shop with help from generative AI while 58% use GenAI instead of traditional search to find recommendations. Nearly 60% of consumers have used AI to shop. These statistics signal a fundamental shift in how purchase decisions are made.
As this continues, visibility will no longer be driven by paid media, SEO tactics or front-end optimisation, but by data quality.
Brands and retailers will be optimising for machines, and this requires a new approach to machine legibility. For a product to be selected by an AI agent, it must be structured, with attributes that are standardised, comparable and consistently formatted. It must be complete, because missing data is not a minor flaw but a disqualifier. It must be explainable in terms of features and benefits so models can reason about constraints, and accessible in real time through interoperable systems and protocols.
Emerging commerce protocols such as MCP, A2A, UCP and ACP are formalising how AI agents access, interpret and transact on product data. Whether through open discovery models or curated ecosystems, the principle is consistent: machines require clean, structured and trusted data to operate.
As a result, the traditional funnel is collapsing. Where commerce followed a linear path from search and browse to decision and purchase, agentic commerce compresses this into a single interaction. A consumer expresses intent in natural language. An AI agent translates that intent into requirements, identifies relevant products, evaluates options across multiple merchants and, increasingly, executes the transaction. This new decision engine runs entirely on product data.
If that data cannot support machine-to-machine interaction, and cannot be queried, compared, validated and acted upon in real time, the retailer is excluded from the decision before it is even visible to the customer.
What this means is that the traditional definition of Product Information Management (PIM) as a repository for storing and organising product data before it is published to channels is now out of date. In an AI-driven commerce environment, PIM must evolve into an execution platform: the system that ensures product data is not only accurate and consistent, but also structured, enriched and delivered in a way that machines can immediately act upon.
This means centralising product data to eliminate fragmentation, enriching attributes to support decision-making, governing data quality continuously, exposing data through APIs for real-time access and ensuring interoperability across evolving standards.
The ecosystem required to support agentic commerce is still emerging. Competing standards, protocols and platforms are evolving rapidly, and it is too early to predict a single dominant model. However, regardless of whether transactions flow through open protocols or closed ecosystems, or whether discovery happens via one interface or many, the common denominator will remain the same: structured, trustworthy, machine-readable product data.
From here, AI will move from assisting decisions to making them. The gap between research and purchase will close, and discovery, evaluation and transaction will become a single, continuous process executed by machines. The challenge for brands and retailers is to build the infrastructure that enables their data to be understood, trusted and acted upon.


