To truly realise the power and importance of explainability, transparency has to be at the heart of all of our interactions online. Whilst search evolves through the use of responsible AI, consumers are eager to understand how and why content or answers are presented to them.
This is not only relevant to eCommerce search but in lots of other contexts in the digital world from online advertising to suggested content on social media platforms. The mystery behind an algorithm and the black box of data that these applications rely on leaves consumers to assume the worst. We’ve all experienced a targeted ad that creates a negative emotional response, either an unnerving feeling of ‘how did they know that?’ or sometimes the frustration of ‘why am I seeing this again’. Such responses are grounded in a lack of trust consumers still have when online.
Consequently, as we see an explosion of innovation in AI, LLMs and search, explainability and the ability to tell users the what, why and how of each result, answer or content they receive has never been more important. We’re slowly seeing Big Tech players wake up to this, with the likes of TikTok and Meta now explaining why users are recommended content on their platforms. However a significant shift in mentality and approach from the industry is required around transparency and data privacy if we are to realise the benefits of explainability.
Why the pitfalls of search impact trust
The pursuit of improved search and customer experiences is nothing new, with personalisation a key cog in so many of our online experiences. So, whilst AI and the use of LLMs by the likes of Bing and Google outwardly offer something new to the end user, it’s an attempt to answer the same question businesses have been dealing with for some time, ‘how can we anticipate the needs of our customer?’
In a digital context, we’ve seen the use of customer data to drive personalisation and enhance customer experiences. However, this reliance on data has meant far too many businesses have neglected the importance of protecting and safeguarding customer data and their privacy. This has resulted in a lack of transparency around data privacy practices which has ultimately undermined consumer trust and confidence, and any attempts to improve the customer experience. What’s more, this data obsession has meant retailers have overlooked alternative ways to enhance customer experience that put privacy and consent integrity at the centre.
The use of AI in search, hasn’t yet shown that it offers much of an alternative to these previous practices. ChatGPT has been banned in Italy over privacy concerns, and whilst they have recently updated their privacy practices, there’s still significant questions around that data used to train AI algorithms and these LLMs. Consequently, there’s still a significant lack of transparency around the results that AI powered search produces which will lead to users questioning whether they can trust the results, even with the rich and varied answers that an AI powered search engine can provide.
This continued reliance on a black box of data, also means businesses are entirely unable to correct and combat inherent biases and blindspots these data points may create. Even if data used to train AI and LLMs is obtained in an ethical manner without transparency there’s significant potential for them to misinterpret and pigeon-hole users, leading to results which aren’t relevant.
How explainable AI can engage customers:
Businesses that explain the what, why, and how of each recommendation involved in the search process can mitigate the risks of AI implementation.
One key consideration to safely implementing AI into the search process comes down to data anonymity. If consumers’ personal information are collected with consent and anonymised through data silos, then businesses run no risk of encountering GDPR issues.
There are infinite possibilities to responsibly implement AI at the service of the customer so it’s key to get this right. By shifting to AI tools that are “human-centred” or “responsible”, businesses can leverage human abilities to create experiences that eliminate biases and blind spots, ensuring that the customer experience finds a balance between convenience and safety.
Offering explainability not only reassures each consumer that they can trust content recommended to them, but allows businesses to better understand their shoppers based on behaviour learned through responsible recommendations.
Tailored customer experiences will continue in the years to come but the pursuit of personalisation and improved customer experiences shouldn’t come at the expense of data privacy and transparency. By taking a responsible stance and explaining to consumers the reasoning behind recommendations and search results, the mystery behind the algorithm evaporates and we can rebuild consumer trust.