Generative AI has picked up incredible momentum in the last few months. Hype is now becoming reality, with companies of all sizes, in all industries, embracing GPT and other large language models (LLMs) to enhance business operations and customer offerings. We are seeing first-hand how this technology can truly transform organisations, and the relationship between human and machine.
At the same time, we are in the defining ‘decade of data’ – a period marked by the fundamental shifts in power created by the intersection of technology and data and the impact this has in every facet of our lives. In today’s landscape, a company’s data can be two or three times more valuable than the company itself. With the value of data increasingly apparent, the opportunity to maximise this has never been more important. Enter generative AI.
What we are seeing is a pivotal turning point for the industry. Thriving in this new era of AI, using GPT technologies to unlock the most value from data, will be critical to success.
New beginnings
GPT-3, the popular large language model (LLM) from Open AI has demonstrated a level of intelligence we have never before seen in natural language processing (NLP) algorithms before. Its ability to capture knowledge from the world based on all the new training data as well as its ability to synthesise and do some reasoning based on facts is completely new. With OpenAI’s recent launch of GPT-4, it’s clear this technology will continue to advance.
Naturally, technology organisations have jumped on the opportunity presented by GPT and have started to incorporate generative AI into their product offerings, from assistive writing, code generation, summarisation, and more. We are already seeing companies looking to use it in both B2B and B2C operations, and some that already have, for example, Snapchat’s “My AI” and BloombergGPT.
This is the beginning of a new relationship between humans and machines, and very specifically, between humans and data.
The decade of data
Today, every sector is being redefined by pervasive, undeniable and immediate data-driven power shifts.
From putting consumers back in the driver’s seat when it comes to their data, to the increasing importance of frontline decision makers having access to all possible information, data is at the heart of progress. Now, having a best-of-breed tech stack is non-negotiable, and with ongoing market shifts favoring flexibility and customer experience, these changes have already started impacting businesses significantly.
In conjunction with these power shifts, the data and analytics space is poised for a major transformation. Similar to the impact created by the developments of personal computers, the internet, public cloud, smartphones, generative AI and the evolution of public cloud services of NLP and ML-based analytics will result in a seminal change in data and analytics.
LLMs enable a dramatically transformed ability to understand both: data and the business questions we have in our mind, and this is the key to fostering data-fluent teams and unlocking the full potential of an organisation’s data.
The inflection point
With more data than ever, we need to be smart about how we’re leveraging it to gain actionable insights. It can be overwhelming for businesses to extract meaningful insights and make informed decisions with the sheer amount of data generated. However, the increasing availability of LLMs has provided the ability to bridge the gap of data fluency even more so.
On its own, GPT is unable to handle the complexity of real-world business data. For example, it gets confused when large numbers of columns are involved or when the schema is fairly complex and there are multiple fact tables involved. The accuracy of raw GPT-3 is not something to write home about.
Taking GPT technology, however, and building machinery around it to make it work with enterprise data and analytics will unlock significant – and transformative – benefits for any organisation.
The future is here now
Search and conversation is the way forward, and organisations now have the ability to intuitively search existing analytical content, or automatically create new insights, charts and visualisations based on natural language search by using systems that harness LLMs.
Frontline business decision workers can self-service, and express business questions in natural language. What would have traditionally taken hours, days or even weeks for data analyst teams to examine vast amounts of data and share actionable answers will take minutes.
Looking forward, personalisation of data for each user will become the norm, making analysing data even more streamlined. Analytics platforms will become invisible while serving more business users by meeting them where they are within their workflows.
Of course, there will be concerns around bias – which will become increasingly prominent in future operations and discussions around generative AI – but platforms built with attention and focus on accuracy and trust will mean companies won’t need to sacrifice accuracy, reliability, governance, or security.
As Chat GPT says itself: “AI is likely to have a transformative impact on the future of data, enabling businesses to make better decisions, improve customer experiences, and drive growth and profitability”, and with the right technology, that has now become a reality.