Future of AI

Bringing ChatGPT to Learning and Development

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to the AI Experience newsletter and join 50k+ tech enthusiasts.

The fastest growing app of all time isn’t Candy Crush or Facebook, it’s ChatGPT which amassed over 100 million users in just two months. That’s equivalent to around a third of everyone living in the United States. Being able to ‘converse’ with an AI took the world by storm and opened up a host of conversations about what AI can actually do for our work lives. It turns out, quite a lot. Especially in learning and development (L&D), which can benefit from using AI to personalize learning, automate grading, and speed up feedback. However, like any use of AI, it comes with some caveats.

ChatGPT falls into a category of algorithms known as generative AI. DALL-E 2, Bing AI, Chatsonic, AlphaGo, AIVA, and Jasper are all examples of other generative AI being used to create images, text, sounds and more. Therein lies the potential for L&D, as it takes anywhere from 18 to 155 hours, on average, to create a piece of instructional material for e-learning. Having the likes of ChatGPT and other generative AI tools do the groundwork in content creation will surely shave some of those hours off.

Content is king… when it’s valuable

Of course, we always need to be careful about providing quality over quantity. That’s one of the risks of the world becoming too dependent on ChatGPT. It’s only as good as what you’re feeding into it, and there’s already concerns about it being used to spread misinformation (on purpose and unwittingly). 

For L&D practitioners, it is vital to be able to separate the wheat from the chaff — knowing what content is actually useful to the workforce versus just creating content for content’s sake. It can be tempting to jump on the ChatGPT bandwagon and just see what it can spew out, but that will ultimately just make a learning pathway bloated with content that just isn’t needed. Starting with a clear definition of what skills you need to build and the knowledge you need to impart will ensure that ChatGPT will benefit corporate learning in the long run. It’s also worth noting that ChatGPT can only do so much, it may not be able to provide highly personalized learning recommendations, for instance, or create highly specialized, niche content.  

Power to the people (skills)

Speaking of misinformation, the skills the workforce will need in the future will be shaped significantly by the rise of AI. Yet, they won’t just be hard, technical skills but power skills like critical thinking and strategic leadership. Not every organization or individual will be as savvy as others when using generative AI, and they won’t always have the best data infrastructure, governance, or responsible AI guidelines. Our future is going to be lived hand-in-hand with AI, so having the skill of being able to discern and address bias will keep your organization on the right track when implementing AI. 

Deepening L&D relationships

Having AI do a lot of the mundane, time-consuming work gives L&D practitioners more time to focus on uniquely human tasks like connecting and interacting with the workforce, and creating learning strategies and processes to develop skills that are aligned with the business needs. Again, we return to power skills as these are hard to computerize and will, therefore, always be in demand in the human workforce. Augmenting your AI with human oversight and creativity is a win-win for your organization. It means AI does the heavy lifting that humans often find the most unsatisfying parts of their job, while freeing up resources for strategizing, relationship building, and problem solving.

Professor AI

AI itself can also be a ‘good’ teacher, especially at a time when more students are being asked for the skills to effectively work alongside AI. Universities are already experimenting with AI. For instance, Paul Fyfe, an English professor specializing in human-data interactions has recently set his students a task to take content from text-generating AI software and weave it into an essay. The goal is to help them reflect on the appropriate use of AI and evaluate its effectiveness and accuracy. 

Ultimately, we need to remember that AI is a tool. As with any tool, it is important to understand its limitations, as well as its capabilities. This understanding can help ensure that AI is used effectively and appropriately, and that it is not relied upon to perform tasks beyond its current capabilities.

Avoiding bias

AI cannot be mentioned without touching on the role of data. More importantly, the quality of that data (in terms of timeliness, accuracy, and bias). AI is a great growth engine for many processes, but that means any inaccuracies and biases could potentially scale just as quickly and widely as the actions that that AI is automating. Large language models like GPT series are trained on huge datasets of the human language taken from the Internet. Yet, we know that the Internet is inherently biased in several areas because the people who use it, all of humanity, are biased in different ways. It’s something worth cautioning when people are hyped about a tool as widespread as ChatGPT. 

We always need to take AI’s insights with a pinch of salt because we cannot fully remove bias from it. But we can try our hardest to eliminate it as much as possible in the data being used. At Degreed, skills data is the foundation of learning content recommendations and other features.  We make significant efforts to collect skills data from a wide range of sources, including (but not limited to) Degreed learner profiles, learning content, user search activities, compiled skills data from other sources like the EU’s ESCO database, the O*NET database, and so forth.

We also have human evaluators from different parts of the business, with diverse backgrounds and demographics working together to quality-check the outputs we’re getting from our algorithms. We take great efforts in ensuring that our model outputs and recommendations can be explained, because AI interpretability is foundational in building trust and identifying biases.

A race to upskill

AI experts are well-versed in AI, but it is essential to educate a broader audience, including those who currently use AI tools like ChatGPT, to prepare for the future when AI will become an integral part of our work environment. It is crucial to upskill everyone to enable AI to be an effective co-worker.

By increasing awareness of the limitations of ChatGPT and generative AI, we can ensure that people are better prepared to work alongside these technologies as they become more normal in our daily lives.

So as you see yet another LinkedIn post created by ChatGPT or look at an AI-generated artwork on a gallery wall, remember, the age of AI is upon us and it’s everyone’s job to have the skills and knowledge needed to effectively assess the reliability and ‘realness’ of what we’re seeing, to make the best use of them. 

Author

Related Articles

Back to top button