Lost in a sea of AI noise, looking for a signal
Two reports confirm that AI is a big, shiny object.
Over the past few weeks, we’ve made multiple references to the World Economic Forum’s ‘Future of Jobs’ report, both in this newsletter and on The Mindtools L&D Podcast.
And we were by no means alone in this.
Search ‘Future of Jobs’ on LinkedIn, and you’ll find countless posts about the implications of the report for business leaders, job-seekers, and, yes, even L&D professionals.
One of the key findings from the report was that, over the next five years, employers expect ‘AI and big data’ skills to grow in importance, more rapidly than any other skill area.
For L&D, this creates a burning platform to upskill or reskill employees, equipping them to meet the challenges of tomorrow.
Arguably, that sense of urgency is reflected in the results of the latest ‘Global Sentiment Survey’, Donald Taylor’s annual pulse-check of the L&D industry.
Since 2014, the survey has asked ‘What will be hot in workplace L&D this year?’.
Participants are invited to respond to the survey via email and social media, and are therefore a self-selecting, tech-savvy subset of the wider L&D population.
Continuing last year’s trend, the top three results from the 2025 edition of the GSS were:
Artificial intelligence (22.6%)
Reskilling/upskilling (10%)
Skills-based talent management (8.9%)
At face value, the results of the ‘Future of Jobs’ and ‘Global Sentiment’ surveys tell a story of business leaders identifying needs, and L&D teams proactively responding to those needs.
But the real story is much more complicated than that.
One of the things I’ve always loved about the GSS is that it is designed to provoke questions, rather than provide answers.
And, while I’m still digesting the results of both surveys, I already have a lot of questions.
Here are just a few that spring to mind:
🤖 What does ‘AI and big data’ even mean when it comes to skills?
Although the ‘Future of Jobs’ report doesn’t include the survey questions participants were asked, I managed to find a copy of them after doing some digging.
To gauge the evolving skills needs of businesses, employers were asked:
‘For your organisation’s key roles, would you expect an increase or decrease in the use of the following skills by 2030?’
Respondents are then presented with a series of options to choose from, rating these based on anticipated decreasing, stable, or increasing use.
Notably, as far as I can tell, no definitions of the options are provided.
So, when the report claims that ‘AI and big data’ is expected to be the fastest-growing skill over the next five years, what does that actually mean?
Do ‘AI skills’ involve prompt-engineering? Leveraging deep-learning to analyze large data sets? Using AI in a way that doesn’t break the law?
Depending on your organizational context, it might be all of the above. It could also be none of the above.
⚡ How valuable are five-year predictions given the current pace of change?
For the sake of argument, let’s say that what employers are thinking about when it comes to building ‘AI skills’ is prompt-engineering.
To realize the potential productivity benefits of AI, employees need to know how to engage effectively with tools like ChatGPT, Claude, and Gemini.
If you accept that premise, then upskilling workers in prompt-engineering might seem like a worthwhile investment.
But before you get out your wallet, how confident are you that prompt-engineering will still be a relevant skill five years from now?
When we started this newsletter just two years ago, I would spend what felt like hours trying to prompt DALL.E 2 into generating barely passable imagery. (See this article from Ross G as an example of its truly unhinged output.)
Now, I ask ChatGPT 4o to complete the same task, and it takes my badly-written prompt and re-writes it for me, based on what it ‘thinks’ I want. Usually, it gets it right first time.
In 2025, you might want to hire someone who is already a skilled prompt-engineer.
But should you train someone in a skill that might be redundant in twelve months, never mind five years?
🕳️ To what extent is our collective obsession with AI self-perpetuating?
Lastly, one thing that worries me is the extent to which AI’s dominance of L&D discourse obscures other, arguably more important, topics.
With a few notable exceptions (hey there, ‘Showing value’!), nearly every other option in this year’s Global Sentiment Survey lost vote share in 2025.
AI is sucking all of the oxygen out of the room, and it’s not lost on me that I’m part of the problem.
When I responded to the GSS, I selected ‘Artificial intelligence’ as my top choice because the question isn’t ‘What should be hot?’, it’s ‘What will be hot?’.
Such is AI’s gravitational force, that even when you don’t want to talk about AI, you still somehow end up talking about it.
I present this newsletter as a case in point.
Want to share your thoughts on this week’s newsletter? Then get in touch by emailing custom@mindtools.com or reply to this newsletter from your inbox.
🎧 On the podcast
All sales training is sales enablement, but not all sales enablement is sales training.
In last week’s episode of The Mindtools L&D Podcast, Lara and I were joined by Darren Bezani, Chief Salecologist at Salecology, to discuss:
what ‘sales enablement’ means, and why it’s intentionally broader in scope than sales training;
what sales enablement is designed to achieve, beyond increased sales;
the role of managers in sales enablement.
Check out the episode below. 👇
You can subscribe to the podcast on iTunes, Spotify or the podcast page of our website. Want to share your thoughts? Get in touch @RossDickieMT, @RossGarnerMT or #MindToolsPodcast
📖 Deep dive
The phrase ‘a coin flip’ is synonymous with randomness and chance. But flipping a coin may not be the 50/50 proposition we take it for.
In a 2023 paper, a team of European researchers set out to test Persi Diaconis’ 2007 hypothesis that a flipped coin is statistically more likely to land on the same side that it started.
The experiment involved 350,757 coin flips, conducted by 48 different people using coins minted in 46 different countries to prevent design bias.
The results of the experiment appear to confirm Diaconis’ finding that the chance of a naturally flipped coin coming up as it started is around 51%. The authors write:
‘The data confirm the prediction from the DHM model: the coins landed how they started more often than 50%. Specifically, the data feature 178,079 same-side landings from 350,757 tosses, Pr(same side) = 0.5077, 95% central credible interval […], which is remarkably close to DHM’s prediction of (approximately) 51%.’
Clearly, this ‘deep dive’ has very little to do with L&D, but it blew my mind when I stumbled across the paper last week.
Bartoš, F. et al. (2023). ‘Fair coins tend to land on the same side they started: Evidence from 350,757 flips’. arXiv.
👹 Missing links
🔍 The End of Search, the Beginning of Research
For my money, one of the more convincing ‘this changes everything’ announcements to come out of the world of artificial intelligence is OpenAI’s release of Deep Research. As Ethan Mollick explains in this edition of his newsletter, Deep Research is a ‘narrow agent’, capable of autonomously performing research on a given topic. While I haven’t tested the tool myself (it’s currently only available to ChatGPT Pro subscribers at a cost of $200/month), Mollick’s reflections are intriguing, suggesting that such agents could transform the way AI is used in the workplace.
💾 Digital Learning for Behavior Change
A couple of weeks ago, our friends at Intellum invited Anna Barnett and me to host a webinar exploring the challenges of measuring behavior, and how we design digital learning with behavior change in mind. If that sounds like something that might be up your street, you can watch the on-demand version on Intellum’s website.
🦷 Do We Want Behavior Change or Information Transfer?
On the topic of behavior change, I loved this recent newsletter from Jess Almlie. When explaining the concept of ‘gaps’ to clients, and why knowledge alone is usually insufficient to change behavior, I typically tell a story similar to the one Jess recounts at the beginning of this article. The short version is that while I know I should floss more, and my dentist is always telling me to do so, that doesn’t mean I actually do it. Where I’ve failed, Jess has succeeded, and she explains the conditions that needed to be in place for her behavior to change. Learning designers, take note!
👋 And finally…
I’m currently reading Warren Zanes’ Deliver Me from Nowhere, exploring the making of Bruce Springsteen’s Nebraska album. In the book, Zanes references the development of Springsteen’s first music video for the song ‘Atlantic City’, which perfectly captures the tone and feeling of Nebraska:
👍 Thanks!
Thanks for reading The L&D Dispatch from Mind Tools! If you’d like to speak to us, work with us, or make a suggestion, you can email custom@mindtools.com.
Or just hit reply to this email!
Hey here’s a thing! If you’ve reached all the way to the end of this newsletter, then you must really love it!
Why not share that love by hitting the button below, or just forward it to a friend?