Compliance has long been the backbone of the learning and development industry, but assuming that your organization should do as little as possible to avoid fines and shift the blame for wrongdoing is starting to look risky.
In the past few months I’ve heard from a number of clients that courts are increasingly skeptical of superficial training, and now seek evidence that genuine behavior change has taken place.
For L&D professionals, that presents a problem: How do you measure something that might not have happened yet and hopefully never will?
🧑🏻⚖️ First, a quick history: Is compliance training really a defence?
Yes, it has been.
In 2012, when Morgan Stanley Managing Director Garth Peterson was charged with violating the Foreign Corrupt Practices Act (FCPA), the bank was able to avoid prosecution by pointing to training records. These showed that Peterson had been trained seven times on the FCPA, and reminded to comply at least 35 times.
Conversely, in the UK, Gordons LLP (who act as subject matter experts on our own compliance offering) point to a case where a claim of harassment was made against Allay (UK) Ltd. The company argued that it had required all employees to take training on equality and diversity. But the Tribunal found that this training was ‘stale’, had taken place over two years ago, and no ‘refresher’ training had been provided.
A little closer to home, our colleagues at Kineo - who Mindtools acquired earlier this year - have had their LMS data pulled into a court case.
Speaking to me on a recent webinar, our Executive Director of Learning Platforms, James Ballard, told me:
‘We have had our LMS data, particularly completion data, pulled into litigation cases where people might have been sacked or disciplined for bullying, harassment… quite serious misconduct. And the argument might be, well, I didn't know that wasn't appropriate here.
‘And being able to show that people have completed training that clearly states that that isn't appropriate behavior helps with those litigation cases, and can also therefore make it safer for people to report or know what is reportable… within an organization.’
🔍 What’s changed?
The cynical ‘tick box’ approach is coming under increasing scrutiny.
In a report published in 2016, the U.S. Equal Employment Opportunity Commission wrote:
‘Much of the training done over the last 30 years has not worked as a prevention tool - it's been too focused on simply avoiding legal liability.’
The Department of Justice, in its guidance on Evaluating Corporate Compliance Programs, writes:
‘Prosecutors should also assess whether the training adequately covers prior compliance incidents and how the company measures the effectiveness of its training curriculum.’
In the UK, after fining Interserve Group Ltd £4.4 million for failing to keep employee data secure, the Information Commissioner’s Office pointed to a failure to ensure that phishing training had taken place for all employees.
Debevoise and Plimpton LLP wrote following the case:
‘Ideally, training should be responsive to businesses’ specific organisational risks identified through risk assessments… General training can be supplemented by practical exercises such as in-depth cybersecurity tabletop exercises and phishing tests.’
⚖️ How can you tell if your compliance training is working?
Compliance training often boils down to a risk management exercise. What is the risk of something going wrong? And if something does go wrong, what are the potential consequences?
During discovery calls with clients, we might ask how often a data breach has occurred. Often, the answer is ‘never’. Does this mean that the organization is lucky, has effective safeguards, or provides great training? It’s hard to tell.
And, if the answer is ‘never’, measuring the impact of your future training is difficult to do. After all, if ‘never’ continues to be the answer, should we pat ourselves on the back for a job well done? Maybe, but it’s hard to draw a straight line.
But here’s a few ways you can start to think about both measuring the impact, and being able to argue that you’ve measured training efficacy:
🧑🔬 Identify the behaviors you want to encourage (or stop). Maybe you haven’t had a data breach or sexual harassment case, but what behaviors would you need to see to give you comfort that the risk is reduced? Automated phishing tests and whistleblowing hotlines are two areas where you can start to spot signals.
🎯 Balance the general and the specific. It’s impractical for smaller organizations to create custom compliance training. But you can supplement generic content with specific case studies or exercises to make compliance issues more relevant to our people.
⏱️ Use behavioral surveys. In one project, we created a behavioral survey for an information security team that measured attitudes to short- and long-term risk. This helps demonstrate that your people are considering the implications of the kind of risky shortcuts that can lead to compliance failures.
The clients we work with are starting to do these things already, with regular refreshes, impact measurement and a tailored approach to risk.
What is your organization doing?
Need help building out your compliance suite? Get in touch by emailing custom@mindtools.com or reply to this newsletter from your inbox.
🎧 On the podcast
Objectives and Key Results have long been a staple of the working world. You set a stretch objective, define the results that will tell you if the objective has been met, and scaffold throughout the organization. Then what?
Often, nothing happens. The OKRs were too vague.
In extreme cases, the worst happens: people bend the rules to hit aggressive targets.
For author Radhika Dutt, there is another way: Objectives, Hypotheses, and Learnings (OHLs).
In this week's episode of The Mindtools L&D Podcast, Radhika joins Gemma and Ross G to discuss:
Why OKRs so often fail
How OHLs prioritize a puzzle-solving mindset
How to ensure OHLs don't lead to analysis paralysis
Check out the episode below. 👇
You can subscribe to the podcast on iTunes, Spotify or the podcast page of our website.
📖 Deep dive
Why is it so hard to learn something new?
We’ve covered cognitive load theory before in the Dispatch (way back at the start - with a completely unhinged image), but now a new paper from the original theorist John Sweller does a nice job of summarizing where we are with it.
In the paper, Sweller writes that learning is ultimately about the accumulation of information in our long-term memories.
When a novice encounters a problem, they don’t have any information in their long-term memory about the challenge they’re facing. They have to solve it with the limited resources of their working memory.
When an expert encounters a familiar problem, they transfer information from their long-term memory back to working memory and can solve it far more easily.
There’s a big overlap here with the idea of “schemas”. If you know nothing about cars (I don’t) then you see lots of unrelated parts that connect but for no obvious reason.
As you become an expert, say by working on a car engine and discovering how it works, your brain stops seeing unrelated parts and comes to understand the “engine” as a single unified concept, or “schema”.
When we learn, we’re building these schemas in our long-term memory so that, next time we encounter an engine, we don’t need to recall lots of parts. We just recall the “engine” schema.
So next time you’re struggling to learn something new, cut yourself some slack. Your working memory is using all of its processing power to build a schema which, transferred to your long-term memory, will sit there in preparation for the next time you encounter the challenge.
Hat tip to our pal Connie Malamed for sharing this paper on LinkedIn.
Sweller, J. (2024). Cognitive load theory and individual differences. Learning and Individual Differences, 110, 102423.
👹 Missing links
Long-term readers will be aware of my enthusiasm for ‘nudge’ theory. Depending who you ask, a nudge is either a low-cost change to how choices are presented or a bureaucratic tweak to avoid tackling major issues. This blog from Tim Harford though does a nice job of highlighting where one type of nudge is effective: the timely delivery of an email or text message that is well targeted and specific. Less ‘change a lifetime of habits’, more ‘take this one action right now’.
💾 “This whole industry is based on a 25-year-old technical standard?!”
Sometimes it’s useful to see yourself as others see you. In this podcast from our friends at Mindset.ai, Jack Houghton takes a Louis Theroux-style look at workplace learning and pitches the SCORM standard as a straw man to be destroyed. It made me chuckle, get defensive, and take a long hard look at myself.
🥊I’m living my Rocky III dream
In the first film, Rocky Balboa loses to Apollo Creed. In the second, he wins. In the third, they become partners. Such has been my experience of working with our new colleagues at Kineo over the last few months. Where once we were rivals, now we’re friends. And so it was nice to read this blog from Solutions Consultant Cammy Bean reflecting on her experience of our 2025 union. Ding-ding.
And finally…
The best jokes require an explanation. So, if you’ve no idea what I mean by ‘Ding-ding’ above, here’s the clip:
👍 Thanks!
Thanks for reading The L&D Dispatch from Mindtools and Kineo! If you’d like to speak to us, work with us, or make a suggestion, you can get in touch by emailing custom@mindtools.com.
Or hit reply to this email!
Hey here’s a thing! If you’ve reached all the way to the end of this newsletter, then you must really love it!
Why not share that love by hitting the button below, or just forward it to a friend?
Excellent read and research Ross!
What is the definition of "compliance" training you're referencing? There are so many.