What gets measured gets spun
How metrics and measurement change our behaviour - and the benefits of accepting this.
In the 1990s, New York Police Department started taking a more data-driven approach to law enforcement. At first, the method was crude: every time a crime was committed, a pin was placed on a subway map to indicate where it had occurred. As high-crime areas emerged, police would concentrate their efforts on those locations.
From these humble beginnings, the CompStat program was born: a systemized way to track crime numbers, reported to all police officers on a monthly basis.
But did it work?
The purpose of CompStat was to leverage data to reduce crime. In many cases, what it actually did was incentivise police to manipulate reporting data to improve the numbers.
I bring this up because the impact of metrics on human behaviour has been front of mind here at Mind Tools Towers over the past month. We’ve been setting goals for the year and asking ourselves: how would we know that these have been achieved?
The most common question I’ve heard is: what does that number even mean? Seemingly simple metrics like ‘% of active users’ on our platform are complicated by clients who buy more licenses than they initially need or by users who we don’t know have left their organisation.
And then there’s this provocative question, asked by my L&D Dispatch colleague Ross Dickie: Should we care how often users access our performance support resources? We are not, after all, Buzzfeed. Success, in the workplace learning context, is not based on how long a user spends on our platform or how many resources they access: it’s based on whether those resources actually helped them perform at work.
So it’s been exciting to trial our new ‘Guided Experience’ content type with around 450 of our consumer users. If you’re one of them, thanks for taking part!
This new content type begins with a survey, tested to ensure that it’s measuring what we claim to measure. And, at the end of the experience, participants will be invited to take the survey again. This won’t be a surprise to them: we tell them we’re doing this on the first page of the Experience.
But what really interests me is how this approach shifted the design conversations we had internally. No longer were we discussing ‘engagement’. Instead, the question we kept asking ourselves was: are the skills and techniques we’re suggesting likely to have a measurable impact on survey results? Or, in other words: will they actually help users improve their skills?
Learning designers, just like police officers, are incentivised to play the numbers. If we pick the right metrics, we can positively influence the behaviour of not just our users: but also ourselves.
🎧 On the podcast
Numbers are in the news a lot lately: inflation numbers, growth numbers, redundancy numbers. So, this week on The Mind Tools L&D Podcast, we invited 360 Learning’s David James onto the show to discuss his experiences heading up the learning team at The Walt Disney Company during the 2008 financial crisis.
David outlined his approach here:
‘I went to each of those business partners and said: Look, I realise that in these times our model, where people attend courses that are run by external vendors... probably isn't going to fly, because budgets are going to be cut. So I would like to speak... to your stakeholders... to address your pain points. To do team development sessions that are bespoke around your team's needs.’
To find out how else David approached this challenge, listen here:
📖 Deep dive
What’s the ideal size for a virtual classroom? According to new research from Jo Cook and Jane Daly, over 56% of survey respondents would prefer that less than 10 people attend.
When asked what enhances that learning experience: 67% said they wanted the opportunity to interact; 66% wanted a well-designed session; and 53% wanted a good balance of interactivity.
Unfortunately, what’s good for learners isn’t necessarily best for your L&D budget: smaller and more interactive sessions are more difficult to scale.
How will you find the right balance in 2023?
Read the full report for more insights: Cook. J and Daly. J (2022) Through the lens of research: Amplifying human focus in virtual and hybrid learning.
👹 Missing links
Do you know what ‘neurodivergence’ is? I confess, I know less than I thought. In this article for Learning Guild, Judy Katz discusses terminology, language, use of visuals and guidance for social interactions. For example, the emoji used for this link is based on Judy’s advice! It’s a fascinating read, exploring a topic that touches the lives of so many people but remains widely misunderstood and often ill-defined.
It’s a corn-on-the-cob, right? Maybe an ear of corn? Some might just say corn, but the current consensus is ‘maize’. The current Wikipedia discussion on this issue now sits at around 65,000 words, and the issue has become so bitter that the page had to be given ‘protected’ status. For more of this sort of thing, follow one of my favourite Twitter accounts: @DepthsofWiki
Apparently, there’s growing interest in using psychedelics to promote creativity at work. Now, one team of researchers has tested this theory in a trial that saw two groups come up with creative ideas. The first group were high on cannabis, the second group were not. The result? The stoners thought their ideas were amazing, but independent assessors found no difference in the quality of their thoughts. Bummer, man.
It’s February! What does this month look like for you? According to this reporter, it resembles: ‘a place where people who are being punished are sent’.
I enjoyed the video, gloomy though it be. I hope you will too!
Or just hit reply to this email!