Some stories are too good to be true
Have you heard of Dunbar's Number?
It posits that there is a correlation between a primate's brain size, and the average size of their social group. When you analyse the size of human brains, the ratio suggests our average 'group size' should be about 150 people. According to Dunbar, this is basically "the number of people you would not feel embarrassed about joining uninvited for a drink if you happened to bump into them in a bar."
I first came across it in Harari's highly acclaimed book 'Sapiens'; but it's also been cited in a wide range of articles; from the BBC through to the FT. Its applications have ranged from how online platforms might optimise gaming, through to how the Swedish Tax Authority determined the use of their office space.
The problem is, it's probably wrong.
When I read the paper essentially rubbishing Dunbar's number, I thought "how did this number impact so many decisions whilst remaining relatively unchecked"?
We are guided by the Elephant
The notion of a rational/conscious vs an emotional/subconscious part of our brain as two separate thought ‘systems’ has been suggested through various analogies over the past millennia. From Plato’s “Charioteer and Horses” theory, to Kahneman’s “type 1 and type 2” systems in “Thinking, Fast and Slow” ,and Haidt’s “Elephant and rider” analogy in “The Happiness Hypothesis” .
In the latter, Haidt explains that emotions are like an elephant; stronger and wilder than the 'logical' rider. The rider can try to train the elephant; but once the elephant decides which way to go, they will struggle to control it.
Anyone who works with in the broad fields of Data / Insight / Analytics knows the importance of telling stories through data. We do this because whilst data and statistics appeal to the logical ‘rider’, we know that to really get people on-board, we have to appeal to the emotional ‘elephant’. Stories help to make the facts land. They turn numbers into action, and ultimately set the path of the ‘elephant’ without necessarily having to appeal to the ‘rider’.
But what happens when a great story is told without the supporting data?
The story appeals to the elephant, and gains traction. It's communicated rapidly and acted upon promptly. The rider desperately tries to maintain control. But by the time the facts behind the story are disproved, it’s too late; the rider has lost control and the elephant is deciding which path to take. A myth has been created.
We are becoming increasingly aware that widespread disinformation spreads faster than the truth across social media (TLDR: the study suggests it takes the truth about six times as long as falsehood to reach 1500 people). But the problem isn't restricted to twitter.
A wider concern for me is how commonplace these myths appear to be in both academia and the workplace.
I came across a stat recently - 85% of Data and Analytics initiatives fail. Not true. 85% fail to achieve maximum benefit is the actual quote in MIT. But the story won out, and it's a stat I see appear time and time again.
It takes 21 days to form a habit? Myth.
We only use 10% of our brain? Myth.
Etc
To give you one final example; at University, I was 'taught' the Bystander Effect.
This phenomenon stipulates that individuals are less likely to help someone when other people are present. The story that stimulated research into this effect was gripping. The murder of Kitty Genovese was reported in the New York Times on March 27th. The opening line stated:
“For more than half an hour 38 respectable, law‐abiding citizens in Queens watched a killer stalk and stab a woman in three separate attacks in Kew Gardens”.
A compelling story. Only, it wasn’t true, and in fact contained multiple flaws, as reported by the American Psychologist in Sept 2007.
There is an ongoing debate as to whether the bystander effect still has merit; this video suggests it might; but the fact that we are still taught this story as fact speaks volumes to the power of a story over the truth.
Yes, we need to be compelling storytellers to land a key point; but we also need to be better myth-busters.
So what?
1) Question the source of the 'fact'. Where do you get '78% of our customers lapse after 1 year' from? How recently? How confident are you in the assertion? I'm not suggesting we need to repeat every piece of analysis we've ever done; just go one step deeper and question what you hear, however strongly it may be asserted.
2) Don't make big decisions based on facts that you haven't had a chance to question and understand. If the decision has big consequences, take the time to verify the facts you are predicating the decision on.
3) Stop spreading the myths! If you hear an interesting insight you haven't verified and want to share it, frame it as such rather than as a 'fact'.
Helping Pharma Hire Mission Critical Talent | Leaders of Life Sciences Podcast Host | Take the Critical Talent Readiness Assessment to see how well you hire specialist contract resource - click Visit My Website
4yLove this! Such a brilliant and an interesting read James. Thanks for sharing.