When Daniel Kahneman’s new book Noise came out, co-authored with Oliver Sibony and Cass Sustein, I was naturally very eager to get my hands on it. After all, the Nobel-prize winner’s last book, Thinking, Fast and Slow, was a sensation. Almost as many people pretended to read it as they did Yuval Hariri’s sweeping global history Sapiens.
Kahneman’s genius is to craft a thesis which, like most truly great ideas, seems like an almost banal observation. In Thinking, Fast and Slow, it was that most people act based on two distinct thought processes: intuition (“fast thinking”) and deliberate thought (“slow thinking”), and that intuition is largely a remembered form of slow thinking that our brains apply to particular scenarios in order to be more efficient.
In this case, most people are prone to spit out widely varying and error-prone answers when presented with the same set of questions and information.
What is especially startling, however, is the scale at which this happens, especially in fields like medicine or insurance, whose practitioners are expected to be trained in a standard way to answer standard questions.
Take medicine, for example. We assume that most physicians dealing with problems like this are following roughly the same guidelines.
Yet according to the authors, “Thirty-one per cent of the time, physicians evaluating angiograms disagreed on whether a major vessel was more than 70 per cent blocked.”
Read on and it only grows more alarming: “Doctors misdiagnosed melanomas (the most dangerous form of skin cancer) in one of every three lesions”
In radiology, they found that false negatives for mammograms happen up to half the time, and false-positive rates range up to two thirds of the time.
You might as well play a round of darts instead.
Indeed, physicians are no less likely to disagree with themselves than with others.
“When assessing the degree of blockage in angiograms, twenty-two physicians disagreed with themselves between 63 and 92 per cent of the time.”
By the way, if you want to be recommended for cancer screening, book an early appointment.
“In a large sample, the order rates of breast and colon cancer screening tests were highest at 8 am, at 63.7 per cent...and then decreased to 47.8 per cent at 5pm."
It’s not just medicine. Kahneman exposes many fields which lay people assume run like clockwork but which in fact run on guesswork.
In one large insurance company: “The median difference in underwriting was 55 per cent...This result means, for instance, that when one underwriter sets a premium at $9,500, the other does not set it at $10,500 – but instead quotes $16,700. For claims adjusters, the median ratio was 43 per cent.
“One senior executive estimated that the company’s annual cost of noise in underwriting – counting both the loss of business from excessive quotes and the losses incurred on underpriced contracts – was in the hundreds of millions of dollars”.
Human error and frailty are to blame for much of this. We are especially terrible at thinking probabilistically.
But the solutions to this cacophony of “noise” are straightforward: measure and standardise! Algorithms (which are really just standardised rules) and even the simplest form of data-gathering and artificial intelligence can make a huge difference.
Much of diagnostics and pathology in medicine is, for example, a simple decision tree that can be easily automated, without even dramatic use of data. The authors present a beautiful case for this.
But the book doesn’t offer answers to some of the real challenges of noise: people. It is the fields with the most noise – judicial sentencing, medicine, and insurance – which often give rise to the most vocal opposition to standardisation of any kind.
In the Caribbean in particular, many industries and professions are almost allergic to data and standardisation. Data is revealing – it reveals the bad indiscriminately alongside the good. Hence our national reliance in TT on intuition and anecdotes. Standardisation also removes individual discretion and power, which most people are primed to resist tooth and nail.
Where Kahneman and his co-authors stop short is with grappling with these fundamental problems, which are barriers to institutional change globally. How can we, for example, show people that relying more on algorithms won’t reduce their power – it can in fact free up more time and resources to devote to bigger problems? When viewed through a wider lens, standards can actually increase individual freedom.
This is hardly something to put on a placard, but as the authors note, this could save millions across industries – and in fact save lives as well. It’s time we tune out the static.
Kiran Mathur Mohammed is an economist and co-founder of medl, an IDB lab-backed social impact health tech company