Common sense in an era of experts: Vikram Mansharamani on our over-reliance on data
Image credit: Dreamstime
Access to big data means that we are increasingly outsourcing our decision-making to protocols, experts and technology. That’s not always the best solution, warns author Vikram Mansharamani.
If you’ve ever allowed your GPS to lead you against your instincts to an empty industrial estate instead of where you were actually going, you will already have experienced what is at the core of Harvard lecturer Vikram Mansharamani’s new book ‘Think for Yourself’.
Put another way, “in a data-flooded world we’ve outsourced our decision-making” to technology. In passing responsibility for making correct decisions to specialists working in narrow fields, we’re creating skewed agendas and biased positions. Because we are losing our intellectual autonomy to machines and data, it’s hardly any wonder we’ve become bad at making decisions, says Mansharamani. His book’s subtitle – ‘Restoring common sense in an age of experts and artificial intelligence’ – is an appeal to people to stop relying on data as a substitute for hard-won experience.
‘Think for Yourself’ went to press before the extent of the coronavirus pandemic became apparent, but Mansharamani points out that there are many current instances of governmental decision-making related to Covid-19 that illustrate his central point. “We rely too much on the testimony of experts. We need to overcome our love affair with technology, stop being blinded by focus and keep experts on tap, not on top. As much as we want to outsource all of our decision-making problems, we need to take control and only tap into experts when their insight is needed.”
In some cases, he says, “experts make assumptions, and only by understanding their logic can we understand their recommendations”. In others, experts may simply be failing to give best advice due to subconscious bias, “especially as academia is organised in silos. So, a cardiology expert is either going to say something is or isn’t a heart problem. This kind of silo thinking is not always helpful.”
This is never more relevant than when you abnegate responsibility for your decision (in the event that it turned out to be a bad one) by claiming to have done the right thing in seeking expert advice. “We see this all the time at the moment,” says Mansharamani, who explains that politicians routinely defend their decisions on issues such as virus containment measures on the basis that they’ve consulted academics or scientists and received best advice. “The problem with expert advice is that while it can be accurate with ‘all else equal’, it’s rarely the case that all else is equal.”
‘Think for Yourself’
One of the disadvantages of living in the information age is that we’ve lost our ability to make decisions based on common sense rather than mathematical models. In ‘Think for Yourself’ author Vikram Mansharamani looks at how today’s data explosion has left us in the bizarre situation in which we are routinely handing over managerial responsibility to algorithms, while automatically relying on experts whose range of expertise doesn’t extend to a whole view of a problem.
Not surprisingly, this is how mistakes are made, and ‘Think for Yourself’ examines how protocols can create an environment for bad thinking, while intellectual self-reliance can often provide solutions unavailable to data crunchers. While experts and computer-based systems inevitably remain part of the decision-making landscape, they’re not infallible.
There’s a parallel narrative to Covid-19 in ‘Think for Yourself’ that makes Mansharamani’s point. At the height of the 2014 Ebola epidemic, a man who had recently returned from West Africa with a fever and severe abdominal pain was admitted to a hospital in Dallas, USA. “Even after healthcare workers learned the patient had come from Liberia – ground zero of the Ebola hot zone – not one of these people treating him considered the deadly virus as a possible cause.” Thirty-five minutes after the patient’s initial temperature reading, his temperature dropped and, as the rules indicated that his latest temperature level meant that it was the correct course of action for him to be discharged, he was sent home, where he died.
Not long after this, one of the nurses who had treated him needed to take a commercial flight. She reported a fever, but because her temperature was lower than the protocol threshold for restricted travel, she was cleared. Somewhat predictably, she was later confirmed to be infected with Ebola. “The system was designed to prevent situations like this,” says Mansharamani, “but failed because judgement and common sense were outsourced to strict protocols.”
I put it to Mansharamani that this is essentially the same scenario that plays out in sport week after week, where support staff huddle over laptops on the sidelines crunching data rather than looking at the game. Why is it that, rather than trusting their instincts, these ‘sports scientists’ turn to computer algorithms to predict play and justify their decisions? Every enthusiast instinctively knows that the reason we are so addicted to watching sport is precisely because it cannot be predicted with any certainty. In fact, the reason the players turn up is to contest the outcome.
Mansharamani agrees, explaining that this is where complexity science kicks in, and refers me to his chapter dealing with the Cynefin framework that separates contexts (such as sport) into four groups: simple, complicated, complex and chaotic. Simple is the natural domain for computers: “It could be something like how much interest is due on a credit card. We know the interest rate, we know the balance. The computer can do this and it’s never wrong.” But as you move through the framework to greater levels of complexity, such as sport, “you find that data could give you misleading results.
“In my previous book I talk about the differences between puzzles and mysteries. You can solve a puzzle, like the interest rate on a credit card, but when the problem is more complex and unpredictable, it becomes a mystery and just analysing the data may not get you any closer to connecting the dots.”
The point that Mansharamani keeps returning to is that at some stage we need to take control and regain our intellectual autonomy. We need to trust ourselves to make the right decision on when it is appropriate to call in outside expertise or rely on big data.
Which brings us to the central irony of this excellent, unconventional and stimulating book on thinking for yourself: which is that for nearly 300 pages we’re being told what to think by a Harvard professor. Surely, if I were to accept his logic, the thing to do would be not to follow his advice and therefore not think for myself?
Mansharamani, to his credit, likes the idea, but he’s too clever to be caught out by such logic-chopping. “That’s nothing,” he says. “While pitching this book, I went to see a publisher who said to me: ‘look. I’m the expert in this, so let’s forget all your ideas and stick with mine.’ To which I said: ‘Have you even read what my book is about?’ I went with another publisher in the end.” And that’s Mansharamani’s message in a nutshell. “Just because someone is an ‘expert’ in their field, or has access to all this data, that doesn’t mean that they are better equipped to take your decisions for you.”
‘Think for Yourself’ by Vikram Mansharamani is from Harvard Business Review Press, £22
The downsides of blindly relying on algorithms are exemplified by what happens when small errors surface in navigation software. GPS navigation aids allow us to take our focus away from navigating, sometimes with disastrous results.
In 2008, a bus carrying the Garfield High School softball team crashed into a pedestrian bridge in Seattle, sending 21 kids to the hospital. The driver’s GPS had routed him under the bridge even though it was too low for a bus. Why didn’t he pay attention to the low bridge as he approached it? One reason, perhaps the reason, is that the driver outsourced his thinking to the technology. An algorithm had given him the route, so he didn’t stop to think about the bridge’s height. You see, the GPS had a ‘bus’ setting.
Neither the driver nor the bus company had considered the possibility that the system could mislead. As the president of the bus company put it: “We just thought it would be a safe route because why else would they have a selection for a bus?” The bus setting gave them a false sense of security.
In a similar case in 2013, Apple Maps routed drivers across an operating runway at Alaska’s Fairbanks International Airport. Drivers mindlessly continued beyond road signs warning them of the runway and drove onto the airport grounds. Listening attentively and focusing on those directions, the drivers stopped thinking about where they were actually driving. Clearly, looking out of the window rather than listening to computer-generated instructions would have been more productive. To avoid a real disaster and potential loss of life, airport officials quickly erected barricades in the hope of preventing more of the same risky outsourcing of thought.
Edited extract from ‘Think for Yourself’ by Vikram Mansharamani, reproduced with permission.
Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.