I really liked this book by Tim Harford. For one thing, it is very readable. It also seems well-researched and the back of the book is filled with citations, so the author is careful in backing up his points. And he validated many of my own conclusions, which is gratifying.
I liked it so much that I tried a new experiment. I heard or read somewhere a suggestion that to really understand a book you should read it three times. As a starter, I tried reading this one two times, which was quite an eye-opener. The second time through did not feel redundant. It was certainly familiar, but there were many points and issues that I noticed differently or understood differently the second time. I'm glad I chose this book for a re-read, because it kept me engaged. I think I'd have to decide that a book was going to be a keystone to this project in order to read it a third time -- re-reading this one seemed to slow me down significantly and I'd only want to invest that kind of time for the right book.
For this book, I think I'll let the highlights and my notes (conveniently captured from my Kindle onto Amazon) speak for themselves.
Highlighted PassagesWe badly need to believe in the potency of leaders. Our instinctive response, when faced with a complicated challenge, is to look for a leader who will solve it.
we have an inflated sense of what leadership can achieve in the modern world.
One of Tetlock’s more delicious discoveries was that the more famous experts – those who spent a lot of time as talking heads on television – were especially incompetent.
Biologists have a word for the way in which solutions emerge from failure: evolution.
Disconcertingly, given our instinctive belief that complex problems require expertly designed solutions, it is also completely unplanned.
of the mutations. Yet the blind evolutionary process produced marvellous things.
Note: can we simulate a better decision making system?
strategies: different ways to run an airline or a fast-food chain.
Note: we simulate models in our brain and guess the outcome. strength is that we can do this. weakness is that our modeling is so often flawed.
the wave. Like surfing itself, this is harder than it looks.
Note: walk the plank for people are our most valuable asset. fall off the worlds edge for assuming we will always grow and nothing bad will happen.
biological process of evolution through natural selection is entirely blind;
Note: genetic algorithms make no assumptions. they try everything.
the evolutionary mix of small steps and occasional wild gambles is the best possible way to search for solutions
But whether we like it or not, trial and error is a tremendously powerful process for solving problems in a complex world, while expert leadership is not.
His method for dealing with this could be summarised as three ‘Palchinsky principles’: first, seek out new ideas and try new things; second, when trying something new, do it on a scale where failure is survivable; third, seek out feedback and learn from your mistakes as you go along.
There is a limit to how much honest feedback most leaders really want to hear; and because we know this, most of us sugar-coat our opinions whenever we speak to a powerful person. In a deep hierarchy, that process is repeated many times, until the truth is utterly concealed inside a thick layer of sweet-talk. There is some evidence that the more ambitious a person is, the more he will choose to be a yes-man – and with good reason, because yes-men tend to be rewarded.
Even when leaders and managers genuinely want honest feedback, they may not receive it. At every stage in a plan, junior managers or petty bureaucrats must tell their superiors what resources they need and what they propose to do with them. There are a number of plausible lies they might choose to tell, including over-promising in the hope of winning influence as go-getters, or stressing the impossibility of the task and the vast resources needed to deliver success, in the hope of providing a pleasant surprise. Actually telling the unvarnished truth is unlikely to be the best strategy in a bureaucratic hierarchy. Even if someone does tell the truth, how is the senior decision-maker supposed to distinguish the honest opinion of a Peter Palchinsky from some cynical protestation calculated to win a budget increase?
Traditional organisations are badly equipped to benefit from a decentralised process of trial and error. Static, solved problems are ideal for such organisations; as are tasks where generalised expertise counts for much more than local knowledge.
Accepting trial and error means accepting error. It means taking problems in our stride when a decision doesn’t work out, whether through luck or misjudgement. And that is not something human brains seem to be able to do without a struggle.
recipe for successfully adapting. The three essential steps are: to try new things, in the expectation that some will fail; to make failure survivable, because it will be common; and to make sure that you know when you’ve failed.
To produce new ideas we must overcome our tendency to fall in step with those around us, and overcome those with a vested interest in the status quo.
distinguishing success from failure, oddly, can be the hardest task of all: arrogant leaders can ignore the distinction; our own denial can blur it; and the sheer complexity of the world can make the distinction hard to draw even for the most objective judge.
The big picture becomes a self-deluding propaganda poster, the unified team retreats into groupthink, and the chain of command becomes a hierarchy of wastebaskets, perfectly evolved to prevent feedback reaching the top. What works in reality is a far more unsightly, chaotic and rebellious organisation altogether.
the three elements of the idealised, decisive hierarchy: a ‘big-picture’ view produced by the refined analysis of all available information; a united team all pulling in the same direction; and a strict chain of command. Johnson and McNamara managed to tick all those boxes, yet produce catastrophic results. The ‘big-picture’ information that could be summarised and analysed centrally wasn’t the information that turned out to matter. A loyal, unified team left no space for alternative perspectives. And the strict chain of command neatly suppressed bad news from further down the organisation before it reached Johnson.
McNamara thought that with enough computers and enough Harvard MBAs, he could calculate the optimal strategy in war, far from the front lines. That project brought the US Army no joy in Vietnam, but its spirit continued to animate Donald Rumsfeld.
McNamara himself looked for ‘team players’, declaring that it was impossible for a government to operate effectively if departmental heads ‘express disagreement with decisions’ of the President. This was the idealised organisation at its worst. Loyalty wasn’t enough. Merely to ‘express disagreement’ was a threat.
top of the US military establishment. Nothing, it seems, had changed.
Note: is it even possible for valid information to reach the top of a hierarchy?
‘We willingly implement lessons learnt at the bottom end, because changing and adapting low-level tactics saves lives,’ one British general told me with an air of resignation. ‘But we rarely adapt and implement lessons learned at the top end.’
the right decisions are more likely when they emerge from a clash of very different perspectives.
organisations which ignore internal criticism soon make dreadful errors,
For an organisation that needs to quickly correct its own mistakes, the org. chart can be the worst possible road map.
What Hayek realised, and Allende and Beer did not seem to, was that a complex world is full of knowledge that is localised and fleeting. Crucially, the local information is often something that local agents would prefer to use for their own purposes.
Whichever way they sliced the data, Azoulay, Manso and Zivin found evidence that the more open-ended, risky HHMI grants were funding the most important, unusual and influential research. HHMI researchers, apparently no better qualified than their NIH-funded peers, were far more influential, producing twice as many highly-cited research articles. They were more likely to win awards, and more likely to train students who themselves won awards. They were also more original, producing research that introduced new ‘keywords’ into the lexicon of their research field, changing research topics more often, and attracting more citations from outside their narrow field of expertise.
The HHMI researchers also produced more failures; a higher proportion of their research papers were cited by nobody at all. No wonder: the NIH programme was designed to avoid failure, while the HHMI programme embraced it. And in the quest for truly original research, some failure is inevitable. Here’s the thing about failure in innovation: it’s a price worth paying.
Recall the work by the Santa Fe complexity theorists Stuart Kaufman and John Holland, showing that the ideal way to discover paths through a shifting landscape of possibilities is to combine baby steps and speculative leaps.
So far, we have discovered two vital principles for promoting new technology. First, create as many separate experiments as possible, even if they appear to embody contradictory views about what might work, on the principle that most will fail. Second, encourage some long-shot experiments, even though failure is likely, because the rewards to success are so great.
The key to unpicking a tangled knot is known as an ‘identification strategy’ – how you identify what causes what. If crops grow better in the shade of a rook-infested tree, is that because they prosper from the shade or the bird droppings?
the clearest identification strategy of all is a randomised trial, which hard-wires identification into the design of the experiment itself.
while a market provides a short, strong feedback loop, in public services the feedback loop is longer and looser. If parents don’t like the local school, they can complain to local politicians, or lobby the headmaster directly. They can also move to a different school, but this act has fewer direct consequences for the school than for a café.
‘We should not try to design a better world. We should make better feedback loops.’
economy needs to be protected from its host country’s own entrenched
Note: administration protects a team from the complexities of the environment.
a built-in mechanism for distinguishing the successes from the failures: ordinary people, voting with their feet.
harnessing the power of ordinary people as a selection mechanism
government regulations, by their very nature, tend to be somewhat impervious to the possibility of improvement.
Sugarcane ethanol can actually lower emissions by harnessing harmful byproducts such as methane; corn ethanol can actually be worse than gasoline, and palm-oil biodiesel grown on former rainforest land can be responsible for the release of over twenty times more carbon dioxide than good old gasoline. The impact of producing biofuels all depends on what crops are grown and how they are processed;
As the biochemist Leslie Orgel famously remarked, ‘Evolution is cleverer than you are’, meaning that when an evolutionary process is let loose upon a problem, it will often find solutions that no human designer would have dreamed of.
Perhaps the mascot of these unlovely consequences should be the great British bulldog
The dark side of Leslie Orgel’s law means that whenever we leap to conclusions about what a particular solution would look like – buildings with inbuilt renewable energy capacity, or cars that run on biofuels – we are likely to discover unwelcome consequences.
‘Any intelligent fool can make things bigger, more complex, and more violent. It takes a touch of genius – and a lot of courage – to move in the opposite direction.’ – attributed to E.F. Schumacher
But what if a system is both complex and tightly coupled? Complexity means there are many different ways for things to go wrong. Tight coupling means the unintended consequences proliferate so quickly that it is impossible to adapt to the failure or to try something different.
Professional domino topplers now use safety gates, removed at the last moment, to ensure that when accidents happen they are contained.
Peter Palchinsky’s second principle: make failures survivable.
Normally, carrying out lots of small experiments – variation and selection – means that survivability is all part of the package. But in tightly coupled systems, failure in one experiment can endanger all. That is the importance of successfully decoupling.
Most insidious are mistakes. Mistakes are things you do on purpose, but with unintended consequences, because your mental model of the world is wrong.
That suggests that the best-placed people of all to spot fraud – or indeed any kind of hidden danger in an organisation – are employees, who are at the front line of the organisation and know most about its problems. Sure enough, Dyck, Morse and Zingales found that employees did indeed lift the lid on more frauds than anyone else.
Above all, when we look at how future financial crises could be prevented, we need to bear in mind the two ingredients of a system that make inevitable failures more likely to be cataclysmic: complexity and tight coupling. Industrial safety experts regard the decoupling of different processes and the reduction of complexity as valuable ends in themselves. Financial regulators should, too.
‘One doesn’t have to be a Marxist to be awed by the scale and success of early-20th-century efforts to transform strong-willed human beings into docile employees.’ – Gary Hamel
Professor Endler’s guppy experiments are a modern classic in evolutionary biology, and a striking example of how a population adapts to a new problem,
went on to produce future generations of well-adapted baby guppies.
Note: hierarchy information loss. Top of the hierarchy can make bad decisions fast. Need to push decisions down but also decouple to reduce impact of mistakes.
Adapting is not necessarily something we do. It may well be something that is done to us.
There are three essential steps to using the principles of adapting in business and everyday life, and they are in essence the Palchinsky principles. First, try new things, expecting that some will fail. Second, make failure survivable: create safe spaces for failure or move forward in small steps. As we saw with banks and cities, the trick here is finding the right scale in which to experiment: significant enough to make a difference, but not such a gamble that you’re ruined if it fails. And third, make sure you know when you’ve failed, or you will never learn. As we shall see in the next chapter, this last one is especially difficult when it comes to adapting in our own lives.
Gary Hamel’s recent book, The Future of Management.
the best computer systems in the world cannot substitute for being there, talking about what’s going on and responding at once to subtle situational clues – or in Hayek’s now familiar words, ‘knowledge of the particular circumstance of time and place’.
H.R. McMaster criticised the idea that ‘situational understanding could be delivered on a computer screen’
Timpson’s company training manual describes the twenty easiest ways to defraud the company, making it clear that the company understands the risks it is running and trusts its employees anyway – and many people respond to being trusted by becoming more trustworthy.
Peer monitoring does not always work, of course; peer groups can turn into self-serving or even corrupt cliques.
it was hard to forget seeing peer monitoring in action: the instant correction of a problem, no matter how small and no matter what the hierarchical relationship might be between head of safety and tea lady.
in a company where the selection mechanism is your teammates rather than top-down rules, there is no room for people who don’t play their part.
A sufficiently disruptive innovation bypasses almost everybody who matters at a company: the Rolodex full of key customers becomes useless; the old skills are no longer called for; decades of industry experience count for nothing. In short, everyone who counts in a company will lose status if the disruptive innovation catches on inside that company – and whether consciously or unconsciously, they will often make sure that it doesn’t.
Note: solution: remove status from the equation.
Peter Palchinsky’s principles: First, try new things; second, try them in a context where failure is survivable. But the third and final essential step is how to react to failure, and Tharp avoided several oddities of the human brain that often prevent us from learning from our failures and becoming more successful.
These, then, are the three obstacles to heeding that old advice, ‘learn from your mistakes’: denial, because we cannot separate our error from sense of self-worth; self-destructive behaviour, because like the game-show contestant Frank, or Twyla Tharp when marrying Bob Huot, we compound our losses by trying to compensate for them; and the rose-tinted processes outlined by Daniel Gilbert and Richard Thaler, whereby we remember past mistakes as though they were triumphs, or mash together our failures with our successes. How can we overcome them?
when a market test is not available or not appropriate, we need to find other ways to test our ideas
John Kay book: The Truth about Markets
‘Ever tried. Ever failed. No matter. Try again. Fail again. Fail better.’ – Samuel Beckett