Tim Harford is one of my favourite writers and broadcasters. I referenced his book, Messy, on an earlier blog. He recently recorded a new podcast called Cautionary Tales and episode 3, LaLa Land: Galileo’s Warning, is especially interesting. In it, Harford tells the story of the 2017 Oscar awards ceremony, the one where Warren Beatty and Faye Dunaway read out that LaLa Land has won the coveted Oscar for Best Picture, only for it to emerge that the winner is actually Moonlight and a major faux pas has taken place. After a short post-mortem, blame for the error landed on an accountant called Brian Cullinan whose job it was to hand the best picture envelope to Beatty and Dunaway. He mistakenly handed them a back-up envelope for best actress instead, possibly having been distracted backstage tweeting a photo of its winner, Emma Stone.
Now the outcome here was nothing like the outcome in the Sandra Bland case discussed at the end of my last blog. The best picture Oscar was rightfully bestowed on Moonlight and apart from some embarrassment life went on. But like Gladwell in the Sandra Bland case, Harford refuses to blame the prime suspect, in this instance Brian Cullinan. Rather, he questions the system in which Cullinan, Beatty, Dunaway, and many others operated. He argues that this was a special kind of system failure because the very mechanisms put in place to prevent failure contributed to its cause. A complex and meticulous system had evolved, whereby two sets of envelopes were despatched to the studio in case one got lost. Yet the system was optimised for security rather than accuracy – it didn’t do anything to mitigate a duplicate card being handed out, nor did the information on the card make clear enough to the presenter what it was conveying.
Normal accidents
Harford cites the work of Charles Perrow, a sociology professor whose book Normal Accidents looks at the conditions for such errors to occur. Perrow identifies two factors that make a system vulnerable to failure. The first is complexity. Some systems are linear, their parts interact in mostly visible and predictable ways. By contrast others are complex – the parts interact in numerous indirect ways and it can be difficult to get a God’s eye view of the whole picture. The second is coupling: how much slack there is in the system. In tightly coupled systems the components need to be combined in a particular order and time frame leaving no room for error. Most systems can be ranked according to the degree of their complexity and how tightly coupled they are; those that are both complex and tightly coupled are the most vulnerable. When he wrote his book in 1984 Perrow ranked systems according to the following matrix:

Based on Harford’s thinking, an updated chart would drop the Oscars into the dangerous, top-right quadrant. Their secrecy as well as the redundancy that emerges from a set of duplicate cards injects complexity into the system and the fact that the event goes out live on TV makes it tightly coupled.
But it’s not just the Oscars. Since 1984, when Perrow wrote his book, systems have become increasingly more complex.
Take dams. Perrow argued that the system around dams is tightly coupled but very linear (top-left quadrant). However, in their book Meltdown, authors Chris Clearfield and András Tilcsik quote research written by a dam inspector which shows that since the 1990’s new technologies and regulations have changed the way that dams in the US are operated. The introduction of sensors and remote operators has made them more complex with the result that “the probability of a failure increases”.
Post offices were another system that Perrow placed out of the danger zone (bottom-left quadrant). The recent news in the UK of the Post Office settling a case brought by postmasters accused of theft illustrates the short-sightedness of that. The case stems from the early 2000’s when the Post Office wrongly accused thousands of postmasters of dipping into the tills based on ‘evidence’ from its new Horizon IT platform. Many were told to pay back supposedly missing funds or face prosecution. Some were convicted and imprisoned while many more pleaded guilty to lesser charges to end further action. According to the FT, an independent forensic review found “exceptionally complex systems that had trouble linking with other systems; a lack of proper training; and a business model that gave sub-postmasters all the responsibility for dealing with any problems.” In the words of the Meltdown authors: “It turns out that Horizon was both complex and tightly coupled.”
Increasing complexity
Brian Arthur, a leading complexity thinker at the Santa Fe Institute, has proposed three mechanisms through which systems become more complex as they evolve over time. One is the formation of more niches within an ecosystem, so that the ecosystem becomes more complex, which he calls growth in coevolutionary diversity. The example he gave, in 1994, was the proliferation of specialised products and processes in the computer industry. He finds a parallel in biology and quotes Darwin:
“The enormous number of animals in the world depends, of their varied structure and complexity… hence as the forms became complicated, they opened fresh means of adding to their complexity.”
The second is a structural deepening, where functionality is added to allow a system to break out of its limits of performance, leading to the creation of subsystems. For example, today’s jet engine is significantly more complex than the one designed by Frank Whittle in the 1930’s, harnessing many more interconnections and parts.
The last is where simple elements from one system are re-tasked to different ends, which he calls capturing software. An example here is the development of financial derivatives where new ideas were overlaid on top of simpler trading markets to create new markets.
He argues that although bursts of simplicity will occasionally cut through, these mechanisms tend to drive complexity unsteadily upwards: “In this back-and-forth dance between complexity and simplicity, complication usually gains a net edge over time.”
The advent of artificial intelligence is likely to compound the trend towards complexity. The lack of transparency inherent in many algorithms is one of the defining features of complexity. In her book, Weapons of Math Destruction, Cathy O’Neil identifies three elements of a harmful algorithm, and opacity is one (the others are whether it works against the subject’s interest and whether it can scale). So far, many systems that use artificial intelligence are not tightly coupled because they leave room for human intervention, but if that factor catches up, more systems will slip into the dreaded top-right quadrant.
In many cases safety concerns themselves are driving the charge towards the top-right quadrant. Adding additional features to a system risks increasing its complexity. In the Oscars case, the practice of deploying two sets of cards did exactly that. Harford notes on his podcast that the Academy’s solution to its system failure, to introduce a third set of cards, may not be the best thing to do.
How we see systems
Yet even while systems are getting more complex, we continue to see them in very simple terms. Nassim Taleb makes the point in his book, Antifragile:
“Man-made complex systems tend to develop cascades and runaway chains of reactions that decrease, even eliminate, predictability and cause outsized events. So the modern world may be increasing in technological knowledge, but, paradoxically, it is making things a lot more unpredictable.”
Not only are they more unpredictable ex-ante, but even after the fact, we frequently fail to see systems for what they are. When a system fails, we tend to focus on one of its component parts rather than on how the whole thing hangs together. Perrow looks at where the blame was cast following the nuclear accident on Three Mile Island. The President’s Commission blamed primarily the operators, the equipment manufacturers blamed only the operators, the officials who ran the plant blamed the equipment, and experts who did a study for the Nuclear Regulatory Commission blamed the design. Perrow concludes that the contribution of each of these was quite trivial and it was the complexity of the system as a whole that led to the accident.
It’s the same with the Brexit referendum. There are many suggestions as to why it swung the way it did: a populist uprising, post-truth campaigning, the role of social media, the positions taken by key characters in the campaign, the list goes on. But Dominic Cummings, campaign director of the Vote Leave campaign, writes:
“The cold reality of the referendum is no clear story, no ‘one big causal factor’, and no inevitability – it was ‘men going at it blind’. The result was an emergent property of many individual actions playing out amid a combination of three big forces… Many of these actions were profoundly nonlinear and interdependent and the result that we actually witnessed was very close.”
The same with the financial crisis. The list of proximate causes is a long one: low interest rates, government housing policy, unsustainable credit demand, weak credit underwriting standards, mixed public/private mandate of Fannie Mae and Freddie Mac, China’s accumulation of dollar reserves, failure of financial regulation, role of credit rating agencies, growth of shadow banking system, poorly designed compensation schemes, financial institution leverage, mark-to-market accounting, financial institution interconnectedness, perception of ‘too big to fail’, short-selling, and more.
Indeed, everyone has their pet theory. Populists blame the bankers, libertarians blame monetary policy, bank executives blame short sellers. But it was the system as a whole that failed, as Ben Bernanke, Federal Reserve Chairman at the time acknowledged when he spoke to the Financial Inquiry Commission: “The system’s vulnerabilities, together with gaps in the government’s crisis-response toolkit, were the principal explanations of why the crisis was so severe and had such devastating effects on the broader economy.”
Across all of these examples, there’s a tendency to blame the operator. That was the conclusion of the President’s Commission on Three Mile Island; in the case of Brexit the “lies” supposedly told by prominent members of the leave campaign have received greater prominence than the “emergent property of many individual actions” that Cummings credits; and in the financial crisis the bankers end up in the cross-hairs more than most.
The idea of personal responsibility is deeply rooted in our culture and a man-made disaster will typically have us searching for human culprits. Perrow estimates that on average 60-80% of accidents involve human error, and although that error may not be the direct cause of the accident, it’s so much easier to blame the operator.
The reason we have difficulty seeing systems is because we are continuously distracted by events. Donella Meadows writes in her book, Thinking in Systems:
“Systems fool us by presenting themselves—or we fool ourselves by seeing the world—as a series of events. The daily news tells of elections, battles, political agreements, disasters, stock market booms or busts. Much of our ordinary conversation is about specific happenings at specific times and places. A team wins. A river floods. The Dow Jones Industrial Average hits 10,000. Oil is discovered. A forest is cut. Events are the outputs, moment by moment, from the black box of the system.”
Sometimes we move to the next level and consider behaviour. In fact, now that news has become so commoditised, more and more of this is going on. When she wrote in 1993 Meadows observed that much analysis in the world goes no deeper than events and that behavioural analysis was rare. Today, opinion writers comment frequently on how events accumulate into dynamic patterns of behaviour. The problem is that their focus tends to be on the short-term and many systems do not reveal themselves through their short-term behaviours. After the Three Mile Island accident, latent errors in the system were traced back two years; after the Challenger space shuttle crash they were traced back nine years. Cummings traces the result of the Brexit referendum back to the financial crisis eight years prior and the financial crisis itself can be traced back to regulatory changes put in place starting thirty years beforehand.
It’s also the case that many commentators will latch onto those parts of the system where they have a preconceived perspective. Because they have so many moving parts, when they fail, systems throw up something for everyone.
Understanding complexity
Clearly there are numerous evolved reasons why we react like this and, as the last point makes clear, they may be compounded by incentives. But our very education is instrumental too. From a young age we are taught to start small and scale up. Whether it’s counting from one to ten as a foundation for counting to bigger numbers, or it’s drawing a small part of a landscape before expanding to the whole, or it’s understanding history as a linear timeline of facts, we are taught to build up to a picture by focusing on the component parts. For many systems that works fine. But for the multitude of systems that are non-linear it falls on its face. As Meadows writes, “If we’ve learned that a small push produces a small response, we think that twice as big a push will produce twice as big a response. But in a non-linear system, twice the push could produce one-sixth the response, or the response squared, or no response at all.”
Indeed, such thinking is the source of many paradoxes:
- The Paradox of Thrift: Families save more during a recession, yet the collective good would be better served by everyone spending more.
- Abilene Paradox: Perhaps the opposite of the paradox of thrift, this is when a group makes a collective decision that is counter to the thoughts and feelings of its individual members. Named after the Texas town of Abilene, it reflects an anecdote in which family members agree to take a day trip in order to keep each other happy, learning later that no-one had really wanted to go.
- Fenno’s Paradox, which we saw in the last blog: In the US according to Gallup, 53% of people approve of the way their individual congressional representative is handling their job. Yet overall approval of the way Congress is handling its job is only 20% (Jan 2019).
- Braess’ Paradox: Add more roads to a network, and overall traffic may actually slow down. That’s because the new route can create a shortcut which becomes overused, directing traffic to roads which end up becoming congested. The network change creates a new game structure which leads to a multiplayer prisoner’s dilemma and a less efficient Nash equilibrium emerging.
- Condorcet Paradox: While individual voters may have so-called transitive preferences in that if they prefer A to B to C, they likely prefer A to C, that may not be true of the electorate as a whole. Scaled up, the electorate may prefer C to A. Although not inconsistent individually, we can be collectively as conflicting majorities emerge made up of different groups of individuals.
Another less known way in which we struggle to reconcile the individual with the collective is when we think about expected values. I have been following the work of Ole Peters since being introduced to it in the context of investment returns in 2015. His theory on ergodicity concerns our confusion between what he calls an ensemble average and a time average. Both formally through maths and informally through our own intuition, we frequently assume that the average outcome when a person does something a thousand times is the same as when a thousand people do it once, at the same time. But simulations show that for many set-ups that just isn’t the case. What looks like a good bet for the ensemble as a whole can drive an individual to bankruptcy over time.
Take a bet where you put in a £100 stake and if you throw heads your wealth increases by 50% but if you throw tails it falls by 40%. If a thousand people took this bet once, they would end up with £105,000: 500 people would get £150 and 500 would get £60. That’s a 5% return. But if I was to take the bet myself over a thousand tosses of the coin, I wouldn’t end up so well off. A few tails in a row would ruin me – three of them, and I would then need four heads in a row just to recover what I’d lost. More often than not, I would be wiped out before the thousandth toss.
The flaw has been discussed explicitly by Rory Sutherland in his book, Alchemy, previewed here and in Nassim Taleb’s book, Skin in the Game. It was also identified in passing by Stephen Schwarzman in his book, What It Takes, reviewed here. I will leave further discussion for a later post, because it has all sorts of real-world consequences; just to say that thinking about a system as the additive sum of its parts can yet again be a recipe for disaster.
The answers
More and more outcomes are determined by systems and systems are getting more complex. Nuclear reactors are the archetypical complex system so no surprise Perrow began his work there. But alongside them we can add political referendums, financial services infrastructure and post office management systems. In the last blog we saw Gladwell taking a holistic look at policing systems, and this we provide a holistic look at the announcement of Oscar winners.
Yet we are failing really to grapple with this trend. Confronted with increasing complexity we seem quicker than ever to leap to the simplest conclusion, the one that focuses on a single dimension of the system. Partly this stems from impatience – the work required to analyse the whole system is clearly vast, and we are often in no mood to wait. Partly it reflects the natural human desire to apportion blame and to identify a scapegoat. But our quick, one-dimensional response has the effect of feeding tribalism. With so many components driving outcomes, there is something for everyone, as we saw in the aftermath of the financial crisis. When systems are not linear, there can be many ways that hindsight allows us to get from A to B without necessarily taking in all the stops. The trend towards greater tribalism is well documented. It has many root causes, but it is also tied to the growth of system complexity.