I rather liked the idea of 'frozen accidents', expounded here by Shane Parrish. He quotes from physicist Murray Gell-Man who believed that the things we observe in the world are the result of both a set of fundamental laws, but also random accidents - the unforeseen occurrences which create the opportunity for new possibilities but which, had they gone differently would have led to a very different world. Once these accidents happen, they become frozen in time, having a potentially significant affect on subsequent developments. In understanding how complex systems (like natural ecosystems and evolution but also markets and organisations) work, we need to understand the role that both can play in shaping the future:
'...most single accidents make very little difference to the future, but others may have widespread ramifications, many diverse consequences all traceable to one chance event that could have turned out differently. Those we call frozen accidents.'
Complexity comes from the combination of simple rules and these 'bounces' that could have resulted in many different outcomes but which once a part of history are irreversible. Frozen accidents of history have a nonlinear effect on everything that comes afterwards. Shane tells a great couple of stories by way of illustration (which I'll quote in full) - the first from Eric Beinhocker's The Origin of Wealth, involving American sharp-shooter Annie Oakley:
'In the late 1800s, “Buffalo Bill” Cody created a show called Buffalo Bill's Wild West Show, which toured the United States, putting on exhibitions of gun fighting, horsemanship, and other cowboy skills. One of the show's most popular acts was a woman named Phoebe Moses, nicknamed Annie Oakley. Annie was reputed to have been able to shoot the head off of a running quail by age twelve, and in Buffalo Bill's show, she put on a demonstration of marksmanship that included shooting flames off candles, and corks out of bottles. For her grand finale, Annie would announce that she would shoot the end off a lit cigarette held in a man's mouth, and ask for a brave volunteer from the audience. Since no one was ever courageous enough to come forward, Annie hid her husband, Frank, in the audience. He would “volunteer,” and they would complete the trick together. In 1880, when the Wild West Show was touring Europe, a young crown prince (and later, kaiser), Wilhelm, was in the audience. When the grand finale came, much to Annie's surprise, the macho crown prince stood up and volunteered. The future German kaiser strode into the ring, placed the cigarette in his mouth, and stood ready. Annie, who had been up late the night before in the local beer garden, was unnerved by this unexpected development. She lined the cigarette up in her sights, squeezed…and hit it right on the target.
Many people have speculated that if at that moment, there had been a slight tremor in Annie's hand, then World War I might never have happened. If World War I had not happened, 8.5 million soldiers and 13 million civilian lives would have been saved. Furthermore, if Annie's hand had trembled and World War I had not happened, Hitler would not have risen from the ashes of a defeated Germany, and Lenin would not have overthrown a demoralized Russian government. The entire course of twentieth-century history might have been changed by the merest quiver of a hand at a critical moment. Yet, at the time, there was no way anyone could have known the momentous nature of the event.'
The fact that such small non-events and inputs can have such hugely disproportionate effects on what happens later means that it is almost impossible to truly predict the future.
In a more modern example, Beinhocker also tells the story of Bill Gates, and how Microsoft created their first operating system.
'[In 1980] IBM approached a small company with forty employees in Bellevue, Washington. The company, called Microsoft, was run by a Harvard dropout named Bill Gates and his friend Paul Allen. IBM wanted to talk to the small company about creating a version of the programming language BASIC for their new PC. At their meeting, IBM asked Gates for his advice on what operating systems (OS) the new machine should run. Gates suggested that IBM talk to Gary Kildall of Digital Research, whose CP/M operating system had become the standard in the hobbyist world of microcomputers. But Kildall was suspicious of the blue suits from IBM and when IBM tried to meet him, he went hot-air ballooning, leaving his wife and lawyer to talk to the bewildered executives, along with instructions not to sign even a confidentiality agreement.
The frustrated IBM executives returned to Gates and asked if he would be interested in the OS project. Despite never having written an OS, Gates said yes. He then turned around and license a product appropriately named Quick and Dirty Operating System, or Q-DOS, from a small company called Seattle Computer Products for $50,000, modified it, and then relicensed it to IBM as PC-DOS. As IBM and Microsoft were going through the final language for the agreement, Gates asked for a small change. He wanted to retain the rights to sell his DOS on non-IBM machines in a version called MS-DOS. Gates was giving the company a good price, and IBM was more interested in PC hardware than software sales, so it agreed. The contract was signed on August 12, 1981. The rest, as they say, is history. Today, Microsoft is a company worth $270 billion while IBM is worth $140 billion.'
Whilst we may feel comfortable predicting some more obvious and certain outcomes, the point is that making rigid predictions in complex adaptive environments is largely a fools game. Like mutations in evolution, random accidents are just as important in shaping the future as incremental improvements and so, as Shane says, we must learn that 'predicting is inferior to building systems that don't require prediction'.
In other words the systems and organisations that we build need to be far more responsive, able to capitalise rapidly on changing contexts with in-built anti-fragility. Leadership in this context is about making smart decisions in recognising small changes that can lead to disproportionate effects.