She also didn’t know that she was in Ethiopia, because this was millions of years before anybody had the bright idea of drawing lines on a map and giving the shapes names that we could have wars about.


Humans see patterns in the world, we can communicate this to other humans and we have the capacity to imagine futures that don’t yet exist: how if we just changed this thing, then that thing would happen, and the world would be a slightly better place.


Humans have a very long history of getting into pointless fights.


The problem is that history is slippery: nobody bothered to write down the vast majority of stuff that happened in it, and lots of the people who did write stuff down might have been mistaken, or mad, or lying, or extremely racist.


Exactly how many other species of humans were knocking around at that point is a matter of some debate. The business of taking fragmentary skeletons or fragmentary DNA and trying to work out exactly what counts as separate species, or subspecies, or just a slightly weird version of the same species, is a tricky one.


And then we repeat this many times in a hundred thousand different ways, over and over again, and what were once wild innovations turn into traditions, which spawn new innovations in turn, until eventually you end up with something that you’d call “culture” or “society.”


Wheels actually get invented surprisingly late in the scheme of things, well after civilization has been cheerfully muddling along without them for thousands of years. The first wheel in archaeological history, which pops up about 5.5K years ago in Mesopotamia, wasn’t even used for transport: it was a potter’s wheel.


The thing is that evolution, as a process, is not smart — but it is at least dumb in a very persistent way. All that matters to evolution is that you survive the thousand possible horrible deaths that lurk at every turn for just long enough to ensure that your genes make it through to the next generation. If you manage that, job done. If not, tough luck. This means that evolution doesn’t really do foresight. If a trait gives you an advantage right now, it’ll be selected for the next generation, regardless of whether or not it’s going to end up lumbering your great-great-grandchildren with something that’s woefully outdated. Equally, it doesn’t give points for prescience — saying, “Oh, this trait is kind of a hindrance now, but it’ll come in really useful for my descendants in a million years’ time, trust me” cut absolutely no ice. Evolution gets results not by planning ahead, but rather by simply hurling a ridiculously large number of hungry, horny organisms at a dangerous and unforgiving world and seeing who fails least.

This means that our brains aren’t the result of a meticulous design process aimed at creating the best possible thinking machines; instead, they’re a loose collection of hacks and bodges and shortcuts that made our distant ancestors 2% better at finding food, or 3% better at communicating the concept “Oh shit, watch out, it’s a lion.”

Those mental shortcuts (they’re called “heuristics”) are absolutely necessary for surviving, for interacting with others and for learning from experience: you can’t sit down and work out everything you need to do from first principles.


That’s not a huge problem when it just means stuff like pointing at the stars in the night sky and going, “Oh, it’s a fox chasing a llama.” But once the imaginary pattern you’re seeing is something like, “most crimes are committed by one particular ethnic group,” it’s a big problem.


The idea that the moon makes people go weird is one that’s been around for centuries. It’s literally where the word lunacy comes from; it’s why we have the werewolf mythology.


Again, this is because of those mental shortcuts our brain use. Two of the main shortcuts are the “anchoring heuristic” and the “availability heuristic,” and they both cause us no end of bother.

Anchoring means that when you make up your mind about something, especially if you don’t have much to go on, you’re disproportionately influenced by the first piece of information you hear.

Availability, meanwhile, means that you make judgment calls on the basis of whatever information comes to mind easiest, rather than deeply considering all the possible information that might be available to you. And that means we’re hugely biased toward basing our worldview on stuff that’s happened most recently, or things that are particularly dramatic and memorable, while all the old, mundane stuff that’s probably a more accurate representation of everyday reality just sort of fades away.


Before I began researching this book, I thought that confirmation bias was a major problem, and everything I’ve read since then convinces me that I was right. Which is exactly the problem: our brains hate finding out that they’re wrong. Confirmation bias is our annoying habit of zero-ing in like a laser-guided missile on any scrap of evidence that supports what we already believe, and blithely ignoring the possibly much, much larger piles of evidence that suggest we might have been completely misguided.


There’s even some evidence that, in certain circumstances, the very act of telling people they’re wrong — even if you patiently show them the evidence that clearly demonstrates why this is the case — can actually make them believe the wrong thing more. Faced with what they perceive as opposition, they double down and entrench their beliefs even more strongly. This is why arguing with your racist uncle on Facebook, or deciding to go into journalism, may be an ultimately doomed venture that will only leave you despondent and make everybody else very angry with you.


We’re a social animal, and we really don’t like the feeling of being the odd one out in a group. Which is why we frequently go against all our better instincts in an effort to fit in.

That’s why we get groupthink — when the dominant idea in a group overwhelms all the others, dissent being dismissed or never voiced thanks to the social pressure.


Hardly any will say, “Oh yeah, I’m probably below average.” (The most common answer is actually outside the top 10%, but inside the top 20%, like a boastful version of ordering the second-cheapest glass of wine.)


Of all the mistakes our brains make, “confidence” and “optimism” may well be the most dangerous.


Humans are also very bad at risk assessment and planning ahead. That’s partly because the art of prediction is notoriously difficult, especially if you’re trying to make predictions about a highly complex system. But it’s also because once we’ve imagined a possible future that pleases us in some way (often because it fits with our preexisting beliefs), we’ll cheerfully ignore any contrary evidence and refuse to listen to anybody who suggests we might be wrong.


One of the strongest motivators for this kind of wishful-thinking approach to planning is, of course, greed. The prospect of quick riches is one that’s guaranteed to make people lose all sense — it turns out we’re very bad at doing a cost-benefit analysis when the lure of the benefit is too strong.


This is where all our cognitive biases get together and have a bigotry party: we divide the world up according to patterns that might not exist, we make snap judgments based on the first thing to come to mind, we cherry-pick evidence that backs up our beliefs, we desperately try to fit into groups and we confidently believe in our own superiority for no particular good reason.


The thing about deserts is that they’re quite dry and absorbent, as much as 75% of the diverted river water never even made it to the farms.


While the Aral Sea receded, the salt continued to hang around, making the waters saltier and saltier and less and less capable of supporting life.


Nobody cutting down any single tree was responsible for the problem, up until it was too late: at which point everybody was responsible.


Ecosystems are complicated and messing with the delicate balance of nature will come back to bite you.


In the Americas, this rising inequality seems to hit a plateau after about 2.5K years of agriculture; but in the Old World, it just keeps on going up and up. At some point, these elites stop just being a bit richer than everybody else, and start actually ruling over them.


He did so while instituting a series of reforms that would set standards for how a modern country should be organized: reducing the influence of feudal lords and establishing a centralized bureaucracy, standardizing writing, money and measurement systems and building key communications infrastructure such as a huge network of roads and an early mail service. And he started work constructing the first sections of what would become the Great Wall.


Another problem with democracy is that people are generally big fans of it when they think it might give them power, but suddenly become notably less keen when it looks like it might take power away from them. As a result, democracy often involves a frankly exhausting amount of work simply to ensure it keeps on existing.


Don’t encourage immigration only to later turn against those same immigrant communities.

Don’t assume that you’ll always be a democracy because that’s exactly when things go wrong.


America’s efforts to ban the drinking of alcohol between 1920-33 did lead to fewer people drinking — but it also allowed organized crime to monopolize the alcohol industry, making crime soar in many places.


Unfortunately, this was one of those plans that sounds great when you say it, but entirely relies on your opponents doing exactly what you want them to do.


And what began as a drive for trade in Asia, Africa and the newly discovered Americas would quickly turn into missions of occupation and conquest.


One estimate of the deaths from European colonialism in the 20th century alone puts the figure in the region of 50M, placing it up there with the crimes of Hitler, Stalin and Mao — and that’s in the century that colonial empires were collapsing.


Waiting for several hundred years to pass then doing a sort of retrospective cost-benefit analysis of your actions is not actually how humans generally distinguish right from wrong. That seems more like an after-the-fact attempt to justify what you already want to believe.


It’s a story of a country committing itself to grand but vague ambitions based on the proclamations of ideological true believers, of expert warnings not being listened to and of a stubborn refusal to acknowledge reality and change course, even when the world is sending you every clear signals that you might have made a mistake.


By the 1690s, the Spanish and the Portuguese had been absolutely coining it for the best part of 2 centuries on the resources they’d extracted from their American colonies; more recently, the English and the Dutch had joined the game to great success. The European scramble for global empires now covered Asia, Africa and the Americas, as the general strategy of “turn up with guns and take all their stuff” continued to promise untold riches, with no sign of slowing down.

The age of empire was also the age of financial revolution: as a result, much of the sharp end of colonialism was enacted not just directly by the states, but also by state-backed, publicly traded “joint-stock” companies that blurred the lines between mercantile business and geopolitics.


This was the Company of Scotland’s first lesson in the brutal realpolitik of global trade: that just because you say, “We want to do lots of international trade,” and furthermore that you want to do it on your ow wish list of terms, doesn’t mean that the rest of the world is simply going to agree with you.


Diplomacy is the art of large groups of humans not being wankers to each other — or at the very least, managing to agree that okay, everybody is a wanker sometimes, but why don’t we try to take it down a notch.


It didn’t just give us lots of science, it gave us the idea of science itself, as something that was a distinct discipline with its own methods rather than just being one variant of “having a bit of a think.”


The reason science has a fairly decent track record is that (in theory, at least) it starts from the sensible, self-deprecating assumption that most of our guesses about how the world works will be wrong. Science tries to edge its way in the general direction of being right, but it does that through a slow process of becoming progressively a bit less wrong.


When we’re surrounded by shiny and unexpected new things all the time, those heuristics we use to make judgments get thrown out of whack. When we’re bombarded by ever more information, it’s not surprising if it gets too much to process and we fall back on picking out the bits that confirm our biases.