That blunt assertion comes early on in Philip Fernbach's new book. Fernbach, a cognitive scientist at the University of Colorado Boulder, is co-author of "The Knowledge Illusion: Why We Never Think Alone."
"We all suffer, to a greater or lesser extent, from an illusion of understanding, an illusion that we understand how things work when in fact our understanding is meager," he continues.
Fernbach and his co-author, cognitive scientist Steven Sloman at Brown University, argue that this illusion of knowledge has led to political gridlock.
Fernbach speaks with Colorado Matters host Ryan Warner.
Read an excerpt:
Introduction: Ignorance And The Community Of Knowledge Three soldiers sat in a bunker surrounded by three-foot-thick concrete walls, chatting about home. The conversation slowed and then stopped. The cement walls shook and the ground wobbled like Jell-O. Thirty thousand feet above them in a B-36, crew members coughed and sputtered as heat and smoke filled their cabin and dozens of lights and alarms blared. Meanwhile, eighty miles due east, the crew of a Japanese fishing trawler, the not-so-lucky Lucky Dragon Number Five (Daigo Fukuryu¯ Maru), stood ondeck, staring with terror and wonder at the horizon. The date was March 1, 1954, and they were all in a remote part of the Pacific Ocean witnessing the largest explosion in the history of humankind: the detonation of a thermonuclear fusion bomb nicknamed “Shrimp,” code-named Castle Bravo. But something was terribly wrong. The military men, sitting in a bunker on Bikini Atoll, close to ground zero, had witnessed nuclear detonations before and had expected a shock wave to pass by about 45 seconds after the blast. Instead the earth shook. That was not supposed to happen. The crew of the B-36, flying a scientific mission to sample the fallout cloud and take radiological measurements, were supposed to be at a safe altitude, yet their plane blistered in the heat. All these people were lucky compared to the crew of the Daigo Fukuryu¯ Maru. Two hours after the blast, a cloud of fallout blew over the boat and rained radioactive debris on the fishermen for several hours. Almost immediately the crew exhibited symptoms of acute radiation sickness—bleeding gums, nausea, burns—and one of them died a few days later in a Tokyo hospital. Before the blast, the U.S. Navy had escorted several fishing vessels beyond the danger zone. But the Daigo Fukuryu¯ Maru was already outside the area the Navy considered dangerous. Most distressing of all, a few hours later, the fallout cloud passed over the inhabited atolls Rongelap and Utirik, irradiating the native populations. Those people have never been the same. They were evacuated three days later after suffering acute radiation sickness and temporarily moved to another island. They were returned to the atoll three years later but were evacuated again after rates of cancer spiked. The children got the worst of it. They are still waiting to go home. The explanation for all this horror is that the blast force was much larger than expected. The power of nuclear weapons is measured in terms of TNT equivalents. The “Little Boy” fission bomb dropped on Hiroshima in 1945 exploded with a force of sixteen kilotons of TNT, enough to completely obliterate much of the city and kill about 100,000 people. The scientists behind Shrimp expected it to have a blast force of about six megatons, around three hundred times as powerful as Little Boy. But Shrimp exploded with a force of fifteen megatons, nearly a thousand times as powerful as Little Boy. The scientists knew the explosion would be big, but they were off by a factor of about 3. The error was due to a misunderstanding of the properties of one of the major components of the bomb, an element called lithium-7. Before Castle Bravo, lithium-7 was believed to be relatively inert. In fact, lithium-7 reacts strongly when bombarded with neutrons, often decaying into an unstable isotope of hydrogen, which fuses with other hydrogen atoms, giving off more neutrons and releasing a great deal of energy. Compounding the error, the teams in charge of evaluating the wind patterns failed to predict the easterly direction of winds at higher altitudes that pushed the fallout cloud over the inhabited atolls. This story illustrates a fundamental paradox of humankind. The human mind is both genius and pathetic, brilliant and idiotic. People are capable of the most remarkable feats, achievements that defy the gods. We went from discovering the atomic nucleus in 1911 to megaton nuclear weapons in just over forty years. We have mastered fire, created democratic institutions, stood on the moon, and developed genetically modified tomatoes. And yet we are equally capable of the most remarkable demonstrations of hubris and foolhardiness. Each of us is error-prone, sometimes irrational, and often ignorant. It is incredible that humans are capable of building thermonuclear bombs. It is equally incredible that humans do in fact build thermonuclear bombs (and blow them up even when they don’t fully understand how they work). It is incredible that we have developed governance systems and economies that provide the comforts of modern life even though most of us have only a vague sense of how those systems work. And yet human society works amazingly well, at least when we’re not irradiating native populations. How is it that people can simultaneously bowl us over with their ingenuity and disappoint us with their ignorance? How have we mastered so much despite how limited our understanding often is? These are the questions we will try to answer in this book. Thinking As Collective Action The field of cognitive science emerged in the 1950s in a noble effort to understand the workings of the human mind, the most extraordinary phenomenon in the known universe. How is thinking possible? What goes on inside the head that allows sentient beings to do math, understand their mortality, act virtuously and (sometimes) selflessly, and even do simple things, like eat with a knife and fork? No machine, and probably no other animal, is capable of these acts. We have spent our careers studying the mind. Steven is a professor of cognitive science who has been researching this topic for over twenty-five years. Phil has a doctorate in cognitive science and is a professor of marketing whose work focuses on trying to understand how people make decisions. We have seen directly that the history of cognitive science has not been a steady march toward a conception of how the human mind is capable of amazing feats. Rather, a good chunk of what cognitive science has taught us over the years is what individual humans can’t do—what our limitations are. The darker side of cognitive science is a series of revelations that human capacity is not all that it seems, that most people are highly constrained in how they work and what they can achieve. There are severe limits on how much information an individual can process (that’s why we can forget someone’s name seconds after being introduced). People often lack skills that seem basic, like evaluating how risky an action is, and it’s not clear they can ever be learned (hence many of us—one of the authors included—are absurdly scared of flying, one of the safest modes of transportation available). Perhaps most important, individual knowledge is remarkably shallow, only scratching the surface of the true complexity of the world, and yet we often don’t realize how little we understand. The result is that we are often overconfident, sure we are right about things we know little about Our story will take you on a journey through the fields of psychology, computer science, robotics, evolutionary theory, political science, and education, all with the goal of illuminating how the mind works and what it is for—and why the answers to these questions explain how human thinking can be so shallow and so powerful at the same time. The human mind is not like a desktop computer, designed to hold reams of information. The mind is a flexible problem solver that evolved to extract only the most useful information to guide decisions in new situations. As a consequence, individuals store very little detailed information about the world in their heads. In that sense, people are like bees and society a beehive: Our intelligence resides not in individual brains but in the collective mind. To function, individuals rely not only on knowledge stored within our skulls but also on knowledge stored elsewhere: in our bodies, in the environment, and especially in other people. When you put it all together, human thought is incredibly impressive. But it is a product of a community, not of any individual alone. The Castle Bravo nuclear testing program is an extreme example of the hive mind. It was a complex undertaking requiring the collaboration of about ten thousand people who worked directly on the project and countless others who were indirectly involved but absolutely necessary, like politicians who raised funds and contractors who built barracks and laboratories. There were hundreds of scientists responsible for different components of the bomb, dozens of people responsible for understanding the weather, and medical teams responsible for studying the ill effects of handling radioactive elements. There were counterintelligence teams making sure that communications were encrypted and no Russian submarines were close enough to Bikini Atoll to compromise secrecy. There were cooks to feed all these people, janitors to clean up after them, and plumbers to keep the toilets working. No one individual had one one-thousandth of the knowledge necessary to fully understand it all. Our ability to collaborate, to jointly pursue such a complex undertaking by putting our minds together, made possible the seemingly impossible. That’s the sunny side of the story. In the shadows of Castle Bravo are the nuclear arms race and the cold war. What we will focus on is the hubris that it exemplifies: the willingness to blow up a fifteenmegaton bomb that was not adequately understood. From The Knowledge Illusion. Published by Riverhead Books, a member of Penguin Group USA (LLC). Copyright © Steven Sloman and Philip Fernbach, 2017. |