The Bulletin of the Atomic Scientists, 11/1/16
There once was a civilization that led the world in science. Its foremost thinkers made great leaps in mathematics, engineering, astronomy, and medicine, and its institutes of higher learning drew scholars from around the known world. Unfortunately, thanks to a combination of religious zealotry and a failure to adopt new technology, the civilization abandoned its tradition of scientific inquiry and collapsed into closed-mindedness and despotism.
This great flourishing took place in the Islamic world from the eighth to thirteenth centuries, when Muslim scholars developed not just algebra and medical treatises but the scientific method itself. The eleventh-century thinker Ibn Al-Haytham, in his groundbreaking work Optics, argued that knowledge-seeking should begin with “an inspection of the things that exist and a survey of the conditions of visible objects,” and called for “exercising caution in regard to conclusions.” He and his kind were gradually shut down by an alternate school of thought led by the anti-rationalist theologian Abu Hamid al-Ghazali, who argued, as some Christian fundamentalists do today, that the only cause of anything is God. By their logic, just because heavy objects are observed to fall downward 10 times out of 10, it does not mean that God could not, on a whim, make them fall up. Al-Ghazali’s views prevailed in the Muslim world, and the West took up the torch of science.
This story is just one historical example of science denial followed by societal collapse cited by Shawn Otto in his big, bracing battle cry, The War on Science: Who’s Waging It, Why It Matters, What We Can Do About It. China’s Mao Zedong put ideology ahead of knowledge when he crafted plans for his Great Leap Forward, which, through grain-production quotas untethered to reality, helped lead to a massive famine that killed tens of millions of people (Dikotter 2010). In Russia under Stalin, biologist Trofim Lysenko was appointed head of the country’s main genetics institute after claiming he knew how to produce better crop yields. He did not, but he grew to wield so much power that research into the science-supported genetics he rejected was virtually outlawed. “Soviet agriculture, biology, and genetics were held back for forty years, weakening the Soviet Union and helping lead to its eventual downfall,” Otto writes.
Examples like these raise an obvious, scary question for Americans who often see their elected leaders frivolously disregarding science: Could it happen here? It is hard to imagine. Democracy in the United States, after all, has done a pretty good job of course-correcting over the years, such that government tends to more or less represent the will of the people. The country remains not just a dominant military power, but an economic and cultural beacon. During presidential elections, a lot of hyperbolic things get said about how if the wrong candidate wins, the nation will fall apart, but then he does and it does not – not entirely, anyway – because even under the new regime there is still room for its most ardent detractors to advance their various causes. So could the system really be failing us when it comes to science? Otto would argue “yes.”
Snowballs in the Senate
The world’s democracies, for all their strengths, have a science problem. Otto opens his book with a quote from Thomas Jefferson: “Wherever the people are well informed they can be trusted with their own government.” The trouble is, the American people are no longer well informed on the issues that affect them.
To take but a few headline topics: Voters and politicians have had to make decisions in recent years about disappearing fisheries, drought, drug addiction, Ebola, extreme weather, genetically modified food, Internet access, mental health treatment, nuclear power, nuclear weapons, renewable energy, stem-cell research, and vaccinations. None of these can be comprehended without some understanding of, and appreciation for, the underlying science. Lacking that comprehension, voters and legislators act in ignorance, and the problem is only going to get worse, because the scientific revolution itself is going through what Otto calls a “phase change.” As he puts it, “There is a sudden, quantitativeexpansion of the number of scientists and engineers around the globe, coupled with a sudden qualitative expansion of their ability to collaborate with each other over the Internet.” In short, human knowledge is expanding more rapidly than ever before. Issues that require not merely novel solutions but whole new ethical frameworks will continue to crop up. What sort of limits should we place on autonomous weapons? What about on genetic editing? When does life begin for a clone? We have barely started asking the questions, much less figuring out the answers.
Meanwhile, in the face of mounting complexity, some leaders, either out of political calculation, psychological impulse, or old-fashioned ignorance, have simply opted for the tried-and-untrue denial route. Take the example of global warming. Based on research going back many decades now, 97% of climate scientists agree that human carbon dioxide emissions are contributing to climate change. Yet only a third of the United States Congress accepts the science (Center for American Progress 2016). Just last year, in an attempt to argue that global warming is a hoax, Oklahoma Republican James Inhofe threw a snowball on the Senate floor and said, “It’s very, very cold out” (Sheppard 2015). Senator Ted Cruz, a Republican from Texas and at one point a serious presidential primary contender, flat-out told an NPR interviewer last year, “The scientific evidence doesn’t support global warming,” as though by saying so he could make it true (Gleick 2015). Voters are not far behind their leaders: According to a 2015 Pew Research Center poll Otto cites, 78% of Democrats said the Earth was getting warmer because of human activity, but only 10% of Republicans agreed.
So how, in a developed democracy – a land of great universities, technological innovation, and relative opportunity – did we get to a place where most members of one of the two major political parties choose to reject the scientific consensus on such an important issue?
The usual (and unusual) suspects
Otto does a fascinating job of teasing out the culprits, some of whom are more predictable than others.
The villains of the right, for instance, tend to be well known for their anti-science behavior. First come the religious forces who, like al-Ghazali when he attacked his scientifically minded fellow Muslims, or the Vatican when it threatened to torture Galileo for observing that the Earth circles the sun, cannot tolerate it when reality conflicts with scripture. Then there are the corporate forces who mount disinformation campaigns to mislead the public when science-based decision-making would harm profits, as when tobacco companies hid their knowledge that smoking causes lung cancer, or when the oil company Exxon obfuscated its own scientists’ evidence on climate change, as first reported by Inside Climate News (Banerjee, Song, and Hasemyer 2015). The vastness and intricacy of the effort to sow uncertainty about climate change among the US electorate, funded by oil companies and carried out by public-relations firms, think tanks, donation-accepting politicians, and corrupt or co-opted scientists, goes a long way toward explaining why there are still plenty of educated professionals who call themselves climate-change skeptics without any of the embarrassment that would presumably go along with claiming to be, say, gravity skeptics. Between 1999 and 2010 alone, the energy industry spent more than $2 billion to fight climate-change legislation (Otto 2016, 408).
As appalling as this is, the author finds fault on the other side, too, arguing that the intellectual underpinnings of contemporary anti-science trends come out of identity politics and postmodernism, movements typically associated with the academic left. It all began with the central postmodernist principle that there is no such thing as objective truth. This turned out to be a valuable insight for learning how to live in a multicultural society, for it contained the lesson that depending on who you are, you might experience the world very differently from someone with a different set of characteristics. The trouble with the postmodernist outlook is that there is, in fact, such a thing as objective reality. Whether you are black or white, Christian or Muslim, male or female, if you use a ruler to measure the amount of rain that fell in a bucket, you will arrive at the same answer. But postmodernism has treated science as just another way of knowing, and a suspect one at that, with all its white lab coats and claims to authority. With his enthusiasm for conspiracy theories – calling global warming a hoax, tweeting that it was invented by the Chinese – Republican presidential candidate Donald Trump is often accused of living in his own reality (Jacobson 2016). It was the postmodernists who taught us we could do that.
Journalists have not exactly acquitted themselves, either. All too often, newspapers and talk shows still present scientific consensus and ill-informed opinion as two equal sides in a debate. “If one side presents knowledge and the other opinion, simply reporting both sides is not journalism,” Otto writes. “It constitutes malfeasance.”
Anti-science outbursts among the general public, meanwhile, appear from across the political spectrum. True, it is largely Republicans who have fought against teaching evolution and sex education in schools. But rejection of vaccines has emerged repeatedly in well-off liberal enclaves – sometimes with deadly consequences – even though the myth that vaccines cause autism has been thoroughly debunked, and they are generally safe and effective. The movement to ban genetically modified organisms from our food supply, largely a left-wing cause, ignores the many scientific studies that have failed to find any harmful health effects. In the 2016 election, the presidential candidate you would think might offer refuge to science supporters – Green Party contender Jill Stein – has waffled on vaccines, acknowledging that they are “an invaluable medication,” but also claiming that not all questions and concerns about vaccines “were completely resolved.” As a medical doctor, she presumably knows better, so it seems that she is choosing to pander to anti-vaxxers. With Libertarian candidate Gary Johnson having said he opposes mandatory vaccines, and Trump explicitly anti-vaccination, that leaves Democrat Hillary Clinton as the only major candidate with views on the subject that reflect the scientific consensus (Lopez 2016).
Heal thyself
There is another group that comes in for a drubbing by Otto, and that is scientists themselves, who he says have unnecessarily hurt their own image by turning inward in the last half-century or so.
As the Cold War took shape and Americans began to see investment in science as necessary to win it, federal science funding increased greatly with the result that scientists “no longer had to impress the public,” Otto argues. Even though much of their funding ultimately came from taxes, they instead worked only to impress the university departments and government agencies funneling the research money. Those departments and agencies, in turn, offered no rewards to scientists for public outreach. Meanwhile, a communication gap between the “two cultures” – the sciences and the humanities – identified by the British physicist and novelist C.P. Snow in 1959 continued to widen. “From the public’s perspective, the science community had largely withdrawn into its ivory tower and gone silent,” Otto writes.
The fate of astronomer Carl Sagan illustrates this institutional embrace of insularity. Fearing an America in which “no one representing the public interest can even grasp the issues,” Sagan starred in the 1980 television series Cosmos, which may have done more to interest non-scientists in the field than anything else for at least a generation. Despite publishing some 500 scientific papers in his career, Sagan appears to have been punished for his celebrity by his peers when he was denied admission to the National Academy of Sciences.
In 1963, Snow argued in favor of a “third culture” that would bring together literary intellectuals and scientists (Brockman 1991). To the extent that one has emerged at all, it is tiny, but Otto has been an industrious midwife. A successful novelist and screenwriter – he wrote the screenplay for the Oscar-nominated House of Sand and Fog – he is also a cofounder of Science Debate, a campaign to get political candidates to discuss science issues. (Lawrence Krauss, a theoretical physicist and chair of the Bulletin’s board of sponsors, is also a cofounder.) In 2008, when the project was founded, and in 2012, it succeeded in getting US presidential candidates to answer lists of science questions in print. As of this writing, no live televised debate among presidential candidates devoted to science-based policy issues has taken place.
It is hard to see how one would even be possible, moreover, when the Republican presidential candidate dwells so proudly in his own non-reality-based reality, as suggested by his views on vaccines and climate change, and his threats to dismantle the Environmental Protection Agency, which enforces clean-air and clean-water regulations. Trump’s behavior even roused the editors of Scientific American recently, who wrote that they were “not in the business of endorsing political candidates” but felt compelled to “take a stand for science” during a race that “takes antiscience to previously unexplored terrain” (Scientific American 2016).
Despite everything he documents, Otto nonetheless has hope. A quarter of his book is devoted to “winning the war,” as he puts it. He even offers 14 “battle plans,” many of which call for some kind of ambitious cultural shift – church leaders should reach out to scientists, business leaders should form a progressive chamber of commerce, scientists should adopt a broad scientific code of ethics, and so forth.
Will there be a tipping point?
It is tempting to hope that the war on science may simply collapse under its own weight. There were times when, reading about efforts to spread climate disinformation, I was reminded of those stories of inept criminals who go to such elaborate lengths to pull off a heist that for all the good it ultimately does them, they would have been better off doing the right thing. The attorneys general of several states have launched investigations into whether Exxon (now Exxon Mobil) committed fraud by burying its own evidence on climate change. Rising insurance premiums caused by the increasing frequency and severity of climate-related catastrophes may increase costs so much that voters push harder for action. Otto interviews Douglas Holtz-Eakin, former head of the Council of Economic Advisers to President George W. Bush, who tells him, “I’ve always felt it’s in [conservatives’] political interest to not deny the science, that’s where the votes of the future are.” Ultimately, the fight over climate change is about whether it is okay to trash the commons for unshared gain, and historically, at least, both liberals and conservatives have concluded that it is not. No less a conservative luminary than Milton Friedman believed that adverse “neighborhood effects” – negative externalities, or costs in a transaction that end up transferred to a third party – justified government intervention.
Once a critical mass came around to believing the science on how cholera spread, the sanitary movement rose up, and eventually people stopped throwing their feces in the street. Such hopeful forces are at work today. But will their effects be quick enough? The cholera outbreaks of the nineteenth century killed hundreds of thousands of people. Today’s science policy questions, from AI to Zika, are just as urgent. It seems impossible that the wealthy West could slip back into a superstitious and authoritarian dark age, but democracy, which is supposed to prevent that from happening, depends on a free flow of knowledge, and the forces currently working to suppress knowledge in the United States – from elected leaders to corporate donors to charismatic but ignorant celebrities – are daunting.
Champions of cluelessness may always lose in the end. As absurd as it sounds now, Germans once indulged in anti-relativity rallies after Einstein’s landmark general theory of relativity became politicized, denouncing its “Jewish nature.” Even the Vatican came around and conceded in 1992 – three and a half centuries after it took Galileo to task – that he was right. That does not mean the forces of ignorance will lose quickly enough, however, or without bringing down whole societies with them. The Muslim world never really regained its global preeminence in science, or much of anything else. Otto makes a convincing plea that the stakes are too high not to act.