Although we are healthier and live longer than our predecessors, and our food supply is more varied and inexpensive than ever, Americans are manifesting anxiety about all sorts of things—genetically engineered foods (derived from so-called "GMOs"), pharmaceuticals, chemicals, gluten, and even "chemtrails," to name just a few. As Dr. Alex Berezow of the American Council on Science and Health recently observed,
Chemistry and biotechnology—instead of being hailed as the revolutionary sciences that they are—have been mocked by an organic industry that is now worth $47 billion. The PR from companies like Whole Foods and Panera Bread is consistent: Other stores' food is toxic and dangerous, and only our food is safe... The insanity doesn't end there. Alternative medicine practitioners teach people to be afraid of their doctors. Lawyers instill fear about pharmaceutical companies.
Part of the reason that such fake news thrives is that science literacy is generally poor and Americans do not know whom to trust when it comes to evaluating scientific information. Moreover, in an age when so much of American life is politicized, polemicized and regulated, skepticism about "experts" is not entirely irrational. Who are the experts and who gets to decide that question? These are questions fraught with complexity and difficulty today.
In a 2017 poll, only about two-thirds of Americans admitted that they are exposed to any science news at all, while fewer than one in five are active consumers. That doesn't discourage non-experts from eagerly offering opinions on a spectrum of arcane scientific issues, however, many of which are inextricably linked to public policy. Indeed, the more closely related a scientific issue is to public policy, the more likely there is to be an army of self-appointed activists and purported experts — many of whom are actually promoting their own interests — at the ready to march before the public and policymakers presenting their alleged "wisdom."
President Dwight D. Eisenhower once famously said about opinionated know-nothings pontificating on agriculture, "Farming looks mighty easy when your plow is a pencil and you're a thousand miles from the corn field."
While it is certainly desirable for government to create laws and regulations that accurately reflect the common sense of the people who elected them to office, the death of expertise is not a trivial problem. It confounds policymakers and regulators who feel compelled to seek non-expert input on decisions, wasting time and taxpayers' money. The 18th-century Irish statesman and writer Edmund Burke emphasized the government's responsibility in a republic to make such determinations. He observed, "Your representative owes you, not only his industry, but his judgment; and he betrays, instead of serving you, if he sacrifices it to your opinion."
The National Academies of Science, Engineering and Medicine have repeatedly endorsed public engagement on public policy issues that are predominantly scientific and technological in nature, but such recommendations, while politically correct, are arguably misguided. Although non-experts should be educated so that they can understand the rationale for government policy, it is less useful for them to formulate said policy. This is particularly true when complex issues of science and technology are involved. Science is not democratic. The citizenry do not get to vote on whether a whale is a mammal or a fish, or on the temperature at which water boils; and legislatures cannot repeal the laws of nature, although they have tried.
A frequently cited model for direct citizen involvement in public policy is Denmark, where non-experts are invited to bring to citizens-consensus conferences "a basic 'common sense' derived from worries, visions, general view and actual everyday experience as their basis for asking a number of essential questions concerned with the given subject."
But in America, this is why we have elections. We elect representatives to bring the "common sense" of the voters to matters of public policy. If they fail to do that, the remedy is to be found at the ballot box, not in presentations of "public sentiment" at public hearings. Who elected the people who show up to these meetings, anyway? How do we know they are an authentic representation of public opinion? Aren't they more likely to be a collection of agitators, activists, and even professional lobbyists? Do ordinary citizens regularly make time in their busy lives to attend such gatherings? In fact, economists have a term—"rational ignorance"—for citizens' lack of engagement on arcane issues whose outcome they feel they cannot affect.
Denmark's approach has been applied there to a broad spectrum of scientific and technological issues, including food irradiation, molecular genetic engineering techniques applied to agriculture and animals, setting limits on chemicals in the environment, and human-genome mapping. Danish populism has led to the adoption of excessively precautionary, harmful regulation of many products and technologies.
Other nations' experience is also far from positive. Consider, for example, the United Kingdom's "GM Nation" exercise in 2003, intended to gain insight into the public's views of genetic engineering (also known as "genetic modification," or "GM"). At great expense and effort, the UK government sponsored a series of public discussions around the country, as well as using more conventional approaches, such as focus groups. Local authorities and various organizations held hundreds of additional public meetings on the subject.
The result? Mark Henderson, science correspondent for The Times (London) newspaper, offered this view of the half-million-pound initiative:
The exercise has been farce from start to finish. I'm not sure I want the man in the street to set Britain's science, technology and agriculture policy. One of the six meetings...spent much of its time discussing whether the SARS [severe acute respiratory syndrome] virus might come from GM cotton in China. It's more likely to have come from outer space.
Mr. Henderson went on to say that the meetings were dominated by anti-technology zealots, the only faction that was organized and impassioned enough about the issue to attend. We see that as well in the highly orchestrated responses to U.S. government requests for public comment on proposed regulations: Special interests mobilize not only their base, but can also create an avalanche of fake comments via troll factories, as described by Sharyl Attkisson in her excellent book, The Smear: How Shady Political Operatives and Fake News Control What You See, What You Think, and How You Vote.
Another disappointment to those who advocate public engagement came courtesy of the National Science Foundation, whose primary mission is to support laboratory research across many disciplines. NSF funded a series of "citizens technology forums," at which ordinary, previously uninformed Americans were brought together to solve a thorny question of technology policy.
One of these, which focused on genetic engineering applied to agriculture and conducted by investigators at North Carolina State University under a 2002 NSF grant, provided information to participants "from a range of content-area experts, experts on social implications of science and technology, and representatives of special interest groups." This was supposed to enable them to reach consensus and make recommendations. The resulting recommendations were, however, at odds with the views of government, academic, and industry scientists, which were based on expertise, data, and experience. I count that as a failure.
The 2008 citizens' forum on nanotechnology, also funded by NSF, is again instructive about the value of non-expert input on esoteric scientific issues. The organizers selected "from a broad pool of applicants a diverse and roughly representative group of seventy-four citizens to participate at six geographically distinct sites across the country." Participants were informed by "a 61-page background document—vetted by experts—to read prior to deliberating." They produced a hodgepodge of conclusions and recommendations, including "concern over the effectiveness of regulations" and "reduced certainty about the benefits of human enhancement technologies" but wanted "the government to guarantee access to them if they prove too expensive for the average American."
That outcome was predictable: The participants lacked an understanding of the risks and benefits but wanted the government to provide them with entitlements so they could avail themselves of the beneficial products of nanotechnology, should they appear!
Politicians like to pay lip service to public engagement on regulatory issues, even if those issues require understanding of sophisticated and complex issues. President Clinton's Secretary of Agriculture Dan Glickman once said that there must be public trust "in the regulatory process that ensures thorough review [of genetically engineered plants]—including complete and open public involvement."
How does one secure that trust? How many years of excessive, worthless regulation and billions of dollars squandered—to say nothing of untold opportunity costs—are necessary to assuage unwarranted public anxieties? Should we allow "complete and open public involvement" via referendum to determine when farmers can cultivate a new variety of canola? By analogy, should we take a vote on the approval for marketing of a new Hepatitis-C vaccine or novel cancer drug?
The bottom line is that there is no evidence that decades of soliciting public engagement on topics like nuclear power, molecular techniques of genetic engineering or nanotechnologies has gained public trust or acceptance. Nor has the subordination of evidence-based policy-making to emotional or political calculations either increased public acceptance or encouraged innovation. So let's stop doing it.
Henry I. Miller, a physician and molecular biologist, is the Robert Wesson Fellow in Scientific Philosophy and Public Policy at Stanford University's Hoover Institution. He was the founding director of the FDA's Office of Biotechnology.