Susan Haack’s “Scientism and its Discontents,” Part 2 of 3.

APP Editors’ Note: Susan Haack’s “Scientism and its Discontents” was originally published by Rounded Globe in 2017, here.

With the permission of the author, APP is hereby re-posting/re-publishing it in three parts.

The full bibliography will appear at the end of the third part.

Clicking on the hot links in the Table of Contents, directly below, will also take you to the corresponding parts of the Rounded Globe version.

About the Author

Susan Haack is Distinguished Professor in the Humanities, Cooper Senior Scholar in Arts and Sciences, Professor of Philosophy, and Professor of Law at the University of Miami.

Her work ranges from philosophy of logic and language, epistemology, metaphysics, philosophy of science, pragmatism—both philosophical and legal—and the law of evidence, especially scientific evidence, to social philosophy, feminism, and philosophy of literature.

Her books include Philosophy of Logics; Deviant Logic, Fuzzy Logic: Beyond the Formalism; Evidence and Inquiry; Manifesto of a Passionate Moderate; Defending Science—Within Reason; Pragmatism, Old and New; Putting Philosophy to Work; Ciencia, Sociedad y Cultura; Evidence Matters: Science, Proof, and Truth in the Law; and (in 2015) Perspectivas Pragmatistas da Filosofia do Direito (São Leopoldo, Brazil: Editora UNISINOS) and Legalizarre l’epistemologia (Milan, Italy: Università Bocconi).

Haack’s work has been translated into French, German, Italian, Spanish, Portuguese, Polish, Russian, Croatian, Danish, Swedish, Romanian, Korean, and Chinese; and she is invited to lecture around the world.

Haack was included in Peter J. King’s One Hundred Philosophers: The Life and Work of the World’s Greatest Thinkers and in the Sunday Independent’s list, based on a BBC poll, of the ten most important women philosophers of all time. Haack’s work has been celebrated in two volumes of essays, Susan Haack: A Lady of Distinctions (2007) and Susan Haack: Reintegrating Philosophy (2016). In 2011 Haack was awarded the degree of Doctor Honoris Causa by Petri Andreis University, and in 2016 the Ulysses Medal, the highest honor given by University College, Dublin.

Contents

Lecture I: Science, Yes; Scientism, No

… the progress of science has seemed to mean the enlargement of the material universe and the diminution of man’s importance. The result is what one may call the growth of naturalistic or positivistic feeling. Man is no lawgiver to nature, he is an absorber. She it is who stands firm; he it is who must accommodate himself.

The romantic spontaneity and courage are gone, the vision is materialistic and depressing. … [But] you want a system that will combine both things, the scientific loyalty to facts and willingness to take account of them, … but also the old confidence in human values …—William James1

In “The Present Dilemma in Philosophy,” the 1906 lecture from which this passage is taken, James suggests that, behind the recurrent disagreements between those who focus on facts, on the concrete, on the scientific, and those who prize poetry, art, religion, values, lies a clash of fundamental philosophical temperaments, the “tough-minded” and the “tender-minded.” Under each heading, he lists a cluster of the philosophical ideas—not always, as he realizes, mutually compatible ideas—to which thinkers of these contrasting temperaments tend to gravitate. What’s needed, James continues, is a philosophy that can accommodate both the focus on facts and admiration for the scientific that appeals to the tough-minded and the literary or religious idealism that appeals to the tender-minded. And while our cultural landscape is very different from James’s, even today his words resonate: we still need a philosophy that accommodates both that admirable “scientific loyalty to facts” and the hankering for those desirable human values—without sacrificing either to the other.

“A man must be downright crazy,” C. S. Peirce wrote in 1903, “to doubt that science has made many true discoveries.”2 Indeed. Thanks to the work of many generations of scientists, we now know far more about the world than we did just a few hundred years ago; and this expansion of our knowledge has enabled us to transform our world, and to lengthen and improve our lives. But science is a human enterprise; and so inevitably combines (as Denis Diderot put it) “insight and blindness, … pettiness and grandeur.”3 Impressive as their achievements have been, the sciences are imperfect and limited, as all human enterprises are.

There’s vastly more the sciences have yet to find out; much of what scientists once thought they knew has by now been found to be false, or only approximately or only partially true; and doubtless some of what scientists now think they know will, likewise, be revealed by future work to have been mistaken. Extraordinary as the edifice of well-established scientific knowledge now is, the trash-heap of discarded ideas and theories is far larger. Scientific discoveries don’t come easily—and, like the rest of us, scientists sometimes cut corners when they are hard-pressed or over-anxious to succeed; like the rest of us, they are sometimes lazy, careless, or biased; like the rest of us, they are susceptible to the pressure of commercial interests, political ideology, and simple careerism. Scientific progress is ragged and uneven, sometimes astonishingly fast, sometimes painfully slow and halting; and there’s absolutely no guarantee that there will always be progress, that the sciences will always or inevitably advance. And, of course, the results of scientific work can be put to bad uses as well as to good.

There are, besides, many questions beyond the scope of the sciences; many other valuable forms of inquiry, such as the historical, the legal, the literary, and the philosophical, besides the scientific; and many other valuable human activities, such as music, art, story-telling, joking, building, and cooking, besides inquiry. Science is a good thing; but it’s an imperfect, human enterprise, and it’s by no means the only worthwhile human enterprise.

Well, yes, you may be thinking, but isn’t all this just obvious—too obvious even to need saying? It is, indeed, obvious. All the same, however, it needs not only to be said, but also to be spelled out. For the fact is that attitudes to science range all the way from naively uncritical admiration through distrust, resentment, and envy to denigration and outright hostility. Many underestimate the sciences, denying, or professing to deny, their achievements; and many overestimate them, denying, or professing to deny, their fallibility or their limitations. In short, both tough-minded admiration for science and tender-minded reservations about it can and all too often do take on indefensible forms: admiration for science can only too easily turn into scientism, and reservations about science into antipathy, resentment, even outright hostility. And there’s more at stake here than a superficial clash of temperaments, for both scientistic and anti-scientific attitudes rest on serious misconceptions of the sciences and their place in culture. We need to understand why—while both admiration for the achievements of the sciences and reservations about their limitations and dangers are entirely appropriate—neither scientism nor anti-science is well-founded.

Sometimes one of these faulty extremes predominates, sometimes the other. For much of the twentieth century, scientistic currents of various kinds—behaviorist, reductionist, etc.—were in the ascendant; until the closing decades, when there was a strong surge of anti-scientific sentiment. That’s why, when I wrote Defending Science—Within Reason (2003)—though I also argued against scientism—I focused primarily on what seemed to be the greater danger, the cynically anti-scientific attitudes of postmodernist, radical-feminist, and post-colonialist “science critics” and newly-ambitious radical sociologists, historians, and rhetoricians of science suspicious of, or outright skeptical about, the claim of the sciences to give us knowledge of the world. Priding itself on having seen through philosophers’ (and scientists’) illusions about science, this style of science-denigration was given to inveighing against “rationalist,” i.e., formal-logical, models of scientific method and scientific reasoning. If the idea was only that these models missed important aspects of the scientific enterprise, it was true enough.4 But it obviously didn’t follow, as so many triumphantly concluded, that there is, after all, nothing more to the sciences than power, politics, and rhetoric; and neither, I argued, is it true.

Before very long, however, the fashion for anti-scientific exaggerations was waning somewhat, and a newly confident scientism was on the rise. To be clear: this new wave of scientism didn’t sweep over us overnight; it crept up on us—though it crept up with remarkable rapidity. It’s tempting to speak of the “new” scientism; but what we’re really seeing is probably better described as a revival, a recrudescence, of attitudes that are far from new—attitudes that were already familiar to James, and that were at the heart of logical positivism, of Karl Popper’s insistence on the crucial importance of the “problem of demarcation,” the need for a criterion to distinguish genuine science from pretenders, and of W. V. Quine’s ambitious but ambiguous program to “naturalize” epistemology. At the same time, as we’ll soon see, the current wave of scientism is neither simply a return to older forms, nor simply an overreaction to the anti-scientific extravagances of those postmodern cynics; other intellectual upheavals—notably, the boom in evolutionary biology and, especially, in neuroscience, along with the rise of a newly evangelical atheism—have shaped the style, and influenced the tone, of the forms of scientism in vogue today.

In “Six Signs of Scientism,” which appeared in 2010, I suggested some ways to identify when appropriate respect for the achievements of the sciences crosses the line into the kind of over-enthusiastic and naively deferential attitude characteristic of scientism. At the time, I thought that was one job I could cross off my list—done. But no: triumphalist scientism grows apace—in the academy, in the legal system, and in our culture more generally; to the point where, by now, as some proudly adopt the word to describe their own positions,5 “scientism” seems gradually to be losing its long-established negative tone,6 and even becoming an honorific term. And these developments oblige me to dig deeper.

The first step is to articulate an understanding of the scientific enterprise that enables us both to appreciate its extraordinary achievements and to acknowledge its inevitable imperfections and limitations (§1); the next is to get some grip on the manifold manifestations of scientism (§2); and the last is to show in some detail that all these manifestations betray significant misunderstandings of what science is, what it does, and how it does it (§3).

1. SETTING THE STAGE: THE SCIENTIFIC ENTERPRISE

The root of the word “science” is the Latin, scientia, “knowledge.” And for a long time the English word “science” had a similarly broad scope, referring to any kind of systematized knowledge or inquiry—as the German word “Wissenschaft” still does; one could, for example, speak without any incongruity of the science of jurisprudence. What we would now call “science” was known, rather, as “natural” or “experimental” philosophy, and what we would now call “philosophy” as the “moral sciences.” But by the latter part of the nineteenth century, usage had gradually changed:7 with the remarkable achievements in physics, chemistry, and biology, the word “science” began to refer to these fields exclusively.8 By now, though there is still some residual resistance, psychology, sociology, economics, anthropology, etc., are normally also classified as sciences—if, sometimes, as “soft” sciences, by contrast with the “hard” natural sciences; but, with this addendum, a relatively narrow usage continues to hold sway.

So the phrase, “the sciences,” as it is now used, and as I shall use it here, refers to a loose federation of interrelated kinds of empirical inquiry including both the natural and the social sciences, but excluding pure mathematics, history, philosophy, jurisprudence, literary scholarship, and such. To be sure, these days scientists engage in many other activities besides inquiry: applying for grants to support their research, reviewing others’ grant proposals, writing up results for publication, refereeing others’ papers, designing experimental apparatus or computer programs to crunch data or simulate the consequences of this or that hypothesis, offering expert testimony in court, and so forth. The point isn’t that inquiry is the only business of the sciences, but that it’s their core business.9 And what does this core business involve? Inquiry, investigation, is an effort to discover the answer—of course, the true answer, or true answers—to some question or questions; as, for example, James Watson and Francis Crick tried, and eventually managed, to solve the structure of DNA, i.e., to come up with a true account of what that structure is.

Advocacy, say, or dancing, writing a novel, promoting a political program, or drafting a business plan aren’t science for the simple reason that they aren’t forms of inquiry. But what, you may ask, distinguishes the sciences from other fields of inquiry not classified as sciences—over and above the sociological fact that deans and librarians group them together? Well, it’s not simply by convention or purely by historical accident that these disciplines and not others are classified as “sciences”; but neither is it in virtue of their sharing some essential characteristics or some unique, distinctive method or procedure. Rather, the sciences form a kind of cluster or family of disciplines. One can say, to be sure, that the fields now classified as sciences are all forms of empirical, descriptive inquiry—which is why such disciplines as pure mathematics, logic, ethics, or aesthetics aren’t included; and that their goal is to explain natural and social phenomena, so that they focus primarily on general laws rather than particular events—which is why history isn’t included. But the boundaries are fuzzy, shifting, and frequently contested.

The boundaries are fuzzy: there are similarities and continuities, as well as differences and discontinuities, between the sciences and other kinds of inquiry—and between scientific inquiry and other human endeavors. Scientific work, no less than writing a novel or designing a skyscraper, requires imagination: to come up with possible explanatory hypotheses, to devise ways to put them to the test, to think what factors might possibly interfere and how to rule them out, to come up with alternative explanations, etc., etc.10 Cosmology and evolutionary biology are historical sciences; astronomy concerns itself with particular heavenly bodies. There’s no sharp line where cosmology ends and metaphysics begins, or where empirical psychology becomes philosophy of mind. And so on.

There are also many interrelations both among the disciplines we call sciences, between the sciences and other kinds of inquiry, and even between the sciences and other human activities. In the early days of the mathematization of science, scientists borrowed techniques developed for purposes of double-entry book-keeping and perspective drawing.11 In the pioneering days of molecular biology, biologists borrowed tools from physics.12 In the late twentieth century, archeologists used neutron analysis showing that jasper found in a settlement in what is now Newfoundland contained trace elements present only in jasper from Greenland and Iceland to confirm that Vikings reached North America long before Columbus;13 and historians borrowed a cyclotron to determine whether the ink in an old bible was the same as that in the Gutenberg Bible of 1450-55.14 And so forth.

The boundaries of science are shifting: over time, new fields and sub-fields develop. Moreover, because of the remarkable successes of the sciences, practitioners of disciplines not at present routinely included in the federation sometimes describe their fields as, also, “sciences.” As a result, the boundaries are often not only fuzzy and shifting, but also contested: there is controversy, for example, over whether forensic sciences such as hair analysis15 or bite-mark identifications16 are really rigorous enough to be included; over whether psychiatric theorizing is genuine science or an unworthy pretender; and—when courts must determine whether creation science or Intelligent Design Theory may constitutionally be taught in public high school science classes17—over whether these are genuinely scientific theories, or thinly disguised religious dogma.

And, even within the large, unruly family of disciplines we now call “sciences,” there is enormous variety. We speak of more and less “mature” sciences, because in some long-standing fields there is by now a substantial body of well-established theory, while in newer fields there may as yet be little more than so-far untested and unsupported speculation. And of course each field has its own special tools, methods, and procedures—and its internal disagreements: think, for instance, of all the elaborate protocols for conducting epidemiological studies and calculating their results developed since the earliest days of the discipline, when John Snow figured out that cholera is waterborne,18 and of the ongoing controversies about methods of meta-analysis of multiple studies.19 And every scientific field evolves, sporting new sub-specialties, new approaches, and new tools.

* * *

So what, if anything, is distinctive in the way scientists go about inquiring? Is scientific inquiry something absolutely unprecedented in the history of the human race, or something familiar and routine? As I see it, it is neither. Inquiry in the sciences has its roots in, and is recognizably continuous with, everyday empirical inquiry; but it has gone far, far beyond it.

Perhaps this idea sounds radical; and indeed, from the perspective of all those twentieth-century debates among proponents of inductivist, deductivist, game-theoretical, Bayesian, etc., models of the Scientific Method, it is radical. But this Critical Common-sensism, as I call it, would have been entirely familiar to Thomas Huxley, according to whom the “man of science … simply uses with scrupulous exactness, the method which we all, habitually and at every minute, use carelessly”;20 to Albert Einstein, who once observed that “the whole of science is nothing more than a refinement of common sense”;21 to John Dewey, who stressed how “[s]cientific subject-matter and procedures grow out of the direct problems and methods of common sense”;22 to Percy Bridgman, who commented that the “scientific method, in so far as it is a method, is doing one’s damnedest with one’s mind, no holds barred”;23 to James B. Conant, who wrote that “what the scientist does is simply to carry over, into another frame of reference, habits that go back to the caveman”;24 and to Gustav Bergmann, who described science as the “long arm” of common sense.25 And, I can now add, it would be entirely familiar to Steven Weinberg, who wrote in 2015 that even “[b]efore history, there was science, of a sort. … Observation of the world led to useful generalizations. … [And] here and there, some people wanted …. to explain the world.” 26

Exactly. From the beginning, in matters of immediate practical concern in everyday life, humans had to figure things out—where the best hunting is, what plants have useful medicinal properties, which beetles make the most effective poison for arrow-heads, and so on. In these early efforts, as in all inquiry, people made the best guesses they could and, so far as they were able, checked them out. Beyond a certain point, though—when it wasn’t possible to check just by looking, listening, smelling, or tasting, or by asking others with sharper senses or better opportunities to observe—people fell back on folklore, myth, mystery and, as civilizations grew, on appeals to religious or secular authority. Surely there were always some who were more curious than others, more given to experiment, more persistent in trying to figure things out. And certainly there were many precursors of what we now call “modern science”—some truly remarkable, such as the work of the Chinese astronomers who, millennia ago, made observations so accurate that astronomers can still rely on them today.27 But, for a variety of reasons,28 these precursors never quite took hold as modern science has done over the last several centuries.

And with the rise of modern science, empirical inquiry took on whole new dimensions. Indeed, as David Wootton shows, the period that saw the emergence of modern science also saw the development of the now-familiar vocabulary for talking about empirical inquiry: “experience,” “experiment,” “discovery,” “fact,” “hypothesis,” “theory,” “observation,” “evidence.”29 The word “scientist” itself is a surprisingly recent coinage. According to William Whewell, it was first proposed by an “ingenious gentleman” at a meeting of the British Association for the Advancement of Science in the early 1830s, as a general term that, like the German “Natur-Forscher,” would refer to people in all fields of science rather than to people in just one.30 (Sydney Ross tells us that the “ingenious gentleman” in question was Whewell himself.)31

Scientists aren’t unique in making educated guesses and checking them out as best they can; but over the centuries they have come up with innumerable ways to extend, amplify, and refine the process, developing new tools, new procedures, new ways to figure things out better—more thoroughly, more persistently, more precisely, more broadly, more deeply, more imaginatively, …, etc. They have devised new instruments of observation and new methods of purification, analysis, excavation, etc., to seek out more, and new kinds, of evidence; new techniques of measurement and calculation to determine more exactly where evidence points, and even techniques of meta-measurement to measure the accuracy and reliability of first-order measurements. And they have built on the results of previous investigations by earlier generations of scientists to make better-informed guesses, better-designed and better-controlled experiments, and better tools and instruments; developed more informative, more exact, and more discriminating theoretical vocabularies; found ways to stretch their unaided imaginative powers; and so on. Over several centuries, generation upon generation of scientists has developed—here I borrow a word from Francis Bacon—a vast and various array of “helps” to inquiry.32

Bacon was a high-ranking lawyer and a remarkable philosophical thinker, but he was no scientist; indeed, as Sir William Harvey—who really was a scientist—complained, he wrote natural philosophy (i.e., science) “like a Lord Chancellor.”33 Nevertheless, for all its many flaws, Bacon’s New Organon is a visionary work. If we are to understand natural phenomena, Bacon saw, we must be willing to leave the study and the library and explore the world, to experiment, manipulate, get our hands dirty. And he also saw that, if we do this, we can anticipate both “light”—a greater understanding of the natural world; and “fruit”34—the power to predict natural phenomena and so to cope with some of the dangers, and take advantage of some of the opportunities, that they present. To us, now, this seems blindingly obvious; but it’s worth remembering that King James, to whom Bacon dedicated the book—and who was at the time one of the finest scholars in Europe—was completely baffled: Bacon’s vision seemed to him “like the peace of God, it passeth all understanding.”35 Only very gradually, as the sciences got on their feet, would the old habits of appeal to authority or resort to folklore and superstition be superseded; indeed, even today they have by no means been entirely banished.

So there is both continuity and discontinuity. Scientific inquiry uses the same underlying procedures and inferences as everyday inquiry; but by now scientists have enormously improved, refined, amplified, and augmented them. Scientific inquiry has been more persistent, more thorough, more searching, more sustained, more rigorous than everyday inquiry; it has been the work of generation after generation, each of which could build on the successes of the last; and it is by now the professional business of large numbers of people, sometimes cooperating, sometimes competing, allowing division of labor and pooling of evidence. And, while there are still “citizen scientists,”36 as in earlier generations there were “gentleman naturalists,” scientific work has become a recognized profession; and scientific communities have gradually developed a raft of internal social mechanisms that have served, up to a point—though only up to a point—to sustain intellectual honesty and encourage the sharing of evidence.

There is, in short, a constantly evolving array of scientific methods, tools, and techniques of inquiry—methods, tools, etc., often local to specific scientific fields, though sometimes proving useful elsewhere, too. Insofar as these methods, tools, and techniques stretch scientists’ imaginative powers, extend their unaided evidential reach, refine their appraisal of where evidence points, and help sustain honesty, provide incentives to the patience and persistence required by scientific work, and facilitate the communication of results, they enable progress: better measurements, better theories, more sensitive instruments, subtler techniques, finer-grained experimental design, more informative terminology, and so on. And nothing succeeds like success. Each step forward enables further steps, as well as allowing scientists to correct their own or others’ previous missteps—which is why it is sometimes said that science is “self-correcting.” But there’s no magic about it; only (only!)—on the part of many people, over many generations—curiosity, imagination, hard work, patience, persistence, attention to detail, and honest willingness to acknowledge failure, learn from it, and start over, perhaps again and again.

Scientific claims and theories are fallible, revisable: any claim or theory, no matter how well-established or how widely accepted, might be shown by new evidence to be false, or only approximately or only partially true. But there’s a kind of continuum, from scientific claims and theories well-rooted in a strong, dense, tightly-interlocking mesh of evidence, through others reasonably well-rooted and others again fairly well-rooted, to the as-yet wholly speculative and the outright wild and woolly. Most conjectures won’t survive as new evidence comes in; only a few will become part of the enduring edifice of scientific knowledge.

In any scientific community, probably, there will be some who are more radical, ready to try a new conjecture when the existing hypothesis most of their colleagues are content to work with encounters difficulties, and others who are more conservative, disposed to keep trying to modify and adapt the old idea; still, consensus will gradually form: this conjecture is probably correct, that idea is almost certainly mistaken. Ideally, we would find consensus among scientists in a field when, and only when, the evidence is sufficient to indicate that it’s probably safe to rely on this conjecture, but probably a waste of time pursuing work on that rival idea. There’s absolutely no guarantee, however, that scientific consensus will always faithfully track the state of the evidence.

The formation of scientific consensus is only too easily distorted by political pressures, by commercial interests, by the demands of litigation: think of Stalin’s sponsorship of Trofim Lysenko’s ideas about plant genetics37 or, at a more commonplace level, of pharmaceutical companies’ routine practice of withholding unfavorable results from publication.38 There may be strong resistance to acknowledging the evidence for a claim perceived as too new, too radical: think of Darwin’s wry comment that admitting he had come to believe that species aren’t fixed and immutable was “like confessing a murder,”39 or of the strongly skeptical reaction when it was first suggested that stomach ulcers might be caused, not by stress or a too-spicy diet, but by a bacterium.40 An individual, or a team, may have so much influence in a field, or so much control over funding or publication, that their approach continues to prevail even if the evidence is weak, as was perhaps the case with Cyril Burt’s work on the heritability of intelligence.41 And sometimes an unfounded idea somehow takes such firm root that everyone in a field just assumes it’s true: think of the “tetranucleotide hypothesis”—Phoebus Levine’s conjecture that DNA is a “stupid,” monotonous molecule in which the four base pairs occur in a regular sequence—which, though it was nothing more than a conjecture, was once widely taken for granted by molecular biologists.42 Indeed, its grip was so strong that, even after he’d completed his experiments subjecting the “hereditary principle” to every known test to discriminate the two substances, which clearly pointed to this conclusion, Oswald Avery dared not say in print that DNA, and not protein, is the genetic material.43

The sciences have achieved remarkable things; but there can be no guarantee that they will always advance, let alone that they will always advance at a brisk pace; and no guarantee that there won’t be setbacks, wrong turns, and false starts. As I said, progress has been ragged, uneven, and unpredictable. Sometimes scientific advance is cumulative; sometimes it involves big upheavals—I would say, “revolutions,” except that I intend no Kuhnian implications44—when a key new idea emerges, or a central old idea is discredited; and sometimes, if a really bad idea takes hold, a field may go backwards before the bad idea is dropped and progress can again be made.

And neither, of course, are the sciences complete. It’s not just that, in any scientific field, there are questions yet to be answered. It’s also that, as scientific work proceeds, new and unanticipated questions—sometimes, even, whole new fields or sub-fields of inquiry—will emerge; and, most importantly, that the competence of the sciences is limited. Not every kind of question is susceptible to scientific inquiry; even the most comprehensive future science imaginable wouldn’t explain everything.

Moreover, while those technical helps to scientific inquiry (the instruments, computer programs, statistical methods, and so on) usually get better and better over time, it’s very clear that by now the social mechanisms for sustaining honesty and encouraging evidence-sharing (peer-reviewed publication, assignment of research funding, professional certification and standards, and so forth) are under considerable strain. As science grows bigger, more expensive, politically more consequential, and potentially more profitable, things can go badly wrong. The peer-review system, for example, always flawed,45 is by now seriously dysfunctional46—to the point where William Wilson writes despairingly that “if [it] is good at anything, it appears to be preventing unpopular ideas from being published.”47 It is, as Wilson continues, truly ironic that a newly aggressive scientism—he goes so far as to say, a Cult of Science—should be on the rise precisely as careerism and a bloated bureaucracy threaten to undermine the ideals and the integrity of the sciences.48

2. SPOTTING THE SIGNS: THE MANY MANIFESTATIONS OF SCIENTISM

Against this background, the distinctive characters of scientism and of anti-science come into sharper focus. As we shall see, however, both are manifested in many and various ways.

Anti-science comes in a whole variety of shapes and guises, not always mutually compatible. So far, my focus has been on the cynical strain encouraged by rivalry between traditional philosophers of science and the ambitious radical sociologists and historians who, in the latter part of the twentieth century, began to turn their attention to the sciences—a great, noisy chorus of academics proclaiming that the sciences are shot through with sexism, racism, and colonialism, that science is driven by power, politics, rhetoric, negotiation, not evidence, that supposedly “objective” facts and supposedly “objective” reality are man-made, scientists’ own creation, even that the concepts of inquiry, evidence, truth are nothing but ideological humbug. (Of course, if there really were no objective truths, it couldn’t be objectively true that science is shot through with sexism, racism, etc.; and if there really were no objective standards of better and worse evidence, there couldn’t be objectively strong evidence that what scientific theories get accepted primarily depends, not on the evidence, but on scientists’ class interests.) But the anti-science camp also includes religious fundamentalists who reject modern cosmology and the theory of evolution, as well as ordinary, non-partisan people disillusioned after reading too often of scientific fraud, of science distorted by politics, of corruption in the scientific peer-review process, or of large grants for what seem to be banal or incomprehensible projects.

Not surprisingly, scientism is no less complex, no less various, and no less rife with internal tensions than anti-science is. Some in the scientistic camp are simply so impressed by the remarkable achievements of the sciences that they are ready to accept any and every scientific claim unquestioningly, the wildly speculative no less than the well-established; to believe the pronouncements of well-known scientists even on matters far outside their professional competence; and to resist all criticism of the sciences or of current scientific orthodoxies.49 But, like anti-scientific dismissiveness, scientism also has its academic wing.

This academic wing has long included those who insist on criteria by which to distinguish real science, the genuine article, from “pseudo-science”; propose formal-logical or probabilistic models of scientific reasoning; and—relegating potentially awkward sociological and psychological factors influencing scientific work to the “context of discovery”—offer bromides about the “rationality” and “objectivity” of science. But now there is, besides, a great noisy chorus of academics newly energized by the rise of evolutionary psychology and neuroscience, the boom in evangelical atheism, and the scientism at work in our culture at large, urging that such hitherto-uncivilized disciplines as jurisprudence, art criticism, philosophy, etc., look to the sciences for answers to their questions. Others, going even further, proclaim that it’s time to abandon these outdated, pre-scientific disciplines altogether, and to pursue genuinely scientific projects instead. “Neurolaw,” we’re told, promises to transform or even displace the outdated field of jurisprudence,50 “neuroart” the outmoded fields of aesthetics, literary criticism, and such,51 and “neurophilosophy” such primitive, pre-scientific disciplines as ethics, epistemology, metaphysics, aesthetics, and the like.52

And, just like anti-scientific cynicism, scientism can harbor contradictions. The earliest advocates of neurophilosophy, hoping to understand human cognitive processes on the model of the workings of the nervous system of the sea-slug, proclaimed—no, like those postmodernist science critics, they boasted—that their approach undermined the legitimacy of core logical and epistemological concepts.53 (Of course, like the all-too-similar claims of anti-scientific cynics, this is self-defeating: if it were true, the neuroscientific discoveries on which they based their extravagant claims couldn’t be well-warranted by strong evidence, and neither could they be true.)54 And, as we’ll see in the next lecture, by now some self-proclaimed supporters of scientism boast of an even more sweeping nihilism that repudiates every kind of value—the moral and the aesthetic, for example, as well as the epistemological.55

If it is evolutionary psychology and neuroscience that have played the largest role in shaping the character of scientism today—its manifestation in, as Raymond Tallis puts it, “Darwinitis” and “neuromania”56—it seems to be the new atheism that has most marked its tone. Of course, there have long been, and still are, religious people who reject cosmologists’ theories about the origin of the universe and evolutionary biologists’ theories about the origin of mankind because they are at odds with scriptural accounts; and for as long as there has been science, probably, there have been atheists who have welcomed scientific discoveries as confirming their position—though it’s salutary to remember that there were atheists long before modern science got on its feet, and that religious people sometimes welcome scientific theories as confirming their position—stressing, for example, the supposed “fine-tuning” of the earth to human life.57

So what’s new, you might ask, about the “new” atheism?—less its content, it seems, than its style, its swagger. Taking for granted, what is a long way from obvious, that by now science has shown that religious claims are groundless, the new atheism often calls on evolutionary or neurophysiological accounts of the religious impulse to explain it away; perhaps more importantly, assuming that religious people are either scientifically ignorant or willfully blind, it seems to pride itself on its intellectual superiority. We, the new atheists proclaim, are the “Brights”58—which, intentionally or not,59 inevitably suggests that religious people are, well, dim. This adds a new layer of confusion: with the new atheists acting as cheerleaders for the revival of scientism, it can come to seem that anyone who resists scientism must, overtly or covertly, have a religious agenda. But this is a serious misperception; as we will see, there are good and sufficient reasons for resisting scientism quite independent of any religious assumptions.

* * *

Of course, there’s no simple formula to determine when the line between appropriate respect for the achievements of the sciences and inappropriate deference to science has been crossed. There are, however, some characteristic indicators, among which I would include:

  • Forgetting fallibility: i.e., being too ready to accept anything and everything bearing the label “science,” or “scientific,” and to believe any and every claim made by scientists of the day.

Excessive readiness to believe in the absence of good evidence is the epistemological vice of credulity.60 This vice comes in many forms: some people are too ready to believe bizarre Hollywood gossip, others the something-for-nothing promises made by aspiring politicians, others again the advertisements for miraculous dietary aids and other medical quackery, etc. And some—including not a few of those who, priding themselves on their “scientific” skepticism, scoff at claims about the Loch Ness Monster, haunted houses, fringe medical treatments, and the like—are too ready to believe any and every claim made by scientists, including the latest headline-catching study or speculation that will, more likely than not, turn out to be just plain wrong.61 They forget that science is an ongoing enterprise, and that much scientific speculation won’t survive the test of time. This kind of credulity about science is the simplest and most straightforward sign of scientism.

Such credulity naturally encourages what is by now a very common and familiar phenomenon, the use of “science” and its cognates as a kind of shorthand for “good, solid stuff.” This is another sign of scientism:

  • Sanctifying “science”: i.e., using the terms “science,” “scientific,” etc., honorifically, as terms of generic epistemological praise meaning something like “strong, reliable, good.”

There’s a real irony here: as we saw, “science” originally meant simply “systematic knowledge,” but gradually became restricted to physics, chemistry, biology, etc.—as we would now say, “the sciences”; but those who turn “science” and “scientific” into honorific terms are in effect restricting the meaning of “knowledge” so as to coincide with the newer, narrower meaning of “science.”

As they become honorific terms, “science” and “scientific” soon lose descriptive content and become near-vacuous expressions of approval. Advertisers urge us to buy their new, scientific detergent or to try their new, scientific dietary supplement; a historian criticizes a rival on the grounds that he has no scientific evidence for his claims;62 phrenology or the phlogiston theory are dismissed as pseudo-sciences; and so forth. This honorific usage even entered our jurisprudence when, in Daubert v. Merrell Dow Pharmaceuticals, the U.S. Supreme Court’s first-ever ruling on the standard of admissibility of expert testimony, Justice Blackmun argued for the majority that, in determining whether such testimony is sufficiently reliable to be heard by a jury, judges should determine that it is genuinely “scientific … knowledge.”63 And, inevitably, as “science” becomes an honorific term, practitioners of other disciplines begin to describe their fields as sciences: “Management Science,” “Library Science,” even “Mortuary Science”64—and, of course, “Creation Science.”

When “scientific” is used as equivalent to “epistemologically strong,” of course it seems enormously important to find some way to distinguish genuinely scientific, epistemologically strong work from epistemologically weak “pseudo-science”—i.e., to find some criterion by which to demarcate real science from pretenders. Hence the third sign of scientism:

  • Fortifying the frontiers: i.e., insisting on a strong, sharp line of demarcation between genuine science and pretenders.

Karl Popper, most famous of twentieth-century demarcationist philosophers of science, thought he had a simple way to distinguish work like Einstein’s (which he deemed good, genuinely scientific) from Marx’s “scientific socialism” and Freud’s and Jung’s psycho-analytic theories (which he deemed bad, pseudo-science at best): the mark of a genuinely scientific claim or theory is that it is falsifiable. This idea is at work in Justice Blackmun’s first, vaguely Popperian “Daubert factor,” suggesting that, in assessing whether expert testimony is really scientific, and hence reliable enough to be admitted, judges ask: “can it be (and has it been) tested?”;65 and in those constitutional cases where judges reach for vaguely-Popperian criteria to argue that creation science,66 or Intelligent Design Theory,67 isn’t really science at all.

The demarcationist impulse often manifests itself in the form of what you might call “methodism,” the idea that real science, the genuine article, can be identified by means of its distinctive method or procedure of inquiry. Hence the next sign of scientism:

  • Mythologizing “method”: i.e., supposing that what’s distinctive about the sciences, what qualifies them as genuinely scientific, is their method—a supposedly uniquely-scientific way of going about inquiry.

Philosophers argue about whether the Scientific Method is deductive, inductive, probabilistic, Bayesian, game-theoretical, error-theoretical, or what; scientists themselves, if put on the spot to say something about how they do what they do, sometimes parrot some half-understood idea from this tradition—most often, something vaguely Popperian in tenor. Textbooks (and scientific organizations) tend either to offer cook-book, step-by-step instructions so formulaic that they tell you nothing of substance—e.g., “make a hypothesis; design an experiment; conduct the experiment; write up the results; submit the results for peer-review”;68 or else describe the specialized procedures or techniques gradually developed over the years in their particular field of scientific work, so specific that they simply don’t apply in other fields.69

The idea that there is some distinctive method of inquiry used by all scientists and only by scientists inevitably encourages some to adopt what they take to be scientific methods, tools, and techniques as if this were sufficient by itself to make their work rigorous, “scientific” in the honorific sense. All too often, the result is—well, in Bentham’s phrase, it’s “nonsense upon stilts,”70 work lacking in real rigor but disguised in the technical trappings of science. This is another sign of scientism:

  • Dressing up dreck: i.e., adopting the tools and trappings of scientific work not to advance inquiry, but to disguise a lack of real rigor or seriousness.

Scientists themselves are not immune to this kind of scientism. For example, one of Merrell Dow’s epidemiological studies of Bendectin, the morning-sickness drug at issue in Daubert, though decked out with all the usual statistical apparatus, failed to distinguish women who took the drug during the period of pregnancy when fetal limbs are forming from those who took it at other stages—and then, predictably, concluded that there was no evidence that Bendectin caused limb-reduction birth defects;71 Merck’s VIGOR trial of the arthritis drug Vioxx, as a result of which the FDA approved it for sale in the U.S., though also dressed up with all the standard statistical apparatus, tracked gastro-intestinal effects for longer than it tracked cardiovascular effects; as a result of which the several subjects who died of heart attacks and strokes after taking the drug could be excluded from the results, since their deaths occurred outside the study period.72

Dressing up dreck is even commoner, probably, in the social sciences, where lengthy introductory chapters on “methodology” are sometimes only window-dressing, and graphs, tables, and statistics sometimes focus attention on variables that can be measured rather than those that really matter, or represent variables so poorly defined that no reasonable conclusion can be drawn. David Abrahamson’s Second Law of Criminal Behavior, “C = (T+S)/R,” is a classic: “[a] criminal act is the sum of a person’s criminalistic tendencies plus his total situation, divided by the amount of his resistance.”73 The forensic sciences are also susceptible to this kind of thing. The “ACE methodology” for fingerprint identification, for example, is little more than a list of steps—analysis, comparison, evaluation; and the addition of “V,” for “verification” is much less reassuring than it sounds, since all it means is “get another fingerprint examiner to check.” “The scientific approach of the ACE-V process was detailed in [a 2009] article by the FBI,” write the authors of a recent report on the accuracy of fingerprint identifications;74 but when you look closer you find that this FBI report simply parrots a completely unhelpful textbook understanding of “scientific method”: make an observation, generate a hypothesis, conduct tests, generate conclusions, confirm through replication, record or present the conclusions.75

And now for my last two signs of scientism:

  • Colonizing culture: i.e., attempting to take over non-scientific disciplines and replace them by scientific substitutes.

Because the sciences have made many true discoveries, they enjoy considerable prestige. So, not surprisingly, some people come to imagine that science could solve virtually all our problems—for example, that it could provide well-founded responses to vexing questions of public policy. Moreover, it’s second nature for scientists to press outward, to tackle the new questions that inevitably arise as older ones are answered, to explore hitherto-unexplored phenomena, to try out tools and techniques that have proven useful in one area to see if they can also be helpful in others. So, not surprisingly, some begin to aspire to take over work hitherto left in the hands of less-prestigious non-scientific disciplines, and do a proper job of it.76 And some in those less-prestigious fields, feeling themselves the poor relations in the academy and aspiring to share in the prestige of the “scientific,” begin to call on one or another of the sciences to solve the problems with which they’ve been wrestling unsuccessfully—to provide evolutionary answers to questions of ethics, for example, or neuroscientific answers to puzzles in philosophy of mind.

Again not surprisingly, these scientistic efforts to colonize other areas of culture often fail. And when they do, many respond, not by acknowledging frankly that the sciences have overreached, but by casting aspersions on the questions that prove recalcitrant and the fields that resist colonization. Hence the last sign of scientism on my list:

  • Devaluing the different: i.e., denigrating the importance, or even denying the legitimacy, of non-scientific disciplines and activities.

This takes many forms: from government efforts to focus resources on science education at the expense of other fields, through dismissive attitudes to the study of aesthetic, ethical, or other values recalcitrant to scientific explanation, to outright denial of the legitimacy of whole fields of human endeavor.

These seven signs of scientism roughly parallel the many and various manifestations with which I began this section; and, as we have seen, they are intimately interconnected—the honorific use of “science” and its cognates leading very naturally to a preoccupation with finding a criterion of demarcation of the genuinely scientific, this in turn to a preoccupation with identifying some method distinctive of the sciences, and so on. Of course, these signs also reflect some of the tensions within scientism: e.g., between the concern to fortify the frontiers of genuine science, the real thing, and the hope of extending the domain of science to previously-unoccupied territories; between attempts to colonize other areas of culture, and the impulse to denigrate whatever falls outside the scope of the sciences; and so on. But the key point, as we’ll soon see, is that every one of them betrays some misunderstanding of the scientific enterprise.

3. MAPPING THE MISUNDERSTANDINGS: THE FALSE PRESUPPOSITIONS OF SCIENTISM

There’s some temptation simply to point out that, since what the word “scientism” means is excessive or undue deference to the sciences, it’s trivially true that scientism is undesirable. So it is; but this doesn’t get us very far. It may alert us to the need to understand why scientism is a bad thing, but it doesn’t, by itself, throw any light on the matter. There’s some temptation, also, to rely on pointing out that, no less than anti-scientific cynicism, scientism poses real dangers: it threatens to cloud our appreciation of our distinctively human mindedness and of the extraordinary array of intellectual and imaginative artifacts this mindedness has enabled us to create,77 and even to lead to its own kind of nihilism. But just stressing that scientism is a threat to the health of our culture, or even explaining why it is, doesn’t do the whole job, either. No: we can get to the root of problem only by identifying the false presuppositions on which scientism rests; which is what I shall try to do.

(i) As my phrase “forgetting fallibility” suggests, credulity about scientific claims betrays a serious misunderstanding of how the sciences advance: not infallible step by infallible step, but by fits and starts, with numerous wrong turns and missteps along the way. Plenty of scientific studies and experiments are poorly-conceived, poorly-conducted, or both; and the results even of well-conceived and well-conducted studies and experiments can be misleading. While work on a scientific question is ongoing, capable scientists in a field may quite reasonably disagree about which of the rival approaches is likeliest to work out—and the rest of us just have to wait and see. The sciences have made many true discoveries, yes; but it obviously doesn’t follow that every claim made by a scientist, or every claim made by a scientist in his own field of expertise, let alone every pronouncement of a well-known scientist on whatever subject, will be true. Far from it: most scientific conjectures will probably turn out to be false, many will be found to be only partially or approximately true, and many will prove to be misleading; and of course nobody is an expert outside his own field.

(ii) The honorific use of the words “science,” “scientific,” etc. as generic terms of epistemological praise betrays a similar blindness to the often fumbling and always fallible character of scientific work. True, over many generations the sciences have found ways to inquire better, to overcome or mitigate some natural human cognitive limitations and weaknesses. But this doesn’t justify using “scientific” as shorthand for “strong, reliable, epistemologically good”; and neither does it justify pretending that bad science—poorly-conceived or poorly-conducted scientific work—isn’t really science at all.78

The sciences have devised sophisticated tools and techniques to bring previously-inaccessible evidence within their reach, found ingenious ways to design more-informative experiments, devised mathematical and other methods to appraise evidence more scrupulously, and so on. But it by no means follows that all or that only scientists are good, careful, thorough inquirers; and of course it isn’t true. Even with all these remarkable tools, scientific work is sometimes weak—relying on careless or biased observations or badly-designed experiments or studies, for example, or on botched statistical calculations, or trimmed or fudged results. Indeed, as I suggested earlier, nowadays the severe pressure on scientists to get grants and to publish, along with a burgeoning scientific bureaucracy and a badly broken peer-review system, positively encourage weak, flawed, and even fraudulent work. Moreover, plenty of excellent scientific work has been done without the benefit of sophisticated tools: think of those astronomers in ancient China, who managed without radio-telescopes; or of Charles Darwin, who sometimes checked the size of specimens against his handkerchief.79

(iii) Once you recognize that there can be poor scientific work as well as good, not to mention strong non-scientific work as well as weak, the “problem of demarcation” loses much of its urgency. This is just as well; for the task of identifying the frontiers to be fortified has proven quite intractable.

Popper’s idea that a claim or a theory is scientific just in case it is falsifiable has been enormously influential not only in philosophy of science but also among scientists themselves and even, as we saw, in the U.S. legal system; nonetheless, it is a badly confused idea. Popper purports to offer a theory of “objective scientific knowledge” based on the bold thesis that, while scientific theories can’t be shown to be true, they can be shown to be false; but what he actually gives us is nothing but a thinly-disguised skepticism. Why so? A theory is falsified, he tells us, when a basic statement with which it is incompatible is accepted. But he goes on to insist that what basic statements are accepted and what rejected is entirely a matter of convention, a “decision” on the part of the scientific community. So the fact that a theory has been “falsified,” in Popper’s sense, doesn’t mean that it is false; and Popper’s account implies that scientific theories can no more be shown to be false than they can be shown to be true. No wonder he couldn’t decide whether the theory of evolution is or isn’t science, and vacillated over whether the problem with “scientific socialism” is that it isn’t falsifiable, or that when it was falsified, its proponents didn’t give it up; no wonder, either, that by 1959 he had decided that his criterion of demarcation was itself nothing but an optional “convention.”80

Others have suggested that the distinguishing mark of real sciences is that they involve controlled experiments, that they make successful predictions, that their theories are well-tested, that they grow and progress, or …, etc. But astronomy doesn’t make controlled experiments; evolutionary biology makes no predictions;81 many scientific claims are not, as yet, well-tested; and not all scientific fields are always growing or progressing.

Larry Laudan writes that demarcationist projects “served neither to explicate the paradigmatic uses of ‘scientific’ … nor to perform the critical stable-clearing for which [they were] originally intended.”82 True enough; but I would stress, rather, that the preoccupation with demarcation betrays a seriously oversimplified conception of what is really a dense, complex mesh of similarities and differences among the disciplines we count as sciences, and of continuities and discontinuities between these disciplines and others not so classified. It loses sight of the elements of historical accident and of convention in our classification of disciplines, of the fuzzy, shifting, and contested boundaries of science, and of the sheer variety of the category “non-science.” And it tempts us to forget that “not science” includes many legitimate and valuable enterprises: writing fiction or making art, for example—excluded because they aren’t kinds of inquiry; pure mathematics, legal or literary interpretation, inquiry into moral, aesthetic, or epistemological values—excluded because are aren’t descriptive but normative; not to mention historical research or metaphysics—both, again, by my lights anyway, legitimate and valuable kinds of inquiry but not, in the modern sense, sciences.

Demarcationists often emphasize the importance of distinguishing science from pseudo-science. But this idea obscures more than it illuminates. It uses “pseudo-science” and “pseudo-scientific” as terms of generic epistemological disparagement, in much the same way that “science” and “scientific” are nowadays often used as terms of generic epistemological praise; but what we really want to know is what, specifically, is wrong with the work in question. Calling it “pseudo-science” is no help at all.

You may object that sometimes, e.g., in legal contexts, we really need some way to discriminate genuine science from pretenders. Perhaps the first thing to say is that, even if the law really did need to do this, that wouldn’t mean that the boundaries of science really are sharp and clear; after all, the law really does need to adopt precise definitions of, say, “adult,” or “drunk,” but that doesn’t mean that there really is a sharp line between adolescents and grown-ups, or between someone who’s drunk and someone who’s a bit tipsy. But, in any case, it’s not clear that the legal system really does need to distinguish genuine science from pseudo-science.

The serious issue in those cases about the standard of admissibility of expert testimony should have been, not how to tell whether expert testimony is genuinely scientific, but how to tell whether it’s reliable enough to be presented to a jury.83 The apparent need for a criterion of demarcation of science arose only because Justice Blackmun’s ruling in Daubert confused “reliable” and “scientific.” Similarly, the serious issue in those constitutional cases over the teaching of creation science or Intelligent Design Theory in public high-school biology classes should have been, not whether these are scientific theories, but whether they are religious. The apparent need for a criterion of demarcation of science arose only because one prong of the Lemon test84 for constitutionality under the Establishment Clause requires that a statute have a secular purpose,85 prompting proponents to argue that teaching creation science or IDT does have such a purpose—namely, improving science education; and it was this that obliged their opponents to argue that creation science and IDT simply aren’t scientific theories, and obliged judges to determine whether they are or not.

(iv) “Make an informed guess about what might explain a puzzling phenomenon; figure out what the consequences would be if this conjecture were true; check out how well those consequences stand up to any evidence you have and whatever further evidence you can lay hands on; and then use your judgment whether to accept the conjecture provisionally, modify it, drop it and start over, or just wait until you can get more evidence.” Fair enough; but this methodological advice, if you can call it that, applies no less to historical research, legal scholarship, detective work, or serious everyday inquiry than it does to inquiry in the sciences. “Design a randomized, double-blind, controlled study with a large-enough number of subjects and controls to compare the effects of drug X with the effects of a placebo; use these statistical techniques to calculate the results, those to check for statistical significance; …, etc.” Fair enough—for epidemiologists conducting clinical trials; but this methodological advice is no help to an astronomer, a molecular biologist, a sociologist, or an anthropologist.

And this pattern is no accident: contrary to what methodism assumes, there is no method used by all scientists and only by scientists. Rather there is, on the one hand, the familiar procedure of conjecture and checking common to all serious empirical inquiry; and, on the other, the myriad specialized techniques and procedures devised by scientists in various fields to get more evidence of the kind they need and a subtler sense of where it points. But those underlying procedures aren’t used only by scientists, and those special techniques, which are constantly evolving and often local to a specific field, aren’t used by all scientists.

(v) Those specialized tools and techniques have helped the sciences advance; and they have also sometimes been borrowed and put to good use by inquirers in other fields—such as the historians who borrowed medical-imaging techniques to distinguish traces of writing from the effects of weathering on the lead “postcards” Roman soldiers used to write home.86 But when scientific tools and techniques don’t make the work more rigorous but only disguise its lack of rigor, the trappings of science are mistaken for its substance. Of course, the idea that tables, graphs, mathematics, statistical and other technical jargon, etc., will somehow magically make what you do precise, rigorous, and accurate is an illusion. Borrowing the trappings of serious scientific work doesn’t, by itself, make your work serious—any more than getting your citations in perfect BlueBook form,87 by itself, makes you a serious legal scholar. Dressing up dreck in scientific trappings is, in short, the pretense of serious intellectual work, without the substance.

But, it may be objected, isn’t this exactly why we need the concept of pseudo-science, which you dismissed earlier as unhelpful? I don’t think so. After all, while it’s quite common for people in non-scientific fields to dress up dreck in hopes of making it look more rigorous than it really is, scientists themselves sometimes do the same thing. And a bad epidemiological study where elaborate statistical apparatus serves only to distract attention from a biased design is an epidemiological study nonetheless, albeit a poor one. It doesn’t, as the phrase “pseudo-science” suggests, falsely pretend to be science; it is (bad) science. The problem is pseudo-rigor, not pseudo-science.

(vi) By now, many questions once thought beyond the scope of the sciences have been found to be within their competence after all. In the seventeenth century philosophers debated whether a man born blind, if he were made able to see, would immediately be able to distinguish between a sphere and a cube, previously known to him only by touch, simply by looking at them;88 now, we see this as the kind of question to be settled by medical scientists.89 But it doesn’t follow that every kind of question is, or will eventually be, susceptible to resolution by the sciences, that the sciences can colonize every area of culture. The mistake here is to suppose that, because the sciences have made so many remarkable discoveries, there are no limits to their reach. To be sure, we can’t know now what future scientific work might be able to achieve; nevertheless, it’s clear enough that certain kinds of question are simply not susceptible to resolution by the sciences. Even though, as I argued earlier, the boundaries of science are fuzzy, shifting, and contested, not every legitimate kind of question falls within those fuzzy, shifting, contested boundaries.

This thought, however, needs careful handling. Religious people sometimes say that science can explain how things happen, but not why: how species evolved, for example, but not why, not for what purpose.90 Aggressive atheists predictably respond that these supposed why-questions aren’t really legitimate questions at all. I would put it differently: the theory of evolution is by now very well-warranted by a dense mesh of tightly interlocking lines of evidence, and provides an explanation of the origin of species entirely in terms of past causes, without postulating any purpose, plan, or goal. So in this instance, the answer to “for what purpose?” is: not for any purpose. Similarly, when religious people ask, as some did in the aftermath of hurricane Andrew, “why did the storm destroy their church, but not ours?” my answer would be: presumably there is a meteorological explanation of why the hurricane hit there rather than here; but there was no reason why it hit this church rather than that—that was coincidence. More importantly, whatever your view about the legitimacy or otherwise of the kinds of question to which theology offers answers or the legitimacy or otherwise of the kinds of answer it offers,91 there are many other kinds of legitimate but non-scientific question.

Take a question of public policy, such as whether we should dam this river at this place. Quite properly, we want such decisions to be made on the basis of the best information available. So we look to specialists in hydro-electric engineering to tell us how much electricity we can expect the dam to supply; we look to environmental scientists to tell us what the effects on the ecology of the region would be; perhaps we ask economists or sociologists to estimate costs and benefits to local communities. The scientistic mistake is to imagine that this could be sufficient to tell us whether or not we should build the dam. It couldn’t; even if we had that hypothetical best account of the benefits and the costs, there would still be something left over: whether the benefits outweigh the costs—and this is a matter of judgment, not to be settled by additional factual information or, for that matter, by any decision-theoretic algorithm.

Or take efforts by evolutionary psychologists or neuroscientists to colonize ethics. Evolutionary psychology may be able to explain why altruism has survival value, or to teach us something about the biological origins of what are sometimes called the “moral sentiments”; neuroscience may be able to teach us something about what’s going on in the brain when we feel disgust, righteous indignation, empathy, envy, guilt, remorse, pride, etc. But none of this could tell us whether helping others is morally desirable or, if so, why it is, or how to weigh it against other morally-desirable things; nor could it tell us which of our hard-wired moral sentiments are truly morally desirable and which not—or, again, why.

(vii) When colonizing efforts fail, as inevitably they sometimes do, those of a scientistic turn of mind—perhaps on the principle that the best defense is a strong offense—may respond by denigrating the fields that resist colonization, suggesting that they are inherently inferior, that they are luxuries we really can’t afford, or even that they aren’t really legitimate fields at all. And sometimes, I suspect, scientistic disdain for the different really is little more than a matter of temperament. At any rate, some of those who look down on non-scientific endeavors seem to feel, consciously or otherwise, that making music or art, telling stories, dancing, and the like are—dare I say it?—effete, inherently inferior to the more manly task of forcing nature to give up her secrets.92 All I can say to this is that, as I see it, our culture, and my life, would be much poorer without the work of scientists, and much poorer, also, without the work of playwrights, poets, novelists, composers, artists, etc.—though poorer, naturally, in different ways; and that the too tough-minded simply fail to appreciate the richness of our many-faceted, intertwining human capacities.

Others of a scientistic bent seem to assume that the intellectual work of historians, musicologists, legal scholars and theorists, literary scholars, philosophers, etc., is inherently soft, squidgy, and weak compared with the clean, hard-edged intellectual work of the sciences. It’s true, as Percy Bridgman observed,93 that successful scientific inquiry demands an honest and unshrinking respect for the facts. (Think of Watson and Crick’s willingness to go back to the drawing board and start over after Rosalind Franklin pointed out that their early model only had room for less than 10% of the water molecules DNA was known to contain.)94 But while in some non-scientific fields the pressure of facts is looser and less direct, nevertheless, the same honesty, the same humility is required of the serious inquirer whatever his subject-matter—a historian or a legal scholar, for example, no less than of a physicist or a psychologist.

Do I mean, then, to ally myself with those who feel a professional obligation to defend the humanities against the present predilection of our universities for neglecting these areas and devoting resources instead to what the jargon calls “STEM” subjects (science, engineering, technology, and mathematics)? No, not exactly. That predilection is indeed a kind of institutionalized scientism. But by my lights the appropriate response is not to cast around for arguments that a degree in history or religious studies or Sanskrit or philosophy will really be just as useful on the job market as a degree in computer science, petroleum engineering, or accounting, but to think carefully through the complex pressures to which university administrators are responding: federal policy; half-articulated concerns about the needs of the economy; parents’ and students’ worries about graduates’ employability; and, I’m afraid, an inchoate but not entirely unjustified sense that, in recent decades, the humanities have been in serious decline, not to say near-collapse.

And what, you will be wondering, do I have to say to those who, going even further, maintain that non-scientific fields aren’t just relatively weak or just a little frivolous but outright misconceived, that there is no legitimate inquiry outside the sciences? It’s tempting just to repeat that, on the contrary, there’s obviously a whole host of perfectly legitimate kinds of question that not even the most sophisticated future science imaginable could answer. And for now, that’s about all I can do. But in the next lecture, where I turn my attention to the rising tide of scientism in philosophy specifically, I can take at least a few steps towards a fuller and more satisfying response. The scientistic philosophies in vogue today, I shall argue, are hollow at the core, in principle incapable of providing answers to crucial questions about how the world must be, and how we must be, if science is to be possible; and so they leave the very science on which they rely with no rational means of support. And to answer those crucial questions, I will continue, we need a philosophical approach that is neither purely a priori nor scientistic. Of course, as we’ll soon see, the devil is in the details.

NOTES

1. William James, “The Present Dilemma in Philosophy” (1906), pp. 15, 17.

2. C. S. Peirce, Collected Papers, 5.172 (1903).

3. Denis Diderot, Addition aux pensées philosophiques (c.1762); I rely on John Gross, ed., The Oxford Book of Aphorisms (1983), pp. 24-25.

4. Indeed, in chapter 2 of Defending Science I argued myself that formal models of “scientific reasoning” are inherently inadequate.

5. See, e.g., Michael Shermer, “The Shamans of Scientism” (2002); James Ladyman and Don Ross, with David Spurrett and John Collier, Every Thing Must Go: Metaphysics Naturalized (2007); Alex Rosenberg, The Atheist’s Guide to Reality: Enjoying Life without Illusions (2011).

6. The Oxford English Dictionary online gives two meanings for “scientism”: “a mode of thought which considers things from a scientific viewpoint”; “extreme or excessive faith in science.” The citations for the former, however, are mostly early; and the citations for the latter, described as “chiefly depreciative,” more recent. Friedrich von Hayek, “Scientism and the Study of Society” (1942), p. 269, describes scientism, the “slavish imitation of the method and language of science,” as a “prejudice.” E. H. Hutten, The Language of Modern Physics (1956), p. 273, describes scientism as “superstitious.” Peter Medawar, “Science and Literature” (1969), p. 23, describes scientism as an “aberration of science.”

7. According to von Hayek, although the earliest example given by Murray’s New English Dictionary was dated 1867, this narrower usage was already coming into play by 1831, with the formation of the British Association for the Advancement of Science. Friedrich von Hayek, “Scientism and the Study of Society,” p. 267, n.2, citing John T. Merz, History of European Thought in the Nineteenth Century (1896), vol. I, p. 89.

8. Something similar happened to the word “logic,” which used to mean “theory of whatever is good in the way of reasoning” but, with the rise of modern formal logic, became narrower in scope, and is now mostly confined to good ways of reasoning that can be formally, i.e., syntactically, represented.

9. The word “science” also sometimes refers, not to scientific inquiry, but to the knowledge that results from such inquiry—to the product rather than the process. In what follows, however, I will use the phrase “scientific knowledge” for the latter purpose.

10. Indeed, science fiction has been a significant influence on some scientists: Carl Sagan, for example, acknowledged that Edgar Rice Burroughs’s “A Prince of Mars” inspired his interest in that planet. See Michael Saler, “The Ship of the Imagination” (2015), reviewing Brian Clegg, Ten Billion Tomorrows (2015) and Matt Kaplan, Science of the Magical (2015).

11. David Wootton, The Invention of Science: A New History of the Scientific Revolution (2015), pp. 163-64 (double-entry bookkeeping), 164ff. (perspective drawing).

12. For example, Max Delbrück came to molecular biology from physics; and Maurice Wilkins (who in 1962 shared the Nobel Prize with Watson and Crick) worked on the Manhattan Project before he turned to DNA. Gunther Stent, “Introduction” to James D. Watson, The Double Helix (1968; critical edition, 1980).

13. Sharon Begley, “The Ancient Mariners” (2000), p. 54.

14. Robert Buderi and Joseph Wisnovsky, “Science: Beaming in on the Past” (1986), p. 75.

15. See, e.g., Spencer S. Hsu, FBI Admits Flaws in Hair Analysis over Decades” (2015), according to which there were errors in the hair analysis offered by FBI examiners in 95% of the cases studied. The following year, Santae Tribble—exonerated after spending 28 years in prison—was awarded $13.2 million when it was found that the hair analysts who had “matched” hairs found at the crime scene to him were unable even to distinguish human hair from dog hair. Spencer S. Hsu, “Judge Orders D.C. to Pay $13.2 million in Wrongful FBI Hair Conviction Case” (2016).

16. See, e.g., Erica Beecher-Monas, “Reality Bites: The Illusion of Science in Bite-Mark Evidence” (2009).

17. See, e.g., McLean v. Arkansas Bd. of Ed., 529 F. Supp. 1255 (E. D. Arkansas, W. D., 1982), and Edwards v. Aguillard, 482 U.S. 578 (1987) (creation science); Kitzmiller v. Dover Area Sch. Dist., 400 F.Supp. 2d 707 (M.D. Pa. 2005) (Intelligent Design Theory).

18. John Snow, On the Mode of Communication of Cholera (1855).

19. See, e.g., Jacob Stegenga, “Is Meta-Analysis the Platinum Standard of Evidence?” (2011); Massiliano Copetta et al., “Advances in Meta-Analysis: Examples from Internal Medicine to Neurology” (2013).

20. Thomas H. Huxley, On the Educational Value of the Natural History Sciences (1854), p. 13.

21. Albert Einstein, “Physics and Reality,” in Ideas and Opinions of Albert Einstein (1954), p. 290.

22. John Dewey, Logic, the Theory of Inquiry (1938), p. 66.

23. Percy Bridgman, “The Prospect for Intelligence” (1945), p. 535.

24. James B. Conant, Modern Science and Modern Man (1952), p. 22. More recently, some scientists have suggested that we can see precursors of scientific inquiry even in babies. Alison Gopnik, Andrew N. Meltzoff, and Patricia K. Kuhl, The Scientist in the Crib: What Early Learning Tells us about the Mind (1999).

25. Gustav Bergmann, Philosophy of Science (1957), p. 20.

26. Steven Weinberg, To Explain the World: The Discovery of Modern Science (2015), p. x (my italics).

27. John Maddox, What Remains to be Discovered: Mapping the Secrets of the Universe, the Origins of Life, and the Future of the Human Race (1998), p. 2.

28. For example, according to Sheldon Glashow what led to the decline of Arab science early in the second millennium was the influence of the doctrine of Taqlid, that there are no truths beyond the Koran; and what led to the decline of Chinese science in the fifteenth century was the idea that there was nothing beyond the Celestial Empire worthy of discovery. Sheldon Glashow, “The Death of Science” (1992), p. 28.

29. Wootton, The Invention of Science, pp. 53-54 (the words “experience” and “experiment” began to diverge in the seventeenth century); 103 ff. (the idea of “discovery,” previously used only in the context of geographical exploration, emerged from the Portuguese “discubrimento”); 251 ff. (“fact,” first used in English by Thomas Hobbes); 385 ff. (“hypothesis”); 400 ff. (“evidence”).

30. William Whewell, “On the Connexion of the Physical Sciences, by Mrs. Somerville” (1834), p. 60. “Natur-Forscher” means, literally, “nature-poker.” However, the new word “scientist” caught on only rather slowly, apparently because of its disagreeably mongrel ancestry: the “scient-” part comes from the Latin, but the “-ist” is Greek in origin.

31. Sidney Ross, “‘Scientist’: The Story of a Word” (1962), pp. 71-72 (citing Isaac Todhunter’s biography of Whewell).

32. Francis Bacon, Works, IV, p. 42 (The New Organon [1620], Aphorism II).

33. Editors’ Preface to De Interpretatione Naturae Proemium, Bacon, Works, III, p. 515.

34. Bacon, Works, IV, p. 107 (The New Organon, Aphorism CXXI).

35. My source is J. E. Roe, Sir Francis Bacon’s Own Story (1918), p. 61.

36. See, e.g., Rick Bonney et al., “Citizen Science: A Developing Tool for Expanding Science Knowledge and Scientific Literacy” (2009).

37. See e.g., George S. Counts and Nucia Lodge, The Country of the Blind (1949), chapter 6; Valerii Soifer, The Tragedy of Soviet Science (1994).

38. David Armstrong and Keith J. Winstein, “Antidepressants under Scrutiny over Efficacy—Sweeping Overview Suggests Suppression of Negative Data has Distorted View of Drugs” (2008).

39. Charles Darwin, letter to Joseph Hooker (January 11, 1844).

40. The discovery was made in 1982, by Australian scientists Barry Marshall and J. Robin Warren. Initially, their idea was very controversial; but in 2005 Marshall and Warren were awarded the Nobel Prize in medicine for this work. Francis Mégraud, “A Humble Bacterium Sweeps This Year’s Nobel Prize” (2005).

41. For a summary of this story and numerous references, see Haack, Defending Science, pp. 199 ff.

42. See Horace Freeland Judson, The Eighth Day of Creation: Makers of the Revolution in Biology (1979), p. 39.

43. Oswald T. Avery, Colin M. MacCleod, and Maclyn McCarty, “Studies of the Chemical Nature of the Substance Inducing Transformation in Pneumococcal Types” (1944).

44. Specifically, I don’t mean to suggest, as Kuhn did, that such revolutionary changes cannot be explained in terms of scientists’ assessment of evidence, but must be thought of in essentially political ways. Thomas Kuhn, The Structure of Scientific Revolutions (1962).

45. Susan Haack, “Peer Review and Publication: Lessons for Lawyers” (2007).

46. See, e.g., John Bohannon, “Who’s Afraid of Peer Review?” (2013) (Prof. Bohannon submitted 304 versions of a hoax article to open-access journals, more than half of which accepted it); Charlotte J. Haug, “Peer-Review Fraud—Hacking the Scientific Publication Process” (2015) (authors are suggesting “peer reviewers” whose e-mail addresses are actually fake accounts they have themselves set up).

47. William A. Wilson, “Scientific Regress,” First Things (2016), p. 42. See also Susan Haack, “The Integrity of Science: What It Means, Why It Matters” (2006).

48. Wilson, “Scientific Regress,” p. 42.

49. And at the most extreme, we see what Wilson calls the “Science Cult” manifested in the popular Facebook site “I f—ing love Science!” and the hashtag “#sciencedancing.” Wilson, “Scientific Regress,” p. 42.

50. See, e.g., the essays in Semir Zeki and Oliver Goodenough, eds., Law & the Brain (2004).

51. See, e.g., Paul Harris and Alison Flood, “Literary Critics Scan the Brain to Find Out Why We Love to Read” (2010).

52. See, e.g., Patricia Smith Churchland, Neurophilosophy: Toward a Unified Science of the Mind-Brain (1986).

53. See, e.g., Paul M. Churchland, Scientific Realism and the Plasticity of Mind (1979), and “Eliminative Materialism and the Propositional Attitudes” (1981); Patricia Smith Churchland, “Epistemology in the Age of Neuroscience” (1987).

54. As I patiently explained a quarter of a century ago. Susan Haack, Evidence and Inquiry (1993; expanded edition, 2009), chapter 8.

55. Alex Rosenberg, The Atheist’s Guide to Reality: Enjoying Life without Illusions (2011).”

56. Raymond Tallis, Aping Mankind: Neuromania, Darwinitis, and the Misrepresentation of Humanity (2011).

57. See, e.g., Paul Davies, Cosmic Jackpot: Why Our Universe is Just Right for Life (2007); Robin Collins, “The Fine-Tuning Evidence is Convincing” (2013); Man Ho Chan, “Would God Create our Universe through Multiverses?” (2015).

58. The word was coined by Paul Geisert and Mynga Futrell, apparently in hopes that (like “gay” for “homosexual”) it might, unlike “godless” or “faithless,” convey a positive connotation.

59. This was, to be fair, apparently not Giesert’s and Mynga’s intention. But it was hardly surprising that such an ill-chosen new word would be taken to have this implication. Dennett didn’t help matters by writing that “we brights don’t believe in ghosts or elves or the Easter Bunny—or God.” Daniel Dennett, “The Bright Stuff” (2003); nor did Dawkins by writing that “brights constitute … a stunning 93% of those scientists good enough to be elected to the élite National Academy of Sciences.” Richard Dawkins, “The Future Looks Bright” (2003).

60. See Susan Haack, “Credulity and Circumspection: Epistemological Character and the Ethics of Belief” (2014).

61. See e.g., John P. A. Ionnadis, “Contradicted and Initially Stronger Effects in Highly Cited Clinical Research” (2005); Wilson, “Scientific Regress.”

62. Mary Lefkowitz, Not Out of Africa (1997), p. 157.

63. Daubert v. Merrell Dow Pharm., Inc., 509 U.S. 579 (1993). (The ellipses are in the original: Justice Blackmun has strategically excised some key words from the text of Federal Rule of Evidence 702, which speaks of “scientific, technical, or other specialized knowledge.”) See also Susan Haack, “Trial and Error” (2005).

64. In 1968 C. Trusedell gave a list based on a random search of graduate-school catalogues: “‘Meat and Animal Science’ (Wisconsin), ‘Administrative Sciences’ (Yale), ‘Speech Science’ (Purdue), …, ‘Forest Science’ (Harvard), ‘Dairy Science’ (Illinois), ‘Mortuary Science’ (Minnesota).” Trusedell, Essays in the History of Mechanics (1968), p. 75. The list, and especially “Mortuary Science,” became famous among philosophers of science when Jerome Ravetz cited it in Scientific Knowledge and Its Social Problems (1971), p. 387, n.25. The trend continues: “Management Science” and “Library Science” are now commonplace, and on a recent visit to Wayne State University I passed a university building bearing the sign “Mortuary Science.”

65. Daubert, 509 U.S. 579 (1993), 593.

66. McLean, 529 F. Supp. 1255 (E. D. Arkansas, W. D., 1982).

67. Kitzmiller, 400 F.Supp. 2d 707 (M.D. Pa. 2005).

68. See e.g., Peter Bock, Getting It Right: R&D Methods for Science and Engineering (2001), p. 168; Stephen S. Carey, A Beginner’s Guide to Scientific Method (fourth edition, 2011), pp. 305ff. Hugh G. Grauch, Scientific Method in Practice (2003) gives a brief statement (p.11) of the kind of thing typically found in beginning college texts, but acknowledges its simple-mindedness—though he is himself quite naïve about the supposedly “incisive thinking and penetrating analysis” of Popper and Kuhn. See also the amicus brief submitted in Daubert by the Product Liability Council, telling us that the Scientific Method is: “(1) set forth a hypothesis, (2) design an experiment … to test the hypothesis, (3) conduct the experiment, collect the data, and then analyze those data, (4) publish the results so that they may … be subject to critical scrutiny, and (5) ensure that these results are replicable and verifiable.” Brief of Product Liability Council as Amici Curiae in Support of the Respondent (No. 90-102) 1993 WL 13006388, *3 and n.20.

69. A particularly good example is Sylvia Wassertheil-Smoller, Biostatistics and Epidemiology: A Primer for Health and Biomedical Professionals (third edition, 2004) which, after some brief and very naïve remarks about the “scientific method” in general, turns serious attention to the serious stuff—which, however, applies only in this specific field.

70. Jeremy Bentham, “Anarchical Fallacies: Being an Examination of the Declaration of Rights issued during the French Revolution,” in John Bowring, ed., The Works of Jeremy Bentham (1843), vol. II, p.501. (Bentham was referring, of course, to the idea of natural rights.)

71. Olli P. Heinonen, Denis Slone, and Samuel Shapiro, Birth Defects and Drugs in Pregnancy (1977); see in particular the description of the project design and data collection, pp. 8-29. The record in Blum v. Merrell Dow Pharms, Inc., 33 Phila. Co. Rep. 193 (1996), 215-17 shows that Dr. Shapiro admitted under oath that the study had failed to distinguish these two sub-groups of the sample.

72. Claire Bombadier, et al., “Comparison of Upper Gastrointestinal Toxicity of Rofecoxib and Naproxen in Patients with Rheumatoid Arthritis” (2000) (the original article); Armstrong, David, “Bitter Pill: How the New England Journal of Medicine Missed Warning Signs in Vioxx—Medical Weekly Waited Years to Report Flaws in Article that Praised Pain Drug—Merck Seen as ‘Punching Bag’” (2006) (explaining what went wrong). See also Susan Haack, “The Integrity of Science: What It Means, Why It Matters; “Peer Review and Publication: Lessons for Lawyers.”

73. David Abrahamson, The Psychology of Crime (1967), p. 37.

74. Igor Pacheco, Brian Cerchiai, and Stephanie Stoiloff, “Miami-Dade Research Study for the Reliability of the ACE-V Process: Accuracy and Precision in Latent Fingerprint Examinations” (2014), pp. 14-15.

75. FBI, Forensic Science Communications (2009).

76. See, e.g., E. O. Wilson, Consilience: The Unity of Knowledge (Knopf, 1998).

77. Susan Haack, “The World According to Innocent Realism: The One and the Many, the Real and the Imaginary, the Natural and the Social” (2014); “Brave New World: Nature, Culture, and the Limits of Reductionism” (forthcoming).

78. A kind of scientism to which, ironically, William Wilson succumbs when, apparently seduced by snappy rhetoric, he writes that “[t]he problem with science is that so much of it simply isn’t.” Wilson, “Scientific Regress,” p. 37.

79. Nora Barlow, Charles Darwin and the Voyage of the Beagle (1946), p. 151 (referring to Darwin’s Buenos Ayres [sic] notebook recording observations from late 1832 and early 1883, where he describes one specimen as “length: one handkerchief and half”).

80. Haack, “Just Say ‘No’ to Logical Negativism” (2011) gives comprehensive quotations and references for all these points; and shows that the distinguished British scientists who enthusiastically endorsed Popper’s ideas (Sir Peter Medawar, Sir John Eccles, and Sir Hermann Bondi) all misunderstood what those ideas were.

81. It is sometimes said that evolutionary biology does make predictions; but what is meant is only that it predicts what fossils you can expect to find at such-and-such places, not that it predicts how the evolution of species will go in the future.

82. Larry Laudan, “The Demise of the Demarcation Problem” (1983), p. 347.

83. As the Supreme Court finally acknowledged in the third of a trilogy of cases on the standard of admissibility of expert testimony, Kumho Tire Co. v. Carmichael, 526 U.S. 137 (1999), which concerned not scientific testimony but the testimony of a supposed expert on tire design. See also Susan Haack, “The Expert Witness: Lessons from the U.S. Experience” (2015).

84. So-called because it was introduced in Lemon v. Kurtzman 411 U.S. 192 (1973).

85. Ironically, this was a (perhaps unintended) change from the earlier ruling where the Court had introduced the “purpose” idea, but had spoken of “purposes,” in the plural. Abington Sch. Dist. v. Schempp, 374 U.S. 203 (1963), 222.

86. “Wish you were here,” Oxford Today (1998).

87. The BlueBook, the Bible of law librarians and law-review editors, lays down the very elaborate rules governing legal citations.

88. John Locke discusses this question, put to him by William Molyneux, in the second edition of his Essay Concerning Human Understanding (1694), II.ix. See generally Brian Glenney, “Molyneux’s Question.”

89. Recent work on subjects who had cataracts removed indicates that at first they were unable to distinguish the shapes by sight, but could do so when subsequently re-tested. Richard Held, et al., “Newly Sighted Cannot Match Seen with Felt” (2011).

90. Biologist Kenneth Miller (co-author of the biology text at the heart of the case), interviewed in Judgment Day: Intelligent Design on Trial, a Nova documentary on Kitzmiller.

91. See Susan Haack, “Fallibilism and Faith, Naturalism and the Supernatural, Science and Religion” (2005).

92. The idea of forcing nature to give up her secrets is Baconian; but to my surprise it turns out that it wasn’t Bacon himself who writes of “putting nature on the rack,” but Leibniz, describing Bacon’s view. Leroy E. Loemker, Gottfried Wilhelm Leibniz: Philosophical Papers and Letters (second edition, 1970), p. 465 (from a letter to Gabriel Wagner, 1696).

93. Percy Bridgman, “The Struggle for Intellectual Integrity” (1933), especially section II.

94. The story is told by Watson in The Double Helix (1968), p. 59.