images

Rational, adj. Devoid of all delusions save those of observation, experience, and reflection.

—Ambrose Bierce, The Devil's Dictionary

What Ambrose Bierce in his usual fierce deadpan manner meant is that observation, experience, reflection, and, we daresay, logic do not necessarily save us from delusions, as anyone with more than a skin-deep understanding of the history and philosophy of science will appreciate.1 Freeze or idealize any method and, presto, you have a catechism that rules out valid and valuable avenues of inquiry. Positivism, anyone? Skinnerian behaviorism? How about rational choice? All these secular priesthoods, as anyone who has crossed swords with them will attest, would like nothing better than to excommunicate anyone they deem heretics.

Scientism is the use of theory to explain reality, as one unknown wag concisely put it, while science is the use of reality to explain theory. Pushed beyond their proper domain, and with all caveats shaken off, these methods instead become barriers to investigative thought.2 “Science and religion have changed places,” philosopher Slavoj Žižek, who can always be counted on for hyperbolic accuracy, observes. “Today science provides the security religion once guaranteed.”3 John Gray adds, “For us, science is a refuge from uncertainties, promising—and in some measure delivering—the miracle of freedom from thought, while churches have become sanctuaries for doubt.”4 The ironies are acute since organized religion was the original culprit in radical diagnoses of alienation.

A pervasive hitch in jeremiads such as Chris Mooney's The Republican War on Science is that they attribute repugnant decisions by the George W. Bush or the Trump administration today to sheer ignorance of or else a mere misconstruing of science, rather than to the perpetrators’ interest-driven agendas to which they force all else to conform. One need not dally much with Machiavelli to realize that it is customary for political factions to pounce on absolutely all events in order to interpret them to their advantage.5 Imagining that political figures care foremost about confining themselves to the dictates of science (or even of the “intelligence community”) is pretty much the Olympian height of naiveté.6 Expecting rationality to reign at these conniving high levels is, one might say, quite irrational. Demanding rationality is fine and necessary, but what exactly is it that is demanded?

The Republicans, in their collective continuous King Canute conniption fit, oppose global climate-change arguments, embryonic stem research, and evolution. Trump, on his sordid campaign trail, promised to dump the EPA withdraw from the Paris climate accord, and indeed repudiated the very idea of climate change.7 Following the money, not scripture, is much more revealing of bedrock motives. Republican atavism is rooted less in medieval lore or tribal fancies than in pecuniary concerns of corporate players who enlist any credulous ally in the fight against restrictions on their quest to commodify everything in sight for power and profit.8 From their lofty Wall Street point of view it is always the perfectly rational thing to do. Nothing can be dispensed properly without shelling out a profit for it to some chokepoint supplier. Without serious mega-donors like the Kochs and their clout, the revived Scopes trial image of modern America, allegedly overrun by petit bourgeois Tea Partiers, would recede into marginalized backwater grousing where it belongs.9

So irrationality isn't evidenced only in braying that the government should keep its paws off social security or insisting that humans coexisted with velociraptors or reading horoscopes or imagining Elvis lives. Irrationality includes believing tax cuts make everybody better off, that austerity policies cure economic stagnation, that high-tech military coercion is wise foreign policy, that Saddam Hussein had nukes ready to fly in forty-five minutes flat, and that indecipherable financial derivative instruments make markets safe and sound.10 One finds plenty of reputable folks who espoused or espouse all those extremely dicey propositions. Some even accrue Nobel prizes and cabinet appointments. Distinctions between delusions and blatant lies might be made in all the preceding cases, but we usually cannot know for sure which is which. Bleeding, after all, once was the last word in medical practice; phrenology and eugenics were respectable and even revered sciences. So we're dealing with a continuum of irrationality wherever we tread.

The insidious predicament is that irrationality dons a host of pleasing shapes, including, most effective of all, the guise of bright shining rationality itself.11 This is hardly a surprise. Karl Mannheim (and Max Weber) had good reason to divide forms of rationality between instrumental (“subordination of all means to a single end”) and substantive (“insight into complex interrelations of matter”), because the former version encourages unchecked excess and ultimately mad actions. From this source we eventually get the paranoid certainties of closed system thinking;12 and the remorseless narrow centralizing view of what James C. Scott calls “seeing like a state”—which is applicable to any powerful entity, public or private.13

No enterprise, not even science, is immune to manipulation, to being bent to the purposes of the wily agent. The satanic figure in ecclesiastical garb haunts all religions, with Spanish inquisitions, Witchfinder generals, and Dostoevsky's not entirely fictional grand inquisitor sowing horrors in the quest to stamp out heretical irrationalities and rationalities alike. Determinism is just the secularization of church dogma, the subordination of intellect to a single narrow track of explanation for any phenomenon. “The mechanical philosophy of Robert Boyle was integral with his natural theology,” Robert M. Young notes:14

Newton's mathematical principles were of a piece with his Biblical ruminations. Charles Lyell's uniformitarian geology was carefully composed so as to be consistent with his views on Genesis, and he never deviated from believing in the separate creation of man, no matter how much he conceded to Darwin with respect to the evolution of other forms of life. Similar claims…were later to be convincingly made about Darwin.15

Early nineteenth century French political theorist Henri de Saint-Simon, who apparently knew something about human psychology, started Newtonian temples based on allegedly scientific principles, which were “never simply contemplative, but consciously manipulative in design.”16 Charles Sanders Peirce later anticipated, in the purest pragmatic faith, arrival at the single “opinion that is fated to be ultimately agreed to by all” through scientific means, which is what scholars actually intend whenever on a futile quest to establish a unitary social science or a unitary discipline of any kind.17 The dream of all demagogic intellects is to force all thought, all imagination, all inquiry down a common route, congenial to their own emotional needs and institutional aims.

In economics, neoliberals want to liberate us through the tender mercies of the unchecked but state-aided market, which permits no other gods before it. Positivists want to free intellect by chaining it to their desiccated creed, which permits no examination of the social relations in which they themselves are reverently embedded.18 Science for them must have one format or else it just isn't science.19 Hermeneutics, phenomenology, and psychoanalysis do not count; they literally do not count in this perpetual haste to worship the mystique of numbers. Who then legitimately wears the mantle of rationality? “In fear of the growth of an irrational anti-science movement, researchers and those who yearn for the benefits of scientific research have doubled their efforts to dampen the effects of criticism,” Wendell Wallach recently noted.20 “In other words, even criticism contributed to driving forward the technological juggernaut.” That is our counterintuitive theme here.

REASON VERSUS RATIONALITY

Rationality can be as much a crass self-serving slogan as it is a term comprising a desirable means of handling reality; the capturing of the emotive term—and it is emotive—as a standard for one's paradigmatic preferences is one of the key political moves used every day in the social sciences in battles between different branches and schools. The reason why Max Horkheimer and the Frankfurt School defined rationality as incorporating an emancipatory intent is because a one-dimensional form of rationality leads not only to dead ends but dreadful ones too.21 Absent a conscious emancipatory intent to ground rationality in a dynamic critical view of social relations, science easily shifts into and is subordinated to elite agendas of domination. Rationality, when drawing unnecessary inflexible borders around admissible knowledge, detracts from reason.

The resistible rise of rational choice, especially in political science and sociology, is a case of primal urges—the desperate seeking of certainty—posing as saintly bearers of unalloyed truth, whose followers imagine everyone else is less “scientific” than they are. Does the fellow in the white lab coat have anything in common with the yahoo beneath white sheets and hood? They most certainly had crucial things in common during the heyday of eugenics when they shared a crusader's sense of serving the one true faith. These white-garbed warriors backed the Ahabian motto of all the means being sane but the goal being mad. In this same vein Jacques Ellul later dryly summed up the onset of “the technological society” as the increasing application of precise means to carelessly examined ends.22 Yet just a few decades ago this state of affairs had been severely challenged to the point of near extinction.

“Once the restrictive canons of positivism and the naive belief in objectivity were discredited,” Barry Richards wrote in the 1980s, “the reasoning for ignoring psychoanalysis or phenomenology or anything else on the grounds that it was not objective science—was fatally weakened.”23 True at the time. Unfortunately, they came roaring back because the underlying thrust, the emotional and the institutional craving for easy or complex certainties, propelled them back to prominence again. Genetic perfection (despite eugenics), “high tech war-winning weapons” (despite Vietnam), and wonder drugs (despite the anti-psychiatry movement) were far too attractive as pseudo-explanatory tropes to stay down for long.

By the 1970s technological determinism, as an ubiquitous cultural fixation, had lost much of the persuasive power it enjoyed in the early postwar era. Critiques by Lewis Mumford, David Noble, Reinhard Skinner, Langdon Winner, the Frankfurt School, and other eminent analysts subdued the naive zeal animating giddy visions of cure-all gadgetry.24 Yet technological determinism was only pushed shyly to the side, where it was remained the creed of choice by technically proficient practitioners at places like MIT, the computer and telecommunications industries, or the wider encompassing military-industrial complex, which see no reason to quibble at the determinative power of the instruments they wield, especially as those instruments underwrite their paychecks.25

“What counts is what can be counted,” is a Laingian double-bind phrase that pretends to liberate inquiry at the same time as it crimps it.26 The slogan, paraphrased, is emblazoned above the Social Science Research Building at the University of Chicago. And it's crazily restrictive, the repository of the lazily industrious mind, all too pleased to settle on a socially approved proposition. The allure of premature certainty beckons us through this widespread Philosophy 101 version of rational activity, an activity, as Hannah Arendt admonished, that was ultimately dehumanizing, as we observe what once were fellow creatures but now are gnats from a distant Archimedean vantage point.27

One unhappy irony of the struggle against Republican Party intransigence on climate change, and similar idiocies, is that supposedly pro-science champions skip merrily past all the withering critiques that had worked so well to keep determinist fancies at bay. A restored certitude arises among ardent defenders of science that rivals any televangelical con man working the crowd or airwaves. Science becomes pretty much what clueless positivists say it is and nothing more. The scene begins to take on the character of a B-movie where a scientist assures everyone that there is a perfectly rational explanation just as the monster descends. One is supposed to forget that behind scientific authorities is a panoply of institutional interests they work for in a commercial realm of ever-more maldistributed power.28 Determinist ideologies of this sort suit authorities because they are peddled and packaged by the same authorities to shore themselves up.

So determinism, in many beguiling forms, slips back into public discourse. We now are supposed to know that there is a pinpointable gene for every behavioral quirk, a rising from the grave of eugenics in spiffy new embodiment. We supposedly know in the same dubious way that pharmaceuticals “correct” chemical imbalances and dopamine deficiencies. Half the United States now is drugged to the gills, mostly legally. Finally we trust that statistical apparatuses are utterly reliable guides to military security matters. Underlying all these confident claims is a resurgent faith in the old time religion of obsolete nineteenth-century notions of science. Against the allure of those notions Wallach cites the classic but deeply underappreciated conundrum that each decrease in “black swans” just “adds layers of complexity that breed more black swans,” which as Schwartz long ago noted leads to underestimates of occurrence of “low-probability” misfortunes,” and then “to a chain reaction of unanticipated events or underanticipated events.”29 Only a dynamic sensibility about science can cope with or avoid the primly static paths that exacerbate these problems.

In Vietnam, the Hamlet Evaluation System (HES) and the entire contrived menu of McNamara's cherished indicators attested to a progress that, under scrutiny, turned out to be a figment of the measures themselves. In economics, derivatives were hailed as wise instruments for market protection until they caved in the economy. The economists were and are bedazzled by the supra-reality of numbers, as if numbers themselves were not metaphorical. Defenders of science often embrace overhyped depictions of the causal magic of genes, of brain chemistry, of artificial intelligence, and of magic bullet warfare. These latter phenomena are modern siren songs, not irrefutable arguments, and as usual they obscure underlying key social forces and purposes at work.

A emblematic 2017 London event entitled “A Celebration of Science & Reason”—and a mere snip at 117 pounds a ticket (who can afford their brand of reason?)—featured Richard Dawkins and Sam Harris, two chaps as bound by blinkered views of science as any Bible belter is by their own vaunted strict interpretation. The counterattack on much plainer fools makes any professed enemy of superstition and tradition appear to be a gallant enlightenment hero. Scientism, a misapplication of a form of science from one domain to another in which it is not suited, is a persistent pestering presence. The best scientists know their techniques are not universal skeleton keys, but not all scientists are the best scientists as any scan of scientific scandals will attest.30

Many scholars, apart from the Frankfurt School, warned against overvaluing dull instrumental logical apparatuses for unraveling tough human puzzles. The author of The Republican War on Science blithely regards anyone who disputes the soundness of genetic engineering as an idiot, and invokes Steven Pinker to grouse that “the genetic underpinnings of human behavior” have often gone unstudied because of a “general left-of-centre sensibility that anything to do with genes is bad.”31 Pinker is enamored of the “notion that a tendency toward violence might have genetic roots” even or especially if it raises among the spoilsport Left “the fear of Eugenic-style solutions.”32 This latter accusation conveniently is scored as an “abuse of science” for which conservatives award themselves points. Yet the fundamental question underlying these debates is, “Are we containers of objectified forces,” as Smail succinctly puts it, “or do we pursue our own needs and wants?”33

GENETICS: REGRESSION TO THE MEANEST?

Science and medicine in the nineteenth and early twentieth centuries, saturated by a casually racist and class-ridden Western culture, generated the grim crusade of eugenics, and there is nothing in twenty-first-century life to lead us to expect that genetic research, even if conducted by improbably angelic sages, will be free of originating eugenic ambitions. Even the British Medical Association soberly had to admit that despite “efforts constantly made to distinguish between eugenics and genetics, the two are clearly related.”34 The best stock, you see, must win out, although many fretful upper-class specimens concluded that they had to fight for their survival against a rising tide of mediocre biological material constituting the masses. The Nazis took this stern frame of mind to the limit and beyond, though Nazism was, according to one pithy summation, “nothing but applied biology.”35 A happy eugenic ending requires a bit of ruthless pruning.

The seductive notion of good stock—always “us”—remains quietly pervasive in popular culture. In Western industrial countries this terribly self-flattering idea hooked up with technological supremacy, which was believed to accompany through some sort of divine providence one's superior biological heritage. For a “generation fearful of race decline, the new technology seemed to buttress the dominant position of the white race.” Michael Sherry noted of the turbulent interwar era, “The vision of air war that emerged before WW II was shaped by the broad currents of racism and faith in progress.”36 How lucky affluent WASPs were that, as Max Weber reminds us their clergy preached, virtue and power strutted hand in hand.37 The “nullity” of “specialists without spirit, sensualists without heart,” as Weber foretold, now run rife in assured urbane pedigreed ways. It is, happily for the winners, genetic destiny.

Late nineteenth-century race theory imposed, in Mahmood Mamdani's words, a “marker dividing humanity into a few superhuman and the rest less than human, the former civilized, the latter putty for a civilizational project.”38 The harrowing Nazi experience was the crest point for eugenics, but the dotty dream of biological perfection—perfect relative to what?—persisted, and eugenic sterilization went on being practiced in some US states well into the 1970s, targeting the poor and minorities. In the tidy eugenic universe, if you lack resources to fight back you clearly count as a confirming instance of inferiority. Eugenics did not so much disappear after 1945 as go stealthy. The animating conceits clung tenaciously on.

A core conceptual conceit here is the notion that genes are perfectly self-contained single action entities designed to produce invariant outcomes: blue eyes, snub noses, schizophrenia, and inclinations to obey or break traffic laws. Yet the most reputable investigators discovered regarding genes that, far from being single-action or immune to environment, “redundancy seems to be built into gene functions, where, if one is ‘knocked out’ others seem to supply the same function,” Evelyn Fox Keller summarizes, which ought to pose an insuperable problem for the popularized “explanatory framework of the genetic paradigm.”39 Nonetheless, the newspaper and media still overflow with breathless routine announcements of discoveries of genes for this or that or the other behavioral trait.

Negative eugenics augured mass murder; positive eugenics involves prenatal genetic testing, in vitro screening, and prenatal testing for the purpose of preventing “carriers” of undesired genes. The laudable aim of curing diseases is deployed to camouflage the underlying and undying elite dream of exerting complete control over the behavior and the beliefs of the population.40 A generously funded apparatus of labs, and an accompanying medley of credulous media outlets, have been tilting the nature versus nurture debate firmly toward nature. Yet the Human Genome Project, ballyhooed to the heavens, came up with distressingly paltry results, given what they expected to find.41

The first schizophrenia gene allegedly was nabbed at University College London in 1991, but nobody since has replicated this or any other such claim of genes for mental maladies.42 “The hypothesis that enduring psychosis/schizophrenia has strong genetic underpinnings,” a 2016 survey concludes, “is not supported by available evidence.”43 David Plomin remained hopeful, though puzzled, as to why genetic research into behavioral traits has yielded so little.44 The next backstop was to resort to the proposition that it is likely that there are “a number of susceptibility genes which interact with one another and with environmental effects.”45 That extenuating collective tack did not work out either, but implacable pursuit squads of researchers remain sure that the right amount of money, technical devices, and pluck ultimately will snare the long-sought genetic culprits galloping away just over that next high ridge.

What then is the statistical significance outcome of all the claims that a gene, or a distinct group of genes, has been located with exactitude as causing inclinations to be gay or to bite your nails or to question authority? Well, it's around zero point zero something. On that patently flimsy basis one daily encounters scientists—securely cocooned in upper-middle-class lives—boldly stating that we must not be afraid to “face facts,” just as German scientists in the 1930s faced facts of their own making and did what was required. One need not have visited Nazi Germany though, which took its cue from Indiana and North Carolina sterilizations prescribed for lower orders, especially if not white enough. Let us suppose that hardnosed scientific authorities like Charles Murray or James Watson found that most unexpected of findings: that their own genetic composition was inferior or “tainted” (as eugenicists liked to put it). Would they then have agreed blithely to have their bloodlines rubbed out? These are the baleful blinkered breed of scientists that Kurt Vonnegut, putting his University of Chicago anthropology experience to some good use, mercilessly satirized in Cat's Cradle, and they are always around because class society is always around and neuroses too are always with us.

The cumulative result in the research sweepstakes for larger portions of funds for diminishing results is significance ratings approaching the odds of drawing a royal flush on first deal. Jay Joseph and Claudia Chaufan piquantly note a Colorado Adoption Project twins study that found “the mean personality scale correlation between birthparents and their adopted-away biological offspring”—a relationship that they considered “the most powerful adoption design for estimating genetic influence”—was zero.”46

The meaning of heritability is conveniently confusing so as to mislead the public to believe that there is a direct percentage of probability for the occurrence of each disease or trait. Joseph and Chaufan, though, point out the elementary fact “that genes and environments do not build phenotypes independently, so there is no common unit of measurement that enables us to say that in one individual, genes “caused” x percent of some trait, while the environment caused y percent.”47 The press, culturally gullible and opportunist, chooses to print the myths. The real loss is to studies of environmental influence in mental health maladies.48 According to the dogma, no parent, or set of social conditions, anywhere at any time can ever inflict mental damage on a child who is not predisposed to suffer it anyway. One can see why parental organizations adore this congenial line of argument.

What about Pinker's whining about left-wing interference? “In 1965 a team of serious researchers suggested that XYY males were more aggressive,” Alvin Rosenfeld noted. “That finding was disseminated widely and contributed to a heated debate about behavioral genetics. In 1993 a report for the National Academy of Science dismissed the link as unproven and nothing of the kind since has withstood scrutiny. In those intervening years, many parents were warned about their XYY children's potential for aggressiveness.” How many families were hurt, he asks, “because parents were worried that their children might grow up to be murderers?”49 Pinker, however, is dead sure that science will find otherwise. As David Bell aptly writes of Pinker's latest tome Enlightenment Now, his stance is one “in which data and code are all too often held to trump serious critical reasoning and the wealth of the humanistic tradition and of morally driven activism is dismissed in favor of supposedly impartial scientific and technological expertise.”50 Just keep pumping money and you're sure to find something that pleases Pinker. Or are you?

Ruth Hubbard reports that DNA neither “self-replicates” nor “self-transcribes” nor “self-translates”: the cell, in response to a host of internal and external signals, “transcribes segments of DNA and translates the resulting RNAs into proteins.”51 The notion that environmentally induced changes can be inherited has long been regarded as Lamarckian folly, but as Stuart Newman points out, it is now being incorporated with whatever degree of chagrin into evolutionary theory.52 As David Moore writes, “We now know that DNA cannot be thought of as containing a specific code that specifies particular predetermined (or context independent) outcomes.”53 A NASA study recently found that 7 percent of the DNA of an astronaut in space for a year altered permanently from the experience.54

In Evelyn Fox Keller's words, DNA “does not even encode a program for development.”55 The idea of a predetermined code, of DNA as the rigid template of heredity in which our fates are transcribed, is at odds with phenotypic plasticity, the ability of organisms to adapt to the demands of their particular ecological niches.” What is it to say, asks Evan Charney, “that all the features of a person are in her genes, but to take all the attributes of a person and ascribe them (or their “predispositions”) wholesale to what is in effect a little latent person—the inherited homunculus-genome? The reigning ideology of DNA.”56 Indeed.

None of this is news to anyone paying attention outside mission-oriented research bubbles of gene hunters.57 Psychiatrist Hervey Cleckley in 1941 could observe presciently that “distinctions between organic and psychogenic are sometimes far from absolute.”58 The organism changes in response to every item of experience. “It would not be profitable to confine our concept of what is organic to the cellular level with so much already known which indicates that molecular and submolecular changes (colloidal, electro-chemical, et) are regularly resulting from our acts of learning, or, if one prefers, from all of our conditioning.”

One can cite many more examples of clear-eyed misgivings about gene research, all the way back to the 1940s and earlier. All were ignored by researchers avid to prove otherwise. This current credulous phenomenon promises a farcical repeat of the eugenics reign, where a dubious but popular creed moved side by side with Mendelian scientists whose investigations refuted the whole basis on which eugenics enthroned itself. The word somehow did not get out. Ignoring refutations is the conventional researcher's first line of defense. Philosopher of science Thomas Kuhn, according to some of interpreters, seemingly approved of this conservative methodological practice as one that assured that only the hardiest anomalies would rupture a reigning paradigm.59 Jonathan Leo recalls,

While there are famous adoption studies and there are famous twin studies, these are separate studies. The famous adoption studies did not study adopted away twins. I once heard the Department Chair of a major university Psychiatry Department make the same mistake in a seminar, and when challenged about it he, and the entire room of biological psychiatrists, stood by the claim. Be careful of the echo chamber in your field.60

Every scientific field, as inevitable in any institution, features its echo chambers, sometimes known as “epistemological communities.”61 Physicians Michael Joyner, Nigel Paneth, and John Ioannidis found that $15 billion of the $26 billion NIH extramural funding was linked to the search terms: gene, genome, stem cells, or regenerative medicine—but yielded little in worthwhile results.62 One wonders if several decades hence this enterprise will be viewed as a Western-style Lysenkoist folly, where the weight of both corporate and enlisted state power expensively sustain a dead-end research program because it tickled the egoistic fancies and social agendas of the funders.63

BIG PHARMA AS PANACEA

Genetic determinism ties in all too symbiotically with pharmacological determinism. The familiar kindred credo is that one miracle pill makes you bigger and a different pill makes you smaller. The public is saturated with canny adverts—trailing interminable strings of small print caveats about side effects—boasting that there is just the right drug available to elevate troubled people from or plunge them into any designated mood or physical condition.64 If you suffer from a personality disorder, it is because of a serotonin imbalance, or if you are depressed it is because of a lack of norepinephrine. Rather like a basket of limes cure a ship full of scurvy.65

The pharmaceutical industry wants everyone to understand that their patent medicines—like genetic determinists claim for genes—exert precise one-to-one actions targeting our individual ills, woes, and even existential insecurities. Cyanide or arsenic in doses adequate to the fatal purpose may display such properties, but few other drugs do. Opioids apparently work all too well in their limited domain of relieving pain, to the point of snuffing out 64,000 Americans (especially in Appalachia and the Southwest) by overdose in 2016 and being declared a public health emergency in 2017 (without needed funding to address it).66

In 2003, to curiously scant media fanfare, a senior executive with Britain's biggest drug firm admitted that most prescription medicines do not work on most people. Fewer than half of patients prescribed drugs seemed to benefit, however one defined “benefit.”67 In Britain his comments came shortly after it was learned that the NHS drugs bill had risen 50 percent in three years. Dr. Allen Roses, an academic geneticist from Duke University, cited figures showing that Alzheimer's disease drugs work in fewer than one in three patients, while those for cancer were effective in a quarter of patients. Drugs for migraines, osteoporosis, and arthritis work in half the patients. By the industry's own measures: drug efficacy rate in percentages was: Alzheimer's: 30, Analgesics (Cox-2): 80, Asthma: 60, Cardiac Arrhythmias: 60, Depression (SSRI): 62, Diabetes: 57, Hepatitis C (HCV): 47, Incontinence: 40, Migraine (acute): 52, Migraine (prophylaxis): 50, Oncology: 25, Rheumatoid arthritis: 50, Schizophrenia: 60.68 Side effects don't even come into it. Nor, since the start of the millennium, does it seem to matter that 100,000 die a year from adverse drug reactions and a million or more are harmed by drug reactions that require hospitalization.69 For psychiatric drugs “the latest best estimates as to the percentage of people who benefit over and above placebo effects are 20% for antipsychotics and even less for antidepressants.”70 John Read, Olga Runciman, and Jacqui Dillon reported that

A survey of 1829 people on antidepressants found the following rates: sexual difficulties (62%), feeling emotionally numb (60%), withdrawal effects (55%), feeling not like myself (52%), agitation (47%), reduction in positive feelings (42%), caring less about others (39%), and suicidality (39%).

They also note that in twenty-four of twenty-five nations surveyed the public believes that social factors play a bigger role than genes or chemical imbalances in causing mental health problems, with the sole exception being the United States, which speaks to the incessant institutional drumbeat message that genes and chemical imbalances cause everything we do and can't do.71 Dawkins's selfish gene evidently is deeply neurotic.

What is most interesting for our essay about this admirably candid event is how the researchers tried to explain it. “I wouldn't say that most drugs don't work.” one university researcher opined to the press.72 The problem is not the drugs themselves or the firm's policies but rather that the “recipients carry genes that interfere in some way with the medicine.” It can't possibly be because the drugs do not work in the advertised way. Therefore, it stands to reason that the genetic structure of the patient perversely prevents some of the drugs from doing their jobs. Psychopharmagenetics, as a field, arises with a mission to fit square pills into the round mouths. Nothing much has changed regarding drug efficacy in the intervening years, but the drug prescribing only goes up. Seven out ten Americans gobble prescription drugs.73

At the same time the US government since Nixon has conducted an extraordinarily extensive and harmful war against illegal drugs, regardless of their actual ill effects, while enshrining the right of pharmaceutical companies to peddle useless, addictive, and suppressive drugs as sure cures for purely internal ailments (for which environmental factors do not matter) we suffer. Nothing in modern life, with its infliction of insecurities on the mass of the population, allegedly can cause any mental state tantamount to depression or schizophrenia (which seems to mean anyone the diagnosing shrink fails to relate to). That is a rock solid article of faith, and a lot of already overpriced stock values are riding on it. The purpose of the drug war has far more to do with social control of surplus population—surplus, that is, to the need of the profit system for their services. Meanwhile, pharmaceutical firms casually raid the public purse for free R and D support, tax breaks, and grants.74

One cannot help but remark that if there were a smidgen of actual logic to these interacting determinist streams (genetics and pharmaceuticals), US authorities would give up on incarceration as punishment for an estimated 27 million illegal partakers, for they would be deemed victims of genetic urges they cannot help.75 Then again the authorities, by dint of their rigid punitive cast of mind, only proved they were genetically coded to do what they did. One sees how tormented and contorted these glib determinisms quickly can become.

Versus all these muddled-up deterministic frames of mind stood figures like Freud whom the “hard scientists” despise. Author Peter Gay notes that psychologists of Freud's era thought he was “not biological enough,” because they ultimately believed mental disturbance was rooted in “heredity or some physical trauma.” Freud seemed just a soft-headed environmentalist:

Indeed, in 1912, Freud found it necessary to defend himself, a little irritably, against the charge that he had “denied the significance of inborn (constitutional) factors because I have stressed that of infantile impressions. Such a reproach stems from the narrowness of the causal needs—Kausalbedürfnis—of mankind,” which likes to posit a single cause if at all possible.76

So Freud tartly identified the primordial urge underlying determinism and its staying (and returning) power.

MILITARY MATH

The final case of the return of irrational determinism that we will address is a military instance. For all the routine reverential comments one hears in policy circles about heeding military theorist Carl von Clausewitz's admonitions about the “fog of war,” leaders, especially in superpowers, like to imagine they can enforce their will on others through the right algorithmic schemes. Take Vietnam: “[Robert] McNamara was both the product and the servant of a society that likes to express itself in the grammar of violence, and he was caught up in a dream of power that substituted the databases of a preferred fiction for the texts of common fact,” Lewis Lapham writes. “What was real was the image of war that appeared on the flowcharts and computer screens. What was not real was the presence of pain, suffering, mutilation, and death.”77

In what is really the crapshoot of war, the minders of militaries are drawn to formulaic solutions to solve vexing phenomena such as counterinsurgency operations in areas in full-scale revolt. In Vietnam, right up to and beyond the end of the war, authorities employed the statistical Hamlet Evaluation System (HES) to concoct a sense of steady progress in extinguishing resistance in the vast countryside, which they really did largely by wiping out or rousting out the population, An 80 percent rural southern populace at the start of the US intervention was battered, pummeled, shelled, and pulverized down to 30 percent by war's end—and the other side won anyway.

The HES, surveying some twelve thousand South Vietnamese hamlets, was the numerical lynchpin in counterinsurgency science. There are of course certain queasy questions that the analysts were not allowed, let alone inclined, to ask about their codifying task. In mission-oriented science, the usual principles of science takes a coerced vacation. The ruthless criticism of everything existing is an alien concept. Still, the practitioners regard their lore as a science, a science of domination, as if the whole purpose of science to dominate man and nature. It does not pay to understand them. Humanizing the opponent can be quite inhibiting.

The Hamlet Evaluation System was reliant on reported values, not inherent values. But once enshrined in a few digits these values take on the alluring vaporous facade of credibility. There is not much distance from the blithering statistical faith that animated the Hamlet Evaluation System in Vietnam to the green screens guiding drone programs abroad today, which serve much the same function abroad and are only beginning to get public attention. To this day one encounters credulous reports that invoke the Hamlet Evaluation System survey to buttress claims that the US really won the war in Vietnam (by suppressing the Southern guerrilla insurgency) but bugged out anyway.78 Moyar, Sorley and other “revisionists” pounce on it.79 The only problem is that the facts the HES depicted were mostly a figment of military imagination and of field-level expeditiousness.

If you read between the lines of one US general's retrospective comments about his predicament in that Southeast Asian guerrilla war—that it “is difficult for this democracy of ours to deal with the political dimensions of insurgency” because the “arbitrary and often undemocratic controls required” do not “go down well back here at home”—one gets a sense of the chafing these people felt.80 It's terribly hard to be a soldier in a democracy because one must strive to conceal what one needs to do. In rural Vietnam the American readily resorted to “recon by fire,” torture, crop destruction, defoliation, killing of farm animals, and random killing of villagers with every military age man regarded as a traitor or instant draftee.81

From 1967 to 1973, HES, based on confidence scores along eighteen rankings, rated A, B, and C hamlets as “Secure,” D and E were “contested.” V hamlets were “enemy controlled,” and therefore candidates for obliteration. Would anyone out there raise their hand to signal they deserve the lattermost ranking? If not, might you feign cooperation? “The basic objective of increasing the population living in security from the enemy was indeed achieved,” according to implacable CIA chief William Colby.82 While he claims the Mekong Delta was basically pacified by 1970, less enthusiastic observers saw a “progress” built not on converting the peasantry into South Vietnam regime fans but on erasing them altogether from the landscape. “My own research suggests that it was not a carefully crafted military strategy of counterinsurgency that led to the apparent ‘pacification’ of the Mekong Delta and many other areas of Vietnam by 1971,” writes David Elliott, “but a policy of rural depopulation that emptied much of the countryside—probably not a tactic that should be repeated in Iraq, or even one which is relevant to the more urbanized Iraqi society.”83

The Hamlet Evaluation System indicated a decline of 15 percent of the number of secure hamlets in February 1968. CIA analyst George Allen noted that although the “enemy had suffered heavy losses, their forces appeared to be regrouping and could mount further large-scale action in a matter of weeks.”84 Historian Gabriel Kolko at the time warned that “the question is not who claims control but who really possesses it” and that “areas, villages, and large population concentrations the NLF [National Liberation Front] operationally controls frequently cooperate in Saigon-sponsored surveys and projects to spare themselves unnecessary conflict with US and Saigon Foes”85 None of these nagging realities were heeded up the chain of command.

In January 1972, 42.17 percent of villages supposedly harbored no NLF (Viet Cong) infrastructure, 43.86 percent experienced sporadic insurgent covert activity; only 9.9 percent suffered regular covert activity and sporadic overt activity at night. The NLF controlled 1.88 percent at night only, 2.12 percent were fully controlled by insurgents” and yet NLF forces usually maintained threatening footholds nearby.86 Two Vietnamese generals afterward reflected on the pacification program:

Experience indicated, however, that in due time those enemy units which had been destroyed were surfacing again. Apparently, they had been regrouped, refitted, and reorganized in base areas with the manpower and equipment infiltrated from North Vietnam. The maintenance of area security had therefore become a frustrating task, for no matter how dense our outpost system or how well motivated our troops were, the enemy could always find loopholes to penetrate and weaknesses to exploit. Ups and downs in village security were an inevitable reality we had to face.87

In the HES “fraud, though not rare, was less common than the understandable tendency to resolve all ambiguities in the direction the incentives for evaluation and promotion led. Gradually, it seemed, the countryside was being pleasingly pacified.”88 The operative word here is “seemed.” McNamara created a means for tracking “legible progress, but also blocked a wider-ranging dialogue about what might, under these circumstances, represent progress.”89 RAND researcher Anders Sweetland in 1968 noted, “Thus far, our search for a person who feels neutral about the HES has been fruitless. People are either for it or against it, with the ‘agins’ outnumbering the field six to one. No measure in the theater has been so thoroughly damned.”90 RAND colleague Austin Long reevaluated Sweetland much later, gamely observing that “Sweetland found HES to be a reliable set of metrics, if one accepted that no objective criteria for measuring pacification existed” (italics mine).91 This is rather like pondering, if pigs sprouted wings…

David Elliott contends that the boasted success of pacification in 1970–71 was “temporary and largely the result of the depopulation of large areas once controlled by the revolution, as a consequence of incessant bombing and shelling.”92 Regarding the relativity of irrationality, he found that “rational calculations are not enough, however, to explain the tenaciousness of the hard core [of the NLF].”93 In 1971, supposedly the high-tide mark of pacification, HES conceded in 1971 that 45 percent of the officially “safe” villages were located within one hundred meters of a recent terrorist incident, and that no official was safe at night.94 The Office of Internal Security Affairs (ISA) inside the Defense Department reported likewise that more troops were pointless and that any gains in Hamlet evaluation “were results from accounting changes” and “not from pacification progress.”95

Historian David Hunt emphasizes the peasantry's “refusal to be pawns of modernization” and highlights that the “irony was that the people's war launched in 1959 had been defeated, but the soldiers’ war, which the United States had insisted on fighting during the 1960s with massive military forces, was finally won by the enemy.”96 “In carrying out their task of making emergent guerrilla tactics legible as part of some overarching strategic vision,” a recent evaluator delicately put it regarding HES, “these systems failed to approach the ontological question of what actually characterizes the supposed ‘rational’ or ‘obedient’ subject in asymmetric warfare.”97 Rationality never mattered so long as authorities imagined they could afford to be irrational in pursuit of goals that has many international relations scholars still scratching their heads. What Freud called the seductive lure of the single answer, Kausalbedürfnis, intersected with arrogance, which explains rather a lot about the embrace of determinist frameworks and consequent abandonment of reason.

No one ever conceded officially that the HES was fundamentally flawed in conception as well as execution. One can marvel at extraordinarily strained and tortuous appreciations of it infesting refereed academic literature today.98 The determinist impulse behind it resurged in the war on terrorism—as has counterinsurgency theory sans the good sense not to apply it—and especially was exemplified with all the mumbo jumbo technology involved in drone warfare. The promise of precision guarantees victory, or so another generation of purportedly rational and highly lettered leaders imagine.99

Collate the data. Flip a switch. Out of the blue comes a hellfire rocket pulverizing everyone in range. This time around one supposedly wins hearts and minds by selectively, rather than indiscriminately, killing relatives and friends. The United States devises an “approach to counterinsurgency warfare and border policing,” writes Gusterson, “that is organized around new strategies of information gathering, precision targeting, and reconceptualizing enemy forces as a cluster of networks and nodal leaders.”100 Collateral damage casualties are low-balled for US public consumption.101 Afghanistan is the most heavily drone-patrolled patch of digitized land on the planet. In Iraq, drone strikes receded under Obama from sixty in Iraq in 2008 to none in 2009, and recommenced in Syria in 2014. Notice any decline in resistance as a result?

Militarily technological determinist fancies are as intoxicating as ever. These “machine dreams” resurge because their sponsors believe it is perfectly rational to deploy the latest gadgets, because they anticipate a rational domestic audience will be too busy or gulled to care, because foreign hosts will see their use as the most rational choice among a lot of bad ones, and because, in their view, rationally speaking, might makes right. As in Vietnam.

CONCLUSION: LASHED TO THE MAST

In 1970, Robert M. Young correctly reckoned that “the history of brain and behavior research can be seen as the progressive abandonment of faith in a one-to-one correlation between the categories of analysis and the functional organization of the brain on the one hand, and the analogous variables in behavior on the other.”102 The same should be said for the gene and for pharmacology, only too few people seem to know it. “Genetic Determinism,” two British geneticists advised Guardian readers recently, “is not a concept used by practicing geneticists.”103 Hence, determinist simplicities easily insinuate themselves, with the help of profiting institutions, into public lore.

A social amnesia, to use Russell Jacoby's term, seems to have set in regarding the risks and perils of denying “scientific fallibilism,” that is, the sober acknowledgement that “at no actual stage does science yield a final and unchanging result,” as Nicholas Rescher writes: “All the experience we can muster indicates that there is no justification for regarding our science as more than an inherently imperfect stage within an ongoing development.”104 “Of course scientific curiosity should be encouraged (though fallacious argument and investigation of silly questions should not).” Noam Chomsky adds, “But it is not an absolute value.”105 The point is not to devalue science but to acknowledge that human interests can be concealed within scientific pronouncements, and to recognize that science occurs in a social sphere and, like it or not, has a social function that can be steered to partisan ends.

The sirens in Theodor Adorno and Max Horkheimer's Dialectic of Enlightenment represent overpoweringly sensuous irrationality, but Odysseus's resort to an instrumentally rational tactic, if instead adopted as standard operating procedure, may only end up substituting a different stern songstress to misguide the voyage. Lashed to the mast by his crew, Odysseus listens to the sweet tempting sirens without fear of acting on the dangerous urges they stir. “So instrumental reason can successfully combat myth,” as Curtis Bowman notes, “but only at the price of re-establishing a new Myth,” that, is a Myth of highly desirable instrumental control.106 The sirens “also represent a threat to narrative control,” that is, a threat to the narrative control of proponents of instrumental reason, which Odysseus temporarily and expeditiously overcomes.107

The trouble is that such a choice is rarely temporary or situation-specific, as in The Odyssey. Instrumental reason vies with, and also intersects with, objective reason, or it should. Why then are the sirens in this rendition in distress? They are, after all, not the avatars of objective reason, as Horkheimer and Adorno define it but, lacking the dramatic emotional experience that they offer, one is left with nothing but a desiccated one-dimensional instrumentalist frame.108 Odysseus has it both ways, and he must in order to be substantively rational in Mannheim's sense or rational in the Frankfurt School sense. The sirens may lure him onto the rocks if they can, but what if he needs to hear, incorporate and, as it were, inoculate himself with their voices to become truly rational so as to navigate more treacherous realms ahead, inner and outer? It seems our shoddy modern versions of Odysseus are determined to plug their ears to the voices of objective reason too.