CHAPTER 7

The Pursuit of Genius

We do not know how or why genius is possible, only that—to our massive enrichment—it has existed, and perhaps (waningly) continues to appear.

HAROLD BLOOM, Genius

In 2000 when I embarked on what would prove to be a haphazard program of research on Einstein’s cerebrum, I posed a straightforward question: “Why does Einstein’s brain exert such irresistible attraction for neuroscientists and people from all walks of life?”1 My explanation for the unflagging interest in a brain preserved in formalin for over six decades was, and still is, best encapsulated in a single word—genius. And as we seek to understand what makes a genius tick, is there any more promising place to begin than with “the twentieth century icon of genius, Albert Einstein?”2

On the surface, I answered a simple question with a breezy answer, to be sure … but just how do we go about defining genius?

In this book about a brain, I must own up once again to the problems that inevitably arise when discussing one of the highest attainments of the human mind—genius. My default stance as a neurologist is that the brain has a lot to do with the mind, but this chapter does little or nothing to bridge the explanatory gap separating brain science and mind science (of which genius is a topic par excellence). So why read on? I can only presuppose that you are turning these pages because of your interest in: A) Einstein, B) brains, or C) genius. C now takes the stage, front and center, because it is emblematic of our personification of Einstein, it is a scarce and remarkable outcome for the human condition, and even if we disregard neuroscience with the concession that “a materialist definition of genius is impossible,”3 we are transfixed by it. Why?

Pick your favored cultural “carrots” (as opposed to “sticks”) along the lines of “fame, fortune, and happiness,” and it won’t be long before you seize upon genius as an effective means of grabbing a bunch. Genius as practiced by Einstein or Shakespeare can leave an indelible imprint on posterity (fame). Genius wielded by the likes of Steve Jobs or Bill Gates can lead to the accrual of staggering wealth (fortune). The happiness part of the genius triad is a little more uncertain and problematic … as in tortured geniuses. The psychologist Kay Redfield Jamison studied the incidence of mood disorders and suicide in thirty-six English and Irish poets born from 1705 to 1805. Byron, Keats, Shelley, Wordsworth, Samuel Johnson, Coleridge, and Blake were among these creative geniuses, and their mental health “scorecard” was discouraging—more than half suffered mood disorders, two committed suicide, and four were institutionalized in asylums.4

Okay, two out of three isn’t bad … and there are undoubtedly some happy and emotionally stable geniuses. For example, Einstein told C. P. Snow that “in his experience, the best creative work is never done when one is unhappy.”5 Emotional storm warnings aside, genius is simply an important subset of global cultural imperatives. Just ask the Tiger Mom sending junior to the “Mozarts and Einsteins” preschool. Just ask the teenager cramming for a perfect sixteen hundred (yup, they brought back the old scoring scheme in 2016) on her SATs. Just ask the job applicant who wants to ace her Google interview. Just ask the young assistant professor who receives an unsolicited $625,000 “genius” grant from the MacArthur Foundation for “extraordinary originality.” And like Enron, everyone wants to be “the smartest guy in the room,” Faustian bargain or no.

If you’re convinced that our contemporary perception of genius is an unchanging legacy from the time that humankind began to appreciate the life of the mind, you’d be mistaken. Our characterization of genius shifted from external to innate likely sometime in the eighteenth century. Prior to that time, genius (from the Latin geno, meaning “beget”) in Roman mythology was a generative and protecting spirit or deity allotted to each man at birth. “The genius of a man, as his higher intellectual self, accompanies him from the cradle to the grave. In many ways he exercises a decisive influence on the man’s character and way of life.”6 It is interesting but entirely unfounded to speculate that the Arab word jinn (later to become the genie of A Thousand and One Nights) for a supernatural being may have shared common etymologic ground with the Latin genius when the Roman Empire encompassed Syria.

A snapshot of the changeover from the old version of genius to the new appears in Samuel Johnson’s A Dictionary of the English Language in 1755. Five different definitions are listed under the entry for genius, and the first epitomizes the ancient Roman conceit of “the protecting or ruling power of men, places, or things.” The new order of innate genius can be found in the second and third definitions: “A man endowed with superior faculties” and “mental power or faculties.”7 Possibly, the Enlightenment (from 1715 to 1789 for the Francophiles among us) set the stage for genius as an inborn human capacity, but certainly by the eighteenth century, the modern sense of genius was off to the races (and this bears out statistically). According to a cursory Google Books Ngram Viewer search of greater than five million books as of 2008, the use of the term genius on the printed page increased over 2,600 percent, between 1700 and 1774.8

As further testimony to the shape-shifting idea of genius, it should be recalled that Isaac Newton (1642–1727) was a first-rank genius in every sense of the modern word, and he performed his greatest scientific work in the seventeenth century. Newton, with whom Einstein is pretty much in a dead heat for the title of Greatest Scientist bar none, died in 1727 and was interred in Westminster Abbey. In 1731 the seventy-two word Latin inscription on the monument in the nave never cited “genius” in its brief hagiography. Alexander Pope’s epitaph was never chiseled into the monument but remains memorable: “Nature and nature’s laws lay hid in the night; God said ‘let Newton be’ and all was light.”9

The seventeenth-century idealization of genius of divine origin dated back nearly two millennia and was incisively expressed by what was left unwritten in the Westminster Abbey inscription. Pope obligingly upped the ante for divinely granted genius when he invoked the majesty of (Newton’s heretical non-Trinitarian vs. Pope’s Catholic) God rather than a minor tutelary deity of the ancient Romans.

It is safe to say that by the time Einstein came on the scene, the modern conception of innate genius held sway but lacked an operative definition. It took a while before Einstein was recognized for the genius he was. Going back to his miracle year in 1905, “it is pretty safe to say that, so long as physics lasts, no one will again hack out three major break-throughs in one year.… It took about four years for the top German physicists, such as Planck, Nernst and von Laue, to begin proclaiming that he was a genius.”10 Once Einstein got started, his ascent was steep, but it took a little while to garner the initial academic recognition. He didn’t make it to the academic big time until he was offered a full professorship in Prague in 1911, six years after he had established the existence of atoms, affirmed quantum theory, and created special relativity. C. P. Snow’s objections aside, although early-twentieth-century physics was a close-knit and fast- moving field, recognition of Einstein’s genius was not instantaneous. This takes us back to the problem of imprecision in a working definition of genius. Additionally, in Einstein’s case, the lack of a university affiliation in 1905 may have slowed down recognition by the old boys’ club of academic physicists. And the revolutionary nature of his discoveries may have been off-putting to the physics establishment. If the Nobel Prize can be regarded as a benchmark for genius, one of the supreme ironies of Einstein’s career is that he did not receive his 1921 (actually awarded in 1922) Nobel Prize in Physics for his greatest intellectual accomplishments—the theories of special and general relativity. The Nobel Prize Committee assigned the influential report on relativity to Alvar Gullstrand, an academic ophthalmologist who won the 1911 Nobel Prize in Physiology or Medicine for his work on accommodation (how the eye changes its focus from far to near and vice versa). “With little expertise in either the math or physics of relativity, he criticized Einstein’s theory in a sharp but unknowing manner.”11 This is evocative of Cambridge astronomer Arthur Eddington’s initial silence when told that people believed that only three scientists in the world (including Eddington) understood general relativity. Eddington then drily intoned, “I’m just wondering who the third might be.”12

Suffice it to say that Gullstrand was not one of those three, and “Einstein would not, as it turned out, ever win a Nobel Prize for his work on relativity and gravitation, nor for anything other than the photoelectric effect.”13

Over a century’s worth of hindsight assures us that Einstein was a genius of the first rank “but pressed to be more precise, we find it remarkably hard to define genius, especially among individuals of our own time.”14 Pertaining to genius, “we may very well know it when we see it,” but the following trenchant questions may put it into sharper focus.

Is there more than one kind of genius? In 1904, as Einstein was setting the stage for his imminent annus mirabilis, the English psychologist Charles Spearman “noted as the data from many different mental tests were accumulating, a curious result kept turning up: If the same group of people took two different mental tests, anyone who did well (or poorly) on one test tended to do similarly well (or poorly) on the other.”15 This statistically positive correlation led Spearman to propose “a unitary mental factor, which he named, g, for ‘general intelligence.’ ” The hypothesis of g has weathered the buffets of scientific inquiry for nearly a century—“the g factor … accounts for about half of the variability in mental ability in the general population.”16

The pronouncement of Lord Rutherford, the proton’s discoverer, to the effect that “all science is either physics or stamp collecting” can be loosely applied to the foregoing classification of intelligence. In just about any scheme of classification (including philately), there are “lumpers and splitters,” and in contrast to g, which lumps intellect into a unitary mental entity, there is a dissenting categorization, according to Howard Gardner’s theory of multiple intelligences.17 Gardner “splits” human intelligence into ten (at last count) abilities:

  1. Spatial

  2. Linguistic

  3. Logical-mathematical

  4. Bodily-kinesthetic

  5. Musical

  6. Interpersonal

  7. Intrapersonal

  8. Naturalist

  9. Spiritual

10. Existential

11. Moral? (The jury is still out on this one.)

Gardner readily acknowledges that multiple intelligence theory “questions not the existence but the province and explanatory power of g … and centers on those intelligences and those intellectual processes that are not subsumed under g.”18 In Gardner’s view, “creators” are not one-trick ponies but “exhibit an amalgam of at least two intelligences.… Like most physicists, Einstein had outstanding logical-mathematical intelligence, but his spatial capacities were extraordinary even among physicists.”

As a neurologist I frequently work with a modular construct of brain anatomy. If a patient has impaired speech (expressive aphasia) or trouble recognizing faces (prosopagnosia), I review the MRI for signs of damage to Broca’s area in the (usually) left inferior frontal convolution and the fusiform gyrus of the ventral occipital lobe, respectively. In these cases there is diagnostic utility to embracing a splitter approach … but the modular scheme is not as successful when grappling with human capacities, such as creativity or judgment. We are a long way from sussing out stand-alone structural bases of multiple intelligences, and it may make “more sense now to speak of several brain areas involved in any complex intellectual activity.”19 To a neurologist in the early twenty-first century, any mental map such as Gardner’s (or for that matter, Freud’s venerable triad of ego, superego, and id) cannot be superimposed on our current knowledge of neuroanatomy.

Can genius be measured with IQ tests? Right off the bat, let me state that I don’t perform IQ (intelligence quotient) tests on my patients; most neurologists conduct abbreviated neuropsychological (as opposed to psychological) tests to screen for brain (as opposed to mind) disorders. I will administer a Mini-Mental State Exam, a Mini-CogTM, or the Montreal Cognitive Assessment to evaluate a patient for cognitive impairment, memory disorders, or disorders of language, such as aphasia. The measurement of IQ is not on my clinical “radar,” which is trained on neurologic disease and not on determining future academic or job performances (or, for that matter, eligibility for Mensa). The Wechsler Adult Intelligence Scale, version IV (WAIS IV) most commonly assesses these occupational and scholastic capacities. These tests have been around since 1939, and the latest iteration, WAIS IV (WAIS V is in the works), has ten core subtests and five supplemental subtests. It may be a benchmark (along with its closely correlated cousin, the SAT) for intelligence, but it has no subtest to measure creativity or wisdom.20 David Wechsler defined intelligence as “the aggregate or global capacity of the individual to act purposefully, to think rationally and to deal effectively with the environment.” Definitions and measurements are two different issues, and Wechsler observed that “the entity or quantity which we are able to measure by intelligence tests is not a simple quantity” and “that intelligence tests do not and cannot be expected to measure all of intelligence.”21

Shortcomings aside, intelligence tests could measure scholastic achievement in children, assess mental deficiency in adults, and classify drafted soldiers. However, interleaved with discussion of “defective” and “dull-normal” subjects Wechsler’s remarks on the psychometric characterization of genius are cautiously limited to “levels of behavior which present certain patterns [that] … reach the other end of the scale where they are labelled very superior, precocious or genius.”22 The “I.Q. Range” assigned to “genius or near genius” (“140 and above”) was reproduced in a table from the educational psychologist Lewis Terman’s research thirty-three years earlier in 1916.

While Wechsler was reticent to elaborate on the identification of genius by his intelligence test methodology, one of Terman’s graduate students, Catherine Cox, had no such qualms in 1926 when she compiled a ranking of 301 “young geniuses” who had lived between 1450 and 1850.23 Lists have a way of catching the public’s eye—for example, David Wallechinsky’s best-selling Book of Lists in 197724—and a list of geniuses, whether flawed or spot-on, is no exception. No one on Cox’s list was around to take the ur-intelligence tests, such as the Binet-Simon test (1905), which begat Terman’s Stanford-Binet test in 1916. She improvised her own methodology of historiographic IQ testing, which combined two intelligence ratings for each individual, the first based on the subject’s mastery of basic skills, such as speaking, reading, or writing, and the second on academic records and early professional careers. These were combined with personality characteristics25 and, for better or worse, Cox’s list was born. The Top Ten:26

  1. Goethe (IQ = 210)

  2. Leibnitz (IQ = 205)

  3. Grotius (IQ = 200)

  4. Wolsey (IQ = 200)

  5. Pascal (IQ = 195)

  6. Sarpi (IQ = 195)

  7. Newton (IQ = 190)

  8. Laplace (IQ = 190)

  9. Voltaire (IQ = 190)

10. Schelling (IQ = 190)

Everyone (geniuses not excepted) likes to make it to the medal round. Einstein is not on the list (even eleven years after general relativity) likely because of Cox’s 1850 cutoff date. However, scientists and mathematicians are inordinately well-represented in the top ten on her list of Worthies, and number seven, Isaac Newton, is pretty much the all-time poster boy for scientific genius. Only three (Grotius, Cardinal Wolsey, and Voltaire) had nothing to do with math, science, or natural philosophy. Lists, by virtue of their selectivity, can be capricious, and this is affirmed by Cox’s series that omitted Pasteur, Maxwell, Shelley, and Tolstoy, to name a few. Taking Cox’s retrospective IQ estimates at face value, many scientific geniuses had “stratospheric” IQs, but is a high score on an IQ test alone sufficient to designate a genius? Wechsler waggishly recounted the tale of “a prominent psychologist” who “answered an inquiry as to what he meant by intelligence by saying that it is what intelligence tests measure.”27 This “intelligence = intelligence test results” tautology makes no mention of the defining spark of true genius, and, for my money, that spark is creativity.

Cox’s study on genius employed the methodology of historiometry and recognized psychological indices. Her subjects were dead, and in academic medical research, we would classify this as a retrospective or chart review study based on the past records of living or dead patients. Another fundamental approach is termed prospective study, which was precisely the undertaking of her PhD supervisor, Lewis Terman. Beginning in 1921 Terman collected data on 1,528 “gifted” children in California. The so-called “Termites” had average IQ scores above 150, and Terman’s successors collected follow-up data on some of them for sixty-five years after the inception of the Genetic Studies of Genius.28 With the exception of the physicist J. Robert Oppenheimer, as “the cohort matured, its members did not produce a significant number of creative individuals.”29 Conversely, two future Californian Nobel laureates in physics, Luis W. Alvarez (for particle physics) and William Shockley (for discovery of the transistor effect), did not qualify to become “Termites.” Few, if any, longitudinal prospective studies have ever exceeded Terman’s magnum opus, which established that “genius (in the sense of creativity) was not the same as a high level of intelligence” (as measured by IQ tests). In the words of the neuroscientist and neuropsychiatrist Nancy Andreasen, “Intelligence is somewhat related to creativity but it is also different.”30

As if the lack of correlation between IQ testing and creativity did not pose enough of an obstacle to ferreting out creative people, in the 1980s James R. Flynn, a professor of political studies at the University of Otago, found that intelligence as measured by IQ testing was a moving target! It was well known that some IQ test scores change with age, but Flynn rigorously demonstrated that standardized IQ tests, such as the Wechsler, Stanford-Binet, and others, have shown that “the ‘average’ person of the later generation was scoring way above the ‘average’ person of the earlier generation.… Flynn makes the telling point when he asks us to reflect on the fact that being born a generation or so apart can make a difference of fifteen IQ points. We have no good account for this change; it is officially mysterious.”31

Given that “beyond an IQ of about 120 … measured intelligence is a negligible factor in creativity,” are there any better tests to identify creative genius? “Divergent or lateral thinking” tests of creativity have been around for over sixty years. They differ from standardized IQ tests that measure logical or convergent thinking, but “there is no correlation between high scorers on divergent thinking tests and their creativity in real life.”32 However, one recently touted bellwether metric of creativity is the conjoint application of the SATs and tests of spatial ability from the DATs (Differential Aptitude Tests). The forty-five-year-old and still going strong longitudinal Study of Mathematically Precocious Youth found “a correlation between the number of patents and peer-refereed publications that people had produced and their earlier scores on SATs and spatial-ability tests.”33

Is genius hereditary? To modern sensibilities raised on the nature versus nurture debate, this is a most dangerous question, bristling with implications of DNA/racial/gender elitism. In 1869 as far as Francis Galton, Charles Darwin’s cousin and a reader before he was three years old, was concerned the answer could not have been more obvious. The title of his book, Hereditary Genius: An Inquiry into Its Laws and Consequences, leaves little doubt as to his conclusion. The first sentence on the first page declares, “I propose to show in this book that a man’s natural abilities are derived by inheritance.”34

Even this “eminent” Victorian had qualms as to the precise nature of the subject of his book’s title, decrying “the uncertainty that still clings to the meaning of the word genius in its technical sense.” (Author’s note: my sentiments exactly … 125 years later.) In the prefatory chapter of his 1892 reprint of Hereditary Genius, Galton wondered whether Hereditary Ability might have served as a better title. In 1869 he used the word genius to describe “an ability that was exceptionally high, and at the same time inborn.” If genius was a mental power sui generis or qualitatively different than high ability, Galton did not inform us, but he did set out to measure it.35

And his particular metric of intellect was that “high reputation is a pretty accurate test of high ability.” He sought to elucidate the “laws of heredity in respect to genius” by surveying familial relationships of what was in essence a Victorian slant on Alexander Pope’s hierarchical “Vast chain of being,”36 beginning with judges of England (1660–1868), statesmen at the time of George III, and premiers (prime ministers) of the last one hundred years. These were followed “in order” by illustrious commanders, men of literature and science, poets, painters, divines, modern scholars, and, in the realm of “physical gifts,” oarsmen, and wrestlers.37

When Galton “crunched the numbers” of “no less than three hundred families containing between them nearly one thousand eminent men, of whom 415 are illustrious,” he was convinced that if there was “such a thing as a decided law of distribution of genius in families, it is sure to become manifest when we deal statistically with so large a body of examples.” The upshot was that “exactly one-half of the illustrious men have one or more eminent relations.” The chances of rising to eminence for a kinsman of an illustrious man varied from one out of four (son), to one out of seven (brother), to one out of one hundred (first cousin).38

Once he had established that heredity was the dominant factor for human ability and genius, he could resolutely declare all hope of improving the natural gifts of future generations to be “under our control.” The admonition that “we may not be able to originate, but we can guide”39 opened a Pandora’s box for the “benefits” of selective human breeding for “desirable” traits or, as it came to be known—eugenics—a term coined by Galton in 1883. Far from being repelled, his Victorian audience embraced his message, and the first edition (1869) of Hereditary Genius sold out, becoming “unpurchasable except at second-hand and at fancy prices.”

Galton’s vision of the dominance of intellectual pedigrees and cognitive bloodlines has not withstood the passage of nigh on 150 years. Ironically, he memorably delineated a major, if not the major, battleground of cognitive psychology when he coined the phrase/meme/war cry of “nature and nurture” which “separates under two distinct heads the innumerable elements of which personality is composed.”40 Galton’s view of heredity as the prime mover of intellect has been largely supplanted by the twentieth-century belief that nurture assumes the commanding role in the formation of intellect. Society’s commitment to this “can do” philosophy, centering on the most potent form of nurture, education, can be seen in the proliferation of Head Start programs or in the online curricula conferring doctoral degrees.

Was Galton (wielding an IQ of 200, as estimated by Louis Terman)41 dead wrong and the modern educational credo that bright kids are “made, not born” 100 percent right? It’s a leading question that will not be answered in a book about the neuroscience of Einstein’s brain, but psychological studies of identical twins may shed some light on the topic of the wellsprings of intelligence. The Minnesota Study of Twins Reared Apart (MISTRA) tested fifty-six sets of identical (monozygotic—i.e., with 100 percent of their genes in common) twins and found that “70 percent of the observed variation in IQ in this population can be attributed to genetic variation.”42 “Identical twin pairs who spent their lives apart end up just about as similar in intelligence as those who spent their lives together.”43 Taken at face value, psychological studies of the modern era suggest that in the arena of IQ, nature dominates nurture. Although MISTRA seemingly did not get the memo that nurture inscribing a cognitive blank slate was the exclusive mechanism of assembling intellect, the investigators left a little wiggle room for educators and concerned parents when they allowed that their “findings do not imply that traits like IQ cannot be enhanced.”

Before we can wholly consign our intellectual destiny to the genome, it bears mentioning that if mental ability genes exist, we “have no idea what those genes are.… The best guess among researchers is that mental abilities are influenced by an unquantifiable number of genes, each of which will have a small effect.”44 And what’s more, the sizes of the genome and the brain don’t match up. As of 2014 it is generally accepted that the three billion DNA base-pairs of the human genome contain nineteen thousand to twenty thousand protein-encoding genes. (Caveat: don’t focus on numbers. The one-millimeter roundworm Caenorhabditis elegans has 20,470 genes, albeit with a “no-frills” one hundred million base-pair count). In stark contrast to the magnitude of our genome, the human brain has eighty-five billion to eighty-six billion neurons, with one thousand to ten thousand synaptic connections per neuron. The sheer quantity of possible arrangements of neuronal connectivity is staggering! Clinical neurologists are definitely not savants of cerebral architectonics (Read “wiring diagrams” of the brain), but if we insist (and I don’t) that nature/heredity/genome is the only game in town vis a vis neural organization, how can the genome possibly contain enough instructions to hardwire the currently unfathomable complexity of the brain? The DNA “true believers” will insist that protein-encoding genes are not the whole story behind “building” a brain. There may well be operating instructions for regulating gene expression in the other 98.5 percent of the genome that doesn’t assemble our amino acids into building-block proteins. Or, epigenetic mechanisms in which the environment alters gene expression without changing the DNA sequence may be in play.

My somewhat informed hunch is that our genes, although incorporating basic organizing principles of the mammalian brain, fall short of containing a completely specific blueprint of central nervous system (CNS) architecture. And here nurture comes to the rescue. When Francis Galton wrote in 1874 that “nature is all that a man brings with himself into the world,” the two most basic elements of the nature of brain microanatomy, Cajal’s neurons and Sherrington’s synapses, would not be brought to light until 1906 and 1897, respectively. As knowledge of neurons, axons, dendrites, and synapses accrued, they were seen as constituent parts of the early-twentieth-century hardwired model of the brain that was likened to a telephone switchboard. Donald Hebb’s proposal of a dynamic synapse in 1949 called into question this static conception of brain wiring, and Hebb’s theory of synaptic plasticity as a basis for learning was confirmed in 2000 by Eric Kandel’s Nobel Prize–winning demonstration of the growth of new connections between motor and sensory neurons with the induction of long-term memory in the marine snail Aplysia.45 The neural paradigm changed for good with the recognition of neuroplasticity, which showed the brain’s wiring to be a dynamic process at the beck and call of nurturing influences, such as what we see and hear or think or learn in school.

Nature versus nurture as the origin of genius (or intellect for that matter) remains a question that is as contentious as it is complex. In-depth knowledge of the biological bases—the human genome and neuroplasticity—of Galton’s dyadic proposal have been on the scene for less than twenty years. The implications of this newfound biology are not confined to sequencing base-pairs of intertwined strands of DNA or tracing axonal connections. Extrapolating from Kandel’s synthesis of the data from his humble sea snail, the human genome and neuroplasticity may address the essence of human thought if we presume to equate the inborn genetic instructions for neural circuitry with Kant’s inborn a priori knowledge and neuroplasticity with Hume’s empiricism, in which knowledge is not innate but must be learned from sensory experience.46 Even Einstein’s penetrating introspection can’t help us choose between these two schools of philosophy of the mind. He rejected the tenets of both Kant’s and Hume’s philosophies47 (although Hume’s empiricism was a closer fit for Einstein’s approach to physics).

Did our examination of the “lost” photographs of Einstein’s brain land us firmly on the side of nature or nurture? Neither. We straddled the line and wrote that “the enlarged [motor cortex] ‘knob’ that represents the left hand … is probably associated with his early and extensive training on the violin. This particular feature of Einstein’s gross anatomy was likely the result of both nature and nurture48 (italics added).

Does genius have a structural basis? Leaving aside the foregoing debate as to whether the unimaginable complexity of human neuroanatomy arises from the genomic instruction manual, sensory-driven neuroplasticity, or both, we come face-to-face with the question of the relationship of intelligence and brain structure. This particular question can be sidestepped if you’re a dualist who regards the mind and the brain as separate entities (and never the twain shall meet). As a neurologist dealing with structural brain disorders on a daily basis, I don’t know how to apply the principles of neuroscience to the “problem” of finding a basis for noncorporeal “mind-stuff,” spirit, or soul, which likely reside outside the boundaries of paleontologist Stephen Jay Gould’s “magisterium of science.”49

The concluding thrust of our research was that Einstein’s exceptional brain was the basis of his genius. This may well be a scientific truth that is forever unprovable, or in the words of the philosopher Colin McGinn, “The bond between the mind and the brain is a deep mystery. Moreover it is an ultimate mystery, a mystery that human intelligence will never unravel.”50 McGinn’s “depressingly negative conclusion” is straight out of the playbook of mysterianism, in which our species’ intellectual shortcomings “filter out what is crucial [to understanding] the brain’s real nature.”51

While McGinn has said dismissively that we’re not smart enough to realize that “there has to be some aspect of the brain that we are blind to, and deeply so,” the mathematician Brian Burrell has piled on and observed that no Einstein brain study “has revealed a credible anatomical basis for the man’s aptitude.” Not content with disparaging our peer-reviewed (scientific-ese for the Good Housekeeping seal of approval) anatomical findings, Burrell editorialized that the so-called failure of Einstein brain studies is a good thing because “discovery of substrates of talent—or lack thereof—in the brain would have troubling practical and ethical implications.”52

If you’re keeping track of the critical “slings and arrows” aimed at the philosophical underpinnings of our study, the list includes: (1) The brain is not the place to look (dualism), (2) If it is the place to look, we’re too dumb to recognize the physiological/anatomical “signature” of genius (mysterianism), and (3) It’s unethical.

Wow. That’s a discouragement “hat trick” for what seemed like a pretty good curiosity-driven piece of brain science that produced some fascinating and verifiable findings. Undeterred, Dean Falk took a direct and effective idea and ran with it when she drew a parallel between paleoanthropology and her analysis of Thomas Harvey’s photographs of Einstein’s cerebral hemispheres: “In order to glean information about hominin (or other) brains that no longer exist, details of external neuroanatomy that are reproduced on endocranial casts (endocasts) from fossilized braincases may be described and interpreted.… Albert Einstein’s brain no longer exists in an intact state, but there are photographs of it in various views. Applying techniques developed from paleoanthropology, previously unrecognized details of external neuroanatomy are identified on these photographs.”53

Putting the straightforward, transparent, and honest premises of our study aside, what is the evidence for a legitimate link between intelligence and brain structure? Nancy Andreasen used MRI scans to compare the brain size of sixty-seven healthy, normal volunteers with their IQ as measured by the WAIS. Her conclusion was, “The larger the brain, the higher the IQ.”54 The general term brain size subsumes disparate elements, including neurons, three types of glial cells, axons, myelin sheaths, dendrites, synapses, blood and blood vessels, meninges, and cerebrospinal fluid (CSF). Andreasen measured volumes of gray matter, white matter, and CSF separately and found a significant positive correlation between gray matter volume, verbal performance, and full-scale IQ. This was an important study because it measured the size of the functioning brains in living people, as opposed to autopsied brains or indirect measures, such as head circumference or hat size. Writing well after the advent of sophisticated neuroimaging, Stephen Jay Gould impugned craniometry—the measurement of the skull and its contents—as a particularly egregious “mismeasure of man” in his book of the same title.55 A decade later Andreasen’s data flew in the face of Gould’s despairing observation that “the supposed intellectual advantage of bigger heads refuses to disappear entirely as an argument for assessing human worth.” As these two scientific hypotheses cudgel each other, I will again interject that Einstein’s brain weighed in at 1,230 grams, a little less than expected for a seventy-six-year-old man. The brain-size wars will undoubtedly continue. Read on.

Does evolution have anything to teach us about brain size and intelligence? “The 1.35 kg brain of Homo sapiens supposedly the smartest creature on earth, is significantly exceeded by the brains of elephants and some cetaceans. Thus, a larger brain alone does not necessarily assure greater intelligence.”56 Bigger bodies require bigger brains. It stands to reason that a male African elephant weighing more than five tons and measuring ten feet tall at the shoulder needs a bigger brain to house the additional sensorimotor neurons and connections required to innervate yards of elephant hide, massive limb joints, a trunk, and thousands of pounds of muscle. Leaving aside the distinguishing (and mostly unknown) features of the intelligences of whales, pachyderms, and primates, it may be argued that absolute brain size provides little if any insight and that relative brain (the ratio of brain to body mass) weight may be a more useful index. Humans score highly on this index, with a relatively large brain at 2 percent of body mass, but we can’t hold a candle to shrews, which have brains weighing in at 10 percent of body mass. (Some shrews have the fairly unique ability to echolocate, but if they have great intelligence they are holding their cards pretty close to their chests.) When neurobiology doesn’t deliver on our aspiration to be the smartest guys in the room … I mean, animal kingdom), statistics come to the rescue with the encephalization quotient (EQ), a ratio that compares the departure of the brain size of a particular mammalian species, such as Homo sapiens, from the size of a “standard” mammal—the cat. “Humans have the highest EQ of 7.4–7.8, indicating that the human brain is 7–8 times larger than expected.”57 Despite this self-affirming finding, contradictory results with capuchin monkeys and chimpanzees led German neuroscientists Roth and Dicke to caution that “EQ is … not the optimal predictor of intelligence.” Or, we may be at the threshold of an alternative anatomical index for intellect if we count neurons rather than weigh whole brains. Primate (and by extension) human brains may have a greater density of neurons than whale or elephant brains: “Our superior cognitive abilities might be accounted for simply by the total number of neurons in our brain, which, based on the similar scaling of neuronal densities in rodents, elephants, and cetaceans, we predict to be the largest of any animal on Earth.”58

If the lessons of evolutionary brain development fail to validate the premise that (brain) “size does matter,” uncertainty only increases when we consider the cases for the Great Leap Forward and a hydrocephalic genius.

Around forty thousand years ago, Western Europe was populated by Neanderthals lacking art and much in the way of cultural innovation. “Then there came an abrupt change, as anatomically modern people appeared in Europe, bringing with them art, musical instruments, lamps, trade, and progress. Within a short time, the Neanderthals were gone.” Physiologist/evolutionary biologist/biogeographer Jared Diamond declared this Upper Paleolithic revolution “The Great Leap Forward.”59

The relentless pressure of natural selection consigned the less fit Neanderthals to oblivion as the Cro-Magnons superseded them. These anatomically modern humans arose in Africa one hundred thousand years ago but initially had no cultural or technological advantage over the Neanderthals. However, during the last sixty thousand years, Cro-Magnon technology and culture “leapt forward,” and these early modern humans migrated into Europe, sweeping away the doomed Neanderthal world. For the neurologic determinist, the “secret” to Cro-Magnon success would be bigger and better brains. “One might therefore expect the fossil record to show a close parallel between increased brain size and sophistication of tools. In fact, the parallel is not at all close. This proves to be the greatest surprise and puzzle of human evolution.”60 The neurologic determinist position is further eroded by the finding that as recently as forty thousand years ago, Neanderthals had brains even larger than those of modern humans.

If “anatomically modern Homo sapiens populations living in southern Africa 100,000 years ago were still just glorified chimpanzees as judged by the debris in their cave sites,” what caused the Great Leap Forward sixty thousand years later?61 Diamond’s guess is “the perfection of our modern capacity for language.”62 Please indulge this inveterate neurologist when I protest that language does not appear “out of the blue”; there must be neural underpinnings, such as asymmetry of the planum temporale, Broca’s area in the frontal lobe, Wernicke’s area in the temporal lobe, and the arcuate fasciculus, connecting the “eloquent” cortices of Broca and Wernicke. Although changes in laryngeal anatomy, allowing growls to become protowords and then human elocution, has been proposed as the origin of language, language is more than what comes out of your mouth. Language originates in the brain and is not restricted to oral expression. People who are unable to speak can express language by writing or signing.

My best surmise is that the Great Leap Forward, whether associated (or not) with the emergence of language, was neurally “engineered.” Paleoanthropology, by necessity, cannot directly examine hominin brains forty thousand years old; the soft tissue neural constituents have long since decomposed, and only the skulls or skull fragments remain. From endocasts of braincases, the size of the brain and some cortical landmarks, such as the lunate sulcus, can be approximated. When revisiting the Great Leap Forward, paleoanthropology can only reckon if brains were bigger (they weren’t) but can’t determine if they were better or, to be more precise, intrinsically different. The most painstaking inspection of the mineralized matrix of a Cro-Magnon skull that has weathered forty millennia would be blind to profound changes in cortical neuronal/glial cell count, synaptic density, myelination, and axonal pathways. I do not mean to downplay the profound role of culture and social interaction in advancing humankind by leaps and bounds. I strongly doubt that the brains of people during the Renaissance or the Enlightenment were anatomically different from those of a generation or more earlier, but the magnitude and sweeping changes of the Great Leap Forward—finely wrought stone tools, body adornments, art, burial rituals, language—more than countenance the necessity of neurologic reorganization, which remains, for the most part, invisible to present neuroscientific investigations. Lacking an Upper Paleolithic brain to study, my hypothesis remains unproven, and the field remains open to differing interpretations, such as that of cognitive archaeologist David Lewis-Williams, who cautions us to “not seek a neuronal event as the triggering mechanism for the west European ‘Creative Explosion,’ ”63 his term for the cultural upheaval forty thousand to fifty thousand years ago. He points us to “social circumstances” and the “role of art in social conflict, stress and discrimination.”

About thirty-six thousand years after an Upper Paleolithic “Michelangelo” began painting polychrome bison on the walls of Altamira Cave on the northern Spanish coast, Professor John Lorber of Sheffield University described a young university student who gained a first-class honors degree in mathematics. We will likely never know the neuroanatomy of the early (presumably non-Neanderthal) humans adorning the Western European caves, such as Chauvet, Lascaux, and Altamira, “where ‘art’ began,” but we are acquainted in detail with the paradoxical brain of the gifted math student.

Dr. Lorber’s patient had hydrocephalus (or more colloquially, water on the brain). Except it really isn’t water—it’s CSF, a complex biological substance, cerebrospinal fluid, which is secreted by the choroid plexuses inside the brain’s hollow cavities (ventricles). If the free flow of CSF is obstructed, it backs up in some or all of the four ventricles present in every anatomically normal human’s brain. As the CSF accumulates, the intracranial pressure exceeds the normal upper limit of 150 millimeters of water, which can be measured with a manometer when I perform a spinal tap (a bad term—let’s call it a lumbar puncture … I studiously avoid the spinal cord during the bedside procedure) on a patient. The increased CSF pressure expands the ventricles like balloons and compresses the brain tissue. In the case of the math student, Lorber saw “that instead of the normal 4.5-centimeter thickness of brain tissue between the ventricles and the cortical surface, there was just a thin layer of mantle measuring a millimeter or so. His cranium is filled mainly with cerebrospinal fluid.”64 Despite the profoundly altered cortical anatomy of his hydrocephalic brain, Lorber’s patient had an IQ of 126! Needless to say, it is difficult to accept the premise that the brain generates mind or that intelligence is exclusively brain-based after encountering a severely hydrocephalic and intellectually gifted individual as just described. Before you enroll in Lorber’s school of cortical nihilism, consider an ill-fated embryologic experiment of nature known as anencephaly. Anencephalic humans have eyes, a brain stem, a cerebellum, and a spinal cord but no cerebral hemispheres. During the few days to weeks of survival, they exhibit stereotyped, reflexive movements, including decerebrate posturing (the rigid extension of the arms and the legs).65 They really have no brain above the level of the high brain stem (mesencephalon), and in contrast to Lorber’s hydrocephalic mathematician, the absent hemispheres of humans with anencephaly profoundly reduce their mental life and behavioral repertoire to a short-lived assemblage of automatic and involuntary fragments of movement.

I will not presume to disallow dualism (in which the mind is unharnessed from the brain) as a plausible interpretation for Lorber’s regrettably unpublished case study. It could be argued that if (substance) dualism, which distinguishes mind from matter, was good enough for René Descartes (1596–1650), why not for us? As an almost obligate materialist by way of my occupation, I will play the devil’s advocate on this highly charged mind/spirit point and offer some opposing contentions from the neurologist’s conceptual tool kit. Possibly, cerebral cognitive “reserves” and intact noncortical structures, including the thalami, are sufficient to compensate for the structural alterations of hydrocephalus. Alternatively, the compressive thinning of the neurons and the connecting white matter of the cortical mantle may not be a mortal wound on the cellular level, and these neural elements may continue to function.

Needless to say, not every brain disease targets intelligence. For example, amyotrophic lateral sclerosis (ALS) does cause the death of motor neurons in the spinal cord, brain stem, and frontal lobes. In most cases the pyramidal cells (so-called due to their triangular appearance under the microscope lens) of the precentral gyri of the frontal lobes atrophy and die. About 10 percent of ALS patients develop dementia. However, a notable exception to the features of ALS, which may jeopardize intelligence, is Stephen Hawking, arguably the world’s most renowned theoretical physicist and cosmologist. Hawking has had an unusual form of ALS, with protracted survival, for well over five decades. I can only speculate that his particular variant of ALS has been confined to the motor neurons, sparing the “intellectual” portions of his frontal lobes involved with abstraction, association, and executive functions.

The selective vulnerability and the selective resilience of particular neuronal subsystems are opposite sides of the same coin for many neurologic diseases. In Parkinson’s disease (PD), a subset (called extrapyramidal) of the motor pathways is damaged, and the sensory pathways are spared. In PD selected neurons are marked with a distinctive neuropathologic “signature” of damage (Lewy bodies), but we lack a basic understanding as to why one (motor) portion of the CNS dies, and another (sensory) portion thrives. Or how intellect can be preserved in the face of brain disease not because it exists in a shadowy “sanctuary” of nonphysical mind stuff but rather because the disease’s biology exempts cognitive cerebral networks.

Any ringing endorsement of dualism and the nonphysical mind should be tempered by what every clinical neurologist learns over time. To wit, the human brain and its myriad functions are awe-inspiring in their complexity, and consequently, we are unable to all-inclusively correlate CNS structure and functions. The absence of such correlations for particular functions may be more illustrative of our neuroscientific ignorance and should not serve as compelling proof of the mind loosed from the body. To a dyed-in-the-wool cerebral materialist, it is tempting to downplay our lack of knowledge and contemplate a seemingly reasonable proposition such as:

More brain cells = More mind

It’s a very rough assertion of parity in an unproven biological structure-function relationship. It may be dressed up as an equation, and we can certainly count neurons, but bear in mind that a neuronal census raises other questions. If we posit the total number of cortical neurons of each cerebral hemisphere to be the seat of intellect, it is perplexing to note that for each cortical neuron we have four cerebellar neurons66 (and although there is some evidence of cerebellar cognition, it does not factor in as a major player for human intellect).67 Moreover, we are at a loss when it comes to “counting” nonexistent units of the mind. Does “more” mind mean “higher” intelligence, “deeper” insight, “broader” perspective, or something else entirely? Assuming we can assign valid measurements to something as abstract as the mind, we’re still left with the discordant observation that plus-size whale and elephant brains do not house an overpowering intelligence that we can recognize at present. Putting aside the formidable methodological questions raised by interspecies IQ comparisons, consider a well-known aspect of brain development that defies my materialist catchphrase: “You can’t be too rich, too thin, or have too many neurons.” At least as far as it applies to neurons, apparently you can, and early on we do have too many nerve cells. “Remarkably, almost half the neurons generated in the mammalian nervous system are lost through a process known as programmed cell death.” This is in effect a suicide program known as apoptosis, which when activated shrinks the neuron, condenses the chromatin (DNA + RNA + protein), and fragments the cell body, which is gobbled up (phagocytosed) by scavenger cells such as microglia.68 Not only do neurons die in the developing nervous system, the connections (synapses) are pruned back in a separate process independent of neuronal apoptosis. Why evolution has, teleologically speaking, employed a “less is more” (as applied to neurons and synapses) developmental strategem for growing brains is a profound mystery even (or especially) for non-Cartesian neurologists.

Regardless of their functionality, bigger brains pose another problem that is purely mechanical. “Whatever its cause the ongoing increase in hominin brain size caused parturition to become increasingly difficult.”69 Even with anatomical work-arounds, such as open skull fontanelles (“soft-spots”) and head molding, human head circumference at birth may be “topped out” at thirty-five centimeters in order to traverse a bony birth canal selected millions of years earlier to accommodate both the smaller hominin cranium of the baby and the upright bipedal locomotion of the mom. Bigger brains require bigger heads, and therein lies the “obstetrical dilemma” that may impose a biological limit on the size of neonatal brains. On a population-wide basis, the increasing global rate of cesarean sections, from 6.7 percent in 1990 to 19.1 percent in 2014, may validate this limit.70 Many medical causes, such as eclampsia, other than cephalopelvic disproportion serve as indications for cesarean section, and obstetricians may very reasonably object to my focus on the rising rate of cesarean sections as further proof of the obstetrical dilemma and its evolutionary implications. Nevertheless, the increase of c-sections is a thought-provoking data set, even if a clinical study encompassing twenty four years71 is remarkably brief compared with the sweep of human evolution. If bigger brains confer an evolutionary advantage (leaving aside the relationship, if any, of bigger brains to intellect), possible biological solutions would include the enlargement of the pelvic outlet and an increase in postnatal brain growth. There is already significant brain (and head) growth after we are born. The pediatric neurologist’s rule of thumb for head circumference is “35-45-55” (thirty-five centimeters at birth, forty-five centimeters at one year old, and fifty-five centimeters as an adult). Even with the developmental thinning of synapses and neurons, the brain’s bulk increases over time with more neurons, glia, and myelin. A third solution would be the “neurological reorganization”72 of the connectome to permit enhanced or repurposed brain functions without adding to brain bulk.

One last obstacle blocking the evolution of bigger, better-wired, and more intelligent brains is the energy cost. This is not intuitively obvious. After a vigorous cardiovascular workout, we heat up, perspire, flush red as our capillaries dilate, feel the “burn” as lactic acid builds up, and breathe faster as our heart and respiration rates increase. Nothing so viscerally obvious occurs during intense intellectual exercise (so don’t expect to see a ten-gallon Gatorade cooler at your next chess tournament). Nevertheless, your eighty-five billion to eighty-six billion neurons and eighty-five billion glial cells consume a whopping 25 percent of your body’s total energy output. The formidable energy consumption of the average human brain is currently estimated to be 516 kilocalories daily (about what your body burns up during a four-mile run), leading Brazilian neuroscientist Suzana Herculano-Houzel to conclude that the “metabolic cost is a more limiting factor to brain expansion than previously suspected.”73

At the end of the day (or at least the end of this chapter), are we any closer to assigning the mind’s intelligence (or its high-end relative, genius) to a particular portion or the entirety of the brain? The search continues, and I will briefly point out three different lines of inquiry—philosophy, psychologic testing, and neuroscience.

We do not have a localization for “the ghost in the machine,” which was Gilbert Ryle’s term for the concept of mind put forth by Descartes. The ghost was demoted to a category mistake of logic, and Ryle assured us that “the hallowed contrast between Mind and Matter will be dissipated.”74

Stephen Jay Gould, an articulate opponent of biological determinism and the belief that worth can be assigned to an individual by “measuring intelligence as a single quantity,” warned that Charles Spearman’s g (general intelligence) fell prey to “reification, or our tendency to convert abstract concepts into entities.” Gould continued “Once intelligence becomes an entity,standard procedures of science virtually dictate that a location and physical substrate be sought for it.”75 Twenty years later Ian Deary reminded us that the analysis of the WAIS subtests for g “classifies the tests’ statistical associations: it does not discover the systems into which the brain partitions its activities.”76 Needless to say, the search for the neural location of g continues apace.

Christof Koch, chief scientific officer for the Allen Institute for Brain Science, has provided a “commonsense” definition in which consciousness is equated “with our inner mental life.” Intellect and consciousness are members-in-good-standing of our inner mental life and finding the neural basis for one is likely to abet the search for the other. Koch has pointed out that the whole brain in its entirety is not required for consciousness (or in my experience, intelligence). The cerebellum, nestled between the cerebral hemispheres and brain stem, contains sixty-nine billion neurons, and yet when I evaluate patients with cerebellar strokes or degeneration from hereditary or nutritional causes, their coordination may be impaired but not their consciousness or intelligence. Eschewing the hypothesis that you use your entire brain to achieve a conscious state, Koch began to seek out the neuronal correlates of consciousness (NCC) with Francis Crick in the early 1990s.77 The hunt for “the minimal neural mechanisms jointly sufficient for any one specific conscious percept” is going strong nearly thirty years later.78 (Yes, I know we’re a long way from a credible accounting for the scope and breadth of intelligence, but starting small with scientific reductionism is a well-trodden path.) Along the way, candidate NCCs have included thalamocortical relay circuits, the claustrum (a wafer of neurons sandwiched between the deep temporal lobe cortex and a wedge-shaped island of gray matter [putamen]), and most recently “a more restricted temporo-parietal-occipital hot zone.”79 We may hope that the NCC will blaze the trail to intelligence, but as I write this in 2017, “No single brain area seems to be necessary for being conscious … but a few areas are good candidates for both full and content-specific NCC.”80

Koch allows that “the sheer number of causal interactions in the brain together with the fleeting nature of many experiences pose challenges to even sophisticated experimental approaches to NCC.”81 Winnowing the wheat of consciousness from the chaff of ongoing cross talk between eight-six billion neurons will require an unexpected, unforeseeable, and unprecedented breakthrough. No guarantees are forthcoming, but the history of science tells us that this has happened before. The solution to the puzzle of the circulation of the blood was announced a week before Shakespeare’s death in 1616 by William Harvey, who did not publish his finding until twelve years later when a “miserably printed little book of seventy two pages” (De motu cordis) appeared and “described the greatest discovery in the annals of medicine.”82 Harvey did not know how blood traveled from the arterial side of the circulatory tree to the venous side or how substances were exchanged between the blood and tissues. He may have suspected “that minute vessels would be found connecting arteries with veins; but he did not commit himself and referred merely to an indefinite soakage of blood through the ‘Porosities of the tissue.’ ”83 The discovery of the “minute vessels” had to await a new technology—the compound microscope—just coming into its own, with Leeuwenhoek’s refinements, in the latter half of the seventeenth century. Wielding the new technology in 1661, Marcello Malpighi found the missing puzzle piece—capillaries—linking the arteries to the veins by intensely studying the lungs of frogs (and by his own admission, “I have destroyed almost the whole race of frogs”).84

Is there a twenty-first-century neural “missing link,” (akin to the capillaries discovered in the seventeenth century) just waiting to be discovered by a new technology (like a latter-day version of Leeuwenhoek’s microscope) that will bridge the explanatory gap dividing the mind (of an Einstein or otherwise) and the brain? The advent of Big Science, brought to bear on the brain, will continue to provide tantalizing glimpses of the frontiers of the brain and the mind. For example, in February 2017 Koch’s group took ten thousand cross-sectional images of mouse brain to create a three-dimensional reconstruction of the branches of a particular neuron arising from the claustrum (see candidate NCCs above) and wrapping around the circumference of the murine brain like a “crown of thorns.” Koch had “never seen neurons extend so far across brain regions.” This hitherto unknown and far-reaching connectivity of the claustrum and the outer parts of a simpler mouse nervous system has rekindled Koch and Crick’s hypothesis that the “claustrum could be coordinating inputs and outputs across the brain to create consciousness.”85

Unlike physicists, who have framed an expert and effective knowledge system laying out the forces of nature (electromagnetic, weak, and strong) and elementary particles, neuroscientists still have no standard model of the brain even 110-plus years since Ramón y Cajal set forth the neuron doctrine. However, guiding principles (the propagation of the action potential and synaptic transmission) and anatomical units (neurons and glia) have allowed us to glimpse the grandeur and the immense complexity of the human brain. The task at hand is to gain a better intellectual purchase on the most complex (to date) biological structure in the universe—a structure that was “devised” by the blind forces of evolution, working over unimaginably long epochs of “deep” time. The next and final chapter will describe and appraise some of the future directions and limiting factors for neuroscientific research … and whether the contemplation of Einstein’s brain can take us any further.