12

…and Existential Disconnect

As I was fleshing out this dispiriting diagnosis, I was beginning to worry that I had dug myself into a conceptual hole from which I could hardly climb out. To cheer myself up, I kept thinking about the minority of students in my classes who were still performing at a seemingly high cognitive level. How could their mental sharpness be explained? Were their peculiar aptitudes the result merely of superior natural ability which no amount of overstimulation could short-circuit? But what would then account for some anomalies in their thinking, particularly the more pragmatic and utilitarian focus they often displayed—to the point of expressing incredulity that an intelligent person could still opt for a non-utilitarian decision in moral dilemmas like the famous “trolley problem”? Or for the grammatical and syntactic problems still marking much of their writing? Moreover, there was also the famous “Flynn effect” to contend with—the alleged gains in “intelligence” of each successive generation (as indicated by standard IQ tests, if by little else).1

A plausible answer to all these questions came in the form of another epiphany. It was triggered by a brief report introducing yet another neuroimaging study which arrived in my inbox in October 2012.2 I could have glossed over it and tried to file it away mentally or electronically. But as I was reading the summary of the experiments directed by neuroscientist Anthony Jack and some of his subsequent comments, the pieces of my mental puzzle started to rearrange into a new “neuroepistemo-logical” framework. The latter could hopefully explain the cognitive divergence I observed as well as the ever increasing predominance of “positivism” (or “scientism”) in the social sciences and policy research.3

Jack’s experiments fell within a line of research that had shifted focus from the activation of different brain centers to that of widely distributed networks and their interactions. He focused on the interactions between two major networks. One was the “default mode network” which had already attracted growing attention. It had received its name from studies in the 1990s which indicated that it was activated “by default”—during periods of rest, when experimental “subjects” were lying in the gut of the scanning machine with no cognitive tasks to perform. On the basis of such observations, the initial consensus among researchers was that this was a “task-negative” network involved mostly in internal neural processing. Its spontaneous, rather high activation level seemed to underlie the generation of overall self-awareness as it integrated signals coming from subcortical emotional centers, internal organs (“interoception”) or parts of the body (somatosensory and sensorimotor representations). At the conscious level, the default mode network thus appeared to be involved primarily in self-referential thinking and biographical recollections or projections.

The other network in the focus of Jack’s study is commonly designated by an even more technical term—the “task-positive network.” In previous experiments, it had been consistently activated when participants were asked to perform demanding mental tasks. It was therefore thought to be involved mostly in cognitive control, the logical processing of information, planning, and decision-making. It was also recruited, however, for tasks requiring attentional control. In addition to the volitional focusing of attention, it was activated by strong sensory (mostly visual) stimuli, whether those required immediate attention or needed to be ignored. In recognition of this role in the algorithmic processing of sensory or more abstract information and in attentional control, this network is sometimes dubbed the “executive attention,” “executive control,” or “central-executive” network. In fact, it largely overlaps with one of the two networks I described earlier with reference to seeking behaviors and addiction, the “reflective” system, and with the “executive brain” more generally.

In this context, it was the default mode network whose workings seemed more intriguing. First, it is involved in a strange dance with the task-positive network. In technical terms, the activation of the two networks is “negatively correlated”—when one “lights up” on brain scans, activity levels in key centers of the other typically fall below baseline levels. Second, some neuroscientists have noticed that the default mode network is activated not just during periods of idleness and mind-wandering. Apparently, it is also recruited when individuals need to distinguish animate from inanimate objects, infer the motivations and intentions of others, assess their own relatedness to different individuals and groups, and keep track of everyone and of social situations, interactions, and hierarchies.4

With hindsight, this larger role of the default mode network should hardly have been surprising. It was already understood that social judgment requires the empathetic simulation of the mental and emotional states of others—a process built upon somatosensory and sensorimotor representations and affective responses in the brain.5 It did take time, though, for neuroscientists to begin to view and study the default mode network in this new light. What emerged from this line of research was a recognition of the substantial overlap between the default mode network and the “social brain”—the set of regions involved in the mapping and navigation of social relationships.6

This is where Jack and his team stepped in with their series of ingenious experiments. Those indicated that the default mode network was, indeed, involved not just in internal, self-referential processing, but also in a particular category of tasks requiring an outward focus of attention. The researchers asked participants to assess purely physical and social interactions—for example, whether water would flow from one container to another through a tube if there was a hole in it; or whether a young man thought the young woman sitting next to him (shown in a video clip) was angry. As expected, scans indicated that the physical puzzles activated the task-positive network in the brains of participants. Social scenarios, on the other hand, reliably activated the default mode network. In Jack’s interpretation, those results offered a critical demonstration that the supposedly “task-negative,” inwardly focused default mode network is also involved in the empathetic, affectively colored processing of social information.

Previous studies had already found that reasoning about physical objects and social judgment was associated with different brain regions. Jack’s study helped clarify this picture by establishing that an outward focus of attention did not necessarily quiet the default mode network. It also indicated that the two modes of reasoning were largely incompatible since they required the recruitment of either the task-positive or default mode networks, whose activation alternated in a see-sawing fashion. One mode of thinking is analytic-empirical, and is appropriate for the causal explanation, utilitarian assessment, and prediction of physical phenomena and interactions. The other is empathetic, geared toward the partly intuitive, holistic understanding of social situations and interpersonal or communal give and take. The two modes do not normally mix. It seems the human brain has evolved to automatically switch between these two mental orientations or faculties—one social, the other non-social, physical or mechanical—as appropriate to any given situation.7

Since social and mechanical reasoning are associated with what had been called the “default mode” and “task-positive” networks, Jack has given these more evocative labels. He has called the former the “empathetic,” and to the latter—the “analytic” network. These designations do seem preferable to the more technical terms they replace, but I would suggest a different term for Jack’s “empathetic” network. As he himself has pointed out, the default mode network is involved not just in the empathetic understanding of other human beings and social relationships. It is also recruited for “some more synthetic forms of non-social reasoning, such as insight problem solving and detecting broader patterns.”8 Its proper activation is thus needed not only for social connection, navigation and positioning. It is also key to the holistic grasp of complex, indeterminate phenomena (“seeing the forest”), making distant associations between seemingly unrelated issues (“connecting the dots”), creative insight, an intuitive sense of what is relevant and significant, etc.9 Crucially, Jack’s “empathetic” network is also involved in implicit learning–acquiring a sense of probabilistic relationships and patterns (such as those involved in complex syntax and grammar) without conscious effort.10

Taking into account this broader role of the default mode network in our existential involvement and ontological grounding in the larger world, I would rather refer to it as the “empathetico-intuitive,” or–for shorter—“intuitive” network. This designation still resonates with the “social brain hypothesis” according to which “our brains have expanded so much over the course of evolution precisely because of the challenges involved in living in large social groups.”11 Evidently, navigating this intricate existential web has required the development of broader neurophysiological and mental capacities which facilitate dealing not just with social interactions but also with complexity and ambiguity in general.

It remains a bit unclear what processes prompt the timely toggling between the analytic and intuitive networks and related mental modes. Some neuroscientists have assigned this role to a third, related brain network—the so called “salience network.” It is involved in the ongoing monitoring of internal and external neural signals, on this basis perhaps prompting either the analytic or the intuitive network to be activated, and the other one to be inhibited. In any case, the determination of which network is to be recruited at any given moment seems to depend primarily on whether the task at hand or the context require keen affective and visceral attunement, or whether such internal processing would interfere with the successful handling of external information or with abstracted logical analysis—as when excessive ruminations could distract us from seeing an approaching car, solving a hard logical problem, or following an algorithmic protocol. This is, at least, how an optimally functioning brain, well integrated internally and with the body, has evolved to work. As in the case of addiction, though, this intricate balance can be easily disrupted.

Such neurosomatic unbalancing is most obvious in some psychiatric patients. On one extreme, individuals on the autism spectrum tend to have an underactive and underconnected intuitive network which is not coherently activated at rest, and is not easily recruited in social situations. At the same time, they have an overactive analytic network which is very hard to disengage—a peculiarity reflecting a general tendency for some neural deficits to be accompanied by related overcapacity in other areas. In such individuals, the more difficult activation of the intuitive network, and the simultaneous disengagement of its analytic counterpart, may also be related to insufficient sensitivity in parts of the salience network to internally generated neural signals or to their inadequate connectivity to other brain regions. On the other extreme, individuals diagnosed with schizophrenia and other psychotic disorders typically have a hyperactive and overconnected intuitive network which can hardly be quieted down.12

In both cases, the brains of affected individuals cannot switch smoothly between the analytic and intuitive networks and the related mental orientations. Those on the autism spectrum commonly remain stuck in a mode of external monitoring (often fixating on insignificant details), and of mechanistic analysis even in social contexts calling for empathetic sensitivity and a more intuitive judgment. A famous case in point is the 18th-century savant who was taken to see a Shakespeare play. Asked what he thought of it, he responded that the actors had uttered 12,445 words and the dancers had performed 5,202 steps—and he got the numbers right.13 Individuals suffering from schizophrenia, on the other hand, tend to remain trapped in their inner mental world. They experience fanciful representations projected within their own minds as external voices, consider their vivid hallucinations truthful, overinterpret trivial actions and words, and often develop paranoid delusions (like mathematician John Nash’s conviction that individuals wearing red neckties were part of a sinister communist conspiracy). Paradoxically, in both groups a failure to properly activate one network and suppress the other is socially debilitating as it entails a loss of touch even with significant others, and with the broader social “reality.”

Of course, these are clinical extremes. But there is no sharp line dividing such extremes from the mental functioning of “neurotypicals.”14 There are, for example, many software engineers and artists who appear at least quasi-autistic or quasi-psychotic,15 or in possession of “high-functioning” versions of the two potentially crippling conditions. Moreover, courtesy of brain plasticity, intense and prolonged engagement in tasks requiring different kinds of attention and mental processing can sharpen the contrasting tendencies and aptitudes of the two groups, eventually pushing some individuals over the edge. This danger is illustrated by the social ineptitude of someone like Christopher Langan (the person with the highest IQ score in history) who was never able to hold down a job where he could employ his exceptional mental abilities,16 or the descent into apparent madness or self-destruction of many strikingly talented artists (and some mathematicians like Nash17).18

Madness or weaker psychotic tendencies among all sorts of creative types have always provoked strong interest. But I have come to see as more relevant and indicative the peculiarities of individuals who have been diagnosed with Asperger’s19 (a high-functioning form of autism), have remained undiagnosed though perhaps meriting a diagnosis, or display weaker, subclinical symptoms. I have kept wondering if such quirks might somehow be related to the somewhat unbalanced cognitive strengths of some of my better students, and also of most social scientists and all sorts of analysts or “knowledge workers” employed outside of academia.

Some of Jack’s conclusions and comments may point in this direction. In his view, not only psychiatric patients but also “normal” individuals could depend too much on the activation of either the analytic or “empathetic” (as he prefers to call it) network.20 But it seems the odds of leaning toward either extreme are not equal. As Jack has noted, formal education is generally aimed at tuning up the analytic network21—a tendency which becomes, I would add, stronger the higher one climbs up the education ladder. My sense is that not only education and any other form of abstract, logical processing, but also the degree of sensory overstimulation I described earlier and an overall increase in the cognitive demands generated by modern society could have a similar effect.

This impact first became obvious in Britain where industrialization prompted not only Romantic musings, but also the quasi-autistic, obsessive theorizing and plans for social improvement championed by Jeremy Bentham and his disciples; and where Adam Smith’s more nuanced vision of society and the market were later replaced by cruder economic and social models. Those typically justified poverty and exploitation as providing necessary incentives for employment under the inhuman working conditions of early factories—or rationalized the extreme inequality generated by the “free market” as a necessary corollary to social progress through the “survival of the fittest.”22

We have come a long way ever since—to a point where we inhabit a socio-technological pressure cooker which requires and fosters some fairly unnatural aptitudes. These include the navigation of multiple information streams; abstracted, utilitarian analysis; and rapid switching of attention, ongoing choosing and decision-making in the face of countless options. It is these “analytical skills” that are valued, selected for, and reinforced as hallmarks of mental fitness in a modern, technologically supercharged social environment23—at the expense of a keener sense of the overall significance of changing social practices and broader trends, and of non-utilitarian or nonmonetary valuations.

There was, however, one obvious problem with this theory as it was forming in my mind. If the sensory and cognitive workout provided by modern civilization had such a brain-building and mind-sharpening effect, why weren’t the majority of my students displaying heightened levels of such mental fitness? Why did they tend to separate into a minority of exceptional (and often utilitarian) “learners” and a majority whose thinking displayed varying degrees of confusion,24 detachment, and difficulty storing relevant information in long-term memory—and recalling it as needed? Why had the once solid group in the middle largely dissipated, as I and many of my colleagues had frequently complained?

As it turned out, these questions also had a plausible answer. It emerged in my mind as I stumbled upon an article co-authored by Mary Helen Immordino-Yang, a neuropsychologist who has done much research in the area of “neuroeducation.”25 The article highlighted the importance of the proper connectivity and activation of the “default mode network” for various mental competencies, overall human development, and education. It also contained, however, a stark warning—that excessive demands on the attention of children and adolescents in and outside of school could require overengagement of the task-positive (or analytic) network, and thus sabotage the development of the default mode (intuitive) network in their brains. Since the latter appears to form the “structural core” of the neocortex26 whose proper development and attunement is essential for coherent engagement with the larger world, its inadequate activation and connectivity can leave students ensnared in concrete thinking and immediate associations. They would be unable to generalize across experiences and information streams. They would also have difficulty seeing the bigger picture into which their personal lives are inserted since they would lack a rich overall framework—whose development depends on ongoing implicit learning and the ability to make rich associations.

Another article by Immordino-Yang and several colleagues also suggested a link between this newer line of research focusing on the analytic and intuitive networks and previous studies of the localization of different functions in the two cerebral hemispheres of the human brain.27 Generally, the left hemisphere is described as the seat of focused, sequential, algorithmic, logical-analytic processing. It is more self-contained and tends to generate more detached, forward-looking and optimistic plans or rationalizations for social behaviors. The right hemisphere, on the other hand, is associated with more synthetic and holistic representations. It is more strongly activated at rest28 and more closely connected to subcortical parts of the brain involved in affective responses and interoception. It contains hubs of the intuitive and salience networks which are key to empathetic attunement, the integration of neural processing throughout the brain, and the overall integration of the brain and body.29

Curiously, the proper functioning of the left hemisphere depends on signals coming from the right hemisphere with its stronger sub-cortical connections. In cases of right hemisphere damage, affected individuals often succumb to false rationalizations, egocentric calculations, and even delusions; and more subtle forms of right-hemisphere malfunction can result in subclinical forms of delusional thinking.30 These distortions are apparently produced by inadequate integration of affective and visceral signals into higher cognitive processes.

The separation of functions between the two hemispheres of the human brain and between the analytic and intuitive network thus seems to partly overlap and be mutually reinforcing. In the vocabulary of neuroscientists, the regions comprising the analytic network are “slightly left lateralized,” and those of the intuitive network are “slightly right lateralized.” This difference may turn out to underlie much of the differentiation in neural functioning and representation between the two hemispheres.31 It could also provide a counterpoint to the usual dismissal of the contrast between “left-brained” and “right-brained” thinking as a myth.32 Neuroscientists often point out that the two hemispheres are well connected, and are harnessed in tandem to perform various tasks. This observation would carry less weight if the activation of some key hubs in each is negatively correlated.

So here is the (provisionally) final picture which has emerged from all the cross-references and associations I have made so far. It seems the chronic overengagement of the analytic network (triggered by formal education from an early age, sustained cognitive effort, and our fast-changing socio-technological environment) and the corresponding withering of the intuitive network tend to produce different “learning outcomes” at a deeper neurosomatic level. On one extreme, some students do develop strong “analytical skills,” and an ability to keep track of and process (though not always integrate) vast information currents through sophisticated abstract frameworks.33 Meanwhile, many remain caught in mostly concrete thinking detached from any larger frame of reference, some falling short of high school or college-level requirements. Such students tend to focus on issues and interactions deemed significant within their own “social networks” of similarly oriented peers (occasionally glancing at Facebook or other websites during class and school-related work outside of class). They remain understandably detached from the constellation of larger social issues I—and academics like Bauerlein and Edmundson—still find inherently significant.

There is also another twist to this story. The intuitive network is recruited not only for the purpose of social judgment or the holistic grasp of complex patterns. It is also involved in metaphorical thinking and, even more importantly, in implicit learning—the unintentional, unconscious absorption of information or knowledge.34 This is the form of learning that underlies the acquisition of physical but also of mental skills and aptitudes. It is involved particularly in the mastering of complex rules and patterns (like those underlying grammar and syntax), but also seeing the proverbial “big picture” and developing the integrated understanding of the larger world that is ostensibly at the heart of liberal education.

Learning and automatically applying such skills and rules can, however, be disrupted by sustained mental effort and focused attention which recruit the analytical network in the brain. Sensory overstimulation and constant access to screen-mediated information has a similar effect. Downtime—made ever more elusive by the ubiquity of digital devices35—is essential for recovery from the “executive fatigue” induced in such an environment, and for the proper development and maintenance of the intuitive network in the brain.

Almost constant recruitment of the analytical network in the brain may result in inadequate activation and connectivity of the intuitive network. This neurophysiological unbalancing could in fact be the key to understanding the difficulties students in my classes, even many of the stronger ones, have with English grammar and syntax, with class material that appears overly “theoretical” or otherwise removed from their immediate frame of reference, and with integrating what they have learned in different courses into a coherent mental framework to which they can assimilate new knowledge. Curiously, such difficulties seem to go hand in hand with either overly literal or rigorously logical thinking—and lack of critical distance and sufficient self-awareness.36

Initially, the two groups of students in my classes—those deemed excellent by most academic standards and the many falling behind—appeared to have contrasting inclinations and capacities. As I kept thinking, though, the two mental toolboxes started to appear in a new light. They seemed to have something in common, related to the “weird” mindset described by psychologists Joseph Henrich, Steven Heine, and Ara Norenzayan. In a much discussed paper, they had criticized the tendency of their discipline to extrapolate from experiments with “weird” subjects (socialized within Western, Educated, Industrialized, Rich, and Democratic societies)—and to posit the highly unusual traits displayed by such exceptional individuals as psychological universals, common to the whole tapestry of humanity.37 The hallmarks of this “weird” mindset appear to be excessive individualism and disproportionate faith in personal agency.

The “weird” thesis recalls earlier observations by cultural psychologists who have questioned psychological universalism. They have described two very different worldviews—one typical of Western societies, the other most pronounced in Eastern Asia but probably common to most non-Western regions38 (with some quasi-Westernized areas and social groups around the world perhaps occupying a middle ground). The Western outlook has commonly been described as more analytic, narrowly focused, egocentric, and utilitarian—generally overvaluing individual traits, preferences, and actions at the expense of broader social forces and influences. The non-Western perspective, on the other hand, appears to be more holistic, communal, contextual, and existentially grounded.

Cultural psychologists have traced the divergence between these opposing worldviews to antiquity, linking them to differences in agricultural practices (focusing primarily on the communal effort and extensive infrastructure needed for rice cultivation in China), social and political organization (competition among small city-state vs. centralized imperial administration), institutionalization of intellectual pursuits (multiple private “academies” vs. state-controlled intellectual pursuits), etc.39 Such cultural disparities may reflect, however, not just lessons learned and transmitted within societies with different cultural practices and norms. They could also be related to differences at the level of neurosomatic engagement with the world and overall existential grounding (or disconnect).

This link is demonstrated by studies indicating that cultural differences start at the deepest, least conscious level of neural processing. For example, when shown a picture of an aquarium, American undergraduates will typically focus on and recall details about the central object or the biggest fish. Their Chinese and Japanese counterparts, on the other hand, will take in the whole picture, and be able to describe the background more fully (while recalling fewer details about the main object). Also, Far Eastern experimental “subjects” tend to show activation in overlapping brain areas when thinking of themselves and their mothers; while in Americans these tasks trigger different patterns of neural activation.40

Within Western societies, pre-existing cultural and psychological tendencies have been reinforced by centuries of modernization. This is a complex process which is still not well understood, but has involved the gradual submission of social life to the logic of the market economy (what economic historian Karl Polanyi once dubbed “the Great Transformation”41), industrialization, urbanization, the weakening of communal bonds, expansion of formal education, growing technological saturation, various forms of overconsumption, sensory and social overstimulation, etc. The sum total of these trends has created more abstracted social relations, overall rationalization and institutionalization of social life, increased social density and complexity, accelerated pace of life, and the general sensory, social, and information overload I already described.

Needless to say, these tendencies have been taken to a whole new level with the development and proliferation of information technology over the past six decades—a sensory revolution which started with the spread of TV in the 1950s, and has been mightily accelerated by the information revolution. All these changes have imposed unprecedented demands on attention, and have required relentless analytic processing and a stream of minor or more consequential choices. Excessive exposure to screens may have a particularly insidious effect. They not only attract our attention, engaging the analytic network. They may additionally suppress activity in the intuitive network in the brain by inducing us to blink less frequently.42 All these influences could play a major role in the kind of affective and visceral desensitization I described earlier.

All this seems like a recipe for further skewing of the balance between analytic and intuitive thinking, and the related networks in the human brain. This slant away from keen affective and visceral attunement and holistic thinking could account for the changes I have observed on a smaller scale among my students—particularly the divergence between a minority of “analysts” (in the broad sense) with overdeveloped “nerdy” cognitive powers; and a larger mass whose thinking is similarly narrow and utilitarian, but detached from any complex conceptual framework.

The first group seems very similar to the “empirical kids”43 graduating from prestigious American universities (or at the top of their class in other institutions). The second group is more diverse and in some sense similar to their less stellar American counterparts—though not as self-absorbed and individualistic44 in their thinking and ambitions. Nevertheless, even the academically weaker students across cultures are likely to appear more “intelligent” than their parents and grandparents if intelligence is reduced to a narrow set of cognitive skills. They would do better on IQ tests (particularly with the more extensive drill in test taking and algorithmic thinking they have received in school)—contributing to the rise in IQ scores psychologists interpret as evidence for increased intelligence, or the “Flynn effect.”45 While these two groups seem to have contrasting cognitive aptitudes, there is one quality they appear to share—the excessive disconnect from the communal settings and any larger existential horizon which Tocqueville feared individualism could eventually bring about.

As I already noted, this growing human estrangement was much resented by the Romantics. Later, it was captured by the string of philosophical, sociological, and psychiatric concepts I mentioned pointing to an overall existential estrangement and anomie. Of course, the socio-technological trends I have described have affected non-Western societies as well, so the predispositions of children and adolescents there may be getting closer to the “weird” norm. Also, a dwindling number of individuals even in Western societies have been able to maintain a keener sensibility and a more holistic existential orientation (sometimes shading into mysticism or a “fundamentalist” yearning for moral purification). They, however, have often struggled and been marginalized within a social milieu to which their frame of mind and neurophysiological proclivities are not well adjusted46—a trend epitomized by the increasing marginalization of the traditional humanities in higher education.

Though their article is called “The Weirdest People in the World?”, Henrich, Heine and Norenzayan have understandably tried to understate the “weirdness” of the disconnected mindset they have described. They have made a point of emphasizing that they did not intend to present any culturally shaped set of psychological traits as superior or inferior—just to suggest that academic psychologists should expand the pool of the test subjects they use to make it more representative of humanity. Science writer Ethan Watters, however, has read a disturbing message between the lines of their original paper. He thinks it very much reconfirms findings from earlier psychological research indicating that the more analytic “Western mind is the most self-aggrandizing and egotistical on the planet: we are more likely to promote ourselves as individuals versus advancing as a group.”47 For the sake of fairness, Watters could have noted that this mindset also entails toleration for different beliefs, values, lifestyles, and sexual preferences. He has pointed out, though, that individualist bias—at the expense of a more holistic grasp and communal belonging—has also constrained the thinking of mainstream Western social scientists.

In fact, Western social and neuroscientists must have an even sharper analytic focus—which can perhaps make it difficult to grasp the bigger picture and understand larger social patterns, trends and influences within their own and other societies. Needless to say, they tend to consider their own predispositions as “normal,” and to see fanaticism and intolerance as aberrations in need of explanation (and debunking—usually in the form of academic high-minded conspiracy theories48). This outlook has also allowed most Western social scientists to maintain unwavering faith in a next generation of social “interventions” that will provide solutions to intractable social problems like poverty and large-scale violence.49