We need more welders and less philosophers.
—Marco Rubio
Whenever I hear “culture”—I cock my pistol.
—SS officer Hanns Johst
When Marco Rubio quipped, during one of the 2016 GOP presidential debates, “Welders make more money than philosophers. We need more welders and less philosophers,” he was doing more than making a grammatical error.1 He was also guilty of several factual errors. First, as any economist will tell you, the fact that profession X is paid more than profession Y does not mean that the economy needs more people to do X. The supply and demand for X and Y may have achieved equilibrium at different salary levels from each other. (Does Rubio think that in an ideal economy neurosurgeons would be paid the same as chimney sweeps?) More importantly, philosophy majors on average earn more than welders. The median starting salary for those who studied “welding technology” was $37,000 per year, while those with undergraduate degrees in philosophy earned on average $42,000 per year; after ten to twenty years of experience, welders can expect to earn $53,000 per year, while philosophy majors on average will earn $82,000 per year.2 Ironically, the same data shows that those with a bachelor’s degree in political science (like Rubio) earn almost exactly as much on average as philosophy majors upon graduation; however, the philosophy majors overtake the political science majors after a few years of job experience, and earn more on average over their lifetimes. But perhaps we are interpreting Rubio uncharitably. Maybe by “philosophers” Rubio meant only professors of philosophy. He is still wrong. The average starting salary for assistant professors of philosophy and religion is over $54,000 per year, already more than the salary a welder can expect after ten years of experience, and full professors earn an average of $86,000 per year.
My point is not that we need more philosophers and fewer welders. (Unlike Rubio, I believe in the free market system, and have faith that the invisible hand of supply and demand will determine how many philosophers and welders we need.) And I am certainly not trying to “stigmatize vocational education,” as Rubio accused some unspecified group of doing. There is nothing intrinsically more noble about either being a philosophy professor or being a welder. But in this chapter I will insist on three points that Rubio’s comment missed: (1) Studying philosophy is a legitimate career choice, even from a narrowly vocational perspective. (2) It is a false dichotomy that one must either study philosophy or become a welder. A liberal arts education is valuable to every citizen of a democracy, and to the maintenance of democracy itself. (3) Philosophy has made immense contributions to our civilization. Moreover, by its nature it is impossible for philosophy to ever become obsolete.
PHILOSOPHY AND OCCUPATIONAL TRAINING
Philosophy majors earn more than those with any other humanities degree,3 and my own students have gone on to success in a variety of different professions, including medicine, secondary teaching, social work, and law enforcement. For those considering a law degree, it is worth knowing that undergraduate philosophy majors on average score higher than any other major on the Law School Admission Test.4 (As I am writing this, I currently have three former students attending top law schools: one at Columbia Law School, another at NYU Law School, and a third at the University of Michigan Law School. Their parents must be thinking: “Oh, if only they had majored in welding!”) Philosophy majors also have the highest average score on the GRE Verbal and GRE Analytical Writing, and are among the highest-scoring majors for the GMAT, the business-school admissions test.5 Perhaps most impressively, philosophy majors have the highest average probability of getting admitted to medical school, better on average than any other major, including biology and chemistry!6 We shouldn’t be surprised that Darrell Kirch, MD, the CEO of the organization that administers the Medical College Admission Test, was an undergraduate philosophy major himself.7
What else do people who majored in philosophy do? A philosophy major can be president of Morgan Stanley (Robert Greenhill), founder and manager of a hedge fund (Don Brownstein), an investor (George Soros and Carl Icahn), CEO of Overstock.com (my former Stanford classmate Patrick Byrne), CEO of Time Warner (Gerald Levin), cofounder of PayPal (Peter Thiel), a Supreme Court justice (Stephen Breyer and David Souter), cofounder of Wikipedia (Larry Sanger), mayor of Los Angeles (Richard Riordan), US secretary of education (William Bennett), chair of the Federal Deposit Insurance Corporation (Sheila Bair), political activist (left-wing Stokely Carmichael and right-wing Patrick Buchanan), prime minister of Canada (Paul Martin, Jr.), president of the Czech Republic (Vaclav Havel), a network television journalist (Stone Phillips), a Pulitzer Prize–winning author (Studs Terkel), a Nobel Prize–winning author (Pearl Buck and Bertrand Russell and Jean-Paul Sartre and Albert Camus and Alexander Solzhenitsyn), a Nobel Peace Prize winner (Albert Schweizer and Aung San Suu Kyi), host of an iconic game show (Alex Trebek), a comedian/actor/producer (Ricky Gervais and Chris Hardwick), an Academy Award–winning filmmaker (Ethan Coen), a four-star general in the US army (Jack Keane), a fighter in the French Resistance in World War II (Stephane Hessel), coauthor of the United Nations’ Universal Declaration of Human Rights (P. C. Chang and Charles Malik), a martyr to German opposition to Nazism in World War II (Sophie Scholl), pope (John Paul II and Benedict XVI), or a seminal anthropologist (Claude Levi-Strauss and Clifford Geertz)—just to give a few examples.
The practical value of our discipline relates to the fact that philosophy courses typically do particularly well at teaching the “three Rs” of a humanities education: reading, writing, and reasoning. Harvard Medical School professor David Silbersweig, MD, explained that his undergraduate major in philosophy
has informed and provided a methodology for everything I have done since. If you can get through a one-sentence paragraph of Kant, holding all of its ideas and clauses in juxtaposition in your mind, you can think through most anything. If you can extract, and abstract, underlying assumptions or superordinate principles, or reason through to the implications of arguments, you can identify and address issues in a myriad of fields. It has helped me in immeasurable ways along my trajectory from philosophy to an academic medical career, which suggest that Rubio [has] a number of serious misconceptions about education.8
Philosophy is not unique among humanities fields in teaching reading, writing, and reasoning, but philosophy classes typically put a special emphasis on clarity of expression, accuracy of interpretation, and cogency of argumentation that is sometimes lacking in other disciplines.9
I certainly wouldn’t say that most people should major in philosophy (any more than I would say most people should become welders). But some training in the reading, writing, and reasoning skills that are distinctive of philosophy courses is valuable to majors in a variety of different fields. I know an engineer who, as an undergraduate, fed me the line every humanities professor must have heard at some point: “Why should an engineer be required to take a humanities course instead of taking that one additional engineering course that would help him design a bridge that won’t collapse?!” First, any engineer who is only one course away from designing a bridge that collapses should not be anywhere close to graduation! I expect my bridges to be designed by engineers who are well over the absolute minimum skill level required. More seriously, the fact is that, when you study a technical field, advanced knowledge is constantly changing. You always use the concepts that are taught to you in the basic courses your freshman and maybe your sophomore year. After that, the content of your courses is typically things that will be obsolete a few years after you graduate, or perhaps irrelevant to the particular job you end up in. You are still getting something out of them, but it is simply more practice in “thinking like an engineer” (or a businessperson, or a computer scientist, or whatever). This is certainly valuable. However, whatever sort of career you end up in, if you need a college degree for it, part of your job will be reading challenging texts with understanding and writing clearly and persuasively. If you end up in an advanced management position, your job may also involve understanding and discussing knowledgeably issues involving ethics.
How do I know all this? The engineer in question admitted it when I talked to him years later, after he had practical experience in his own profession.
Let us remind ourselves that the distinctive higher education system of the United States—which requires most students to take a liberal arts curriculum and, since World War II, has been increasingly open to people of all social classes—is the envy of the world. No wonder, because the system that makes natural scientists study poetry and philosophy has produced nuclear power, computers, supersonic flight, the polio vaccine, lasers, transistors, oral contraceptives, CDs, the Internet, email, and MRIs, and has put the first people on the moon. Ironically, as liberal arts education in the United States is increasingly under fire, governments in China, India, Japan, Singapore, and South Korea try to re-create in their own countries the liberal arts model that they recognize is one of the keys to US technological and economic dominance.10
PHILOSOPHY AND DEMOCRATIC CITIZENSHIP
If failing to appreciate the practical value of studying philosophy were the only problem with Rubio’s comment, it would be gratuitously cruel to pair it with one by an SS officer as the epigraphs to this chapter. Rubio certainly did not intend to lead anyone toward anything that horrific. In fact, his speech suspending his 2016 presidential campaign after his loss to Trump in his home state of Florida included the most inspiring rhetoric of the Republican primaries and expressed the values that I believe are most important to him.11 But what Rubio fails to grasp is that the anti-intellectualism he took part in during the early parts of his campaign is inconsistent with the democracy and justice that he honored at the end of his campaign.
Certainly, Rubio’s comment was not a political misstep in the context of contemporary Republican politics. Had Carly Fiorina (who also took part in the debate that night) confessed to the fact that she majored in philosophy, her political ambitions would have been crushed faster than you can fall off a stage—which she later did.12 Rubio was not even the last person to explicitly target philosophy during that debate. Senator Ted Cruz, apparently noticing that Rubio’s line had gotten thunderous applause, criticized the “philosopher-kings” who run the Federal Reserve. (I somehow doubt that this is what Plato meant by “philosopher-kings.”) Not wanting to be left out, Governor John Kasich made the blanket statement, “Philosophy doesn’t work when you run something.” (This sentence makes less sense to me every time I read it.)
Ben Carson, Trump’s secretary of housing and urban development, expressed his disdain for philosophy in another context, when he opined that “political correctness” is a serious problem for the US because “it’s the very same thing that happened to the Roman Empire. They were extremely powerful. There was no way anybody could overcome them. But these philosophers, with the long flowing white robes and the long white beards, they could wax eloquently on every subject, but nothing was right and nothing was wrong. They soon completely lost sight of who they were.”13 Let’s try to take Carson’s comment seriously. By “these philosophers” Carson may mean either Academic or Pyrrhonian Skeptics. Carson’s reaction is much like that of Rome’s own archconservative, Cato the Elder (234–149 BCE), who had the skeptic Carneades (214–129 BCE) expelled from Rome. But neither ancient nor modern skeptics advocate ethical anarchy. Hellenistic Skeptics generally claimed that, while we cannot know what the truth is, we can and should act upon what appears to us to be most plausible.14 This position raises many delightful philosophical puzzles (is there a distinction between acting on what seems most plausible to us and believing in it?), but it hardly seems like something that would bring the greatest empire of the ancient West to its knees.
Moreover, as a Christian, Carson should also know that there was more to Hellenistic philosophy than Skepticism: the Book of Acts (17:18) reports that St. Paul debated the Epicureans and Stoics he met in Athens.15 Presumably, Carson would disapprove of the Epicureans’ materialistic conception of the universe and their view that good and evil are reducible to pleasure and pain. However, the Epicureans did not see either of these facts as entailing a sybaritic lifestyle: they advocated the moderate satisfaction of desire as most conducive to a happy life. In addition, it is hard not to admire Epicurus for admitting women and slaves to his school.
The Stoics actually had a fairly significant influence on the development of Christian thought, and Carson would approve of many of their teachings, if he had bothered to learn about them. Contrary to the Skeptics, the Stoics claimed that we can know the truth with certainty, and contrary to the Epicureans the Stoics argued that the only thing good in itself is virtue. The Stoics also believed that God exists and is identical with logos (reason). (We find similar language in the New Testament: “In the beginning was the Word, and the Word was with God, and the Word was God,” where “Word” is logos [John 1:1, KJV].) The Stoics argued that the best way of life is to live in accordance with the natural law dictated by the reason that exists within each of us. (Compare this with Romans 2:14–15: “For when the Gentiles, which have not the law, do by nature the things contained in the law, these, having not the law, are a law unto themselves: Which shew the work of the law written in their hearts” [KJV].)
If the philosophers are not responsible for the fall of Rome, what is? Edward Gibbon (1737–94) argued in his classic The Decline and Fall of the Roman Empire that “the introduction, or at least the abuse of Christianity” was a contributing factor to the decline of the empire, because it preached “the happiness of a future life” over political activity in this life.16 Indeed, Gibbon claimed that the preference for the afterlife was so extreme among Christians during the period of persecution that they actively sought martyrdom, going unsummoned to the tribunal to gratuitously confess their faith and demand to be executed.17 Gibbon editorialized that, after Christianity was legalized by Emperor Constantine (r. 306–37), “the active virtues of society were discouraged … the last remains of military spirit were buried in the cloister” and the “sacred indolence of the monks was devoutly embraced by a servile and effeminate age.” When the Church did encourage activity, it was often counterproductive: “the church, and even the state, were distracted by religious factions, whose conflicts were sometimes bloody and always implacable.”18 Of course, Gibbon is not the last word on later Roman history.19 But almost all serious historians would agree with one point he makes: “instead of inquiring why the Roman empire was destroyed, we should rather be surprised that it had subsisted so long.”20 Rome fell for many complicated social, political, and economic reasons. Although it is a common conservative trope to compare the decadent United States to decadent Rome, there is no simplistic lesson to be learned from its fate.
To some extent, dismissals of philosophy by politicians simply reflect the anti-intellectualism that has been central to US culture for a very long time.21 Presidents have to affect an everyman demeanor, dressing up like a cowboy (Ronald “The Gipper” Reagan) or maintaining the hint of an accent they normally would have lost some time at Yale or Oxford (“Bill” Clinton). One of the reasons that Al Gore lost to George W. Bush in 2000 is that Gore never learned the art of hiding his intelligence and erudition; in contrast, Americans correctly saw in G. W. Bush someone they could relate to. Although Bush and Gore each went to an Ivy League school, Bush could be forgiven because (in his own words) he didn’t learn “a damn thing at Yale.”22 Gore reminded Americans of the smart kid in class, who’s in debate club, and always gets an A, and uses big words. Nobody likes that kid. G. W. Bush, in contrast, is the friendly goofball who’s on the cheerleading squad (which Bush actually was). Everyone likes him, and even though he barely got through school with Cs and Ds, he’ll be fine because when he graduates he’ll just join the family business (which Bush also did).
There is a big difference, though, between wearing a Stetson or playing a saxophone to show what a regular guy you are and actively condemning education. The latter is the direction the GOP has recently been taking. As conservative commentator Matt K. Lewis laments, “Too many of today’s conservatives deliberately shun erudition, academic excellence, experience, sagaciousness, and expertise in politics.”23 In other words, the conservative movement has been “Palinized.”24 When former Republican vice presidential candidate Sarah Palin compared conservative women like herself to “Mama Grizzlies,” who “kinda just know when something’s wrong,” the unfortunate implication is that innate and inarticulate intuition invalidates informed intelligence.25 Historian Stacy Schiff agrees that “Moms ‘do kinda just know when something’s wrong’ ” but notes that “ideally that category includes monitoring unprotected teenage sex under one’s own roof” (something that Palin failed to do with her daughter Bristol). Turning more specifically to the political implications of Palin’s metaphor, she states: “I’m all for saluting the maternal sixth sense, though I’m not sure I want a government run by intuition. We had one of those recently,” under G. W. Bush, who led our country to war because of imaginary weapons of mass destruction in Iraq.26 But it was precisely Palin’s anti-intellectualism that made her appealing to so many in the contemporary Republican base.
It used to be that the Democrats were the party of anti-intellectual populism. In the early nineteenth century, our first Democratic president, Andrew Jackson, dismantled America’s successful central bank, and was responsible for the ethnic cleansing of Native Americans known as the “Trail of Tears.” William Jennings Bryan was the face of the Democratic Party at the beginning of the twentieth century, running unsuccessfully three times for the presidency. When Republican Teddy Roosevelt invited African American leader Booker T. Washington to dine with him and his family at the White House, Bryan expressed outrage.27 Bryan also famously opposed evolutionary theory, taking the stand for the prosecution in the Scopes Monkey Trial, where he was skewered by Clarence Darrow.
In contrast, the Republican Party was once the party of thoughtful intellectuals like Lincoln (who boasted of “having studied and nearly mastered” the ancient classic of geometry, Euclid’s Elements, and whose Gettysburg Address was modeled on the Funeral Oration of the Athenian statesman Pericles),28 Teddy Roosevelt (Phi Beta Kappa and Magna Cum Laude graduate of Harvard), Hoover (who spoke Chinese and translated the Renaissance work of metallurgy De Re Metallica out of Latin),29 Eisenhower (graduate of West Point, war hero, and president of Columbia University), Nixon (who earned a degree from a liberal arts college and then went on to law school), and George H. W. Bush (Phi Beta Kappa at Yale). However, the GOP has now become the party of B-movie actor Ronald Reagan (who confessed that he could not remember whether he had violated his administration’s own policy by trading arms to Iran in exchange for hostages),30 C-student George W. Bush (“They misunderestimated me”),31 and D-list celebrity Donald Trump (“I’m very highly educated. I know words, I have the best words. I have the best, but there is no better word than stupid.”).32
What happened?
In his book Too Dumb to Fail, Lewis does an excellent job of diagnosing some of the causes of the rising anti-intellectualism of the GOP, including the need to please evangelical voters in southern red states, who often believe, mistakenly, that Christianity is inconsistent with education and reflectiveness. In every presidential election since 1980 (the year Reagan was first elected), the Republican presidential candidate has won in the Bible Belt states of Alabama, Mississippi, Oklahoma, South Carolina, and Texas. But pleasing this constituency sometimes leads to embarrassing results. None of the GOP candidates in 2016 would admit to believing in evolution. When Gov. Scott Walker was in the running for the Republican presidential nomination and visited the United Kingdom, he was ridiculed by a BBC interviewer for evading a question about whether he believed in evolutionary theory: “Any British politician, right- or left-wing, would laugh and say, ‘Yes, of course evolution is true.’ ”33 Ted Cruz also evaded questions about his views on the topic. However, even though “the son will not bear the punishment for the father’s iniquity” (Ezekiel 18:20, KJV), it is difficult not to quote Cruz’s father on this topic: “Communism and evolution go hand and hand. Evolution is one of the strongest tools of Marxism because if they can convince you that you came from a monkey, it’s much easier to convince you that God does not exist.”34 Finally, Rubio, when asked how old he thought the Earth is, sounded like a student trying to bluff when he had not done the reading:
I’m not a scientist, man.… At the end of the day, I think there are multiple theories out there on how the universe was created and I think this is a country where people should have the opportunity to teach them all. I think parents should be able to teach their kids what their faith says, what science says. Whether the Earth was created in 7 days, or 7 actual eras, I’m not sure we’ll ever be able to answer that. It’s one of the great mysteries.35
In reality, there is no good reason to assume that the Bible is inconsistent with evolutionary theory. The notion that parts of the Bible should be read metaphorically is quite orthodox. No less an authority than St. Augustine explained that he was initially reluctant to embrace Christianity because parts of the Bible seemed implausible. However, St. Ambrose explained to him that many passages were to be read not literally, but as metaphors for a higher truth.36 Ambrose argued that this is part of the point St. Paul was making when he said, “for the letter killeth, but the spirit giveth life” (2 Corinthians 3:6, KJV). Augustine would himself provide an extensive metaphorical interpretation of the creation story in Genesis in the appendix to his spiritual autobiography, the Confessions.37 But the ultimate authority for metaphorical interpretations of the Bible is Jesus, who warned his disciples against literalism:
And when his disciples were come to the other side, they had forgotten to take bread. Then Jesus said unto them, Take heed and beware of the leaven of the Pharisees and of the Sadducees. And they reasoned among themselves, saying, It is because we have taken no bread. Which when Jesus perceived, he said unto them, O ye of little faith, why reason ye among yourselves, because ye have brought no bread? … How is it that ye do not understand that I spake it not to you concerning bread, that ye should beware of the leaven of the Pharisees and of the Sadducees? Then understood they how that he bade them not beware of the leaven of bread, but of the doctrine of the Pharisees and of the Sadducees. (Matthew 16:5–12, KJV)
Jesus was using the metaphor of yeast ruining what should be unleavened bread to warn the disciples not to be influenced by the self-righteousness of the Pharisees or the Sadducees (two sects that opposed Jesus). He chastises his disciples for their overly literal reading, and he would do the same for those who assume we must read the book of Genesis as if it were a newspaper instead of scripture.
In general, there is no intrinsic conflict between being a Christian (or any kind of theist) and being an intellectual. St. Paul said, “Beware lest any man spoil you through philosophy and vain deceit” (Colossians 2:8), but most Christians historically have taken this to mean only that philosophy—if practiced in a shallow or specious way—can be destructive of faith, not that it must be, or must be avoided intrinsically. Francis Bacon (1561–1626), whose works were a major influence on empiricist philosophy of science, stated, “It is true that a little philosophy inclineth man’s mind to atheism; but depth in philosophy bringeth men’s minds about to religion.”38 The plausibility of Bacon’s claim is reflected in the impressive list of seminal philosophers who were theists (including Plato, Aristotle, Augustine, Anselm, Maimonides, Avicenna, Averroes, Aquinas, Descartes, Leibniz, Spinoza, Berkeley, and Kierkegaard). In the twentieth century, there have been many profound theologians,39 and a number of highly influential philosophers who are theists.40 This cuts in two directions. If those who believe in God are anti-intellectual, it is due to laziness, not religious principle. However, philosophers who are atheists should engage those with religious beliefs as seriously and respectfully as they engage those who disagree with them about the mind/body problem or consequentialism vs. deontology.
A mistaken conception of religion as incompatible with science or intellectual sophistication is one factor in the rise of political anti-intellectualism. And this became especially significant when the Moral Majority, a conservative Christian religious group, helped Reagan win the presidency in 1980. However, Reagan’s politics had an anti-intellectual slant long before then. Part of Reagan’s successful campaign for governor of California was predicated upon open disdain for the faculty and students of the state’s public universities. (Students were protesting in favor of the hair-brained ideas that the Vietnam War was a bad idea, and that desegregation and ensuring the voting rights of African Americans were good ideas. Kids today!) Soon after being elected, Reagan complained that, by funding higher education, the state was “subsidizing intellectual curiosity.”41 It seems not to have occurred to Reagan that, as the Los Angeles Times editorialized, “If a university is not a place where intellectual curiosity is to be encouraged and subsidized then it is nothing.”42
Anti-intellectualism grew even stronger in the GOP during the presidency of Bush the Younger. After writing an article critical of the administration, journalist Ron Suskind was summoned to a meeting with Karl Rove (who was Bush’s campaign director and later senior advisor and deputy chief of staff in the White House). According to Suskind, Rove told him that
guys like me were “in what we call the reality-based community,” which he defined as people who “believe that solutions emerge from your judicious study of discernible reality.” I nodded and murmured something about enlightenment principles and empiricism. He cut me off. “That’s not the way the world really works anymore,” he continued. “We’re an empire now, and when we act, we create our own reality. And while you’re studying that reality—judiciously, as you will—we’ll act again, creating other new realities, which you can study too, and that’s how things will sort out. We’re history’s actors … and you, all of you, will be left to just study what we do.”43
This stunningly Nietzschean rejection of truth and evidence was followed, less than a year later, by the US invasion of Iraq, which was justified by the pursuit of imaginary weapons of mass destruction for which there was never any compelling evidence. (I still remember watching with incredulity as Secretary of State Colin Powell argued in favor of the war before the UN Security Council by waving a vial of what anthrax would look like—if we had any samples produced in Iraq, which we did not—and a drawing of what a mobile weapons lab would look like—if we had any photos of one, which we did not.) The Republican Party’s turn away from evidence and facts, which began with Reagan and accelerated under G. W. Bush, has now reached its climax in the “alternative facts” of the Trump administration.
In fairness, the GOP has also fielded some impressive candidates for the presidency since Reagan. George H. W. Bush, Bob Dole, and John McCain were all intelligent, eloquent, pragmatic war heroes with distinguished careers in public service. But notice the common thread here: George H. W. Bush lost to Clinton in 1992; Bob Dole lost to Clinton in 1996; and McCain lost to Obama in 2008. There were certainly complex reasons for these losses, including Bush reneging on his pledge to tell Congress “read my lips: no new taxes” and the surge in new young voters and voters of color who supported Obama. However, I think thoughtful, sophisticated statesmen like Bush the Elder, Dole, and McCain had two insurmountable weaknesses: they don’t ignite the Republican base like candidates who offer simplistic solutions, and they don’t seem different enough from their Democratic counterparts to win over swing voters.
The serious ethical problem here is that anti-intellectualism is actually the worst kind of elitism, because it suggests that one must choose between being a welder and learning about philosophy. Why shouldn’t welders study philosophy too? After all, Socrates was a bricklayer and Spinoza was a lens grinder. If they were alive today, they’d work at Home Depot and LensCrafters. (The Stoic philosopher Epictetus was a slave, so he’d be working at WalMart.)
Consider the following. Seven of the eight people on the stage when Rubio made his sneer had college degrees.44 Rubio and John Kasich both majored in political science; Carly Fiorina majored in philosophy and medieval history; Ben Carson has an undergraduate degree in psychology; Jeb Bush majored in Latin American studies; Ted Cruz studied public policy. These are all liberal arts majors. Why would anyone think that other liberal arts majors are more practical or employable or economically valuable than philosophy? Do we imagine a businesswoman at the office confiding to a coworker: “That was close. My son was going to be major in philosophy. Can you imagine?! Thank God he switched his major to public policy. He’s going to be raking in the big bucks now!” The only person on the stage that night with a conventionally “practical” major was Donald Trump, who studied economics (and subsequently led multiple companies to bankruptcy and lost almost a billion dollars during an economic boom).
When Rubio stood on a stage with a group of people who, between them, have seven bachelor’s degrees (six of them in the liberal arts), two JDs, an MBA, and two MDs, yet denigrated liberal arts education, it sent a clear message to the electorate: “We are highly educated; you should not be. We will get the education and training that will allow us to effectively pursue our goals; you don’t need information or practice in thinking objectively or critically about the world. We will study something that opens our minds and helps us to choose our own futures; you should study something that makes you useful to the economic system that we run.” A century ago, John Dewey warned of the dangerous political implications of publicly funding only a narrow vocational education: “This movement would continue the traditional liberal or cultural education for the few economically able to enjoy it, and would give to the masses a narrow technical trade education for specialized callings, carried on under the control of others.”45 This would make the educational system nothing but “an instrument in accomplishing the feudal dogma of social predestination” and for “transferring the older division of … directed and directive class into a society nominally democratic.”46
There is an even more dangerous aspect of anti-intellectualism: it is fascistic. Stuart Hampshire was a leading British philosopher who served in military intelligence during World War II and participated in the debriefing of Nazi officers. Hampshire came to recognize that
below any level of explicit articulation, hatred of the idea of the Jews was tied to hatred of the power of intellect, as opposed to military power, hatred of law courts, of negotiations, of cleverness in argument, of learning and of the domination of learning: and in this way anti-Semitism is tied to hatred of justice itself, which must set a limit to the exercise of power and to domination.…47
The Nazi fury to destroy had a definite target: the target encompassed reasonableness and legality and the procedures of public discussion, justice for minorities, the protection of the weak, and the protection of human diversity.48
It is not coincidental that anti-Semitic themes are characteristic of every kind of American and European nativist and nationalist movement.
Consequently, in a democracy, philosophy courses are not an “intellectual luxury” (as Reagan suggested).49 They should not be a privilege of the wealthy. Philosophy belongs to whoever has the intelligence and the willingness to appreciate it, whether it be the child of the welder or the child of the CEO. Studying philosophy makes people more informed and more thoughtful citizens, more comfortable with the fact that others disagree with them, less vulnerable to manipulation and deception, and more willing to resort to discussion rather than violence.
It should come as no surprise, then, that making a broad education widely available to the populace is part of what is most distinctive about the United States. As Martha Nussbaum explains:
Unlike virtually every nation in the world, we have a liberal arts model of university education. Instead of entering college/university to study a single subject, students are required to take a wide range of courses in their first two years, prominently including courses in the humanities.… Nor is the emphasis on the liberal arts a vestige of elitism or class distinction. From early on, leading U.S. educators connected the liberal arts to the preparation of informed, independent, and sympathetic democratic citizens.50
Thomas Jefferson illustrates that Nussbaum’s view of the value of a liberal arts education is not some trendy, liberal conceit. Jefferson argued that the best way to combat tyranny is
to illuminate, as far as practicable, the minds of the people at large;… whence it becomes expedient for promoting the publick happiness that those person, whom nature hath endowed with genius and virtue, should be rendered by liberal education worthy to receive, and able to guard the sacred deposit of the rights and liberties of their fellow citizens, and that they should be called to that charge without regard to wealth, birth or other accidental condition or circumstance; but the indigence of the greater number disabling them from so educating, at their own expence, those of their children whom nature hath fitly formed and disposed to become useful instruments for the public, it is better that such should be sought for and educated at the common expence of all, than that the happiness of all should be confided to the weak or wicked.51
Consequently, despite their claim to revere the wisdom of America’s Founding Fathers, the disdain of many contemporary conservatives for publicly funded liberal arts education is in opposition to a significant part of what has made America “exceptional” and a “shining city upon a hill,” among other nations.
PHILOSOPHY’S VALUE TO CIVILIZATION
As I noted in chapter 1, some contemporary scientists take a dim view of philosophy. Noted edu-tainer Neil deGrasse Tyson warned that majoring in philosophy “can really mess you up.” This is ironic, because Tyson has a PhD, which stands for doctor of philosophy, reflecting the fact that all the sciences grew out of philosophy. The pre-Socratic philosophers were the first to experiment and speculate about the physical world and provide naturalistic explanations for phenomena, setting the stage for all later science. Anaxagoras (fifth century BCE) correctly surmised that the Sun was not a god but actually a hot physical object much larger than it appeared, that the Moon only shined because of light reflected from the Sun, and that eclipses were caused when objects came between the Earth and the Sun. Leucippus (fifth century BCE) and Democritus (460–370 BCE) developed the first version of the atomic theory, later confirmed by chemist John Dalton (1766–1844 CE). Galileo (1564–1642) famously said that “philosophy,” which for him included physics and astronomy,
is written in this grand book, the universe, which stands continually open to our gaze. But the book cannot be understood unless one first learns to comprehend the language and read the letters in which it is composed. It is written in the language of mathematics, and its characters are triangles, circles, and other geometric figures without which it is humanly impossible to understand a single word of it; without these, one wanders about in a dark labyrinth.52
In saying this, Galileo was consciously echoing Plato (428?–348? BCE), who had argued that the universe can only be comprehended in mathematical terms.53
Plato’s student Aristotle (384–22 BCE) perhaps did more than anyone else to lay the foundations of Western science. He was a keen observer of the natural world, who described the development of chicken embryos that he observed by opening a series of eggs on successive days in between laying and hatching.54 Carl Linnaeus (1707–78) is rightfully praised for elaborating the contemporary biological categorization of living things according to kingdom, phylum, class, order, family, genus, and species. However, his system is based on Aristotle’s method of definition by genus and differentia.55
It is a party game among shallow intellectuals to mock Aristotle for his hypotheses that turned out to be mistaken, like his claim that the Sun goes around the Earth. However, it does not make someone a bad scientist that she turned out to be mistaken about something. If it did, Galileo would be a bad scientist because he believed that comets were optical illusions,56 Lavoisier (the founder of modern chemistry) would be a bad scientist because he denied that meteorites came from space,57 and Einstein would be a bad scientist because he never accepted quantum mechanics.58 A good scientist is someone who theorizes based upon the best empirical evidence and most plausible assumptions available to him or her. Aristotle’s geocentric hypothesis made sense of the evidence available: the Sun does appear to move, and the Earth does not feel as if it is moving. Even more importantly, a good scientist lays the groundwork for later scientific advances by offering a theory that can be refined, tested, and even refuted. As Nietzsche noted, “It is certainly not the least charm of a theory that it is refutable; it is precisely thereby that it attracts subtler minds.”59
Those ignorant of history accuse Aristotle and his later followers of an inflexible dogmatism that held back the development of science. The reality is much more complex. As historian of science Thomas Kuhn explained: “Men who agreed with Aristotle’s conclusions investigated his proofs only because they were proofs executed by the master. Nevertheless their investigations often helped to ensure the master’s ultimate overthrow.”60 The phenomenon that Aristotelian physics had the most trouble accounting for was the motion of projectiles, like an arrow (or, later, a cannonball). Aristotle offered some tentative explanations for projectile motion, but acknowledged that none of them was fully satisfactory. In response to these problems, Aristotle’s medieval followers developed “impetus theory,” which laid the foundation for Galileo’s concept of inertia. In addition, as Jaroslav Pelikan observed, “By constructing the telescope and using it to observe empirically, Galileo was a more faithful Aristotelian than were those who quoted Aristotle’s Physics against his observations.”61 For synthesizing many earlier theories and starting science on a research program that would eventually transcend his own insights, Aristotle is one of the greatest scientists ever.
Even computers, the most revolutionary scientific achievement of our era, are a gift of the philosophers. (You’re welcome!) Binary arithmetic, which is the basis of all computers, was invented by G. W. Leibniz (whose Sinophilia we discussed in chapter 1). Leibniz had several acrimonious debates with Newton. One thing they quarreled over was who discovered calculus. The answer is that Newton discovered it first in time, but Leibniz announced his discovery first, and we use Leibniz’s symbolism for the calculus nowadays. They also argued over whether location in space is relative or absolute. Newton argued for the latter, but Einstein would later prove Leibniz right. It’s tempting for the humanist to crow that the philosopher won that argument. However, Newton described himself as a “natural philosopher,” and he would have been genuinely offended at the suggestion that he was not really a philosopher. Of course, most scientists nowadays are not philosophers. As Bertrand Russell (1872–1970) explained, this is because, once we know the proper methodology for solving problems on a certain subject, “this subject ceases to be called philosophy, and becomes a separate science.” Only the questions “to which, at present, no definite answer can be given remain to form the residue which is called philosophy.”62 So could science completely replace philosophy? Some people think so. Stephen Hawking recently pronounced that “philosophy is dead,” and that “scientists have become the bearers of the torch of discovery in our quest for knowledge.”63
Wrong.
There are at least three reasons why it is impossible that all of philosophy will be replaced by natural science, or that philosophy will become obsolete. First, the history of science alternates between long periods of “normal science” and brief periods of “revolutionary science.”64 During periods of normal science, scientists largely agree about the way the world works and the proper methodology for studying it. Normal science is an impressive activity, and those who do it, like Tyson and Hawking, deserve our utmost respect and admiration. However, revolutionary science is what the true geniuses of science do: people like Aristotle, Galileo, Newton, John Dalton, Darwin, Erwin Schrödinger, and Einstein. During scientific revolutions, scientists realize that their previous worldview and methodology do not do justice to some aspect of reality. Consequently, they have to radically restructure their concepts. For example, Einstein had to fundamentally rethink what space, time, and gravity were in order to formulate the Special and General Theories of Relativity. During periods of scientific revolution, scientists become philosophers, and draw on the work of other philosophers to help inform their views (for example, Galileo was influenced by Plato, Dalton drew on Democritus, and Einstein’s approach to science was shaped by Pierre Duhem). Consequently, when another scientist asked him about the importance of physicists learning about philosophy, Einstein replied:
I fully agree with you about the significance and educational value of methodology as well as history and philosophy of science. So many people today—and even professional scientists—seem to me like somebody who has seen thousands of trees but has never seen a forest. A knowledge of the historic and philosophical background gives that kind of independence from prejudices of his generation from which most scientists are suffering. This independence created by philosophical insight is—in my opinion—the mark of distinction between a mere artisan or specialist and a real seeker after truth.65
The second reason that natural science will never replace philosophy is that sciences like physics are successful precisely because they limit their inquiry to particular aspects of reality using particular methods. If someone argues that there is nothing to reality besides what is studied by physics, we can legitimately ask her why she thinks this. However, the answer to the question of whether there is anything beyond physics cannot be provided from within physics. Physics uses a particular methodology, M, to study reality insofar as it is physical, P. But the question we are asking is whether there is a reality that is not-P. Since M is the methodology we use to study reality only insofar as it is P, we cannot use that same methodology to explore whether there is anything that is not-P. More generally, you cannot show from within the limits of something that there is nothing outside that limit. You have to straddle a limit, conceptually speaking, in order to define it. So if you try to show that there is nothing beyond the factual and methodological limits of physics, you have already transcended the limits of physics.
The final reason that philosophy will never be obsolete is that it includes ethics, political philosophy, and philosophical theology. These topics are intrinsically controversial but also inescapable. As I am fond of telling my students, whatever opinions you have in these areas have their origins, at least in part, in philosophical thought. Do you think that the purpose of life is to make the most out of your intelligence and contribute to your community? You’re an Aristotelian. Do you think that there is no purpose to life except for the one each of us chooses for herself? You’re an existentialist. Do you think that morality has to be explained psychologically, by our emotions and other motivations? You’re a Humean. Do you think that what is right is to do whatever produces the greatest happiness of the greatest number of people? You’re a utilitarian. Do you think that there are some actions that are intrinsically wrong and must never be done, even if they would result in desirable consequences? You’re a Kantian. Do you think that government is designed to protect our inalienable rights to life, liberty, and property? You’re a Lockean. Do you think that government must protect our freedoms, but wealth inequality is justifiable only insofar as it benefits those most in need? You’re a Rawlsian. Do you think that much of religious belief can be justified by philosophy? Please say hello to my friend Thomas Aquinas. Do you think we can legitimately have religious belief even though most of it must be accepted on faith? Go hang out with my buddies Pascal and Kierkegaard. Do you believe that religion is superstition that has had a largely negative influence on the world? Read Bertrand Russell or J. L. Mackie. Or do you dismiss philosophy as nothing but rationalizations for the will to power or structures of domination? Enjoy Nietzsche, Marx, Freud, and Foucault. (Oops! They’re philosophers too!) The question is not whether philosophy is important to you. It already is. The only question is whether you choose to become self-aware and critically reflective about the philosophical beliefs that you hold.
Erwin Schrödinger, one of the founders of quantum mechanics and a noted cat lover/hater, was expressing similar views about the limitations of science when he said that
the scientific picture of the real world around me is deficient. It gives a lot of factual information, puts all our experience in a magnificently consistent order, but it is ghastly silent about all and sundry that is really near to our heart, that really matters to us. It cannot tell us a word about red and blue, bitter and sweet, physical pain and physical delight; it knows nothing of beautiful and ugly, good or bad, God and eternity. Science sometimes pretends to answer questions in these domains, but the answers are very often so silly that we are not inclined to take them seriously.66
It is tempting to point out that the two scientists I have quoted who praise philosophy, Einstein and Schrödinger, each won the Nobel Prize in physics, while neither of the two scientists I quoted who disparage philosophy has won one. However, that would be a snarky observation to make, so I won’t do it.67
We have seen that philosophy is a valuable part of vocational training for many careers. Beyond that, philosophy is important to the maintenance of democracy itself: those who have been trained to argue rationally and constructively have learned to value discussion over violence, and pluralism over intolerance. Finally, philosophy is responsible for so much that we view as valuable in Western civilization, from natural science to the various formulations we use to talk about ethics, politics, and spirituality. For all these reasons, the anti-intellectualism that delights in degrading philosophy is both unwarranted and unfortunate. The value of philosophy was summed up nicely by John Cleese (who obtained a law degree from Cambridge University before becoming one of the founders of the legendary comedy troupe Monty Python):
Philosophy seems so harmless, and yet, among dictatorships, philosophers have always been among the first people to be silenced. Why have dictators bothered to silence philosophers? Maybe because ideas really matter: they can transform human lives.68
But what distinguishes philosophy from other fields in the humanities and social sciences that discuss similar topics? And to what extent are philosophers to blame for their discipline’s bad reputation in contemporary society? We turn to these questions in the final chapter.