Below are four cards. They’re randomly chosen from a deck of cards in which every card has a letter on one side and a number on the other. Please indicate which of the cards you would have to turn over in order to find out whether the card obeys this rule: “If a card has a vowel on one side, then it has an even number on the other side.” Turn over only those cards that are necessary to establish that the rule is being followed. Commit yourself: if you’re reading this on an electronic gadget, highlight your choice in yellow; if you’re reading it in hard copy, check your choice with a pencil.
I must turn over:
a. Card 3 only
b. Cards 1, 2, 3, and 4
c. Cards 3 and 4
d. Cards 1, 3, and 4
e. Cards 1 and 3
We’ll return to this problem later in a different context.
* * *
Critical reasoning texts usually have a heavy dose of formal, deductive logic. This is done more because of ancient pedagogical tradition than because of any evidence of utility for everyday thought. And in fact, there are some reasons to suspect that most of what you read in this chapter on formal logic will be of limited value in solving problems in everyday life.
But there are nevertheless some good reasons to read about formal logic.
1. Formal logic is essential for science and mathematics.
2. The chapter sets up the stark contrast that exists between Western hyperrationality and Eastern dialectical habits of thought. The two systems of thought can both be applied to the same problem, generating different conclusions. The two systems also provide excellent platforms for critiquing each other.
3. An educated person should have some command over basic forms of logical reasoning.
4. Formal logic is interesting, at least to lots of people. (At any rate, in doses about the size of this chapter!)
The story about the Western origins of formal logic goes as follows: Aristotle got tired of hearing lousy arguments in the marketplace and the assembly. So he decided to develop reasoning templates to apply to arguments in order to analyze them for validity. An argument is valid if (and only if) its conclusions follow necessarily from its premises. Validity has nothing to do with truth. An argument can be invalid, but its conclusion can be true. An argument is valid if it has the proper structure, but its conclusion can nevertheless be false.
The concept of argument validity is important for many reasons. One reason is that you don’t want to allow someone to trick you (or allow you to trick yourself) by conferring plausibility on a conclusion because it follows from some premises—unless those premises are true and the conclusion follows necessarily from the premises. A second reason is that we don’t want to allow ourselves to disbelieve conclusions that we happen to dislike if the premises are clearly true and the form of the argument dictates that the conclusion must be true as well. A third reason is that if we have a clear grasp of the concept of validity as opposed to truth, we can assess whether a conclusion follows from its premises by stripping away the meaning from the premises and the conclusion, and thinking purely in terms of abstractions. As and Bs rather than birds and bees. That can show us that a conclusion at least follows from its premises, so even though the conclusion is highly implausible, it’s at least not the product of illogical reasoning.
Syllogisms
One of Aristotle’s major contributions to formal logic was the syllogism. The creation of syllogisms ballooned into a cottage industry in the Middle Ages, when monks generated dozens of them. From the Middle Ages until the late nineteenth century, philosophers and educators believed that syllogisms provided powerful rules for thought. Consequently they formed a large part of the curriculum in higher education in the West.
The issue of validity arises for syllogisms, which deal with categorical reasoning. Some kinds of categorical reasoning involve the quantifiers “all,” “some,” and “none.” The simplest syllogisms involve two premises and one conclusion. The simplest of those simple syllogisms, and one that we’re not normally at risk of getting wrong, is: All A are B, all B are C, all A are C. The classic here is:
All clerks are human.
All humans are bipedal.
—————————————
All clerks are bipedal.
That argument is valid because it follows logically from the premises. The conclusion is also true.
All clerks are human.
All humans have feathers.
——————————————
All clerks have feathers.
That argument is valid, too, though the conclusion is untrue. But the implausibility of the conclusion draws us toward feeling that the argument is also invalid. Substituting As, Bs, and Cs for clerks, humans, and feathers allows us to see the validity of the argument. That may force us to reconsider whether a conclusion is true, which can be useful.
The argument below is invalid, even though its premises and its conclusion are all true (or at least highly plausible).
All people on welfare are poor.
Some poor people are dishonest.
Therefore, some people on welfare are dishonest.
In abstract terms:
All A are B.
Some B are C.
Therefore, some A are C.
The exercise of abstracting the terms is useful because we may have the feeling that a conclusion is true because it seems plausible and because we have true premises that seem to support it logically. Finding that an argument is invalid strips away the feeling that the conclusion is necessarily true and may cause us to doubt it. (The key to recognizing the invalidity of the above argument is to realize that the As are a subset of the Bs.)
Things rapidly become more complicated from here: All A are B, some C are A, some C are B. Valid or not? No A are B, some C are B, no A are C. Valid or not?
You can spin these out till the cows come home. Medieval monks whiled away the weary hours by generating endless numbers of them. But I agree with the philosopher Bertrand Russell, who said that the syllogisms were as sterile as the monks. So much for twenty-six hundred years of pedagogy that assumed that syllogisms were crucial for effective thinking.
The most useful thing I’ve gained from instruction in categorical reasoning is learning how to draw Venn diagrams, named for the nineteenth-century logician John Venn, who invented a pictorial way of representing category membership. Every now and then I find these useful, even necessary, for representing relationships between categories. Figure 5 shows some of the more useful ones and will give you the general idea.
Figure 5. The intersection of categories that overlap with one another.
The top left picture in Figure 5 captures a particular syllogism that we do use in everyday life. It represents a situation in which some (but not all) A are B and some (but not all) B are A. A might stand for small furry animals and B might stand for duck-billed. As it happens there is one animal in the intersection of A and B, namely a duck-billed platypus. Or that top left picture could represent a situation in which some but not all of the students who are English speakers at an international school also speak French and some but not all French speakers speak English. (Some but not all A are B and some but not all B are A.) The exclusive English speakers (A only) must study mathematics with Ms. Smith; the exclusive French speakers (B) must study with M. Pirot. Students who speak both languages can study with either teacher.
The picture on the top right shows the much more complicated but not rare situation in which some A are B, some B are A, some A are C, some C are A, some B are C, and some C are B.
The bottom picture presents an actual example of that situation. It represents the intersections of letters that are found in Greek (top left), Latin (top right), and Russian (bottom). I defy you to reach the correct conclusion about overlap of categories solely by means of verbal propositions about categories. In any case, I’m sure I would end up merely with alphabet soup.
This is not enough on Venn diagrams to set you up for a very wide range of problems, but it gives you some of the basics on how to represent category inclusions and exclusions pictorially. You might find that learning more about Venn diagrams will be useful for you.
Propositional Logic
Syllogisms apply only to a tiny range of inferences that we need to make in everyday life. Much more important is propositional logic, which applies to an enormous range of inferential problems. Philosophers and logicians made sporadic contributions to propositional logic between roughly 300 B.C. and A.D. 1300. Beginning in the mid-nineteenth century, logicians started to make considerable progress on propositional logic, in particular focusing on “operators” such as “and” and “or.” “And” is concerned with “conjunctions,” such as “A is the case and B is the case; therefore A and B are both the case.” “Or” is concerned with “disjunctions,” such as “A is the case or B is the case; A is the case, therefore B is not the case.” Work on propositional logic in that era became the basis for computer design and programming.
At the beginning of the chapter, I asked you to solve a problem about cards. You can now see it was a problem requiring the application of conditional logic. If P then Q. “If a card has a vowel on one side, then it has an even number on the other side.” Before we see how well you did on the problem, let’s see how you do on the problem below.
You’re a police inspector. One of your tasks is to make sure that restaurants don’t sell alcohol to anyone under twenty-one. Your task is to say which of the customers below you have to check in order to see whether the customer is obeying this rule: “If a customer is drinking alcohol, the customer is at least twenty-one.” You should check only those customers necessary to establish that the rule is being followed.
The first table you see has four customers. You see that
You need to check:
a. Customer 1
b. Customers 1, 2, 3, and 4
c. Customers 3 and 4
d. Customers 1, 3, and 4
e. Customers 1 and 3
I’m betting that you said option c, Customers 3 and 4. Now look back at the card problem. I’m betting that you didn’t say option c, cards 3 and 4. Can we agree that you should have? The logical structure of the two problems is identical. Check my logic below.
CARD PROBLEM
Make sure this rule is not violated: Vowel? Better be an even number on the other side.
N—doesn’t matter whether there’s an even number on the back or not.
4—doesn’t matter whether there’s a vowel on the other side or not.
A—better be an even number on the other side. If not, the rule is broken.
3—better not be a vowel on the other side. If so, the rule is broken.
RESTAURANT PROBLEM
Make sure this rule is not violated: Drinking? Better be twenty-one.
Customer is fifty-plus—doesn’t matter whether customer is drinking or not.
Isn’t drinking—doesn’t matter whether customer is twenty-one or not.
Drinking—better be twenty-one. If not, the rule is broken.
Under twenty-one—better not be drinking. If so, the rule is broken.
Don’t feel bad if you didn’t get the card problem right. Less than 20 percent of Oxford University students can solve the abstract version of the card problem!
Why is the card problem so much harder than the restaurant problem? At first blush this seems strange, because both problems can be solved by applying conditional logic, in fact by applying the very simplest principle of conditional logic, modus ponens:
If P is the case, then Q is the case. |
If the customer is drinking, then the customer is twenty-one. |
P is in fact the case. |
The customer is drinking. |
Therefore, Q is the case. |
Therefore, the customer is twenty-one. |
Modus ponens entails modus tollens (if not Q, then not P). An instance where Q (twenty-one or over) is not the case but P (drinking) is the case contradicts the conditional rule.
Note that P (drinking) is a sufficient condition but not a necessary condition for Q. It’s sufficient that P is the case in order for Q to be the case. Lots of other conditions might be sufficient to require that the person is twenty-one. Flying a plane, for example, or gambling.
For the biconditional, P is both necessary and sufficient in order for Q to be the case. This would include the (rather weird) rule that if you’re drinking you must be twenty-one and if you’re twenty-one you must be drinking.
After some more consideration of conditional reasoning, we’ll discuss why the drinking problem is so easy.
Plausibility, Validity, and the Logic of the Conditional
As we’ve seen, syllogistic arguments can be valid—that is, map correctly onto a cogent argument form—even when their conclusions aren’t true. That’s true as well for propositional logic.
Decide whether each of the following arguments, having two premises and one conclusion, is valid.
ARGUMENT A
Premise 1: If he died of cancer, he had a malignant tumor.
Premise 2: He had a malignant tumor.
——————————
Conclusion: Therefore, he died of cancer.
ARGUMENT B
Premise 1: If he died of cancer, he had a malignant tumor.
Premise 2: He didn’t die of cancer.
——————————
Conclusion: Therefore, he didn’t have a malignant tumor.
ARGUMENT C
Premise 1: If he died of cancer, he had a malignant tumor.
Premise 2: He died of cancer.
——————————
Conclusion: Therefore, he had a malignant tumor.
Only Argument C is valid. It maps onto modus ponens: If P (he died of cancer) then Q (tumor). P (cancer). Therefore Q (tumor). The plausibility of the conclusions in Arguments A and B tempts us into feeling that they’re valid. But Argument A maps onto an invalid argument form: If P (died of cancer) then Q (tumor). Q (tumor). Therefore P (died of cancer). This is called the converse error because the form of reasoning involves erroneously converting the premise If P then Q into If Q then P. (If he had a malignant tumor, then he died of cancer.) If that had been the premise, then we would indeed know that since Q is the case, P is also the case. But that wasn’t the premise.
We make converse errors all the time—if we’re not monitoring ourselves for the logical validity of our argument.
CONVERSE ERROR 1
If the car is not in our garage, then Jane went downtown.
Jennifer told me she saw Jane downtown.
Therefore, the car won’t be in the garage.
But of course Jane could have gotten downtown by some means other than taking the car, in which case the car probably will be in the garage. Making the error is more likely given some kinds of background knowledge. If Jane rarely goes anywhere without the car, we’re more likely to make the error; if she sometimes takes the bus and sometimes is driven by a friend, we’re less likely to make the error.
CONVERSE ERROR 2
If I have the flu, then I have a sore throat.
I have a sore throat.
Therefore, I have the flu.
But of course there are possibilities other than P (flu). A cold or strep throat, for example. We’re likely to make the error if people are dropping like flies from the flu, always with a sore throat as a symptom, and nothing much else is going around. We’re much less likely to make the error if the flu, colds, and pollen allergies are all going around at once.
Argument B above was: If died of cancer, then malignant tumor; didn’t die of cancer, therefore didn’t have malignant tumor. This is called the inverse error. The form of this invalid argument is If P then Q, not P therefore not Q. We make this error a lot, too.
INVERSE ERROR 1
If it’s raining, then the streets must be wet.
It’s not raining.
Therefore, the streets must not be wet.
If we live in a city where the street sweepers operate frequently (thereby making the streets wet), or if it’s a blazing summer day in a city where fire hydrants are sometimes opened to cool people off, we’re less likely to make the error. If we live in rural Arizona, with no street sweepers and no fire hydrants, we’re more likely to make the error.
INVERSE ERROR 2
If President Obama is Muslim, then he’s not a Christian.
President Obama is not Muslim.
Therefore, President Obama is a Christian.
The conclusion would be valid if we had tacit knowledge operating as an additional premise to the effect that people can only be either Muslim or Christian. We don’t believe that, of course, but we may have gotten into a mood of thinking these are the only alternatives for Obama; for example, if the only alternatives for Obama’s religion ever discussed were Muslim and Christian.
An interesting and important thing to know about the converse and inverse errors is that they are only deductively invalid conclusions. (That is, they don’t follow logically from their premises.) They may however be pretty good inductive conclusions. (That is, if the premises are true, the conclusion is more likely to be true.) It’s more likely that I have the flu if I have a sore throat than it is if I don’t have a sore throat. If it’s not raining, it’s less likely that the streets are wet than if it is raining. The plausibility of the inductive conclusion in these cases contributes to making the invalid deductive conclusion plausible as well.
The lists of argument forms and logical errors are very long. But these are some of the most common and important errors.
Pragmatic Reasoning Schemas
The abstract version of the conditional—if P then Q—is difficult to use. We do reason in accord with conditional logic all the time, but rarely by applying a completely abstract version of it. Instead, we’re more likely to use what I call pragmatic reasoning schemas, that is, sets of rules useful for thinking about everyday life situations.1 This book is full of such schemas. In fact, at some level, that’s what the book is all about. Some of the schemas map directly onto conditional logic. These include, for example, the schema distinguishing between independent and dependent events and the principle that correlation doesn’t prove causation. The sunk cost principle and the opportunity cost principle are deductively valid and can be derived logically from the principles of cost-benefit analysis. Economics courses teach these principles, though not as well as they could because they’re typically not very good at showing how the formal principles can be used pragmatically for everyday reasoning.
Some pragmatic reasoning schemas map onto conditional logic but fall short of being deductively valid because they don’t guarantee a correct answer. In fact, they’re not concerned with truth or validity at all but with assessing whether a person’s conduct is proper. This branch of logic is called deontic, from the Greek deon, meaning duty. It deals with what kind of situation constitutes an obligation, what kind gives permission, what’s optional behavior, what’s beyond the call of duty, and what ought to be done. Contractual schemas are one type of deontic schema that can be used to solve a wide range of problems related to permission and obligation.
The deontic schema that’s necessary for getting the drinking age problem right is called the permission schema.2 You want to drink (P)? Better be twenty-one (Q). Not twenty-one (not Q)? Better not be drinking (not P).
A kindred schema is the obligation schema.3 If you’re eighteen (P), you must register for the draft (Q). Didn’t register for the draft (not Q)? Then you better not be eighteen or you didn’t meet your obligation.
Two years of law school improves deontic reasoning quite a bit, but two years of graduate training in philosophy, psychology, chemistry, or medicine does nothing for this kind of reasoning.4
A second type of pragmatic reasoning schema doesn’t map onto conditional logic at all (or at least it isn’t very profitable to try such a mapping), but applies to an enormous range of situations and can be described in purely abstract terms. Logical thinking is required for using these schemas, but the logic isn’t what makes them powerful; rather, it’s their ability to shed light on everyday problems. These include statistical schemas and schemas for scientific procedures such as the randomized control design. Statistics and methodology courses teach these concepts but don’t always succeed very well in creating pragmatic schemas that are helpful in everyday life. Both undergraduate and graduate courses in social sciences and psychology, but not natural sciences or the humanities, do enhance pragmatic schemas that help in applying statistical and methodological schemas to everyday problems.5 Other highly general pragmatic reasoning schemas include Occam’s razor, the tragedy of the commons, and the concept of emergence, discussed in Chapter 15.
Finally, some powerful pragmatic reasoning schemas don’t constitute abstract blueprints for reasoning but are merely empirical principles that facilitate correct solutions to a broad range of everyday problems. These include the fundamental attribution error, the generalization that actors and observers tend to explain behavior differently, loss aversion, the status quo bias, the principle that some choice architectures are generally superior to others in the quality of choices they encourage, and the principle that incentives aren’t necessarily the best way to get people to change their behavior—among dozens of others in this book.
Abstract pragmatic schemas are tremendously useful, but purely logical schemas are of limited value. I believe this is the case because there’s a very high civilization, namely Confucian Chinese, that never developed purely logical formalisms. It’s that civilization’s dialectic tradition, and modern additions to it, that are covered in the next chapter.
Summing Up
Logic divests arguments of any references to the real world so that the formal structure of an argument can be laid bare without any interference from prior beliefs. Formal logic, contrary to the opinions of educators for twenty-six hundred years, doesn’t constitute the basis of everyday thought. It’s primarily a way of thinking that can catch some kinds of errors in reasoning.
The truth of a conclusion and the validity of a conclusion are entirely separate things. The conclusion of an argument is valid only if it follows logically from its premises, though it may be true regardless of the truth of the premises or whether it follows logically from the premises. An inference need not be logically derivable from any other premises, but it gains in claims for credence if it can be shown to have logical as well as empirical support.
Venn diagrams embody syllogistic reasoning and can be helpful or even necessary for solving some categorization problems.
Errors in deductive reasoning are sometimes made because they map onto argument forms that are inductively valid. That’s part of the reason we’re susceptible to making deduction errors.
Pragmatic reasoning schemas are abstract rules of reasoning that underlie much of thought. These include deontic rules such as the permission schema and the obligation schema. They also include many inductive schemas discussed in this book such as those for statistics, cost-benefit analysis, and reasoning in accord with sound methodological procedures. Pragmatic reasoning schemas are not as general as the rules of logic because they apply only in specific situations, but some of them rest on logical foundations. Others, such as Occam’s razor and the concept of emergence, are widely applicable but don’t rest on formal logic. Still others are merely empirical generalizations of great practical utility, such as the fundamental attribution error.