LESSON 3: Know When You Are Being Misled, Deceived, Manipulated, or Outright Lied To
A "lie" is only the tip of the metaphorical iceberg when it comes to things that are not true. Misinformation comes in many forms, and in this lesson, I'll show you how to spot some of the most common ways people are misled, deceived, manipulated, and outright lied to.
Opinion or commentary, not facts.
People are often misled when opinion or commentary is being sold as fact. This is often the case with political commentators who appear to have the authority of news anchors, yet can say whatever they want without having to support their claims or preface them with "in my opinion." There is nothing wrong with opinion or commentary, as long as we recognize it as such and don't mistake it for fact reporting.
Selective reporting.
Media outlets lie by omission and demonstrate their biased worldview by only running stories that comport with their biases and agenda. With this technique, we can make anyone look like a monster or a saint, without technically lying. Sometimes stories cannot be ignored, but media outlets can present just one side of the story and do so in a biased way. They can hold back relevant facts and tell "half-truths." Similarly, one can focus on what did happen and leave out what didn't happen, or vice versa.
Misleading proportionality.
Media outlets can give the impression that with any issue where there is an overwhelming scientific consensus that legitimate controversy exists. They do this by giving "equal time" to those with outlier ideas. For example, although roughly 97% of climatologists agree on climate change, a media outlet might focus on the 3% who don't. This technique also helps cast doubt on facts. By giving equal time to people who deny facts, "alternative facts" are born.
Creating stereotypes.
A very common form of media manipulation is when an author suggests that one person’s actions are representative of their group when this one person is an outlier. For example, a conservative press might run a story about a feminist who thinks all men should be killed and suggests through careful wording, that this one person is representative of all feminists. This also occurs when qualifiers such as “all, most, some, none,” or “every” are conveniently ignored when making a claim about a group. For example, “Republicans hate Mexicans.” All Republicans? Without a qualifier, either “all” or “most” can be assumed, when sometimes it should “some,” “a small percentage of,” or even just “a couple.”
Ambiguity.
The use of ambiguous, emotionally-charged words with little or no substance is a common method of deceit and manipulation. This is like a politician emotionally shouting, "I support freedom and the American way of life!" What the heck does that mean? How does that translate to actual policies? Perhaps she means she wants to free all prisoners? Perhaps she means she wants to lock up many more people so other citizens can be "free" from concern of harm? As long as specifics are left out, people are free to read in their own meanings, which very often leads to political polarization.
Suggest, imply, but don't directly state.
"Are you a doctor?" "Wow, you are very perceptive!" "What kind of doctor are you?" "I'm not a doctor." "But you said you were a doctor?" "No, I didn't. You said I was a doctor. I just said that you were perceptive." Suggesting and implying is common in most forms of communication as a "shortcut", but this is also often abused to absolve the communicator from any legal responsibility while still indirectly making their claim. Don't be afraid to ask for clarity or an affirmative or negative statement related to your question, and don't stop asking until you get a clear answer.
Ambiguous Pervasiveness.
If one wants something to appear more frequent or pervasive than it actually is, one just needs to make an ambiguous statement such as "People all over the world believe in a flat earth." Technically, this is true. There might be a few people on each continent (and hundreds in America) that believe this. But "people all over the world" sounds like there are more of them than there actually are. Another example is saying something such as "Animosity against postal workers is problem." Perhaps two postal workers were attacked in the last year because they were postal workers. Sure, even one attack it too many, but stating "Animosity against postal workers is problem" implies much greater pervasiveness than is reflected in reality.
Labeling.
Don't be so quick to accept labels others apply to people and ideas. If people are called "racist" or "bigoted," and such labels aren't immediately justified with evidence, ignore the labels rather than parroting them. The same goes for labels given to ideas and events. Usually, these labels are given out of emotion and not reason. Perhaps that latest decided court case was a "miscarriage of justice," but find out why it was a "miscarriage of justice" if the information source got lazy and didn't bother telling you why.
Deflecting.
This is a common technique in debate or in an adversarial interview where the interviewer will ask the interviewee a question, and the interviewee will "deflect" by asking the interviewer a question, changing the subject, or otherwise simply not answering the question. Good interviewers won't let their interviewee's get away with this, but many do. Don't be satisfied with a non-answer, and especially, don't assume the answer you want to hear.
Exaggeration or Minimisation.
Some people are more prone to exaggeration than others. Exaggeration is usually accompanied by ambiguous words such as "greatest," "best," "smartest," but can also be falsifiable claims such as "I won by the biggest margin ever." Minimisation is the opposite of exaggeration where the enormity or seriousness is grossly understated.
“Just joking.”
Humor is wonderful, and sites like "The Onion" are hilarious. The problem is when humor becomes more passive-aggressive or when it is unclear that humor is being used. When people are proven wrong, claiming they were "just joking" or "being sarcastic" is one way to avoid taking responsibility for their false claims.
Puffery.
Puffery is frequently used in marketing. It is defined as "exaggerated commendation especially for promotional purposes." So when you see a sign that says "World's best cup of coffee," don't be like Buddy the Elf and assume that it really is the world's best cup of coffee. It's probably just a crappy cup of coffee.
Accusation with no evidence.
Unfortunately, this is very common, especially with less-reputable media sources or individuals who aren't known for their honesty. This is common because an accusation alone can change the minds of the public, regardless of the facts. People have had their lives destroyed by being falsely accused of a heinous crime. When the person is vindicated, very often the visceral disgust that was felt for the person does not go away. Don't accept accusations lightly—demand evidence.
Quotes out of context / fake quotes.
Headlines that interpret words of a person while still using quotes can be incredibly deceptive. For example, "Politician X: murdering people is fun." This headline clearly implies that politician X said that murdering people is fun. So what did politician X actually say? Once you dig deeper (which very few people do), you will see that the actual quote was "Violent crime is a major problem that needs to be addressed. Many criminals murdered people because it gives them an adrenaline rush—a kind of high without needing drugs." Not quite the same, is it?
Claim causation where there is only correlation.
This is common with the social justice movement, where "racism," "sexism," "transphobia," "xenophobia" or other -isms and phobias are said to be responsible for actions or problems, where no such causation has been established.
Use of anecdotes.
In science, anecdotes are among the worse form of "evidence," yet outside of science, they are used all the time because they have a strong emotional appeal. Facts to don't stand a chance to a strong, emotional anecdote. For example, evidence strongly supports the safety and necessity of vaccines. But all it takes is one story of a child who had a bad adverse reaction from a vaccine to change public opinion on the overall safety of vaccines.
What other ways do providers of information mislead, deceive, manipulate, and outright lie to us? Think about this question the next time you're surfing the Internet, going through your social media feeds, watching television, reading the newspaper, or otherwise consuming information. Critical thinking requires that you don't automatically accept information, but that you apply a healthy dose of skepticism when it comes to deciding which information to accept, and which to reject. The more you practice this, the better you'll get.