2

THE INFORMATION BEHAVIOR OF IT ALL

Information doesn’t exist in a vacuum; rather, it is surrounded and shaped by context, both internal and external. In addition to the context produced by the financial and business dimensions of the media, the concepts of post-truth and truthiness also emphasize that there is a wide-ranging spectrum of motivations and emotions that motivate everyday information consumption. Information-seeking, information selection, information avoidance, and information usage (which are all part of the information behavior continuum) contribute to our understanding of how information is consumed on a daily basis, and provide further understanding of why consumers are susceptible to fake news.

LEARNING THEORY

A brief mention of learning theory is appropriate here in order to gain a meta-level understanding of how people acquire and absorb the information around them. The subsequent discussion of information behavior is informed by these larger concepts of learning and understanding.

In his work The Three Dimensions of Learning (2002), Knud Illeris positions learning at the intersection of internal and external cognitive, emotional, and social learning processes. Tapping into the fields of education, psychology, and management, Illeris posits that learning has two fundamental assumptions. First, learning involves two distinct processes: an internal psychological process in which new information is acquired and added to existing knowledge, and an external process in which the individual’s information acquisition is shaped and influenced by their interactions with their environment. Second, the learning that occurs during these internal and external processes encompasses three socially situated contexts: the cognitive domain of knowledge acquisition, the psychological dimensions of emotion and motivation, and the social domains of communication and cooperation.

Similarly, Char Booth (2011), has written about reflective teaching and learning, and suggests that there are four factors of learning: memory, prior knowledge, environment, and motivation (42–46). When considering this typology, memory can be connected to the processing that one goes through when acquiring, filtering, and absorbing new information; information overload, which occurs when too much information is acquired and subsequently rejected, is also related to information processing. Prior knowledge refers to an individual’s existing mental schemas and shapes the way in which new information is accepted or rejected; for example, mental schemas can be rigid and cause cognitive dissonance and reinforce confirmation bias and filter bubbles (more on this later in the chapter). An individual’s prior knowledge can also indicate their readiness, or lack thereof, to receive new and/or conflicting information. The environment refers to the physical or mental factors that can influence information acquisition or rejection; for example, both anger and hunger could prevent an individual from being receptive to new information. The final factor, motivation, refers to the intrinsic and extrinsic factors that influence the procurement or dismissal of new information; an individual might be quite motivated to learn new facts when faced with an academic exam, and another person might be personally motivated to learn sign language to connect with a hearing-impaired friend.

What is important to note about the models presented by Booth and Illeris is that they emphasize the cognitive and the affective dimensions of learning and information acquisition. Learning, and consequently an individual’s information behavior patterns, are complex, multifaceted, and dynamic. It is no wonder that the fake news phenomenon is so complicated and challenging to address and resist. The following sections will address two of the primary cognitive aspects, and then several of the affective aspects, of fake news.

MISINFORMATION/DISINFORMATION

The two cognitive dimensions of information behavior that are most applicable to fake news are misinformation and disinformation. Misinformation and disinformation (mis/dis) can be thought of as two sides of the same coin. Misinformation is simply information that is incomplete (Fox 1983; Losee 1997; Zhou and Zhang 2007), but it can also be defined as information that is uncertain, vague, or ambiguous. However, misinformation may still be “true, accurate, and informative depending on the context” (Karlova and Lee 2011, 3). The Oxford English Dictionary defines “disinformation” as “the dissemination of deliberately false information.” This is especially true when the information in question is likely to be broadly and quickly disseminated, such as information on the Internet. Fallis (2009, 1–3) provides a more nuanced definition by suggesting that disinformation is carefully planned, can come from individuals or groups, can be circulated by entities other than the creators (i.e., misinformation spread by a news organization), and is typically written or verbal information. Hernon concurs by warning that “we can put quotation marks around anything and change meaning,” and that mis/dis is so easily spread because “the person doing the misuse might only be guilty of making something publicly available, through a listserv or electronic journal or newsletter, without checking the original source” (1995, 136). The key to disinformation is that it is created with malicious or ill intent. However, it can also be motivated by benevolence (e.g., little white lies meant to spare hurt feelings, or lying about a surprise) (Rubin 2010; Walczyk et al. 2008). In such cases, it really is context that enables an individual to begin to make sense of the mis/dis (or information in general) being presented to them.

Because mis/dis is related to notions and discussions of credibility, trustworthiness, and deception, it can be hard to discern the motivations behind this type of erroneous information-sharing. These motivations are especially hard to discern in the online environment, where there is an abundance of information (both accurate and inaccurate) and often a lack of visual and aural clues, clues that in real life might alert a consumer of information that something is amiss or false. Because of the ubiquity of technology in today’s world, it is particularly important to be conscious of mis/dis not only because it prohibits collective comprehension and intelligence, but because it can indeed do harm by prioritizing and upholding biased, misleading, or false agendas and opinions (i.e., propaganda). Zhou and Zhang (2007, 804) state: “with the growing use of Internet and ubiquitous information access, misinformation is pervasive on the Internet and disseminated though online communication media, which could lead to serious consequences for individuals, organizations, and/or the entire society at large.”

AN EMOTIONAL DIMENSION OF INFORMATION BEHAVIOR

In addition to considering mis/dis as part of the information consumer’s cognitive processing of fake news in the post-truth era, it is especially important to recognize the emotional, or affective, components of mis/dis; it is the affective dimension of learning and information behavior that enables us to understand how and why fake news has become so pervasive and hard to displace. One of the hallmarks of the post-truth era is the fact that consumers will deliberately pass over objective facts in favor of information that agrees with or confirms their existing beliefs, because they are emotionally invested in their current mental schemas or are emotionally attached to the people or organizations which the new information portrays. The affection dimension of information-seeking and usage circumvents the cognitive processes of information-gathering and selection. Among the examples of affective information behavior to be aware of are confirmation bias, filter bubbles (also known as an echo chamber), information overload, satisficing, and information avoidance.

It’s easy for anyone, even information professionals, to become overwhelmed and overloaded by the sheer volume of information presented to us on any given day over the Internet and other forms of communication. Added to the sheer volume of information is information that is charged by political issues and involves potentially life-altering societal problems. The 2016 presidential election was such a time, no matter what a person’s political party affiliation or leanings; information-seeking and use in such a fraught environment is stressful. The information behavior researchers Donald Case and Lisa Given (2016) suggest that information-seekers during political campaigns may be “actively open to receiving new information and receiving it through serendipity, in an intense and condensed period of time” (30). Such information is often not as comprehensive or as rigorously vetted as one might prefer because the topics are so complex and “have such a wide range of opinions associated with them” (30). And “as the number of information items increases—or as the amount of available time decreases—people resort to simpler and less reliable rules for making choices to shorten their research time” (102). Contextually speaking, this information may also be impacted by personal experiences and viewpoints, and by a multitude of information sources “including news broadcasts, newspapers, magazines, the Internet, social media, and many personal conversations” (30) that may vary widely in regard to depth and clarity.

Social media plays a significant role in information overload because it facilitates the rapid dissemination of information, fake or otherwise. Instantaneously, stories can be shared, whether they have been read or not; for example, there is the accepted online shorthand of TL, DR—too long, didn’t read—which gives people license to share and comment on content they’ve not actually read, much less evaluated (Gil 2016; Dictionary.com n.d.). The instant gratification associated with sharing online stories, “liking” something first, and collecting friends’ reactions also encourages the dissemination of fake news.

Social media also encapsulates users into filter bubbles; filter bubbles (or echo chambers) are the result of the careful curation of social media feeds, which enables users to be surrounded by like-minded people and information that is aligned with their existing beliefs. Filter bubbles are further aggravated by confirmation bias, which suggests that users may actively seek and use information that already concurs with their existing mental models, prior knowledge, and memories, as opposed to seeking information from a variety of potentially conflicting sources. It is very easy for people to avoid distasteful, upsetting, or just incongruent information while in their social media filter bubbles. Filter bubbles are an example of selective exposure, or selective information-seeking, which is defined as the predisposition to “seek information that is congruent” with “prior knowledge, beliefs, and opinions, and to avoid exposure to information that conflicts with those internal states” (Case and Given 2016, 115).

Hand-in-hand with selective exposure is information avoidance. When purposeful choices are made regarding what information is obtained and consequently used, there are also purposeful decisions being made about what information is disregarded, evaded, or rejected in order to maintain existing states of belief (Case and Given 2016, 117). A final piece of the decision-making process, as it relates to information overload, selective exposure, and information avoidance, is satisficing. Satisficing is selecting information that is “good enough” to satisfy basic needs (36) or “choosing the first ‘acceptable answer’ to a question or solution to a problem” (102), “even if it means accepting a lower quality or quantity of information” (194). Satisficing could be a result of intellectual laziness, being unwilling or unable to deal with information overload, or not having the requisite information evaluation skills to reliably source information. Whatever the reason, satisficing also contributes to the spread and inescapabilty of misinformation, disinformation, and fake news by allowing low-quality information to remain in circulation and be disseminated; it may not be the best information, but it’s “good enough” not to be questioned or challenged.