Emotion and humor as misinformation antidotes

Edited by Czerne M. Reid, University of Florida, Gainesville, FL, and accepted by Editorial Board Member Susan T. Fiske January 25, 2021 (received for review February 10, 2020)
April 9, 2021
118 (15) e2002484118

Abstract

Many visible public debates over scientific issues are clouded in accusations of falsehood, which place increasing demands on citizens to distinguish fact from fiction. Yet, constraints on our ability to detect misinformation coupled with our inadvertent motivations to believe false science result in a high likelihood that we will form misperceptions. As science falsehoods are often presented with emotional appeals, we focus our perspective on the roles of emotion and humor in the formation of science attitudes, perceptions, and behaviors. Recent research sheds light on how funny science and emotions can help explain and potentially overcome our inability or lack of motivation to recognize and challenge misinformation. We identify some lessons learned from these related and growing areas of research and conclude with a brief discussion of the ethical considerations of using persuasive strategies, calling for more dialogue among members of the science communication community.
Many visible public debates over scientific issues are clouded in accusations of falsehood, which place increasing demands on citizens to deduce fact from fiction. Doing so is challenging, as facts, half-truths, and falsehoods can seem indistinguishable (1). This is especially the case in a media environment where a consumer can pick and choose from numerous channels, formats, and types of information (2). Others in PNAS (3, 4) have dissected and quantified the scope of the misinformation problem, and we will not repeat those arguments here. It is sufficient, for this perspective, to assert that fake news is a salient issue in media today (e.g., refs. 5 and 6), and the challenge presented by science misinformation and misperceptions deserves attention.
In this essay, we first examine the constraints on our ability to detect misinformation that, when coupled with our inadvertent motivations to believe false science, result in a high likelihood that we will form misperceptions. We briefly engage with ability and motivation, drawing primarily from research in cognitive psychology. Then, extending prior research and discussions from previous colloquia (e.g., ref. 3), we focus our perspective on an important but relatively understudied area of research in science communication: emotion and humor.
As antiscience claims often appeal to emotions (e.g., ref. 7), a better understanding of the role of emotions in science communication can advance not only how we communicate and engage public audiences with science but also how we address misinformation. To this end, recent research on humor sheds light on how funny science can potentially combat misinformation and misperceptions through various mechanisms. We conclude with some lessons learned from this growing body of research and briefly touch on the ethical considerations, calling for more discussion about this area of science communication.

Our Ability to Recognize Misinformation Is Limited

In a 2016 survey, nearly 25% of adults in the United States said they shared inaccurate information on social media (8), a statistic that is likely much higher due to social desirability bias in self-reporting.* Our ability to recognize and avoid misinformation is curtailed not only by overwhelming amounts of information and the nature of scientific content that we encounter online (2) but by individual characteristics (e.g., science knowledge, media literacy) and structural constraints (e.g., local and regional news deserts).
Knowledge about basic science facts plays a role in our ability to parse accurate information from falsehoods and half-truths (e.g., ref. 9). But the level of science knowledge among US adults has been fairly stagnant for at least a decade; in 2018, American adults correctly answered about 5.5 of 9 true-or-false questions about basic science facts (10). Moreover, factual knowledge is not the only predictor of people’s perceptions about science (11), and merely filling the deficit in public knowledge is unlikely to remedy (mis)perceptions (12).
While knowledge about basic science facts is one aspect of science literacy, the term also encompasses an understanding of the practice of science and its role as a social process (13). As others have pointed out (3), knowledge about the practice and process of science, not just basic facts, is likely to be more relevant in the context of science misinformation. In the United States, these types of literacy have also been relatively unchanging. In 2018, only about 43% of adults correctly responded to several questions that measure understanding of the process of scientific inquiry (10).
Media literacy also augments our ability to evaluate information. Like science literacy, media literacy is a complex concept that generally refers to the ability to analyze and evaluate information (14, 15), much of which we now access online. Evaluating online information includes considering strategies used to create content; identifying a media producer’s purpose and perspective; recognizing the social, political, and historical contexts in which information is created and consumed (14); and determining credibility (16). Media literacy equips us with the ability to negotiate meaning and engage with information that is available in a variety of media formats. Yet, media literacy education in the United States has lagged behind other developed nations (1719). Moreover, some argue that current media literacy education efforts focus on little more than familiarity with online and digital media tools that emphasize creativity and social connectivity (20). Education efforts that hone so-called “tool competence,” while useful, are unlikely to improve our ability to detect and avoid misinformation.
Media and science literacy are inextricably linked; media coverage of science issues is part of the social process of science (13). We need greater attention to media literacy education that focuses on critical engagement with media and its impacts on society. Better media literacy education coupled with an understanding of science literacy that includes considerations of media presentations of science (e.g., ref. 21) has the potential to improve our ability to discern science facts from falsehoods.
There are also structural constraints to our ability to detect misinformation. Two structural factors that limit our ability to discern credible science information from falsehoods include the shrinking of traditional science journalism and the prevalence of news deserts across the United States. Journalists play a key role in public understanding, legitimization, and support of science, through their coverage of scientific topics in media (22), yet this practice faces increasing challenges. Over the last decade, the traditional news industry has lost readers, influence, and advertising revenue, leading the industry to decline in both size and average salary (23).
Science news is typically less popular than other topics in media (e.g., politics, sports), and this low priority has intensified in a digital environment, resulting in fewer science journalist careers (22). In addition, there are increasing demands placed on science journalists, with constant, tighter deadlines across multiple media platforms (24) and an emphasis on simplifying material for audience consumption (25). Rather than the watchdog or gatekeeper roles that journalists traditionally held, many science journalists today find their role in science news coverage evolving into one necessitating public relations skills to navigate politicized issues, polarized debates, and interest-driven coverage (26). Fewer journalists, working faster and for less money, to produce bite-sized news stories in an oversaturated media market impairs the quality of science information that public audiences receive.
Moreover, since 2004, more than one in five newspapers in the United States has closed its doors permanently, and others have switched to a completely digital landscape, leaving many communities without a local newspaper (27). The departure of local newspapers produces news deserts, or communities with no coverage of local news (28). For many, this information absence is filled by the Internet. Today, most individuals obtain scientific information online (10), a trend even more prominent among so-called digital natives (12). While the Internet allows for public communication about science in novel ways, the information provided in this participatory media environment does not always face the same scrutiny upheld by established journalistic norms. As such, individuals other than scientists, such as politicians or religious leaders who may have contrary opinions, may challenge facts held in scientific consensus (29). The declining presence of traditional science journalism and an increase in physical news deserts emphasize that citizens’ abilities to identify information do not occur in a vacuum and can be threatened by structural, as well as individual, constraints.

We Often Lack Motivation to Parse Misinformation

Scholarship in the basic sciences of human cognition offers abundant evidence that the ways in which we seek and process information are not conducive to discerning misinformation. Many scientific issues that society faces are complex and novel to the average information consumer. Thus, it requires significant effort by citizens to make sense of the information necessary to thoroughly understand a single scientific issue on the public agenda (30).
To manage the deluge of information, we rely on mental shortcuts, or heuristics, that reduce complex cognitive tasks into simple operations that enable us to make judgments and form opinions about scientific issues society faces. Abundant empirical evidence shows that we use heuristics in the context of science. Predispositions such as political ideology and religious values are employed when we seek information (31) and form opinions about issues ranging from nuclear energy (32) to nanotechnology (33). Online, heuristics become helpful tools that allow us to efficiently make sense of information in an often overwhelming environment; we are constantly inundated with complex science messages from a myriad of sources and in diverse formats.
In addition to helping us sift through large amounts of information, we use these mental shortcuts in our evaluation of information (30, 34, 35). We are motivated skeptics engaged in motivated reasoning; we process information in unconsciously biased ways (36). As others have pointed out (4), this mechanism can explain both the difficulty of detecting misinformation and the challenge of correcting misperceptions.
Misinformation is often packaged in simplistic and emotional formats (37). Stories containing misinformation are often framed as clickbait, with captivating titles that capture attention with scandalous information. Indeed, extant scholarship indicates that emotions such as anger tend to favor biased processing of misinformation, resulting in attitude-consistent misperceptions (38). Such mechanisms encourage our acceptance of misinformation without much cognitive effort. Perhaps because the scientific endeavor is traditionally viewed as cold, rational, dispassionate, and objective, we have overlooked the role of emotion in the formation of science opinions and attitudes. Yet, appeals to emotional reactions are often used in the framing of false information (7). This “cold” view of science is seemingly at odds with “hot” topics like emotion. But this constructed dichotomy fails to account for decades of research in the social sciences; emotion is a fundamental part of almost all human actions and decisions (39).

The Role of Emotion in Science Communication

Emotions are subjective feeling states that result from appraisal of a situation and give rise to approach or avoidance motivational processes. Functional emotion theorists argue that emotions arise from meaningful interpretation of an object (e.g., a scientific message). In other words, emotions are the result of meaning making that give rise to action tendencies [i.e., approach or avoidance responses (40)]; each emotion has a core relational theme that guides our responses (41). Approach motivation is typically connected with incentive and reward, while avoidance is associated with aversion and threat (42). For example, fear is experienced in response to a physically or psychologically threatening object resulting in an avoidance motivation (43, 44). In contrast, anger that results from appraisal of an object or message is associated with approach action tendencies, as individuals are motivated to defend themselves or rectify a perceived wrong (45). These appraisal tendencies are implicit predispositions used to evaluate future stimuli (46) and can affect depth of information processing (4749) and thought content (50, 51). Emotions, therefore, are likely to influence people’s attitudes toward science and their risk judgments.
Although emotional appeals and affect have a long history of study in the context of health communication, emotions can also influence our attitudes toward scientific issues and how we process scientific information (e.g., refs. 5255). For example, disgust elicited by a message about fecal microbiome transplants can increase people’s risk perceptions (55) and influence their attitudes toward policy and regulation (54). Fear and anger toward videos from the Discovery Channel’s Shark Week have also been found to drive shark conservation behaviors (53). Emotions have likewise been examined as potentially strengthening cognitive strategies such as gain-versus-loss framing, in which information is presented in terms of gains or losses that result from engaging in a behavior (56, 57). Using the context of sea star wasting disease, Lu (52) found that gain-framed messages containing a sadness appeal (relative to loss-framed messages and hope appeals) increased proenvironmental behaviors, policy support, and information seeking among individuals. Others have found that gain-framed messages that evoke hope can influence people’s attitudes toward climate advocacy and policy (58), drawing on theoretical approaches to the study of emotion in science communication, including the cognitive functional model (59), which highlights mechanisms that potentially explain why rectifying misperceptions remains a challenge.
Strong emotions can impair our ability to process science information rationally (49). If processing ability is impaired, we generally resort to using mental shortcuts, or heuristic processing, to make sense of new information. Then, if a science falsehood aligns with our priors, heuristic processing impairs our ability to detect misinformation, while increasing the possibility of acceptance. If processing ability is not impaired, whether we adopt systematic or heuristic processing depends on the availability of mental shortcuts. If these shortcuts are present, and we are motivated to engage with the information and expect it to satisfy an emotion-induced goal, then we are more likely to process information heuristically. In the same state of motivation and goal expectation, the absence of mental shortcuts makes it more likely that we will process the information systematically. In the latter case, priors and predispositions can serve as moderators of the resulting attitude, judgment, or (mis)perception. If our priors lead us to accept misinformation, the misperceptions that result are likely to be long lasting and relatively stable.
A more recent theoretical framework is the emotional flow hypothesis (60, 61). While most communication research on emotion has focused on how a primary emotion affects downstream attitudinal and behavioral outcomes, message content can induce a series of emotional responses (62). This so-called emotional flow is defined as “the evolution of the emotional experience during exposure” to a message (60). Although primarily proposed and examined in the context of health messaging (e.g., refs. 63 and 64), an initial empirical test of emotional flow has been conducted in the context of climate change (58). Using gain-versus-loss framing coupled with threat and efficacy messages presented in succession, Nabi et al. (58) found that climate change messages designed to first elicit fear, then hope, were more effective in encouraging advocacy behavior when compared to messages that lacked emotional sequencing structure.
Although the evidence is yet sparse, the emotional flow hypothesis might offer a means of correcting misinformation. One study found that a narrative containing corrective information that had an emotional ending was more effective at rectifying attitudes compared to a corrective narrative without an emotional ending (65). Even though this study did not test the emotional flow hypothesis specifically, these findings are promising, as shifts in emotion are central to narratives and storytelling (61).
The effect of emotions on the detection and acceptance of misinformation, the formation of misperceptions, and their correction is not straightforward. Indeed, the mechanisms reviewed and proposed require further empirical testing. Additional research in this area can shed much-needed light on how emotional appeals and affective reactions to science information might limit or enhance our ability and motivation to address misinformation. Advances in this area will complement existing research on the cognitive mechanisms associated with misinformation and the correction of misperceptions.

Funny Science: How Humor Influences Science Attitudes

Related to our understanding of the role of emotions and emotional shifts is the use of humor in science messaging. Humor is derived primarily from surprise, as incongruity often plays a role in the elicitation of humor (66, 67), and can, if one gets the joke, result in amusement or mirth. Humor and emotion have a long history; emotional events are often retold or framed in humorous ways (68), and humor is regularly used in interpersonal emotion management (69). Today, we often see humorous content about current scientific issues that are emotionally charged (e.g., memes about mask wearing to prevent the spread of the coronavirus). Establishing a better understanding of humor, including its relationship to discrete emotions and the mechanisms that underlie shifts in emotion when we encounter funny, yet emotional, science content, is necessary to improve our understanding of the effects of humor and how it can be used in the practice of communicating complex scientific issues.
Humor is ubiquitous and constant in daily life. We see funny messages in television advertisements (70, 71), and almost 30% of Americans say they learned something about politics from satirical programs such as The Daily Show, The Colbert Report, and Saturday Night Live (72). Humor is also prevalent in science. A recent content analysis of Twitter and Instagram analyzed the types of humor present, finding that satire, wordplay, and anthropomorphism were relatively commonplace (73). The ubiquity of humor makes it an ideal subject of inquiry, as it allows researchers to examine theories of science communication in real-world settings, a research agenda that has been emphasized in a recent report of the National Academies (74).
In an era of (mis)information, humor has the potential to be implemented as a defense against falsehoods, but a better understanding of how humor influences public attitudes and decision-making is necessary. So far, research that has examined the use of humor to correct misinformation is inconclusive, though hopeful. Vraga et al. (75) compared the effectiveness of humor- vs. logic-based corrections of misinformation on Twitter and found that, of the three issues examined (climate change, HPV vaccinations, and gun control), only corrections about HPV vaccinations reduced misperceptions. However, both humor- and logic-based corrections were effective. In a related study using eye tracking, researchers found humor directed audiences’ attention to both the misinformation and the visual designed to correct it (76). Attending to the corrective image reduced people’s perceptions of credibility of the misinformation and, indirectly, reduced misperceptions. Other research on Facebook has shown that fake news from a source that self-identifies as a satirical outlet can potentially reduce misperceptions by reducing perceptions of credibility (77).
While the evidence is far from equivocal, these studies highlight why humor can be a valuable tool. First, it can serve as a means of drawing attention to issues to which audiences might not otherwise attend (e.g., refs. 78 and 79). Humorous messages also direct a viewer’s attention to information embedded within their content, which may be a result of the viewer marshaling cognitive resources to “get the joke” (67). In particular, visual forms of science humor (e.g., memes, comics) have the potential to capture attention (80), and some studies show that humor can also improve problem-solving skills and learning (81), although more research in this area is necessary. More importantly, humor impacts how we process information (e.g., ref. 82) and form attitudes and behavioral intentions (e.g., refs. 83 and 84).
Clearly, humor is already used to communicate science; scholars even recommend using humor for this purpose (85, 86). Yet, humor’s effects on people’s attitudes toward science and scientists largely remain an open empirical question. Science humor as a research context is integral to its application in practice. However, this is an emerging area of scholarship in science communication, and we look to the areas that have a longer history in the study of humor (e.g., education, advertising, political communication) for applicable insights.

Humor’s Effect on Source Evaluations

Audience perceptions of a communication source have long been recognized as important factors that impact the effectiveness of communication (87). Among the desirable attributes of a source, trustworthiness and likability play decisive roles in the persuasive impacts of messages. Trust has long been shown to affect people’s attitudes toward science (e.g., refs. 8891). Although trust is a broad concept that can be measured in a variety of ways, source credibility is a common feature of numerous conceptualizations (e.g., refs. 9295). To improve detection of misinformation and guard against misperceptions, then, we must consider the credibility of sources of scientific messages.
Related to credibility, source likability is typically conceptualized as an affective evaluation linked to an object (96). It is associated with traits that make a person likable in a general sense but are not necessarily relevant to the person’s expertise or credibility (97). Consistently, research has shown that more-likable communicators are more likely to influence audiences’ views through explicitly expressed intentions to persuade (98, 99). Taken together, source likability and credibility have potentially impactful roles as preventative and corrective measures against misinformation.
Humor has long been linked to source evaluations (100, 101). In advertising, its effects on source evaluations often depend on factors such as humor type (102). In education research, the relationship between humor and source evaluation is more consistent; humor is linked to more-positive evaluations of teachers (103, 104). In interpersonal communication research, inoffensive humor has been associated with attraction and building of rapport between individuals (105). When someone makes another person laugh, a recipient associates the source of humor with the pleasure of laughing. As a result, they view the source as more likable (106). In general, funny people are rated more favorably than others, a finding that has been replicated across diverse contexts (107).
Recent research has found supporting evidence in the context of science communication. Using a science joke on Twitter, we (108) found that people who found the content amusing also perceived the scientist who posted the joke as more likable. In another experiment manipulating the presence of a laugh track in a video clip featuring a scientist performing a standup comedy routine, Yeo et al. (109) found that laughter increased audiences’ perceptions of likability and expertise of the scientist. These findings are encouraging—scientists who use humor to engage audiences appear to be more likable, and, importantly, their credibility as a scientist is not undermined.
In addition to affecting perceptions of likability and expertise, funny content can impact downstream attitudes and behavioral intentions indirectly. Not only is a scientist performing a standup comedy routine perceived as more likable and credible, but greater perceptions of expertise are subsequently associated with perceptions of comedy as valid sources of science information (109). Experiencing humor as a result of funny science content also increases people’s motivations to follow more science on social media and their intentions to share and engage with such content (84, 108). Notably, these recent works on science humor are, for the most part, conducted with jokes that tend to be benign and inoffensive. However, satire and sarcasm are prevalent in online science humor (73), and it is to this type of biting, other-directed humor we now turn.

Regarding Satire and Sarcasm

Satire is commonly found in online science content (73) and exemplified by Twitter hashtags such as #overlyhonestmethods and #fieldworkfail. These hashtags are often used by researchers to express methodological frustrations that would not be considered appropriate for scholarly publication (110). The humor expressed in this content, instead of being self-deprecating, is other directed, poking fun at the scientific process (111). Some research on humor in science and health communication has probed the effects of satire on attitudes and information processing. For example, a satirical message about the importance of the measles, mumps, and rubella (MMR) vaccine led to less psychological reactance [a motivational state in which individuals feel their freedom is threatened (112)] and reduced defensive information processing for those who held misinformed beliefs about the MMR vaccine (113). In the context of climate change, viewers of a one-sided, sarcastic message mocking people who believe climate change is a hoax reported increased risk perceptions (114) and were encouraged to engage in more elaborative information processing (115). These findings and others (e.g., ref. 116) offer promising answers to the question of using humor to accomplish strategic science communication goals, including detecting and countering misinformation.
Although these few studies offer some understanding of satire’s role in science communication, this is still an under-studied area. However, we can look to its treatment in other contexts to gain insight into its role in communicating complex topics. On the one hand, research in political communication demonstrates the promise of satirical content to foster learning, engagement, and message elaboration. For example, following the 2012 election, exposure to The Colbert Report was found to increase people’s perceptions of their knowledge about super PACs, while also increasing factual knowledge of campaign finance regulation (117). Others have found that political humor can increase knowledge (118, 119), message elaboration (120), and political participation (120, 121). Applied to communicating complex science issues, finding satirical ways to present novel and intricate topics might facilitate learning and engagement among broad audiences.
On the other hand, it is easy to imagine that satire could potentially perpetuate misperceptions in science and negatively influence people’s perceptions of information sources and scientific actors. An analysis of climate change reporting on The Colbert Report found that, even though the issue was covered ironically, some audiences (primarily conservatives) took Colbert’s message about climate change being a “hoax” at face value (122, 123). These backfire effects can thus perpetuate misperceptions about climate change as a myth and would also be a concern for other scientific issues (e.g., vaccines). Others have found parodies of political candidates to increase the salience of caricatured traits (124) and affect perceptions of a joke’s target (125128). Much of the extant research on satirical impersonations in political communication (e.g., Tina Fey’s portrayal of Sarah Palin on Saturday Night Live) shows that satire negatively influences people’s evaluations of political candidates. If satirical political content can negatively impact evaluations of political actors, it seems reasonable to consider that the same might occur in the context of satirical science: Does satirical science humor potentially undermine trust in scientists and other scientific professionals? This and other questions about the effect of satire in science communication are open empirical questions that should be addressed.

Looking Ahead

There is no simple remedy to the problem of science misinformation. Our best and most realistic approach is to use multiple approaches in concert with each other. To this end, a better understanding of the roles of emotion and humor in accepting misinformation and forming misperceptions, as well as correcting them, serves as one more resource for science communicators’ efforts against misinformation. It is crucial that members of the science communication community (e.g., trainers, practitioners, researchers) form mutual collaborations to facilitate the conduct of translational research to address the misinformation challenge we face today. We believe science communication research needs more translation—empirical observations from research can and should be turned into best practices, strategies, and interventions that improve the health of science and its role in society.
Yet, in doing this work, critiques arise about the potential ethical implications of the recommendations and best practices that result from theoretical work. One perspective asserts that it is manipulative for science communicators to use persuasive strategies to achieve better scientific citizenship (e.g., higher science literacy, proscience attitudes). Although this consideration of ethics is not new to science communication (129, 130), even a recent report on the science of science communication from the National Academies (74) remained agnostic on this issue. We need to discuss the ethics of using communication strategies when engaging publics, and a recent edited volume initiates this conversation (131). Yet, the issue of whether science communication informs or persuades has not been adequately addressed (132). Not engaging with the goals and related ethics of science communication and continuing to rely on “just the facts” of scientific issues risks allowing misinformation and misperceptions to become more pervasive in our information ecosystem.
Communication strategies are not inherently deceitful or malicious—it is how we deploy these strategies that matters. Of course, how practitioners adopt communication practices translated from research depends on many factors, including goals and intentions. If the goals for current science communication are to correct misperceptions and inoculate ourselves against misinformation, we must engage with the moral complexities and make ethically grounded decisions about whether and how to implement persuasive communication tools to meet our goals.
Although empirically driven communication techniques may be advantageous for combating ephemeral misinformation wars, another concern is the potential hazard of undermining public trust in science and scientists. For now, opinion surveys show that the public’s trust in science is relatively high. In 2018, 44% of Americans said they have a “great deal of confidence” in the scientific community (10), second only to confidence in the US military. Given the importance of trusted sources in communication, it is integral that this confidence in the scientific community is not eroded in the effort to diminish misinformation and reduce public misperceptions. Science communication researchers continue to advance our understanding of how emotion and humor as strategic communication techniques can be used in service to practice; collaborative projects between practitioners and researchers will only strengthen this knowledge base. But, although empirical discoveries may yield effective tools and techniques, the recommendations and best practices that result must be employed conscientiously. To this end, it is essential that we engage in discussions and dialogue about the ethical considerations and challenges that face science communication today.

Data Availability

There are no data underlying this work.

References

1
Pew Research Center, The future of truth and misinformation online (2017).
2
D. Brossard, New media landscapes and the science information consumer. Proc. Natl. Acad. Sci. U.S.A. 110, 14096–14101 (2013).
3
D. A. Scheufele, N. M. Krause, Science audiences, misinformation, and fake news. Proc. Natl. Acad. Sci. U.S.A. 116, 7662−7669 (2019).
4
M. A. Cacciatore, Misinformation and public opinion of science and health: Approaches, findings, and future directions. Proc. Natl. Acad. Sci. U.S.A. 118, e1912437117 (2021).
5
S. A. Baker, M. Wade, M. J. Walsh, Misinformation: tech companies are removing 'harmful' coronavirus content – but who decides what that means? The Conversation (2020). https://theconversation.com/misinformation-tech-companies-are-removing-harmful-coronavirus-content-but-who-decides-what-that-means-144534. Accessed 10 September 2020.
6
R. T. Garcia, Brazil’s “fake news” bill won’t solve its misinformation problem. MIT Technology Review (2020). https://www.technologyreview.com/2020/09/10/1008254/brazil-fake-news-bill-misinformation-opinion. Accessed 10 September 2020.
7
S. J. Bean, Emerging and continuing trends in vaccine opposition website content. Vaccine 29, 1874–1880 (2011).
8
M. Barthel, A. Mitchell, J. Holcomb, Many Americans believe fake news is sowing confusion (Pew Research Center, 2016).
9
G. Pennycook, J. McPhetres, Y. Zhang, J. G. Lu, D. G. Rand, Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy-nudge intervention. Psychol. Sci. 31, 770–780 (2020).
10
National Science Board, Science and Engineering Indicators 2020 (National Science Foundation, 2020).
11
M. J. Simis, H. Madden, M. A. Cacciatore, S. K. Yeo, The lure of rationality: Why does the deficit model persist in science communication? Public Underst. Sci. 25, 400–414 (2016).
12
D. A. Scheufele, Communicating science in social settings. Proc. Natl. Acad. Sci. U.S.A. 110, 14040–14047 (2013).
13
National Academies of Sciences, Engineering, and Medicine, Science Literacy: Concepts, Contexts, and Consequences (The National Academies Press, 2016).
14
R. Hobbs, “Expanding the concept of literacy” in Media Literacy in the Information Age: Current Perspectives, R. Kubey, Ed. (Transaction, 1997), pp. 163–183.
15
M. Bulger, P. Davison, The Promises, Challenges, and Futures of Media Literacy (Data & Society Research Institute, 2018).
16
T. A. Callister Jr, Media literacy: On-ramp to the literacy of the 21st century or cul-de-sac on the information superhighway. Adv. Reading Lang. Res. 7, 403–420 (2000).
17
D. Kellner, J. Share, Media literacy in the U.S. MedienPädagogik 11, 1–21 (2005).
18
R. Kubey, F. Baker, Has media literacy found a curricular foothold? Educ. Week 19, 56–58 (1999).
19
P. Mihailidis, Media literacy in journalism/mass communication education: Can the United States learn from Sweden? J. Mass Commun. Educat. 60, 415–428 (2005).
20
R. Hobbs, A. Jensen, The past, present, and future of media literacy education. J. Media Lit. Educ. 1, 1 (2013).
21
D. Brossard, J. Shanahan, Do they know what they read? Building a scientific literacy measurement instrument based on science media coverage. Sci. Commun. 28, 47–63 (2006).
22
M. S. Schäfer, “How changing media structures are affecting science news coverage” in The Oxford Handbook of the Science of Science Communication, K. H. Jamieson, D. M. Kahan, D. A. Scheufele, Eds. (Oxford University Press, 2017), pp. 51–57.
23
S. Dunwoody, “Science journalism: Prospects in the digital age” in Routledge Handbook of Public Communication of Science and Technology, M. Bucchi, B. Trench, Eds. (Routledge, ed. 2, 2014), pp. 27–39.
24
G. Brumfiel, Science journalism: Supplanting the old media? Nature 458, 274–277 (2009).
25
B. Goldacre, Bad Science: Quacks, Hacks, and Big Pharma Flacks (McClelland & Stewart, 2010).
26
D. J. Ashwell, The challenges of science journalism: The perspectives of scientists, science communication advisors and journalists from New Zealand. Public Underst. Sci. 25, 379–393 (2016).
27
P. M. Abernathy, The Expanding News Desert (University of North Carolina Press, 2018).
28
J. Miller, News Deserts: No News is Bad News (Manhattan Institute for Policy Research, 2018).
29
T. Bubela et al., Science communication reconsidered. Nat. Biotechnol. 27, 514–518 (2009).
30
S. T. Fiske, S. E. Taylor, Social Cognition (McGraw-Hill, ed. 2, 1991).
31
S. K. Yeo, M. A. Xenos, D. Brossard, D. A. Scheufele, Selecting our own science: How communication contexts and individual traits shape information seeking. Ann. Am. Acad. Pol. Soc. Sci. 658, 172–191 (2015).
32
S. K. Yeo et al., Partisan amplification of risk: American perceptions of nuclear energy risk in the wake of the Fukushima Daiichi disaster. Energy Policy 67, 727–736 (2014).
33
M. A. Cacciatore, D. A. Scheufele, E. A. Corley, From enabling technology to applications: The evolution of risk perceptions about nanotechnology. Public Underst. Sci. 20, 385–404 (2011).
34
S. L. Popkin, The Reasoning Voter: Communication and Persuasion in Presidential Campaigns (University of Chicago Press, 1991).
35
J. H. Kuklinski, P. J. Quirk, “Reconsidering the rational public: Cognition, heuristics and mass opinion” in Elements of Reason: Cognition, Choice, and the Bounds of Rationality, A. Lupia, M. D. McCubbins, S. L. Popkin, Eds. (Cambridge University Press, 2000), pp. 153−182.
36
C. S. Taber, M. Lodge, Motivated skepticism in the evaluation of political beliefs. Am. J. Pol. Sci. 50, 755–769 (2006).
37
S. Lewandowsky, U. K. H. Ecker, C. M. Seifert, N. Schwarz, J. Cook, Misinformation and its correction: Continued influence and successful debiasing. Psychol. Sci. Public Interest 13, 106–131 (2012).
38
B. E. Weeks, Emotions, partisanship, and misperceptions: How anger and anxiety moderate the effect of partisan bias on susceptibility to political misinformation. J. Commun. 65, 699–719 (2015).
39
A. Damasio, Descartes’ Error: Emotion, Reason, and the Human Brain (Penguin, reprint ed., 2005).
40
J. E. Newhagen, TV news images that induce anger, fear, and disgust: Effects on approach‐avoidance and memory. J. Broadcast. Electron. Media 42, 265–276 (1998).
41
R. S. Lazarus, Progress on a cognitive-motivational-relational theory of emotion. Am. Psychol. 46, 819–834 (1991).
42
A. J. Elliot, A. B. Eder, E. Harmon-Jones, Approach–avoidance motivation and emotion: Convergence and divergence. Emot. Rev. 5, 308–311 (2013).
43
N. H. Frijda, The Emotions (Cambridge University Press, 1986).
44
R. S. Lazarus, Emotion and Adaptation (Oxford University Press, 1991).
45
C. E. Izard, Human Emotions (Springer, 1977).
46
J. S. Lerner, D. Keltner, Beyond valence: Toward a model of emotion-specific influences on judgement and choice. Cogn. Emotion 14, 473–493 (2000).
47
J. S. Lerner, L. Z. Tiedens, Portrait of the angry decision maker: How appraisal tendencies shape anger’s influence on cognition. J. Behav. Decis. Making 19, 115–137 (2006).
48
D. A. Small, J. S. Lerner, Emotional policy: Personal sadness and anger shape judgments about a welfare case. Polit. Psychol. 29, 149–168 (2008).
49
L. Z. Tiedens, S. Linton, Judgment under emotional certainty and uncertainty: The effects of specific emotions on information processing. J. Pers. Soc. Psychol. 81, 973–988 (2001).
50
D. Keltner, P. C. Ellsworth, K. Edwards, Beyond simple pessimism: Effects of sadness and anger on social perception. J. Pers. Soc. Psychol. 64, 740–752 (1993).
51
R. Raghunathan, M. T. Pham, All negative moods are not equal: Motivational influences of anxiety and sadness on decision making. Organ. Behav. Hum. Decis. Process. 79, 56–77 (1999).
52
H. Lu, The effects of emotional appeals and gain versus loss framing in communicating sea star wasting disease. Sci. Commun. 38, 143–169 (2016).
53
J. G. Myrick, S. D. Evans, Do PSAs take a bite out of Shark Week? The effects of juxtaposing environmental messages with violent images of shark attacks. Sci. Commun. 36, 544–569 (2014).
54
Y. Sun, S. K. Yeo, M. McKasy, E. C. Shugart, Disgust, need for affect, and responses to microbiome research. Mass Commun. Soc. 22, 508–534 (2019).
55
S. K. Yeo, Y. Sun, M. McKasy, E. C. Shugart, Disgusting microbes: The effect of disgust on perceptions of risks related to modifying microbiomes. Public Underst. Sci. 28, 433–448 (2019).
56
D. Kahneman, A. Tversky, Prospect theory: An analysis of decision under risk. Econometrica 47, 263–291 (1979).
57
A. J. Rothman, P. Salovey, Shaping perceptions to motivate healthy behavior: The role of message framing. Psychol. Bull. 121, 3–19 (1997).
58
R. L. Nabi, A. Gustafson, R. Jensen, Framing climate change: Exploring the role of emotion in generating advocacy behavior. Sci. Commun. 40, 442–468 (2018).
59
R. L. Nabi, A cognitive-functional model for the effects of discrete negative emotions on information processing, attitude change, and recall. Commun. Theory 9, 292–320 (1999).
60
R. L. Nabi, Emotional flow in persuasive health messages. Health Commun. 30, 114–124 (2015).
61
R. L. Nabi, M. C. Green, The role of a narrative’s emotional flow in promoting persuasive outcomes. Media Psychol. 18, 137–162 (2015).
62
P. Carrera, D. Muñoz, A. Caballero, Mixed emotional appeals in emotional and danger control processes. Health Commun. 25, 726–736 (2010).
63
N. Alam, J. So, Contributions of emotional flow in narrative persuasion: An empirical test of the emotional flow framework. Commun. Q. 68, 161–182 (2020).
64
R. L. Nabi, J. G. Myrick, Uplifting fear appeals: Considering the role of hope in fear-based persuasive messages. Health Commun. 34, 463–474 (2019).
65
A. Sangalang, Y. Ophir, J. N. Cappella, The potential for narrative correctives to combat misinformation. J. Commun. 69, 298–319 (2019).
66
C. R. Gruner, The Game of Humor: A Comprehensive Theory of Why We Laugh (Routledge, 1997).
67
J. Suls, “Cognitive processes in humor appreciation” in Handbook of Humor Research, P. E. McGhee, J. H. Goldstein, Eds. (Springer, New York, NY, 1983), pp. 39–57.
68
S. Meisiek, X. Yao, C. E. J. Hartel, W. J. Zerbe, N. M. Ashkanasy, “Nonsense makes sense: Humor in social sharing of emotion at the workplace” in Emotions in Organizational Behavior, C. E. Härtel, W. J. Zerbe, N. M. Ashkanasy, Eds. (Lawrence Erlbaum Associates, Inc., 2005), pp. 143–165.
69
L. E. Francis, Laughter, the best mediation: Humor as emotion management in interaction. Symbolic Interact. 17, 147–163 (1994).
70
C. S. Gulas, K. K. McKeage, M. G. Weinberger, It’s just a joke: Violence against males in humorous advertising. J. Advert. 39, 109–120 (2010).
71
F. K. Beard, One hundred years of humor in American advertising. J. Macromark. 25, 54–65 (2005).
72
J. C. Baumgartner, J. S. Morris, Laughing Matters: Humor and American Politics in the Media Age (Routledge, 2008).
73
M. McKasy, S. K. Yeo, M. A. Cacciatore, L. Y.-F. Su, Z. Oldroyd, American Association for the Advancement of Science (AAAS) Annual Meeting. February 14–17 2019, Washington, D.C. Poster 24727.
74
National Academies of Sciences, Engineering, and Medicine, Communicating Science Effectively: A Research Agenda (The National Academies Press, 2017).
75
E. K. Vraga, S. C. Kim, J. Cook, Testing logic-based and humor-based corrections for science, health, and political misinformation on social media. J. Broadcast. Electron. Media 63, 393–414 (2019).
76
S. C. Kim, E. K. Vraga, J. Cook, An eye tracking approach to understanding misinformation and correction strategies on social media: The mediating role of attention and credibility to reduce HPV vaccine misperceptions. Health Commun., (2020).
77
R. K. Garrett, S. Poulsen, Flagging Facebook falsehoods: Self-identified humor warnings outperform fact checker and peer warnings. J. Comput. Mediat. Commun. 24, 240–258 (2019).
78
A. B. Becker, D. J. Waisanen, From funny features to entertaining effects: Connecting approaches to communication research on political comedy. Rev. Comm. 13, 161–183 (2013).
79
P. R. Brewer, J. McKnight, Climate as comedy: The effects of satirical television news on climate change perceptions. Sci. Commun. 37, 635–657 (2015).
80
S.-F. Lin, H. Lin, L. Lee, L. D. Yore, Are science comics a good medium for science communication? The case for public learning of nanotechnology. Int. J. Sci. Education. Part B 5, 276–294 (2015).
81
M. Farinella, The potential of comics in science communication. J. Clin. Outcomes Manag. 17, Y01 (2018).
82
E. K. Vraga, C. N. Johnson, D. J. Carr, L. Bode, M. T. Bard, Filmed in front of a live studio audience: Laughter and aggression in political entertainment programming. J. Broadcast. Electron. Media 58, 131–150 (2014).
83
M. A. Cacciatore, A. B. Becker, A. A. Anderson, S. K. Yeo, Laughing with science: The influence of audience approval on engagement. Sci. Commun. 42, 195–217 (2020).
84
S. K. Yeo, L. Y.-F. Su, M. A. Cacciatore, M. McKasy, S. Qian, Predicting intentions to engage with scientific messages on Twitter: The roles of mirth and need for humor. Sci. Commun. 42, 481–507 (2020).
85
J. Goodwin, M. F. Dahlstrom, Communication strategies for earning trust in climate change debates. Wiley Interdiscip. Rev. Clim. Change 5, 151–160 (2014).
86
A. Baram-Tsabari, B. V. Lewenstein, An instrument for assessing scientists’ written skills in public communication of science. Sci. Commun. 35, 56–85 (2013).
87
C. I. Hovland, W. Weiss, The influence of source credibility on communication effectiveness. Public Opin. Q. 15, 635–650 (1951).
88
L. Berdahl, M. Bourassa, S. Bell, J. Fried, Exploring perceptions of credible science among policy stakeholder groups: Results of focus group discussions about nuclear energy. Sci. Commun. 38, 382–406 (2016).
89
M. Siegrist, The influence of trust and perceptions of risks and benefits on the acceptance of gene technology. Risk Anal. 20, 195–203 (2000).
90
M. Siegrist, M. Connor, C. Keller, Trust, confidence, procedural fairness, outcome fairness, moral conviction, and the acceptance of GM field experiments. Risk Anal. 32, 1394–1403 (2012).
91
D. Sleeth-Keppler, R. Perkowitz, M. Speiser, It’s a matter of trust: American judgments of the credibility of informal communicators on solutions to climate change. Environ. Commun. 11, 17–40 (2017).
92
B. W. Hardy, M. Tallapragada, J. C. Besley, S. Yuan, The effects of the “war on science” frame on scientists’ credibility. Sci. Commun. 41, 90–112 (2019).
93
A. Malka, J. A. Krosnick, G. Langer, The association of knowledge with concern about global warming: Trusted information sources shape public thinking. Risk Anal. 29, 633–647 (2009).
94
K. A. McComas, C. W. Trumbo, Source credibility in environmental health-risk controversies: Application of Meyer’s credibility index. Risk Anal. 21, 467–480 (2001).
95
O. Renn, D. Levine, “Credibility and trust in risk communication” in Communicating Risks to the Public, R. E. Kasperson, P. J. M. Stallen, Eds. (Springer, 1991), pp. 175–217.
96
D. R. Roskos-Ewoldsen, R. H. Fazio, The accessibility of source likability as a determinant of persuasion. Pers. Soc. Psychol. Bull. 18, 19–25 (1992).
97
V. A. Stone, H. S. Eswara, The likability and self-interest of the source in attitude change. Journalism Mass Commun. Q. 46, 61–68 (1969).
98
J. Mills, E. Aronson, Opinion change as a function of the communicator’s attractiveness and desire to influence. J. Pers. Soc. Psychol. 1, 173–177 (1965).
99
M.-A. Reinhard, M. Messner, S. L. Sporer, Explicit persuasive intent and its impact on success at persuasion: The determining roles of attractiveness and likeableness. J. Consum. Psychol. 16, 249–259 (2006).
100
D. Markiewicz, Effects of humor on persuasion. Sociometry 37, 407–422 (1974).
101
B. Sternthal, C. S. Craig, Humor in advertising. J. Mark. 37, 12–18 (1973).
102
M. G. Weinberger, C. S. Gulas, The impact of humor in advertising: A review. J. Advert. 21, 35–59 (1992).
103
J. Bryant, P. W. Comisky, J. S. Crane, D. Zillmann, Relationship between college teachers’ use of humor in the classroom and students’ evaluations of their teachers. J. Educ. Psychol. 72, 511–519 (1980).
104
D. Zillmann, J. Bryant, Guidelines for the effective use of humor in children’s educational television programs. J. Child. Contemp. Soc. 20, 201–221 (1989).
105
C. P. Wilson, Jokes: Form, Content, Use, and Function (Academic, 1979).
106
E. E. Graham, M. J. Papa, G. P. Brooks, Functions of humor in conversation: Conceptualization and measurement. West. J. Commun. 56, 161–183 (1992).
107
M. B. Wanzer, M. Booth‐Butterfield, S. Booth‐Butterfield, Are funny people popular? An examination of humor orientation, loneliness, and social attraction. Commun. Q. 44, 42–52 (1996).
108
S. K. Yeo, M. A. Cacciatore, L. Y.-F. Su, M. McKasy, L. O’Neill, Following science on social media: The effects of humor and source likability. Public Underst. Sci., (2021).
109
S. K. Yeo, A. A. Anderson, A. B. Becker, M. A. Cacciatore, Scientists as comedians: The effects of humor on perceptions of scientists and scientific messages. Public Underst. Sci. 29, 408–418 (2020).
110
J. D. Stemwedel, “#overlyhonestmethods: Ethical implications when scientists joke with each other on public social media” in Ethical Issues in Science Communication: A Theory-Based Approach, J. Goodwin, M. F. Dahlstrom, S. Priest, Eds. (Iowa State University Digital Press, 2013).
111
M. Simis-Wilkinson et al., Scientists joking on social media: An empirical analysis of #overlyhonestmethods. Sci. Commun. 40, 314–339 (2018).
112
S. S. Brehm, J. W. Brehm, Psychological Reactance: A Theory of Freedom and Control (Academic, 1981).
113
E. Moyer-Gusé, M. J. Robinson, J. Mcknight, The role of humor in messaging about the MMR vaccine. J. Health Commun. 23, 514–522 (2018).
114
A. A. Anderson, A. B. Becker, Not just funny after all: Sarcasm as a catalyst for public engagement with climate change. Sci. Commun. 40, 524–540 (2018).
115
A. B. Becker, A. A. Anderson, Using humor to engage the public on climate change: The effect of exposure to one-sided vs. two-sided satire on message discounting, elaboration and counterarguing. J. Clin. Outcomes Manag. 18, A07 (2019).
116
C. Skurka, J. Niederdeppe, R. Romero-Canyas, D. Acup, Pathways of influence in emotional appeals: Benefits and tradeoffs of using fear or humor to promote climate change-related intentions and risk perceptions. J. Commun. 68, 169–193 (2018).
117
B. W. Hardy, J. A. Gottfried, K. M. Winneg, K. H. Jamieson, Stephen Colbert’s civics lesson: How Colbert super PAC taught viewers about campaign finance. Mass Commun. Soc. 17, 329–353 (2014).
118
Y. M. Baek, M. E. Wojcieszak, Don’t expect too much! Learning from late-night comedy and knowledge item difficulty. Communic. Res. 36, 783–809 (2009).
119
X. Cao, Political comedy shows and knowledge about primary campaigns: The moderating effects of age and education. Mass Commun. Soc. 11, 43–61 (2008).
120
J. Matthes, R. Heiss, Funny cats and politics: Do humorous context posts impede or foster the elaboration of news posts on social media? Communic. Res. 48, 100−124 (2019).
121
L. H. Hoffman, D. G. Young, Satire, punch lines, and the nightly news: Untangling media effects on political participation. Commun. Res. Rep. 28, 159–168 (2011).
122
J. C. Baumgartner, J. S. Morris, One “nation” under Stephen? The effects of The Colbert Report on American youth. J. Broadcast. Electron. Media 52, 622–643 (2008).
123
H. L. LaMarre, K. D. Landreville, M. A. Beam, The irony of satire: Political ideology and the motivation to see what you want to see in The Colbert Report. Int. J. Press/Polit. 14, 212–231 (2009).
124
D. G. Young, Late-night comedy and the salience of the candidates’ caricatured traits in the 2000 election. Mass Commun. Soc. 9, 339–366 (2006).
125
J. C. Baumgartner, J. S. Morris, N. L. Walth, The Fey effect: Young adults, political humor, and perceptions of Sarah Palin in the 2008 Presidential election campaign. Public Opin. Q. 76, 95–104 (2012).
126
S. Esralew, D. G. Young, The influence of parodies on mental models: Exploring the Tina Fey–Sarah Palin phenomenon. Commun. Q. 60, 338–352 (2012).
127
J. C. Baumgartner, J. S. Morris, J. M. Coleman, Did the “road to the White House run through” Letterman? Chris Christie, Letterman, and other-disparaging versus self-deprecating humor. J. Polit. Mark. 17, 282–300 (2018).
128
A. B. Becker, B. A. Haller, When political comedy turns personal: Humor types, audience evaluations, and attitudes. Howard J. Commun. 25, 34–55 (2014).
129
M. F. Dahlstrom, Using narratives and storytelling to communicate science with nonexpert audiences. Proc. Natl. Acad. Sci. U.S.A. 111, 13614–13620 (2014).
130
M. F. Dahlstrom, S. S. Ho, Ethical considerations of using narrative to communicate science. Sci. Commun. 34, 592–617 (2012).
131
S. Priest, J. Goodwin, M. F. Dahlstrom, Ethics and Practice in Science Communication (University of Chicago Press, 2018).
132
C. Wilkinson, Ethics and practice in science communication. J. Clin. Outcomes Manag. 17, R02 (2018).
133
R. J. Fisher, Social desirability bias and the validity of indirect questioning. J. Consum. Res. 20, 303–315 (1993).

Information & Authors

Information

Published in

Go to Proceedings of the National Academy of Sciences
Proceedings of the National Academy of Sciences
Vol. 118 | No. 15
April 13, 2021
PubMed: 33837148

Classifications

Data Availability

There are no data underlying this work.

Submission history

Published online: April 9, 2021
Published in issue: April 13, 2021

Keywords

  1. science communication
  2. misinformation
  3. emotion
  4. humor

Notes

This paper results from the Arthur M. Sackler Colloquium of the National Academy of Sciences, "Advancing the Science and Practice of Science Communication: Misinformation About Science in the Public Sphere," held April 3−4, 2019, at the Arnold and Mabel Beckman Center of the National Academies of Sciences and Engineering in Irvine, CA. NAS colloquia began in 1991 and have been published in PNAS since 1995. From February 2001 through May 2019, colloquia were supported by a generous gift from The Dame Jillian and Dr. Arthur M. Sackler Foundation for the Arts, Sciences, & Humanities, in memory of Dame Sackler's husband, Arthur M. Sackler. The complete program and video recordings of most presentations are available on the NAS website at http://www.nasonline.org/misinformation_about_science.
This article is a PNAS Direct Submission. C.M.R. is a guest editor invited by the Editorial Board.
*Social desirability bias is the tendency for survey respondents to answer questions in ways that they perceive to be socially acceptable (133).

Authors

Affiliations

Department of Communication, University of Utah, Salt Lake City, UT 84112-0491;
Department of Communication, Utah Valley University, Orem, UT 84058-6703

Notes

1
To whom correspondence may be addressed. Email: [email protected].
Author contributions: S.K.Y. and M.M. wrote the paper.

Competing Interests

The authors declare no competing interest.

Metrics & Citations

Metrics

Note: The article usage is presented with a three- to four-day delay and will update daily once available. Due to ths delay, usage data will not appear immediately following publication. Citation information is sourced from Crossref Cited-by service.


Citation statements




Altmetrics

Citations

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

    Loading...

    View Options

    View options

    PDF format

    Download this article as a PDF file

    DOWNLOAD PDF

    Get Access

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Personal login Institutional Login

    Recommend to a librarian

    Recommend PNAS to a Librarian

    Purchase options

    Purchase this article to access the full text.

    Single Article Purchase

    Emotion and humor as misinformation antidotes
    Proceedings of the National Academy of Sciences
    • Vol. 118
    • No. 15

    Media

    Figures

    Tables

    Other

    Share

    Share

    Share article link

    Share on social media