Preparing learners to deal with the faulty information they encounter in their lives has become another task educators are expected to accomplish. This expectation is a reasonable response to the mixed quality of online resources including some attempts to purposefully mislead viewers.
What follows is a lengthy post about approaches that can be applied to deal with exposure to misinformation. The primary focus is on a technique called “debunking” which represents a general approach for helping individuals not be taken in by misinformation. By general, I mean that the techniques do not involve rejecting specific misinformation by the introduction of convincing information after the initial exposure to misinformation. Prebunking involves approaches that prepare individuals to reject misinformation and as a general strategy has certain advantages of not having to be tailored to address false understandings after false beliefs have taken hold.
For those who want a quick alternative to reading my entire post, I will explain that prebunking involves familiarization with common approaches used to encourage the acceptance of false information. The study I will describe created this sensitivity through short videos. These videos are available and educators may find them useful in their classes. The videos can be found here:
https://inoculation.science/inoculation-videos/
This post uses some language that may be new to those who don’t read the scientific research I read. Allow me to first offer definitions for these terms as used in this research. These concepts are interrelated and I have attempted to identify some of these connections.
Debunk – to provide evidence intended to call into question faulty beliefs
Prebunk/Inoculation – to provide explanations of faulty beliefs before they are encountered in an attempt to prevent acceptance of these flawed beliefs.
Conceptual change – the attempt to bring into awareness and then counter faulty information accepted by a learner
Cognitive conflict – the proposal that a learner must be aware of the inconsistency between an existing belief and information relevant to this belief before change can occur (related to conceptual change and inert knowledge)
Naive theory – a personal theory based on an interpretation of life experiences
Inert knowledge – stored knowledge that is called into awareness only when certain contextual conditions are met. Inert knowledge implies that a second stored understanding also exists that is activated under different conditions. This term is often used to explain how naive theories that are flawed can persist despite learning more appropriate things in an educational setting. Hence, one understanding is activated in a school setting and a different understanding in day-to-day situations outside of school.
Motivated cognition – a psychological concept that refers to the tendency of individuals to interpret and process information in ways that align with their preexisting beliefs, values, and desires. This phenomenon can occur across various domains, such as politics, religion, social issues, and personal beliefs.
Confirmation bias – one example of motivated cognition that involves the selection or interpretation of inputs to sustain existing beliefs.
Conceptual change and naive beliefs
I think of misinformation in terms of what I know about conceptual change. This is a way to understand learning and also changes in understanding. I think of the topic of learning in terms of personal knowledge building. Each of us builds personal knowledge as models of how the world works. We use these models to interpret new experiences and when new experiences do not fit our understanding of how something works (a model), we may make adjustments in our model. Piaget called these two complementary processes assimilation and accommodation. We interpret experiences in terms of an existing model (assimilation) and when this will not work, we adjust or update our model (accommodation). The mismatch between experience and model when recognized is described as cognitive conflict and results in a motivation to create an adjustment.
My exposure to conceptual change theory occurred within the context of science education. There are many concepts in the formal study of science that explain phenomena we experience all of the time (e.g., gravity, inertia). Before we are educated in formal explanations we develop our own models of these phenomena. For example, what I sometimes describe as the “roadrunner” model of inertia and gravity imagines a roadrunner speeding off a cliff and speeding through the air. At a point, the roadrunner realizes it is no longer on solid ground and then plunges straight down. This model is an example of a naive theory – it kind of works, but is not how inertia and gravity actually work. Eventually, we learn a more accurate understanding. Assuming heavy and light objects (say a bowling and tennis ball) fall at the same rate often works as another example. It seems logical, but isn’t accurate.
Some naive theories have an interesting characteristic. They may persist even after learners have learned a more accurate account of a phenomenon. A learner may store and retain inconsistent models. One model active in daily life and the other in the school setting. This is the challenge of inert knowledge. It is thought that this is possible because recall is context dependent and there are some interesting demonstrations that the likelihood of formal knowledge can be activated by preceding a question about a phenomenon by suggesting a context. For example, you may remember from school the story of Galileo’s famous Tower of Pisa experiment before asking which of a heavy or light object will fall fastest. Without the prompt and reminder of the school context, it might seem logical that the heavier object will fall faster. The prompt changes the context.
Inert knowledge is a significant challenge. How does education (one context) prepare learners for functioning in a different context (daily life)? Learning alone is not enough. It is also necessary to activate and modify existing ways of understanding that are incorrect. That two-step process – activate and then experience limitations – is cognitive conflict. Physical demonstrates work great if preceded by outcomes that are unanticipated. Computer simulations can in some cases provide similar experiences. Even mentioning common misconceptions before providing accurate explanations can be successful. Textbook authors can use this strategy. This approach to conceptual change might be described as debunking.
What is frustrating is that in some situations calling out false understandings and then providing information that supports a different understanding seems inadequate. Our present circumstances with political differences of opinion are a good example. We find it completely illogical when we point what seem obvious contradictions to certain arguments and someone is willing to persist in a flawed understanding. We have encountered a challenge of motivated cognition.
Motivated cognition is a psychological concept that refers to the tendency of individuals to interpret and process information in ways that align with their preexisting beliefs, values, and desires. My favorite example when I was teaching was to recognize the predictable reaction of sports fans who witness a close call say charging or pass interference and come to the opposite opinion on what the correct call should be. Same data, different interpretations easily predicted from the team they were rooting for. Such examples involve a cognitive bias where people are more likely to accept, remember, and give greater weight to information that supports their existing views while disregarding or downplaying information that contradicts their beliefs. In essence, motivated cognition can lead to selective perception and interpretation of information to maintain a preferred mindset or belief system.
This phenomenon can occur across various domains, such as politics, religion, social issues, and personal beliefs. Motivated cognition can significantly influence how people form opinions, make decisions, and engage in discussions or debates. It plays a crucial role in the formation and reinforcement of attitudes, as well as in the persistence and spread of misinformation.
What can be done in such situations which have become predictive of how people take positions on such important issues as climate change or the value of inoculations? Prebunking, originally called inoculation in the research literature, proposes an intervention before flawed inputs have been fully processed. It is technically a little different from techniques that attempt to create cognitive conflict by acknowledging flawed beliefs as might be the case in a textbook. but similar. I came across a field research study making use of short videos to point out common misinformation techniques. The idea is that by labeling misinformation as it is encountered the processing of that information will be modified or the information ignored.
The prebunking intervention in this study (reference appears below) consisted of short videos explaining six different manipulative strategies – using strong emotional language, using incoherent arguments, presenting false dichotomies, scapegoating individuals or groups, and ad hominem attacks (attacking the person rather than the argument). Exposure to these videos (experimental vs control) resulted in more accurate detection of misinformation immediately and after a year. The researchers also tested their technique by posting two of their videos on YouTube as ads and then comparing the impact on those who had viewed and not viewed their ads on a dependent variable – reaction to misinformation.
Other writers have recognized the potential of debunking in the context of predicting AI will only increase the amount and personalization of misinformation. https://thedispatch.com/article/fake-news-meets-artificial-intelligence/
While the researchers do demonstrate significant consequences for exposure to the debunking videos, it is important to recognize the practical magnitude of the benefits is not great. Prebunking videos perhaps like other educational efforts to sensitize learners to propaganda techniques does not come close to eliminating the problem. Debunking efforts must continue as well.
Again, I think educators could make use of the videos the researchers have made available. https://inoculation.science/inoculation-videos/
References:
Nickerson, Raymond S. (June 1998), “Confirmation bias: A ubiquitous phenomenon in many guises”, Review of General Psychology, 2 (2): 175–220
Roozenbeek, J., Van Der Linden, S., Goldberg, B., Rathje, S., & Lewandowsky, S. (2022). Psychological inoculation improves resilience against misinformation on social media. Science Advances, 8(34), eabo6254
You must be logged in to post a comment.