PSYC206 Research Design and Statistics II
What is Misinformation?
We often receive information which turns out to be false, and yet at the time we consider it to be true (Lewandowsky et al., 2012; Eysenck & Keane, 2015/2020). Of course, there is often no immediate way of assessing the truth of information as one encounters it, but often the likely truthfulness of information can be assessed on the basis of other information sources. For example, the motivation a speaker might have for making a claim, the base-rate that a claim is true (Eysenck & Keane, p.624), and the internal coherence of the claim. Some individuals seem to be more sensitive to the likely truthfulness of information, and less susceptible to misinformation. One context which has become quite topical in public life is the phenomenon of false (fake) news. This is a term that has frequently been attributed to both factual news, and that which does turn out to be wrong (misinformation). In both cases, this term of false news is often used for political purposes. Yet our social, political, and economic systems would function better if we were all better at discriminating the truth value of information in the public domain. Is a claim likely to be true, or is it likely to be false?
Once misinformation is accepted, it can be very difficult to remove even after an explicit retraction has been made. This is known as the Continued Influence Effect. For this reason it is better to reduce initial misinformation acceptance through training. We can also look at individual differences to understand why some individuals might be less susceptible to misinformation than are others.
How can we reduce misinformation acceptance?
There are various approaches to reducing initial misinformation acceptance. Some involve warnings that specific types of information may be incorrect, and these warnings may be made repeatedly. This approach focusses on prospective content, and results have been mixed (Lewandowsky et al., 2012). This is an example of a general approach involving increased scepticism for any new information. While possibly reducing misinformation acceptance, there are also negative outcomes through being overly sceptical; for example, limiting one’s scope for the adoption of new valid ideas. A more active approach which focusses on the process of reasoning, rather than information content, is to consider the nature and motivation behind claims one encounters in the public domain, through cognitive inoculation (Roozenbeek & van der Linden, 2019).
Inoculation for information involves training people to be more resistant to prospective misinformation by experiencing the process of misinformation creation. This can be achieved by playing the Bad News game, where the participant pretends to be a fake news creator (Roozenbeek & van der Linden, 2019). The underlying premise is that by creating misinformation one will become better at identifying misinformation (It takes one to know one). The aim of this game is to become familiar with each of six misinformation strategies while attracting many followers (who accept the misinformation) and still appear credible. Roozenbeek and van der Linden conclude that inoculation appears to have its effect by “making people more attuned to specific deception strategies” (p.7), rather than making them more sceptical. In this way the intervention attempts to heighten the sensitivity of participants to misinformation statements (e.g., fake news) through 4 having them actively engage in the process of generating misinformation, rather than simply (passively) exposing them to warnings concerning specific content. This focus on process is arguably more effective in inoculating participants, making them less susceptible to misinformation.
What mental processes are associated with misinformation acceptance?
The idea that susceptibility to misinformation is due to insufficient processing (i.e., inadequate analytical reasoning) rather than scepticism or pre-existing belief states (i.e., motivated reasoning) was pursued by Pennycook and Rand (2019). They showed that it was reduced application of analytical reasoning which predicted misinformation acceptance. The measure of analytical reasoning used by Pennycook and Rand was the Cognitive Reflection Test (CRT). This presents a problem which a participant has to solve, with the correct solution requiring analytical reasoning. Individual differences in analytical reasoning can then be identified. But the CRT involves a reasonably complex task, which fails to isolate the fundamental mental processes which may be specifically responsible for the individual differences. Fundamental processes which underly reasoning are executive processes, which are a set of general-purpose control processes (information-general) which regulate our thoughts and behaviours. These are reasonable candidates for understanding analytical thought (see Friedman et al., 2006) and therefore, if failures in analytical thinking are related to increased acceptance of misinformation (Pennycook & Rand, 2019), why some individuals might be more willing to accept fake news than are others. Specific processes include inhibition, shifting, and updating (Miyake & Friedman, 2012); also refer to Eysenck and Keane (pp. 254-262, 597-599[220-224]). Each of these processes can be measured using dedicated performance tasks, which directly reflect corresponding processes used in reasoning. Updating was chosen for this study because inherent to the acceptance of new information is the updating of pre-existing information (selection and retrieval of information). More efficient updating is likely to allow better identification of flaws in the new information and so minimise misinformation acceptance. Stronger competence in updating is also likely to correlate with more effective inoculation training.
So, if participants are asked to rate the likely accuracy of true and false media claims, an individual’s ability to correctly discriminate between true and false items should correlate positively with their performance on a task to measure the executive function of updating. It might also be possible to train individuals to think more critically about media claims, and so discrimination after (post) training should be better than before (pre) training. Updating performance might also positively correlate with the degree of discrimination gain acquired from training.
Research Questions directing this study There are three research questions directing this study:
1. Do individuals with stronger executive updating performance show less susceptibility to misinformation as measured by the perceived reliability of misinformation statements?
2. Can we actively train people to be less susceptible to misinformation? If so, then reliability ratings for misinformation statements will be lower after training than before training. 5
3. If we are successful in reducing misinformation acceptance through inoculation training (reduced reliability ratings), does the effect of this training correlate with an individual’s executive updating performance?