The effect of correction timing on the continued influence effect of misinformation with different relevance

Main Article Content

Lina Jia
Cite this article:  Jia, L. (2024). The effect of correction timing on the continued influence effect of misinformation with different relevance. Social Behavior and Personality: An international journal, 52(12), e13662.


Abstract
Full Text
References
Tables and Figures
Acknowledgments
Author Contact

Exploring the optimal correction timing for the appearance of corrective information can help to reduce the continued influence effect of misinformation, and thus mitigate its negative impact. I conducted two studies with 86 participants to pinpoint the most effective timing for corrections using relatively efficient correction strategies, also considering the relevance of the information as a factor. The findings revealed that the effect of the timing between misinformation and its correction on the continued influence effect of misinformation varied depending on the relevance of the information and the correction approach used. Additionally, the best timing for corrections differed across various correction methods. The results of these two studies offer guidance and a framework for more precise and targeted misinformation correction efforts in the future.

Research has shown that even after misinformation is corrected, the initial false information can continue to influence people’s reasoning and decision making, a phenomenon known as the continued influence effect of misinformation (CIEM; Lewandowsky et al., 2012; Swire et al., 2017). In everyday situations, it is often not feasible to immediately provide corrective information when misinformation spreads, and media corrections can sometimes come hours or even days later. This makes it critical to understand the effects of the time delay between misinformation and its correction. Rich and Zaragoza (2020) examined whether the timing of a correction—immediately after the misinformation (10 minutes) or much later (2 days)—affected its effectiveness. They evaluated the effect of correction timing on the CIEM on the basis of participants’ belief ratings in the misinformation and their responses to inference questions, and found that delayed corrections did not significantly impact the CIEM. Similarly, Miller et al. (2022) investigated the role of age and delay intervals (10 minutes vs. 2 days) and found no significant main effect of the timing of the correction.
 
Rich and Zaragoza (2020) also pointed out in their discussion that it remains uncertain whether their findings would extend to other news stories or hold true across longer delay intervals. This suggests a need for further investigation into various correction timings to identify the most effective timing for correction. Additionally, Rich and Zaragoza did not account for the potential impact of the type of content in their experiment. Jin et al. (2022) showed that information with higher (vs. lower) relevance to the individuals had a larger (vs. smaller) CIEM. Rotliman and Schwarz (1998) demonstrated that people employ different judgment strategies depending on the relevance of the information. Therefore, in studying the effects of different correction timings, it is crucial to also consider the role of information relevance to identify the most effective correction timings for misinformation of varying relevance.
 
In summary, in this study I examined the effect of correction timing on the CIEM for information with high or low relevance. In the manipulation of the time interval for correcting information, I used three correction times: immediate correction, shorter interval (1 day), and longer interval (1 week). In addition, previous research has found that providing alternative explanations during corrections (Ecker et al., 2020; Lewandowsky et al., 2012) and corrections from high-authority sources (Pluviano et al., 2020) can be relatively effective in reducing the CIEM, so I incorporated these two types of corrections in my study. Therefore, I proposed the following hypotheses:
Hypothesis 1a: There will be a main effect of information relevance, with highly relevant information being more likely to be corrected than is low-relevance information.
Hypothesis 1b:  There will be a main effect of correction interval, with longer (vs. shorter) intervals leading to less (vs. more) effective corrections.
Hypothesis 2a: There will be an interaction effect between relevance and the timing of corrections, such that for high-relevance information there will be no difference between different correction times, but for low-relevance information longer correction intervals will lead to a larger continued influence effect of misinformation.
Hypothesis 2b: The interaction between information relevance and correction intervals will vary according to the method of correction.

Study 1

Method

Participants and Procedure

I utilized G*Power to determine the required sample size for my study, leading to the recruitment of 99 undergraduates. However, 13 participants were unable to complete all experimental tasks and withdrew from the study partway through. Therefore, 86 participants, including 11 men (12.8%) and 75 women (87.2%), with an age range of 18 to 22 years (Mage = 19.64, SD = 0.99) remained. All participants were randomly assigned to one of three groups: immediate correction (n = 31), 1-day interval (n = 25), and 1-week interval (n = 30). All participants had normal or corrected-to-normal visual acuity and signed an informed consent before beginning the study.
 
Participants were required to first read the text containing the misinformation, after which they completed the precorrection questions. Subsequently, according to their assigned group, participants read the remainder of the story after a specific interval, then completed the corrected questions.

 

Materials and Design

The experimental materials for this study consisted of four stories: two low-relevance fictional news stories (e.g., “A fire broke out at a textile factory...”) taken from Gordon et al. (2017) and two high-relevance campus life stories that I wrote (e.g., “Students have been frequently discussing issues with the cafeteria...”). All stories were corrected using alternative explanations. Each story consisted of six sentences and each sentence comprised 10–20 words (text length range = 60–120 words). Before beginning the formal experiment, I asked 20 participants (five men, 15 women; Mage = 22.05 years, SD = 3.19), to assess the degree of relevance of the information to themselves using a 7-point Likert scale (1 = not at all relevant to 7 = fully relevant). This revealed a significant difference in relevance between the currently selected materials, t(1) = –14.00, p = .045, Mhigh relevance = 4.73, SD = 0.04 vs. Mlow relevance = 4.03, SD = 0.11. In addition, the difference in familiarity was not significant, t(1) = 0.91, p = .53.
 
The experiment had a 2 (relevance: high, low) × 3 (correction interval: immediate, 1 day, 1 week) mixed design, where relevance was a within-subjects variable, and the correction interval was a between-subjects variable. The dependent variables were change in belief (i.e., belief after correction vs. belief before correction) and inference scores. A smaller belief differential or lower inference score indicated a smaller CIEM.

Results

Belief Difference Scores

Factual questions following corrections were first analyzed for high- and low-relevance information. There was no significant difference in participants’ awareness or understanding of subsequently corrected information with high and low relevance, t(85) = 1.85, p = .07.
 
I used a repeated measures analysis of variance to examine the main effects and interactions. Results of a 2 (relevance: high, low) × 3 (correction interval: immediate, 1 day, 1 week) analysis of variance on the belief difference scores (see Figure 1) showed that the main effects of relevance and correction interval were not significant, p > .05. The interaction between relevance and correction interval was significant, F(2, 83) = 3.24, p = .04, η² = .07, and a further simple effects analysis revealed that in the low-relevance condition the belief difference at a 1-week interval was significantly greater than that after both immediate correction, p = .046, and a 1-day interval, p = .043, while the belief difference scores between immediate correction and a 1-day interval were not significantly different, p = .89. For high-relevance information, there were no significant differences between the three conditions of immediate correction, a 1-day interval, and a 1-week interval, p > .05.

Table/Figure

Figure 1. Belief Difference Scores Between High-Relevance and Low-Relevance Information Conditions at Different Intervals of Alternative Explanations

Inference Scores

Figure 2 shows there was a significant main effect of relevance, F(1, 83) = 28.46, p < .001, η² = .26, with high-relevance information having significantly lower inference scores than did low-relevance information, indicating a smaller CIEM for high-relevance information. There was also a significant main effect of correction interval, F(2, 83) = 3.52, p = .03, η² = .08, whereby inference scores were significantly higher for a 1-week interval than for an immediate correction (p = .03) and a 1-day interval (p = .02), while inference scores were not significantly different between immediate corrections and 1-day intervals, p > .05, and the interaction was not significantly different, F < 1, p = .53.

Table/Figure
Figure 2. Inference Scores for High-Relevance and Low-Relevance Information at Different Intervals of Correction Timing

Study 2

Method

Participants and Procedure

After the participants had completed Study 1, they proceeded to complete Study 2.
 

Materials and Design

The stories had the same inference problem as those I used in Study 1. For example, one of the stories in Study 1 had the theme of an office building that collapsed due to a fire, and subsequent corrections found no evidence of a fire. In Study 2, the story with the same theme involved the evacuation of passengers before take-off due to a fire in the cabin, which was subsequently corrected to say that no signs of fire were found. Thus, the two parallel stories with the same theme were followed by the inference question “The fire brigade must arrive at the scene.” The number of words in the experimental materials was the same as in Study 1, and the experimental design was the same except that Study 2 used authoritative corrections. These corrections were characterized by high levels of expertise and credibility, akin to those disseminated by recognized authoritative entities.

Results

Belief Difference Scores

I found no significant differences between high- and low-relevance information for the factual questions, t(85) = –1.91, p > .05. Belief difference scores (see Figure 3) showed that neither the main effect nor the interaction between relevance and correction interval was significant, F < 1, p > .05.
 

Table/Figure
Figure 3. Belief Difference Scores Between High- and Low-Relevance Information at Different Intervals of High-Authority Correction Timing

Inference Scores

The main effect of relevance was marginally significant, F(1, 83) = 3.98, p = .049, η² = .05, with high-relevance information inference scores being lower than low-relevance information scores (see Figure 4). The main effect of interval group was significant, F(2, 83) = 3.40, p = .04, η² = .08, where inference scores were significantly higher after a 1-week interval than after a 1-day interval (p = .01), and were not significantly different from the immediate-correction interval scores (p = .15), nor were they significantly different between immediate correction and a 1-day interval (p = .23). The interaction was also not significant, F < 1, p = .90.
 

Table/Figure
Figure 4. Inference Scores for High-Relevance and Low-Relevance Information at Different Intervals of High-Authority Correction Timing

General Discussion

This study delved into the CIEM concerning high- and low-relevance information, assessing the impact of various correction timings when employing alternative explanations and authoritative source-correction approaches. The findings from Study 1, regarding belief difference scores, revealed a significant interaction between relevance and correction timing. For information of low relevance, corrections made immediately or within a 1-day interval were most effective. This finding contrasts with those of Miller et al. (2022) and Rich and Zaragoza (2020), who did not observe varying impacts based on correction timing. This may be due to differences in the materials used as well as differences in the intervals in each study. In my study I made a further distinction between the relevance of the information materials, while Rich and Zaragoza (2020) did not distinguish between the degree of relevance of the information. Further, the corrections I used were alternative explanations and high-authority corrections. The discernible effect of different intervals for low-relevance information in this study might be attributed to a more nuanced differentiation of the information materials and the use of alternative interpretations as a method of correction. The inference scores for high-relevance information were significantly lower. This may be due to the existence of a self-processing advantage for highly relevant information (Yin et al., 2019); thus, participants may have been more likely to correct previous misinformation and accept current corrected information.

No significant effects were found in any of the analyses of belief difference scores in Study 2. A main effect of relevance was found in the analysis of inference scores, which is consistent with the results of the inference scores analysis in Study 1. In addition, corrections were least effective at longer intervals (1 week), that is, there was a larger CIEM. When the correction interval was longer, the misinformation may have left a solid trace in the participants’ long-term memory (Brydges et al., 2018, 2020; Lewandowsky et al., 2012). However, unlike Study 1, there was no significant difference in inference scores between the 1-week interval and immediate correction.

The results showing that high-relevance information consistently exhibited a smaller CIEM for inference scores, but not for belief difference scores, supported Hypothesis 1a. This may be due to differences in the assessment of dependent variables, where inference scores were more indirect and belief scores were direct. Additionally, the effectiveness of corrections diminished with longer intervals between the misinformation and the correction, particularly after 1 week, which supported Hypothesis 1b. This result suggests that when misinformation appears in some media, it should be investigated promptly and corrections issued quickly to prevent individuals from forming irrational judgments.

Furthermore, I found that in the low-relevance condition the interaction between information relevance and correction interval for alternative explanations was significantly greater for belief difference at a 1-week interval than for either immediate correction or a 1-day interval, while for highly relevant information there were no significant differences between correction timings. This finding supported Hypothesis 2a. The current study shows that under the alternative explanation correction approach, the effective correction timing was immediate correction or a 1-day interval. Under the correction method of high-authority sources, the best timing was a 1-day interval, which supported Hypothesis 2b.

There are limitations to this study that future research could address. First, I focused primarily on the impact of correction intervals and did not include appropriate control conditions in the CIEM tasks. Further, I relied solely on rating scales for assessments. Future research could refine the experimental manipulations. Moreover, the applicability of the study findings is limited to college students, as the manipulation of information relevance was centered around this demographic. Future studies could broaden the participant pool to include a more diverse range of demographics.

References

Brydges, C. R., Gignac, G. E., & Ecker, U. K. H. (2018). Working memory capacity, short-term memory capacity, and the continued influence effect: A latent-variable analysis. Intelligence, 69, 117–122.
 
Brydges, C. R., Gordon, A., & Ecker, U. K. H. (2020). Electrophysiological correlates of the continued influence effect of misinformation: An exploratory study. Journal of Cognitive Psychology, 32(8), 771–784.
 
Ecker, U. K. H., O’Reilly, Z., Reid, J. S., & Chang, E. P. (2020). The effectiveness of short-format refutational fact-checks. British Journal of Psychology, 111(1), 36–54.
 
Gordon, A., Brooks, J. C. W., Quadflieg, S., Ecker, U. K. H., & Lewandowsky, S. (2017). Exploring the neural substrates of misinformation processing. Neuropsychologia, 106, 216–224.
 
Jin, H., Jia, L., Yin, X., Wei, S., & Xu, G. (2022). The influence of information relevance on the continued influence effect of misinformation. Journal of Psychiatry and Psychiatric Disorders, 6(3), 203–218.
 
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106–131.
 
Miller, A. L., Wissman, K. T., & Peterson, D. J. (2022). The continued influence effect: Examining how age, retraction, and delay impact inferential reasoning. Applied Cognitive Psychology, 36(3), 708–723.
 
Pluviano, S., Della Sala, S., & Watt, C. (2020). The effects of source expertise and trustworthiness on recollection: The case of vaccine misinformation. Cognitive Processing, 21(3), 321–330.
 
Rich, P. R., & Zaragoza, M. S. (2016). The continued influence of implied and explicitly stated misinformation in news reports. Journal of Experimental Psychology: Learning, Memory, and Cognition, 42(1), 62–74.
 
Rich, P. R., & Zaragoza, M. S. (2020). Correcting misinformation in news stories: An investigation of correction timing and correction durability. Journal of Applied Research in Memory and Cognition, 9(3), 310–322.
 
Rotliman, A. J., & Schwarz, N. (1998). Constructing perceptions of vulnerability: Personal relevance and the use of experiential information in health judgments. Personality and Social Psychology Bulletin, 24(10), 1053–1064.
 
Swire, B., Ecker, U. K. H., & Lewandowsky, S. (2017). The role of familiarity in correcting inaccurate information. Journal of Experimental Psychology: Learning Memory and Cognition, 43(12), 1948–1961.
 
Yin, S., Sui, J., Chiu, Y.-C., Chen, A., & Egner, T. (2019). Automatic prioritization of self-referential stimuli in working memory. Psychological Science, 30(3), 415–423.

Brydges, C. R., Gignac, G. E., & Ecker, U. K. H. (2018). Working memory capacity, short-term memory capacity, and the continued influence effect: A latent-variable analysis. Intelligence, 69, 117–122.
 
Brydges, C. R., Gordon, A., & Ecker, U. K. H. (2020). Electrophysiological correlates of the continued influence effect of misinformation: An exploratory study. Journal of Cognitive Psychology, 32(8), 771–784.
 
Ecker, U. K. H., O’Reilly, Z., Reid, J. S., & Chang, E. P. (2020). The effectiveness of short-format refutational fact-checks. British Journal of Psychology, 111(1), 36–54.
 
Gordon, A., Brooks, J. C. W., Quadflieg, S., Ecker, U. K. H., & Lewandowsky, S. (2017). Exploring the neural substrates of misinformation processing. Neuropsychologia, 106, 216–224.
 
Jin, H., Jia, L., Yin, X., Wei, S., & Xu, G. (2022). The influence of information relevance on the continued influence effect of misinformation. Journal of Psychiatry and Psychiatric Disorders, 6(3), 203–218.
 
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106–131.
 
Miller, A. L., Wissman, K. T., & Peterson, D. J. (2022). The continued influence effect: Examining how age, retraction, and delay impact inferential reasoning. Applied Cognitive Psychology, 36(3), 708–723.
 
Pluviano, S., Della Sala, S., & Watt, C. (2020). The effects of source expertise and trustworthiness on recollection: The case of vaccine misinformation. Cognitive Processing, 21(3), 321–330.
 
Rich, P. R., & Zaragoza, M. S. (2016). The continued influence of implied and explicitly stated misinformation in news reports. Journal of Experimental Psychology: Learning, Memory, and Cognition, 42(1), 62–74.
 
Rich, P. R., & Zaragoza, M. S. (2020). Correcting misinformation in news stories: An investigation of correction timing and correction durability. Journal of Applied Research in Memory and Cognition, 9(3), 310–322.
 
Rotliman, A. J., & Schwarz, N. (1998). Constructing perceptions of vulnerability: Personal relevance and the use of experiential information in health judgments. Personality and Social Psychology Bulletin, 24(10), 1053–1064.
 
Swire, B., Ecker, U. K. H., & Lewandowsky, S. (2017). The role of familiarity in correcting inaccurate information. Journal of Experimental Psychology: Learning Memory and Cognition, 43(12), 1948–1961.
 
Yin, S., Sui, J., Chiu, Y.-C., Chen, A., & Egner, T. (2019). Automatic prioritization of self-referential stimuli in working memory. Psychological Science, 30(3), 415–423.

Table/Figure

Figure 1. Belief Difference Scores Between High-Relevance and Low-Relevance Information Conditions at Different Intervals of Alternative Explanations


Table/Figure
Figure 2. Inference Scores for High-Relevance and Low-Relevance Information at Different Intervals of Correction Timing

Table/Figure
Figure 3. Belief Difference Scores Between High- and Low-Relevance Information at Different Intervals of High-Authority Correction Timing

Table/Figure
Figure 4. Inference Scores for High-Relevance and Low-Relevance Information at Different Intervals of High-Authority Correction Timing

This study was funded by Tianjin University of Commerce research start-up funds.

The author thanks Hua Jin and Xiaojuan Yin for assistance with experimental design and data collection.

The data that support the findings of this study are available on request from the author.

Lina Jia, Department of Psychology, College of Law, Tianjin University of Commerce, Guangrong Road 409, Tianjin 300134, People’s Republic of China. Email: [email protected]

Article Details

© 2024 Scientific Journal Publishers Limited. All Rights Reserved.