People often stumble across conspiracy content while scrolling online, sometimes out of curiosity rather than conviction. New research suggests a simple mental exercise may quietly change how people interact with that material, reducing the time they spend engaging with it even when their core beliefs stay the same. The findings were published in Journal of Experimental Social Psychology.
Valentin Mang, PhD, a researcher at the Economic and Social Research Institute, explained: “We know from other research that counterfactual thinking can trigger what we call a flexibility mindset, which can help mitigate strong views and reduce polarisation. We were curious if we could harness these insights to tackle conspiracy theories.”
The study examined counterfactual thinking, a psychological process where individuals imagine alternative outcomes to past events. Researchers tested whether reflecting on the negative consequences of believing conspiracy theories would influence behaviour in real information environments.
Mang added: “What we found was that simply reading counterfactuals about the negative consequences of conspiracy beliefs makes people consider opposing viewpoints and reflect on their views about conspiracy theories more than reading a non-counterfactual version of the same text. This suggests that we indeed triggered a flexibility mindset in people.”
Across four experiments involving 2,487 participants, volunteers read a short interview with a fictional former conspiracy believer. Some versions included reflections such as imagining life if the person had never embraced conspiratorial ideas, while others presented the same story without those reflections.
Participants who encountered the reflective version became slightly more willing to consider opposing viewpoints. The effect was modest but consistent across studies, suggesting the mental exercise encouraged cognitive flexibility rather than persuasion.
Crucially, the intervention did not significantly reduce belief in conspiracy theories themselves. People did not suddenly abandon existing views after reading the text, even when they already had strong conspiracy tendencies.
Mang noted the same pattern in the behavioural data: “While reading these counterfactuals did not reduce people’s existing beliefs in various conspiracy theories, it changed people’s behaviour in a reading task. We told participants that they would have some time to read about cultivated meat and then showed them a list of fictional articles; half of them were conspiratorial and half of them were not. We then tracked their clicks and reading times for both types of articles.”
Instead, the shift appeared in behaviour. In a simulated news browsing task, participants chose between conspiracy themed and factual articles about cultivated meat. Those exposed to counterfactual reflections selected fewer conspiracy headlines and spent less time reading them.
He continued: “The people who read counterfactuals about the negative consequences of conspiracy beliefs before this task clicked on fewer conspiracy compared to non conspiracy headlines than those who did not. Entertaining these counterfactuals also reduced the time people spent reading conspiracy articles, while it did not substantially change how much time people spent reading non conspiracy articles. In other words, it reduced engagement with conspiracy theories, without making people engage less with non conspiracy content.”
On average, reading time for conspiracy material dropped by more than 12% after a single exposure to the exercise. Non conspiracy reading behaviour stayed largely unchanged, indicating the effect was selective rather than general disengagement from information.
Researchers argue this matters because belief formation often depends on repeated exposure. If people interact with conspiratorial content less often, the chances of adopting those beliefs may fall over time, particularly in social media environments driven by engagement patterns.
The findings also suggest prevention may be easier than correction. Previous research has shown entrenched conspiracy beliefs are resistant to direct challenge, but influencing how people process information before beliefs solidify could be more effective.
Reflecting on the broader implications, Mang said: “Future studies will have to tease apart if it is counterfactual thinking, reading about the potential harms of conspiracy theories, or both, that causes the effects we found. It is also too early to say to what extent this approach can prevent people from falling for conspiracy theories in the real world; more research is certainly needed to test this. However, our studies provide some initial promising evidence that encouraging counterfactual thinking could help tackle conspiracy theories.”
The authors propose that public campaigns or online prompts encouraging reflection about consequences could act as a subtle digital literacy tool. Because the technique does not directly argue against a claim, it may provoke less defensiveness than fact checking or confrontation.

