Ullrich Ecker

169 Followers
298 Following
74 Posts

I am a cognitive psychologist studying effects of #misinformation.

I am an Australian Research Council Future Fellow at the University of Western Australia's School of Psychological Science, and a Fellow of the UWA Public Policy Institute.

I am an Associate Editor at Experimental Psychology and the Journal of Applied Research in Memory and Cognition.

Views are my own.

Lab websitehttps://www.emc-lab.org
University websitehttps://research-repository.uwa.edu.au/en/persons/ullrich-ecker

Happy to announce publication of a Special Issue on Misinformation in European Psychologist (with several #OpenAccess papers).

https://econtent.hogrefe.com/toc/epp/current

Big thanks to the contributors, it was a privilege to work with you all!

@KirstiJylhae @ejrclarke @PhilippMSchmid @Sacha_Altay @ldscherer @ryanburnell @stworg

Special Issue: Misinformation

European Psychologist

Happy to announce that the paper introducing our Misinformation Game simulator (led by the amazing Lucy Butler) is out in Behavior Research Methods today: https://rdcu.be/dgCo6
 
It's a social-media simulation for experimental research. Full experimental control, open source, Qualtrics integration, no coding skills required.

Posts can be text, image (incl. gif) or both. You can choose between a feed mode or page-wise presentation.

You can edit source handles and avatars, as well as engagement metrics (e.g. number of likes). There's a source-credibility badge and follower counts. Source-post assignments can range from fully random to fully determined.

Participants can like, dislike, share, or flag content, and they can comment. Their credibility score and follower count changes dynamically based on the choices they make. And all these features can easily be switched on or off.

Note that posts *can* be classified as true vs. false but they don't have to be. We developed the tool with misinformation experiments in mind but it can be used for many other purposes. We hope it'll be useful!
 
Full access here: https://misinfogame.com

The (Mis)Information Game: A social media simulator

New paper out today! Another study providing evidence against backfire effects, here with vaccine misinformation corrections.

Ecker UKH, Sharkey CXM, & Swire-Thompson B (2023). Correcting vaccine misinformation: A failure to replicate familiarity or fear-driven backfire effects. PLOS ONE, 18(4), e0281140. https://doi.org/10.1371/journal.pone.0281140

Correcting vaccine misinformation: A failure to replicate familiarity or fear-driven backfire effects

Individuals often continue to rely on misinformation in their reasoning and decision making even after it has been corrected. This is known as the continued influence effect, and one of its presumed drivers is misinformation familiarity. As continued influence can promote misguided or unsafe behaviours, it is important to find ways to minimize the effect by designing more effective corrections. It has been argued that correction effectiveness is reduced if the correction repeats the to-be-debunked misinformation, thereby boosting its familiarity. Some have even suggested that this familiarity boost may cause a correction to inadvertently increase subsequent misinformation reliance; a phenomenon termed the familiarity backfire effect. A study by Pluviano et al. (2017) found evidence for this phenomenon using vaccine-related stimuli. The authors found that repeating vaccine “myths” and contrasting them with corresponding facts backfired relative to a control condition, ironically increasing false vaccine beliefs. The present study sought to replicate and extend this study. We included four conditions from the original Pluviano et al. study: the myths vs. facts, a visual infographic, a fear appeal, and a control condition. The present study also added a “myths-only” condition, which simply repeated false claims and labelled them as false; theoretically, this condition should be most likely to produce familiarity backfire. Participants received vaccine-myth corrections and were tested immediately post-correction, and again after a seven-day delay. We found that the myths vs. facts condition reduced vaccine misconceptions. None of the conditions increased vaccine misconceptions relative to control at either timepoint, or relative to a pre-intervention baseline; thus, no backfire effects were observed. This failure to replicate adds to the mounting evidence against familiarity backfire effects and has implications for vaccination communications and the design of debunking interventions.