TY - CHAP
T1 - Reducing misinformation and conspiracy theories on social media
AU - van Prooijen, Jan Willem
PY - 2025/3/18
Y1 - 2025/3/18
N2 - Policy-makers often focus on algorithms to reduce the spread of misinformation and conspiracy theories online. However, the main reason misinformation and conspiracy theories proliferate on social media is that human users decide to share them. One of the Digital Services Act’s main goals is to compel digital service providers, specifically Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), to enhance measures against online misinformation. To do so effectively, it is crucial to understand people’s motivations for being active on social media.People often share false information to serve their identity needs and appease their in-group. For example, false information that disparages political opponents may gain so-called likes and other forms of social approval from like-minded others. Research suggests that believing and sharing misinformation is often not due to incompetence or the intention to mislead others purposefully; instead, it is due mainly to people’s attention being focused on social connections instead of accuracy. Shifting people’s focus on the possible accuracy or inaccuracy of information can reduce their belief in misinformation and their willingness to share it.This policy brief reviews interventions that successfully shift people’s focus to accuracy. One intervention is warning labels that are presented simultaneously with misinformation. Such warning labels can be pretty effective if implemented correctly. Moreover, interventions either before (prebunking) or after (debunking) encountering misinformation can be effective, although the effects of these interventions tend to be small and decrease over time. Even though emotions and identity needs are the primary reasons people believe misinformation and conspiracy theories, raising public awareness of possible inaccuracies and rationally refuting and correcting such false information makes a difference for many citizens.
AB - Policy-makers often focus on algorithms to reduce the spread of misinformation and conspiracy theories online. However, the main reason misinformation and conspiracy theories proliferate on social media is that human users decide to share them. One of the Digital Services Act’s main goals is to compel digital service providers, specifically Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs), to enhance measures against online misinformation. To do so effectively, it is crucial to understand people’s motivations for being active on social media.People often share false information to serve their identity needs and appease their in-group. For example, false information that disparages political opponents may gain so-called likes and other forms of social approval from like-minded others. Research suggests that believing and sharing misinformation is often not due to incompetence or the intention to mislead others purposefully; instead, it is due mainly to people’s attention being focused on social connections instead of accuracy. Shifting people’s focus on the possible accuracy or inaccuracy of information can reduce their belief in misinformation and their willingness to share it.This policy brief reviews interventions that successfully shift people’s focus to accuracy. One intervention is warning labels that are presented simultaneously with misinformation. Such warning labels can be pretty effective if implemented correctly. Moreover, interventions either before (prebunking) or after (debunking) encountering misinformation can be effective, although the effects of these interventions tend to be small and decrease over time. Even though emotions and identity needs are the primary reasons people believe misinformation and conspiracy theories, raising public awareness of possible inaccuracies and rationally refuting and correcting such false information makes a difference for many citizens.
M3 - Chapter
T3 - Studio Europa Maastricht Policy Brief Collection
SP - 28
EP - 34
BT - SEM Policy Brief Collection: Digitalisation
A2 - Verduyn, Philippe
PB - Maastricht University Press
CY - Maastricht
ER -