Misinformation, emotions, and violence: the dark side of conspiracy theories

Posted in: Digital, PhD, Research, Social media, Technology

Darja Wischerath is a PhD student working in the Applied Digital Behaviour Lab and the Bath Institute for Digital Security and Behaviour. Here, they explain how social media platforms such as Parler can act as a breeding ground for violence by allowing unchecked discussion around conspiracy theories.

In today's digital world, misinformation is more common than ever. From claims that ‘5G spreads Covid’ to assertions that the Earth is flat, conspiracy theories are easily found on many social media platforms.

While some may seem harmless or even amusing, others have a darker side that can lead to real-world violence. This raises an important question: why do certain conspiracy theories push people toward violent actions?

To explore this, we analysed posts relating to five different conspiracy theories (associated with varying degrees of violence) on Parler, an alternative social media platform known for minimal content moderation.

Using sentiment analysis, we examined how discussions around these specific topics stirred strong emotions – particularly anger, contempt and disgust.

Overall, we found that conspiracy theories associated with more violence (such as the great replacement conspiracy) sparked higher levels of anger, contempt and disgust than conspiracy theories associated with little or no violence (such as flat Earth conspiracies).

This suggests that there is a link between certain conspiracy theories, heightened emotions and the endorsement of violence.

Legitimising violence

Conspiracy theories often tap into powerful emotions that can drive people toward extreme actions. Anger arises when individuals believe they face injustice or threats – such as harm from vaccines – compelling them to ‘fight back’ against perceived oppressors.

Contempt develops when others – such as scientists or immigrants – are viewed as inferior or corrupt, fostering a sense of superiority and disdain.

Disgust emerges from dehumanising others by seeing them as impure or contaminating, as seen in theories that certain groups are ‘polluting’ society. This makes it easier to justify hostile actions against them.

Such emotions can lead individuals to rationalise harmful behaviour, turning abstract anger into concrete actions. This is exacerbated when combined with in- and out-group dynamics that contribute to the justification of violence against ‘the other’.

Finding community

For individuals inclined toward conspiracy theories, platforms such as Parler, Gab or Odysee provide a space where these ideas can thrive with minimal content restrictions. These platforms emulate the functionality of mainstream social media but typically operate with fewer regulations on speech.

This approach appeals to users who feel their views are marginalised or censored elsewhere, creating a favourable environment for extremist ideas to circulate unchecked.

Participation in these ‘alt-tech’ communities fosters a sense of belonging but can also entrench users within echo chambers. In these spaces, individuals repeatedly reinforce one another’s beliefs, often escalating the intensity of their language.

This can lead to what researchers refer to as ‘violent talk’ – language that endorses violence as a legitimate response to perceived threats or injustices. Over time, such expressions can influence real-world behaviours, particularly when anger, contempt and disgust are involved.

Real-world consequences

The transition from online talk to real-world violence isn't just theoretical: some conspiracy theories have been associated with a range of violent acts.

For example, misinformation around Covid-19 vaccines spurred threats and assaults of healthcare workers and clinics. On a larger scale, the great replacement conspiracy theory has been mentioned in terrorist manifestos, such as the Christchurch terrorist attack, and is often discussed in far-right spaces.

These incidents demonstrate that the emotional intensity generated online doesn't stay confined to the digital world; it spills over, affecting real lives and communities.

Understanding this pathway from discussion to action is crucial for prevention. Here's how we can address it:

  • Promote media literacy: educating people on how to critically evaluate information can reduce the spread of misinformation. Schools and community programmes can do so by teaching fact-checking and critical thinking skills.
  • Encourage diverse interactions: social media platforms can design algorithms that expose users to a variety of viewpoints, breaking the echo chamber effect.
  • Implement responsible moderation: while respecting free speech, platforms should enforce policies against content that incites violence. Advanced tools can detect and flag harmful language.

Not all conspiracy theories lead to violence, and questioning official narratives is a healthy part of a democratic society. However, when these theories dehumanise others and incite harm, intervention becomes necessary.

Balancing freedom of thought with the need to prevent violence is, therefore, a complex but essential task.

Posted in: Digital, PhD, Research, Social media, Technology

Read the research paper

Respond

  • (we won't publish this)

Write a response