Skip to content ↓

The Negativity Effect

Last week the Communities Secretary Michael Gove laid out a new definition of extremism.  Extremism is now defined as “the promotion or advancement of an ideology based on violence, hatred or intolerance”.

The change shifts the emphasis away from action to ideology and is a response to what the Prime Minister Rishi Sunak described as “a shocking increase in extremist disruption and criminality”.

Politicians and commentators on both sides of the political divide have been critical of the change. What both sides agree on however is that we live in an increasingly divided and intolerant society.  Why is this?

To my mind it is increasingly clear that social media is at the very least a significant contributor to this intolerance.

The negativity effect is a quirk of human behaviour which leads us to pay far greater attention to negative information than positive.  There is increasing evidence that this cognitive bias has a huge effect online. The business models of nearly all digital media platforms are built on maximising user screen time.  More users spending more time on your platform means more advertising revenue and more data harvested and available for sale. Consequently, whilst the algorithms social media uses to suggest content to their users vary all the time, one thing is consistent, they are programmed to recommend material that will keep you looking at your screen.  The profit motive and human psychology mean that these social media algorithm curate feeds full of updates, news and videos that make you outraged and angry.  Something born out by the data.  According to one site monitoring YouTube trends – the most effective way to get your video selected by the algorithm is to include the words “hates”, “obliterates”, “slams” and “destroys” in the title.

A major study at New York university found that for every word of moral outrage you add to a tweet your retweet rate will go up by an average of twenty percent, and the words that increased the retweet rate the most were “attack”, “bad” and “blame”. An investigation by the Pew Centre, a non-partisan research organisation that conducts opinion polling, demographic research, and media content analysis found that if you fill your Facebook posts with “indignant disagreement” you’ll double your likes and shares.  What impact does this diet of disagreement, blame and hate have on our society?  In our quest for likes and follows driven by algorithms that reward indignant disagreement and penalise empathy we have become slow to understand but quick to condemn. Worse, as we consume more and more of this content, we become desensitised to it and ever more shocking material is required to keep us scrolling and staring at our screens, and that is exactly the diet that digital media feeds us.  An unwholesome diet that is full of ‘fake news’ because according to a study by MIT ‘fake news’ travels six times faster on Twitter than real news, meaning that during the 2016 US Presidential elections, falsehoods on Facebook outperformed the top stories at nineteen mainstream news sites combined.

In 2020 internal Facebook documents were leaked to the Wall Street Journal.  These documents were the work of a group tasked with discovering whether their own algorithms were promoting intolerance and radicalising users.  They reached a definite conclusion.

‘Our algorithms exploit the human brains attraction to divisiveness…if left unchecked [they will promote] more and more divisive content in an effort to gain user attention and increase time on the platform.’

Social media fosters intolerance and radicalises people!

A conclusion supported by two major studies.  The first asked white supremacists how they became radicalised, and the majority sited the internet -with YouTube as the site that most influenced them. Whilst the second separate study of far-right Twitter users found YouTube to be, by a significant margin, the website they turned to the most. The evidence of these studies is further supported by a second internal Facebook team whose work was also leaked to the Journal.  They found that 64% of all the people joining extremist groups were finding their way to them because Facebook’s algorithms were directly recommending them.

YouTube, Facebook, Twitter, and Instagram have age ratings of 13+ but these age ratings provide a false sense of security.  Social media is fostering intolerance and promoting extremism.  The genie cannot be returned to its bottle. Social media is an inescapable part of young people’s lives but we must approach it with caution, be sure to guide young impressionable minds, monitor their consumption and educate them about it’s dangers.