DISINFORMATION: ‘A moral case based on rigorous technical research can bring about change’

CIVICUS speaks with Imran Ahmed, founding Chief Executive Officer of the Centre for Countering Digital Hate (CCDH), about the rise of disinformation and hate speech in the context of the pandemic, and the roles civil society can play in countering them. CCDH is an international civil society organisation that seeks to disrupt the architecture of online hate and misinformation. Founded in 2018, it develops strategies and runs campaigns to strengthen tolerance and democracy, as well as counterstrategies to resist new forms of hate and disinformation.

ImranAhmed

How did the Centre for Countering Digital Hate get started and what it is trying to achieve?

The Centre for Countering Digital Hate seeks to disrupt the production and distribution of content of hate and misinformation in digital spaces. It exists because digital channels have become one of the primary means through which we transmit information, establish social mores and behavioural or attitudinal norms, and create value as a society.

As it happens, those spaces have been colonised by malignant actors who have undermined some of the basic precepts of our democracy. They use trolling to undermine tolerance and the liberal values that give everyone an equal voice in those spaces and use misinformation not only to destabilise the fundamental tenets of the scientific method but also to spread hate.

We try to counter this by making malignant activity more costly. We use exposure and inoculation to make it more difficult and create costs, whether political, economic, or social, for those undertaking malignant activity.

How did your work change under the COVID-19 pandemic?

As early as February 2020, we pivoted the entire organisation towards fighting COVID-19 misinformation. We saw that extremist groups that were already on our radar were having discussions about COVID-19 as an opportunity, and any opportunity for a neo-Nazi is a threat to a civilised democratic society.

We always try to put our efforts where there is most need. A few months back, in December 2019, we had done a study on vaccines and disinformation for the UK parliament’s All-Party Parliamentary Group on Vaccinations for All, so we were already aware that anti-vaxxers were a sophisticated group of misinformation actors. In a paper that we put together for the UK and US governments in April 2020, we expressed concern about a surge in xenophobia driven by the pandemic and deriving from psychological, sociopsychological and neurological factors. There is a correlation between disgust sensitivity – which is high in a pandemic – and xenophobia. We also realised that anti-vaxxers were a very sophisticated group of propagandists, and if they were able to professionalise the production of COVID-19 misinformation, they would cause a lot of trouble.

How does COVID-19 disinformation connect with identity-based hate?

At a very simple level, because of the correlation between disgust sensitivity and xenophobia, we can look at the research in social psychology by Michael Bang Petersen and at explanations by neuro-endocrinologists such as Robert Sapolsky, which tell us that disgust sensitivity and group thinking are co-located in the insular cortex of the brain. For a year and a half we have warned that there is a problem, as people have been primed at a really basic level, in the sense that if you view anyone who is different from you and outside of your group as a potential threat, it triggers the frantic inner group thinking in your brain.

We know this is going to be an ongoing problem, but we do not know its long-term ramifications. This could potentially set back some of the work we’ve done, for example on migrants’ rights or climate change and taking responsibility for what happens to the world and not just yourself. There is a lazy assumption that we are going to ‘build back better’ because people are feeling positive about things once they feel we are coming out of the pandemic, yet for the past year and a half we have been neurologically and psychologically primed to be very insular.

What programmes and campaigns have you developed to reduce disinformation and hate?

One of the things we do well is produce actionable intelligence. I think what is key about our model is that we do not produce raw data, or research, or even insight, which is the analysis of data in context. We produce actionable intelligence, which is insight plus an understanding of what it is that you can do to change things.

Part of the problem with digital misinformation and hate is that people do not know what they can do about it because the platforms are resistant to doing anything and absolve themselves of the problem. We challenged this understanding through our work on anti-vaxxers.

First, in late 2020 Facebook stated that anti-vax misinformation wasn’t banned on their platform, and then they changed that as a result of our research showing that misinformation causes harm. It may sound trite to say misinformation causes harm in a pandemic, but it does – on a scale that is both massive and grave –, and we had to go out and prove it. Second, their platforms were uniquely being used by these bad actors to organise, and we had to prove that as well. Third, we produced the ‘Disinformation Dozen’, an analysis that showed that 12 anti-vaxxers were responsible for almost two-thirds of anti-vaccine misinformation circulating on social media platforms.

When we put out this research, everyone from President Biden to physicians begged social media platforms to change their behaviour and take responsibility as publishers. They have the biggest audience of any publishing company in the world, 4.5 billion users, and they must take that responsibility seriously. Recently Google announced that they are going to take action against the Disinformation Dozen. This took CCDH 18 months of campaigning. We were told it was a freedom of speech issue and that it would lead nowhere, but we have shown that if you present a moral case based on rigorous technical research, you can shift views and force people to confront the ramifications of the technology they have created. I think we have shown that change is possible, and I am very proud of that.

There are many areas affected by misinformation, from public health and migrants’ rights to sexual health and reproductive rights. In the last few months, for instance, we have taken on anti-abortion, violent extremist neo-Nazis in the Ukraine, using the same model of rigorous research and strong campaigning. We put out a report showing that Google and Facebook were taking money from anti-abortion campaigners by putting up ads. This means that they were enabling terrible organisations to spread misinformation that undermines women’s reproductive rights. In response to our report, they removed those ads the next day. More so, due to our campaign in the last few weeks, Heartbeat International and Live Action were banned from advertising on Google. 

How can civil society come together to put more pressure on governments and big tech companies to hold them accountable?

We need more people who not only have good technical skills but also understand persuasion, campaigning and activism, and who believe and bolster the moral argument to understanding the technology. In a risk society, where human-made risk and scientifically-generated negative externalities increasingly comprise what we campaign on, whether big tech undermining democracy and public health or climate change and the energy mix, these are areas where it is more important than ever that we understand that technical problems require moral argumentations. You need to make the moral argument and have the courage to make it, while also having a strong technical understanding of what is really going on.

For example, if you want to make the case, as President Biden did, that Facebook are killing people, you have to nail down exactly how their technology functions and be absolutely certain before you state it. That is what we do on the basis of our research. It is important to start reaching out beyond our usual allies and build alliances across science, technology and campaigning.

Get in touch with the Centre for Countering Digital Hate through its website or Facebook page, and follow counterhate on Instagram and @CCDHate on Twitter. 

CONNECT WITH US

DIGITAL CHANNELS

HEADQUARTERS
25  Owl Street, 6th Floor
Johannesburg,
South Africa,
2092
Tel: +27 (0)11 833 5959
Fax: +27 (0)11 833 7997

UN HUB: NEW YORK
CIVICUS, c/o We Work
450 Lexington Ave
New York
NY 10017
United States

UN HUB: GENEVA
11 Avenue de la Paix
Geneva
Switzerland
CH-1202
Tel: +41.79.910.34.28