15 Things To Learn About Disinformation, Propaganda And The Role Of Tech Platforms
Takeaways from 'AI x Disinformation', a conference co-hosted by BOOM and the University of Munich.
Sometimes it helps to state the obvious. Sometimes the obvious eludes us. In the spirit of both such cases, we present 15 takeaways from a conversation on the world of AI-fuelled disinformation, extreme speech, propaganda and more.
These lessons come from three conversations [YouTube link] that we co-hosted on 20th January at AI x Disinformation. The other co-host was LMU Munich (University of Munich) led by Professor Sahana Udupa.
The conversation on AI and propaganda featured three experts: Alaphia Zoyab, Advocacy Director at Reset.Tech, Shivam Shankar Singh, an author and former political consultant, and Karen Rebelo, BOOM’s Deputy Editor.
I’ve arranged the 15 takeaways into themes and have provided direct quotes from the speakers. (Points 11 and 12 were new to me too.)
Disinformation and propaganda doesn’t always spread through WhatsApp and social media. Also, the mainstream media’s role is underestimated and poorly understood.
“Disinformation and propaganda spreads across various media: mainstream media, cultural content, entertainment, etc.”
“The absolute crisis [is] in the mainstream media and the way that tacks onto social media. So for example, Fox News content goes crazy viral. Extreme content as disseminated from mainstream media has a double advantage on social media because the algorithms are primed to promote extreme speech, and you have this additional problem of slightly higher levels of trust in mainstream media anyway, and so therefore the combination of the two: the social media algorithm + disinformation emerging from mainstream media, is a real crisis.”
“And that’s something we have to address from potentially media regulation and media reform rather than social media reform.”
- Alaphia Zoyab
Research on disinformation is specific to Western countries and focused on English.
“If you think of the lifecycle of disinformation: the production, the distribution and the consumption of disinformation, research is focused very heavily on the aspect of distribution and particularly distribution on social media.”“The glut of research on disinformation tends to be currently very heavily from the western world. From Europe, North America and very heavily focused on English. Which means that a lot of the interventions that we may be designing to tackle the problem are not actually global. And they may not actually fulfil needs of communities far away from the global north.”
“So what we need is … scaled-up research [which is] linguistically diverse, driven by different communities that understand the dynamics of disinformation and propaganda as it is targeted to them.”
“So I just really want to underscore the diversity point because we can’t design solutions if we haven’t engaged those communities and the problem to begin with.”
- Alaphia Zoyab
“We don’t have enough rigorous research on what are the interventions that actually work. So it’s very hard to know at the consumption level what sort of intervention has worked within a community, how durable is it, are the effects of media literacy programs really short-term or are they long-term. So we need much more rigorous research on that aspect on the consumption of disinformation.”
- Alaphia ZoyabTech platforms don’t have an incentive to alter their algorithms against extreme speech, propaganda and disinformation.
“…the way platforms are currently designed (which is maximised for growth), they have absolutely no incentive to change any of their policies and systems, and only regulation can make them do so. And that’s why we [Reset] lobby very heavily for regulation.”
- Alaphia ZoyabFact-checking and media literacy is only one of several approaches to fight disinformation.
“I think there has been an overemphasis on fact-checking and a lot of research also shows that — banish this idea that facts are going to fight against disinformation because of this emotional appeal of some of these narratives. And so there are other ways to fight it, not just fact-checking.”
- Alaphia Zoyab
Editor’s Note: This fact has been obvious to us at least since 2018. Here’s a proposal we called the #5pFramework:Social media and other tech platforms are not a mirror of society. Rather, they are a megaphone.
“‘Platforms are but a mirror of society’ is still a very dominantly held view. So the roles of algorithms in changing our information ecosystem and our behaviour from being just a mirror to being a megaphone—that aspect is still very invisible to most people. And we in the spaces where it is academics and researchers who work, day in and day out on this problem, I think we’re all collectively disconnected from the real world where people still think there’s a seamless reflection of society on platforms.”
- Alaphia Zoyab
Europe is showing us how to regulate technology.
“The European Parliament has just voted on the DSA [Digital Services Act]. And the promise of regulation emerging from a slightly more progressive space is that it can hopefully set a precedent for other parts of the world. [Also it can] enable civil society from other parts of the world to demand the same things that platforms are forced to provide citizens in spaces where there is actually robust regulation.”
- Alaphia Zoyab
Your election data is misused to target you with propaganda.
“What we did was profiling constituencies, profiling voters, [and] putting them into different categories. After that someone else took over that data and used it for messaging. And when you have photos provided to different caste buckets, different religious buckets, different socioeconomic buckets, different age groups, it's much easier to craft a message that will resonate with one group of people while sending a completely different set of message to another group of people.”
“So this is the kind of thing that social media and digital propaganda allows, which makes it very different from conventional propaganda. If you look at say, what happened during the Nazi regime in Germany, then through movies through television channels through radio broadcasts, there was a lot of propaganda but it was…propaganda for the entire population. Everyone heard the same messages. Everyone wants to be convinced of the same thing.”
“That isn’t true of the world that we live in now. Now, what political parties want is for every specific group to believe a different message altogether. One caste group will get messages saying that they'll be favoured over other groups, while another group will be getting messages from the same party that they will be.”
"If you look at digital propaganda, a major component of it is actually profiling. Facebook is a brilliant tool because it already allows you to profile people based on their likes based on their activity on the platform. itself. But a lot of the profiling is done by political parties themselves. They do it using the voter…just by using people's last name, just by using proxies like people's electricity bills that give you an idea of socioeconomic status, because if you have a higher bill that you probably have a bigger house with more appliances. So you are richer, that kind of thing. So right now, because we don't have a data protection law, things like people's phone numbers, things like people's electricity bills, even things like people's enrollment, as beneficiaries of government schemes are not protected under any law.”- Shivam Shankar Singh
Power in today’s world is not about military and economic power.
"[In] the world that we live in, power is no longer based on the army that you have or the amount of money that you have. Power is based on your ability to shape a reality for a population. If you're able to convince a certain section of society that okay, this is what is real and this is what is fake, you are probably going to have power.
Who controls the information that people see? It is these digital platforms. It's Facebook, it's Google, Instagram, WhatsApp, all of even the new upcoming platforms that are coming up…
This is especially true during COVID because most of us are sitting at our house and not actually moving out and interacting with the world. So if we consistently about anyone with a certain type of messaging that becomes the basis of reality. That becomes the foundation for the void.”
- Shivam Shankar SinghGovernments around the world are trying to rein in big tech, but they profit from the status quo too.
"The issue that I primarily see with the world right now is that what you are allowed to post on Facebook, what goes viral, what goes viral on Twitter, is right now determined by private entities. These private entities decide who's going to get blocked, who is going to get amplified how the algorithms function, and what messages go viral. This is an incredible power in the modern world. It has the power to shape people's thoughts.”
"Governments are slowly starting to realize that this power is too much for a private entity. So what you see around the world right now is a fight between governments and these private entities where governments are trying to take back control.”
“…[Governments] realise how powerful the social media giants are, and they will want to take some of that power to themselves. And essentially, that is what I see a lot of the regulation in India being framed as: It's not to fix the problem of disinformation. It's not to fix the problem of fake news. It's to make the government the arbiter of what should be on a platform and what should be taken down and what should be going viral. So that is completely in line with the government's incentive; that is completely in line with the political parties’ incentive structure, where they get the power and control.”
- Shivam Shankar SinghPolitical parties are abusing tech to get votes.
“I think technology lends itself to you know, being abused. And it's a very powerful tool to like some things that she even said to personalise your message to a group of people to reach a lot of people at the same time and do a lot of arm as well. So we've been in this space for quite some time. We are always quite alarmed with everything that happens.”“It seems to be like an arms race in terms of the tech. So you have bad actors who will stop at no cost; who have no incentive to self regulate. There is no regulatory oversight. There is not even a discussion even though deep fakes have been used and they've been used in elections in India.
We've seen an example of this in 2020, where, before the Delhi elections, the BJP Delhi unit created a deep fake of Manoj Tiwari, you know one was a video where he speaks about Arvind Kejriwal. [It] was a message that was tailor made to specific WhatsApp groups but as much as it shocked all fact-checkers when that story came out…nothing after that happened. There wasn't any noise by any politician or even opposition parties to say that, hey, this should not be happening. Even the election commission didn't seem to care and it just died down. So one is that, you know, when we think of AI and propaganda, the average person thinks of deep fakes. Yes, it is very evolved and takes a lot of expertise and money as well.”
- Shivam Shankar SinghIt’s time we had audit powers over big tech.
“We don't have a clue as to what's actually happening in the backend of these platforms. We are unable to look inside the black box. So all of the research is based on a very narrow view of what we are able to research.”
“How are these platforms algorithmically amplifying disinformation and propaganda? The law—the DSA—enables regulators to look under the hood. It gives regulators audit powers, and that is one very significant advance from the status quo that we have today. And also I should say, the DSA is not yet law. It was passed by the European Parliament today [20th January], but it still has to be negotiated with the European Commission and the council and only then it gets passed. So we're still some ways off from a law actually, you know, being implemented, but these are all really positive signs.”
- Alaphia Zoyab‘Environmental risk assessment’ is for tech companies too.
"[There’s an] elegant solution to this debate. We always end up in the Free Speech debate. [The solution] is that it forces the platforms to do risk assessment. So in the way that a factory or a mine would have to do an environmental risk assessment before it sets up. platforms now routinely have to do a risk assessment as to how their products and policies may be harming people. It is not an obligation they've had to fulfill ever.
And because we've sort of emerged from this world of tech utopia where algorithms and AI was going to solve the problems of humanity. Everyone had a very rosy view of tech platforms. And this notion of risk assets assessment, or the notion that they could be deeply dangerous to society was never entertained, but it is now enshrined in that piece of legislation, which will force the platforms to do risk assessments.”
- Alaphia ZoyabPropaganda picks up during election season.
“[Election time is] game time for political parties. And sadly, propaganda has become an accepted part of nearly all election. All parties that we see during elections become extremely active on social media, in terms of, pushing out their own message or in terms of targeting the opposition party.”
"So, yes, it's been a busy time…we are primarily focused on Uttar Pradesh. It's a huge state, it sends a lot of legislators to the parliament. From last time [2017], the difference that we've seen is that the other parties, the SP and the BSP, have gotten a lot more active. They've they've upped their social media game. I guess they've realized that if you want to take take on the BJP, then you have to play the game as well. So we see a lot [of propaganda]: six months prior, almost a year before.”- Karen Rebelo
Political advertisements on Facebook don’t require to be fact-checked. And it’s cheaper to spread anger on social media than positive messages.
"That was one of the most shocking revelations [of Facebook whistleblower Frances Haugen] for me—that it's actually cheaper to spread hate. Not just that it wins you votes, it's actually cheap, much cheaper as well.
I think one of the bigger problems though with the advertising space is the loophole that Facebook created around not fact-checking ads…particularly when it came from, like politicians. I think that is absolutely unjustifiable.
The fact that you can post anything, the fact that it has to pass no fact-checking standards if it comes from political parties…I think that was one of the huge, huge gaps in that [it] enabled the further spread of disinformation and hate. So I'd say that is a huge loophole that Facebook, I don't recall if they've already plugged it, but that enabled a much bigger spread of misinformation.”
- Alaphia ZoyabThe Facebook Oversight Board is about self-regulation.
"So the oversight board first and foremost, is still self regulation. And self regulation, we know has failed because the platforms don't have the incentives to change their behavior. The second huge problem with the Oversight Board is that the they look at individual pieces of content which is like looking for the needle in the haystack.
The Oversight Board does not have the power to demand the kind of data access that I spoke about earlier, which hopefully will be enshrined in law, which is to look at how your algorithms are actually promoting X and Y pieces of content. By looking at individual pieces of content, it is utterly pointless. And so this big, grand institution [is] set up to do very little; to look at a very narrow aspect of the problem.
But I just want to add one more point, which is which goes to I think one of the questions in the chat around like okay, regulations all very well, but when the government is part of the problem, you know, what is the use of regulation? And I totally agree with that there are some governments who you don't want legislating and interfering in like content moderation questions. You're still however, left with the problem that it is private entities regulating speech and thought.”
- Alaphia Zoyab
Note: I transcribed the whole conversation using AI tools. Ha, the irony!