2023 Cambridge Disinformation Summit.

Fighting disinformation needs interdisciplinary approach

3 August 2023

The article at a glance

The first Cambridge Disinformation Summit ends with call to lobby for access to social and sharing media platform data.

The first Cambridge Disinformation Summit ended with a call for an interdisciplinary lobby to gain timely access to social and sharing media platform data to better understand how advertising targeting and sharing recommendations affect the spread of deliberately harmful information. 

“I propose that this community pool resources to persuasively lobby for transparency to platform data and processes so we can understand what is actually happening in these systems,” conference organiser Professor Alan Jagolinzer said in remarks closing the Summit. 

Professor Alan Jagolinzer opening the summit.
Professor Alan Jagolinzer opening the summit
Alan Jagolinzer with panel members.
Alan with panel members.

Young people should be included in the fight against disinformation 

The Professor of Financial Accounting at Cambridge Judge Business School also called for collaborative effort from the assembly of global academic, policy, and practice leaders to support grade-school teachers in developing curriculum in critical thinking and media literacy. This call followed the discussion from invited speakers representing Finland and Sweden, who shared how their countries’ children develop resilience (’Don’t be fooled’ campaign) to disinformation through early-age literacy education. 

Among the next steps following the Summit, which was organised at Cambridge Judge Business School, will be an event to share interdisciplinary research and policy proposals at the University of Zurich, at a date to be announced. 

The 27-28 July Summit held at King’s College, University of Cambridge, brought together 210 people ranging from psychologists, accountants, management scientists, political scientists, legal experts, journalists and others involved in studying and combating the rise of disinformation in societies around the world. Another 200 watched live streaming remotely. 

Six common elements in different types of disinformation 

Opening the session, Professor Jagolinzer, who is also Co-Director of the Cambridge Centre for Financial Reporting & Accountability, said that he observes the same six key elements in all disinformation campaigns, regardless of the setting in which they operate:  

  1. The existence of a malign actor (or actors) 
  1. An incentive or benefit 
  1. An intentionally false and emotionally charged message 
  1. Specifically selected disinformation channels 
  1. A selected target audience 
  1. The intention to exploit that audience for intended benefit. 

He suggested this pattern appears in examples of financial fraud, authoritarian governments, sexual predation, and attempts to marginalise out-groups. 

Systems need to be democratically governed 

The featured speaker at the Summit’s closing dinner was Frances Haugen, an advocate for social media accountability and transparency known for being a whistle-blower in disclosing internal Facebook documents in 2021. 

She said that Facebook was adept at “making false choices” that “you can either have safety or freedom of speech when there are 20,000 pages of documents saying you have a lot of other options. It’s either ‘you let us do whatever we want or China wins’”. 

She concluded her remarks, however, on a more optimistic note, saying that new technologies have changed society for hundreds of years but thoughtful citizens always found ways to fight back against problems and abuses. 

While cheap printing presses allowed disinformation in newspapers that provoked wars, “we learned and we responded. We developed journalism schools and journalistic ethics. We started doing things around media concentration laws and ownership disclosure laws like that you can’t own all the newspapers in town. 

“This is our moment to respond. We are learning, every single time before we have learned and we have responded, so I want you to go forth in each of your roles in the ecosystem of accountability, and press for the idea that we deserve to democratically govern these systems, because we will respond again.”

The promise and risk of AI 

The Summit opened with an address on the definition, history, strategy and consequences of disinformation from Stephen Jolly, a former senior British Ministry of Defence official and Fellow Commoner at St. Edmund’s College, University of Cambridge.

Summit sessions included an examination of why disinformation is effective, societal implications, platform accountability versus free speech, journalistic ethics, open source intelligence, the profitability of running disinformation campaigns, the potential promise and risks of generative artificial intelligence (AI), and discussions of the efficacy of an array of interventions to stem the impact of disinformation.  

There was also a private showing of ‘The YouTube Effect’ documentary at the Cambridge Union, with Q&A from actor-director Alex Winter and producer Gale Anne Hurd, who also produced in the ‘Terminator’, ‘Aliens’, and ‘The Walking Dead’ franchises. 

Sander van der Linden, Professor of Social Psychology at the University of Cambridge who supported several discussions in the Summit, said that while disinformation had been around for thousands of years, AI makes it much “faster and more persuasive” to support bad actors. He said that “pre-bunking,” or introducing a small dose of the pending message with some context about why it is manipulative, is one tactic to help inoculate against negative effects from disinformation. 

Microtargeting: targeting individuals based on their public data 

A session on microtargeting of the audience to receive disinformation, chaired by Michael Willis, Management Practice Associate Professor at Cambridge Judge, included an explanation from Cambridge Judge Professor of Computational Social Science David Stillwell on how microtargeting is aimed not at groups or communities but solely at individuals based on their data. 

“It’s sent to an individual, the rest of us don’t know, and the individual doesn’t know they’re being targeted, so it’s a black box,” David said. Other panelists noted that the intention is to isolate someone who has revealed – through public data footprints – that they would be most vulnerable to being fooled by disinformation. 

David added that microtargeting can be made more open, such as the way that Netflix recommends movies based on a customer’s past viewing patterns. “I know that they’re doing it, I have some insight into how they are doing it, I have some control over that.”

Fighting disinformation is not censorship

Professor Jagolinzer acknowledged that there are active campaigns to undermine academic research on disinformation, typically by intentionally conflating research on disinformation with “censorship.” He displayed a contemporaneous social media post that falsely implied the Summit was intended to undermine free speech rights afforded under the US Constitution.

In his closing remarks, the conference organiser proposed that the Summit community should continue efforts to understand the mechanics and impact of malign influence campaigns. 

Professor Jagolinzer analogised that “everyone would agree that we need to understand why Elizabeth Holmes and Bernie Madoff engaged in their disinformation campaigns, why their intentionally false messaging sounded compelling, why they chose each of their victims, and why their victims could not see through the deception. This is not about censorship, but rather about understanding the deception process to enable prevention and accountability.”