Conference audience.

Tackling disinformation through an interdisciplinary approach

18 July 2023

The article at a glance

Why me? Professor Alan Jagolinzer explains why an expert in tightly regulated financial markets organised the first Cambridge Disinformation Summit, to be held 27-28 July, to find ways to tackle the lack of clear disinformation accountability elsewhere in society.

Category: AI and technology

The first Cambridge Disinformation Summit

The first Cambridge Disinformation Summit will be held 27-28 July at King’s College, Cambridge, bringing together thinkers from psychology, journalism, financial reporting, political science and other fields to explore this timely and critical topic.  

Panels over the 2 days range from the history of disinformation, to why disinformation is effective, to the balance between platform accountability and free speech. Participants will also discuss the benefits and risks of artificial intelligence (AI), journalistic integrity, and ways to combat disinformation. The keynote discussion on Day 2 focuses on social media responsibility and accountability, featuring Frances Haugen, who is Co-Founder of nonprofit group Beyond the Screen.

The Cambridge Disinformation Summit is the brainchild of Alan Jagolinzer, Professor of Financial Accounting at Cambridge Judge Business School. In this opinion article, he explains why he decided that disinformation needs urgent attention through an interdisciplinary approach, and why someone with a background in markets and financial accounting took on such a sprawling topic. 

Organising a global conference on disinformation

As the first Cambridge Disinformation Summit on 27-28 July draws nearer, I am frequently asked: “Why are you leading this project?” 

I have been told many times that people are surprised that a Professor of Financial Accounting is organising a global interdisciplinary conference that focuses on the broad societal damage from disinformation that includes exacerbating excess pandemic deaths, undermining democratic institutions, and fuelling aggressive war or genocidal campaigns. 

But accounting is an information science that has developed considerable infrastructure over many decades to minimise disinformation about a company’s financial health (which we call fraud) or about its environmental impact (which we call greenwashing). We have, for example, reporting standards, independent audits, audit committees, regulators who can enact civil or criminal penalties, and shareholder legal rights to hold managers accountable for intentionally misleading and harmful information, which is, by definition, disinformation. 

Personal episodes convinced me of need for broad discussion of disinformation 

Around the pandemic, I had my close relative die unnecessarily from disinformation about vaccine efficacy and safety. I felt a chasm grow within my family and old friend networks, where bringing evidence to challenge their entrenched beliefs led to ad hominem attack and indefinite breaks for any future engagement. I saw direct attacks on democratic institutions from politicians who used to proudly support the same institutions and who had shifted to rhetoric that inspires random violence and is easily falsifiable. I also saw a war of aggression kick off a few hours flight east of Cambridge, that has unnecessarily killed thousands and has triggered numerous geopolitical fallout shockwaves including forced human migration, sizeable changes in energy supplies and pricing, and disruptions to other global supply chains.   

What was interesting for me, as I saw these episodes materialise, was that the underlying disinformation campaigns to support each seemed to operate on the same fundamental psychological technology as I had seen in perpetrating accounting fraud. In other words, I started to observe patterns and analogies common to all these settings. 

This led me to wonder: “Why do we have so much infrastructure to mitigate societal harm from intentional manipulation of information in a financial reporting setting, but virtually none of the same infrastructure to mitigate societal harm from intentional manipulation of information in other settings, like social media and journalism, where arguably the societal damage is significantly greater?” 

Can we incorporate accountability infrastructure of accounting to other areas? 

In other words, why can chaos agents lawfully disseminate harmful disinformation in other platforms under the auspices of “free speech rights,” yet face significant penalties for doing so in my field? Could we incorporate similar accountability infrastructure in other information domains? If so, how? Who would be responsible for enforcement, and could that entity earn trust and avoid trampling free speech rights in the process? I also started to wonder why most people who read corporate financial reports are inherently sceptical and heavily consume alternative data to allow for better informed decisions; yet many who read social media or mainstream journalism outright reject alternative data—sometimes violently—because it doesn’t align with their prior belief systems. This led me to deepen my learning about entrenched belief structures to explore how one might penetrate information echo chambers or create opportunities for people in different belief systems to relearn how to safely talk with each other and perhaps find common ground. 

Collectively, my questions led me to find scholars outside of my field, from journalism, authoritarian studies, social psychology, psychiatry, public policy, computer science, law, and other information sciences, to help me better understand the nature of these information environments and the assorted approaches each discipline is examining to help mitigate the societal damage of malignant information chaos actors. What I found were pockets of highly knowledgeable and dedicated academics, policymakers, and practice professionals who shared my concerns about the global existential risks of disinformation, but who were approaching the problem from different angles or frameworks. I felt compelled to convene the groups, because I sensed the potential to amplify all our collective impact if we shared learning across disciplines. 

And that’s how the Cambridge Disinformation Summit was born at the Cambridge Judge Business School, with significant support from the accounting faculty groups at the University of Notre Dame and the University of Zurich, the Cambridge Psychometrics Centre, the Cambridge Social Decision-Making Laboratory, and the Cambridge Overcoming Polarization Initiative. 

Broad summit agenda ranging from history of disinformation to free speech 

The immediate goal of the Summit is to convene global thought leaders to discuss the history and consequences of disinformation, why disinformation is effective, the societal impact, platform accountability versus free speech rights, and fact checking including the use of artificial and open source intelligence. We will also examine who profits from disinformation, journalistic integrity, customising disinformation for micro targeting, and the efficacy of methods to combat disinformation. We are delighted to welcome our closing-dinner keynote speaker, Frances Haugen, Co-Founder of Beyond the Screen, who will discuss her advocacy work for accountability and transparency in social media. Frances is best known for her testimony to lawmakers in the US, Canada, and UK as the whistleblower who exposed internal evidence of harm from Facebook business practice. 

The longer-term objective of the Summit is to facilitate interdisciplinary collaboration on projects that can enhance efforts to mitigate global societal damage from disinformation. For example, I am working with academic, policy, and practice colleagues who will attend the Summit to develop Business School curriculum on the business risks and responsibilities of disinformation. We plan to explore, for example, direct business risks that might include stock or product market manipulation by chaos actors who exploit for financial profit. We also plan to explore CEO or auditor reputation-damage strategies by chaos actors who exploit for political or financial gain.  

We will also explore indirect business risks that might include geopolitical shocks that expose the business to disruption or loss. One example might be supply-chain disruption or stranded assets relating to bad pandemic response or civil conflict. Another example might include BP’s response to the Russian invasion of Ukraine, which is supported by sophisticated domestic and global disinformation efforts. BP had previously purchased a large stake in Russia’s oil enterprise Rosneft and had placed its CEO on Rosneft’s board. BP unexpectedly had to impair the value of its investment by $25 billion USD and announce its CEO’s departure from the Rosneft board.

Cambridge is uniquely positioned to address this societal issue 

Punting down the River Cam in front of King's College.

In addition, we will explore business leaders’ ethical and regulatory responsibilities as businesses trend towards collecting and disseminating considerable volumes of consumer information. Many new businesses, for example, are developing services or products that rely on health data collected from consumer wearables (such as smartwatches). We plan for students to discuss and debate the ethics of collecting mental health data from online subscribers, when selling that data might be lucrative and there are lax government rules about data protection. We also plan to discuss the increasing regulatory obligations from the growth of digital services legislation acts around the globe. 

We are excited to host the Cambridge Disinformation Summit, and we look forward to disseminating some of the ideas generated after the summit concludes. Why me? Because I am very sensitive to the global existential risk, and I recognise that Cambridge holds historical precedent to facilitate deep learning and interdisciplinary dialogue that can materially support humanity across future generations. 

Related news and insight

From burnout to AI and the US elections, faculty at Cambridge Judge Business School offer their hopes and fears for the year to come.

The first Cambridge Disinformation Summit ends with call to lobby for access to social and sharing media platform data.

Matt Gorham, Managing Director at the Cyber and Privacy Innovation Institute at PwC US, explains how to do a disinformation threat assessment and figure out how it might affect your business.

Connect with us

Receive the latest news directly to your inbox.

Subscribe to our newsletter