
After reflecting on the dialogue at the 2026 Cambridge Disinformation Summit, I think we have finally found proper focus for the profound problems and core actors we face.
This year, we applied a macro lens to better understand the widespread harms from malign cognitive influence and disinformation campaigns. This offered a systemic view – which hovers well above a traditional focus on narrative veracity or quality – to help explore fundamental malign influence architecture and discuss global policy reform and accountability.
Our speakers reflected the full political spectrum: Liberal Democrats, Green, Labour, Conservative, Democrat and Republican – yet all converged on core fundamentals.
Summit takeaways include a focus on corruption and social media harm
Here in my view are the primary Summit takeaways:
1
Disinformation lays the groundwork for corruption and harm
Disinformation lays the groundwork for corruption and harm. The Summit attendees engage in anti-corruption work, not censorship, and it will take across-politics collaboration to hold corrupt actors accountable and build safer infrastructure.
2
Social media infrastructure is currently designed for hate and rage profiteering
Social media infrastructure is currently designed for hate and rage profiteering, which also increases violence risk. Powerful influencers – particularly key global political leaders and some tech platform owners – exploit algorithmic asymmetry to threaten and marginalise targeted communities, influence election outcomes, set the political agenda, entrench loyal followers and harvest outsized wealth.
3
AI relationship apps, social media influencers and algorithm designs lure in, and potentially addict, audiences
AI relationship apps, social media influencers and algorithm designs lure in, and potentially addict, audiences at times when many are most vulnerable, with feelings of shame, anger, sadness, confusion or loneliness.
4
Risks and potential harms are acute for children
The risks and potential harms are acute for children and for members of communities who have long histories of being targeted.
Summit reflected an urgency to hold tech platforms to account
I sense there is shared urgency to engage with collective action, mirroring rising public sentiment – including recent jury verdicts cited by several Summit speakers – to hold platforms and their leadership accountable for downstream harms. These verdicts included one in New Mexico against Meta for failing to protect children from material on its platforms, and in California against Meta and YouTube involving addictive features and failure to adequately warn users against potential harms.
I also sense shared momentum for major policy change and accountability enforcement by global lawmakers.
In his Summit remarks, Sir Sadiq Khan, Mayor of London, warned that London faces a “dark blizzard of disinformation” about a city in alleged decline, fuelled by an “outrage economy” that breeds division and allows profiteering from such division.
Utah Governor Spencer Cox said he initially believed that social media would bring societies together, but that has proved sadly wrong:
We’ve seen a significant increase in anxiety, depression, self-harm, loneliness, suicides sadly, and it correlates almost perfectly to the advent of social media and the explosion of smart phones with our youth. The research continues to drive that point home, that this has caused more division not to mention polarisation in our politics and the discord that we’re seeing in so many of our nations.
Harms to targeted communities
Esosa Osa, from Onyx Impact, Mobashra Tazamal, from the Bridge Initiative, and Meredith Clark from the University of North Carolina discussed how tech platforms proliferate harms to targeted communities by profiteering off rage and tribalistic rhetoric.
Clark noted how social media platform ownership allows tech executives to influence the public agenda: “You can’t break up the existence and the background and the history [of the online black community], but you can control the means of information production. So, you buy the platform.”
Input from members of parliaments
Member of European Parliament Alexandra Geese and Member of UK Parliament Anneliese Dodds echoed concerns that media and platform executives influence the political agenda and elections and discussed a need to preserve digital sovereignty. Participants also discussed investment in alternative infrastructure to protect digital sovereignty and offer safer social online platform alternatives.
The importance of these issues discussed at the Summit was underlined by post-Summit remarks in the House of Lords by Baroness Sarah Teather, a Liberal Democrat peer and former MP who was Minister of State for Children and Families. During a debate on Masculinity and Misogyny in Schools, Baroness Teather asked:
“My Lords, last week I attended the Cambridge Disinformation Summit run by Cambridge Judge Business School, where a key takeaway for me was that restricting young people’s access to social media is not on its own a sufficient response to the risks we’re discussing today.
“Does [Baroness Smith of Malvern, the Minister of State, Department for Education and Department for Work and Pensions] agree that we need accountability from social media companies on algorithms that promote and target extremist content to both adults and children?”
Baroness Smith replied:
“I largely agree with the noble Baroness.”
Looking ahead to a discussion of tech CEO culture
What I took away from the Summit is that we need to better analyse and intervene on the actors who hold, and the structures that sustain and amplify, corruption of knowledge to advance corrupt acts.
This is why we will next host a policy-informing discussion to explore tech CEO culture, with Silicon Valley-based podcaster and journalist Kara Swisher, on Saturday 9 May 2026. Find out more and register for the event.
Featured research
Related content
Watch a video of Baroness Sarah Teather’s remarks on tech platform accountability, UK House of Lords, 15 April 2026
Watch selections of quotes from speakers at the Disinformation Summit




