What is it about?

During the global pandemic, it became clear that the people we listen to are not always the people who know the most. While scientists and public health experts carefully explained what was happening, many turned to podcasts, social media posts, and influencers who spoke with confidence but without expertise. This is not just about misinformation. It is about a system designed to reward attention rather than accuracy. Social media algorithms amplify posts that provoke strong emotions, like fear or anger, while careful, evidence-based advice often goes unnoticed. This trend is happening across many areas, not just health. Legal advice, household tips, and lifestyle recommendations are all shared by charismatic but untrained voices. People listen to them because they speak with certainty, even when they are wrong. Research shows false information spreads faster than the truth and is much harder to correct once believed. Over time, this erodes trust in real experts and makes it harder for people to tell what is reliable. Our article argues that expertise itself is being undervalued and replaced by viral charisma. The consequences are serious: lower vaccination rates, public distrust of science, conspiracy theories, and poor decision-making in critical areas. To address this, experts and institutions must engage the public directly, clearly, and often. Staying silent is no longer an option. We also need to rethink incentives in academia and media, so public communication and engagement are recognized as vital work. Ultimately, understanding the difference between popular opinion and informed knowledge is essential for protecting public health, democracy, and the decisions that shape our world.

Featured Image

Why is it important?

This article highlights a new and urgent problem: algorithms reward charisma, not competence, creating an epistemic crisis where expertise is sidelined. Unlike previous discussions of misinformation, this work links social media design, public engagement, and institutional incentives to show how society increasingly mistakes confidence for authority. Its timeliness comes from the ongoing consequences of this trend, from public health failures during pandemics to political polarization and environmental misinformation. By framing the issue as systemic rather than individual, the article calls for proactive engagement strategies from experts and institutions, making it both a warning and a practical guide for maintaining trust in knowledge in an era of viral influence.

Perspectives

Working on this article made me confront how much the structures of attention and influence have shifted in the digital age. In writing it, I realized that social media algorithms are not neutral. They reward confidence, charisma, and emotional impact over expertise. Having observed this in public health debates and political discussions, I understand that we are not just facing isolated cases of misinformation but a systemic epistemic crisis. As a co-author, I feel the urgency of this moment. Experts cannot sit on the sidelines and institutions must rethink how they support public engagement. For me, this work is a call to action. Knowledge alone is not enough, and communicating it effectively in a world of viral attention is now part of the responsibility of scholarship.

Assoc. Prof. Ezra N. S. Lockhart
National University

Read the Original

This page is a summary of: Influence without insight: how algorithmic charisma is replacing experts, AI & Society, November 2025, Springer Science + Business Media,
DOI: 10.1007/s00146-025-02706-y.
You can read the full text:

Read

Contributors

The following have contributed to this page