What is it about?
This study explores how to make fact-checking systems that use AI more trustworthy, transparent, and fair. As AI technologies increasingly help decide what is true or false online, they also shape how people understand and debate information. These systems do not simply verify facts. They influence who is seen as credible, what evidence counts as valid, and how truth itself is organized and communicated. The paper reviews major global policies and governance models, such as the European Union’s AI Act, the United States’ Blueprint for an AI Bill of Rights, and UNESCO’s recommendations on AI ethics. It compares how different regions, including countries in the Global South, are responding to the challenges of misinformation, deepfakes, and algorithmic bias. Using a structured review process (the PRISMA method), the study examines how governments and organizations are developing tools like algorithmic audits, transparency reports, and oversight councils to ensure accountability. It highlights the need for participatory governance, meaning that citizens, civil society, and local communities should have a role in shaping how these AI systems work. The paper argues that fact-checking should be seen not just as a technical or journalistic task, but as part of a larger democratic process that safeguards knowledge, fairness, and public trust. To do this effectively, policies must be adaptive, inclusive, and sensitive to cultural and regional differences. In short, the study provides a roadmap for how policymakers, researchers, and technology designers can work together to build AI systems that strengthen democratic accountability and uphold diverse ways of knowing in the digital age.
Featured Image
Read the Original
This page is a summary of: Trustworthy AI and the governance of misinformation: policy design and accountability in the fact-checking system, Transforming Government People Process and Policy, November 2025, Emerald,
DOI: 10.1108/tg-09-2025-0273.
You can read the full text:
Contributors
The following have contributed to this page







