The Art of the (Platform) Deal

Tech Platforms, Fact Checkers, and the Politics of Truth

Authors

1 Introduction

Meta’s decision to end its third-party fact-checking (3PFC) program in the United States can be viewed as part of a corporate deal-making strategy to leverage its substantial and far-reaching content-moderation apparatus to court political favor with right-wing politicians and their allies. We argue that this potentially marks a turning point for digital platforms that more explicitly barter information integrity initiatives to retain their agency and monopoly by subverting politically targeted regulation. Fact checkers once again face challenges to their legitimacy, funding, access to information, and public reach. These changes further institutionalize an epistemology based on ideology and transaction rather than evidence seeking and truth telling (Barrett, 2025; Gilbert, 2025). Meta’s recent move also provides a political deal-making playbook for other US-based digital platforms, stepping away from its earlier “network governance” approaches (Caplan, 2023).

In this commentary, we argue that recent platform developments implicating the field of independent fact checking set a precedent for deal-making rather than policy-making between industrial and political leaders. Meta’s recent decision might create a new status quo for relationships between populist leaders, tech platforms, and fact checkers, which moves away from practical, evidence-informed public-interest regulation towards oligarchical patronage (Meese, 2023), deal-brokering, and political alliance-building. We discuss the possible, far-reaching implications of this new status quo for social media and its regulation. First, we reconsider the changes to Meta’s 3PFC program in light of the historical role that fact checkers have played in political discourse. We consider Meta’s stated intentions for the program over the years and how it has been received politically. We then examine the evolving relationships between other platforms and fact checkers and the role that crowd-sourced approaches play in content moderation. Finally, we discuss what this new phase of deal-making means for fact checking and the future of information integrity.

2 Ripping Up the Rule Books: The Inception of Meta’s Third-Party Fact-Checking Program

In 2016, Meta’s (then Facebook’s) announcement of the launch of its 3PFC program read: “It’s important to us that the stories you see on Facebook are authentic and meaningful” (Mosseri, 2016, para. 9). The tone and approach adopted by the platform company almost 9 years later could not be more different. Appearing on the Joe Rogan Experience podcast (Rogan, 2025), Mark Zuckerberg described Meta’s experience with professional fact checking as “something out of like, you know, 1984” (06:23). Zuckerberg’s recent statements capture Meta’s reluctance to be perceived as a publisher and echo past public statements responding to criticism of the platform’s decision to fire human curators working on Meta’s newsfeed algorithm and complaints from global journalists after Meta removed the Pulitzer Winning Award photograph of the Vietnam War (Bucher, 2018). Zuckerberg now uses the word “censorship” to describe what journalists generally understand as “editorial responsibility.”

Conservative critics have long equated fact checking with censorship (Stewart, 2021; Barrabi, 2024; Uscinski & Butler, 2013). Fact checkers have vehemently denied accusations of censorship, stating that these are often attacks from actors who mislead publics intentionally and seek political gain (Gilbert, 2025). Meta’s recent decisions should also be considered in light of other moves by big tech companies in the United States that are part of the race to capture the global artificial intelligence (AI) business over the next four years, with US$500 billion invested in infrastructures and development (OpenAI, 2025). Meta could indeed be appealing to politicians to avoid missing out on these investment opportunities. Essentially, Meta’s new political stance distances its current US deal-making approach from previous efforts to establish regulatory mechanisms through policy consultation with foreign governments to strengthen moderation countering mis- and disinformation.

Although it is not clear how Meta’s crowdsourced fact-checking model will work in the United States, the remainder of the global fact checkers working for Meta’s 3PFC program face an uncertain future, given that the consequences of this new US platform deal-making strategy are yet to unfold. Fact checkers, as only one part of the platform content-moderation apparatus, engage in “soft” content moderation, which focuses on verifying particular kinds of problematic content that could cause public harm (Graves et al., 2023). Their role as “de facto moderators” has long been debated, with researchers investigating how fact-checking organizations may be prioritizing the interests of platforms (Vinhas & Bastos, 2023; Riedlinger et al., 2024) given the “convoluted relationship between the democratic mission of fact checking and the content moderation bureaucracy of social platforms” (Vinhas & Bastos, 2023, p. 3).

3 Controlling the Conversation: Meta’s Attack on Fact Checking

Meta’s announcement has enabled the platform company to gain control of the narrative by casting doubt on fact checkers’ neutrality instead of reassuring users of the continued importance of content moderation and outlining what a new paradigm might look like on its platforms. The chief reasons provided by Zuckerberg for Meta’s abandonment of the program in the US were that fact checking did not successfully address misinformation on Meta’s platforms, that it stifled free speech, and that it led to widespread censorship. These accusations have surprised fact checkers, who have challenged Zuckerberg’s statement that the 3PFC program demonstrated biases (see, for example, Holan, 2025; Schenk, 2025a).

While Zuckerberg blames the 3PFC program for being too political, public-interest fact checking – and indeed content moderation – is intrinsically linked to politics. The 3PFC program was a policy response to international concern among audiences and regulators about the role that Facebook played in international democratic processes in 2016 (Ohlheiser, 2016; Stray & Sneider, 2025). The timing of this policy suggests that Facebook acted in anticipation of regulation, involving civil society actors to address legitimacy concerns (Caplan, 2023) and favoring self-governance (Suzor, 2019). The original remit of the 3PFC program was to safeguard democracy by tackling viral claims on issues that are timely and consequential for democratic citizens. Meta particularly emphasized major issues such as “COVID-19, global elections, natural disasters, conflicts and other events” (Meta, 2021, p. 1). Major global events, mis- and disinformation, politics, and fact checking are inherently connected, and although fact checkers have never claimed to use scientifically rigorous methods (Amazeen, 2015), their methods have always sought to reflect scientific consensus, separating institutionally recognized facts from value-laden content (Yarrow, 2021). Yet, science itself – especially climate science and health science – is also politically contested (see, for example, Sarewitz, 2004). The COVID-19 pandemic brought to light the critical role of fact checking in countering harmful public health claims, but it also exposed its limitations (Burel et al., 2020; Luengo & García-Marín, 2020). COVID-19 was a case study of extreme politicization around the distribution of vaccines, isolation measures, and the allocation of blame (Forman et al., 2021), especially in the United States (Stroebe et al., 2021; Kerr et al., 2021). During the pandemic, fact checkers continued to fulfill their watchdog roles, at times maintaining significantly higher online engagement when verifying claims by elite actors, including politicians, alternative-health influencers, and other public figures (Riedlinger et al., 2024). However, as a result of these practices resulted, fact checkers were targeted by conservative critics for allegedly being biased (Moon et al., 2023), in particular for exhibiting the kind of progressive bias often associated with knowledge-making industries.

Academic critiques of fact checking practices focus on these organizations’ normative conflation of empirical data with objectivity and their imposition of binary, truth/falsity metrics on political speech (Uscinski & Butler, 2013). These practices ignore the complex political nature of selective data generation and analysis and normative reasoning (Yarrow, 2021). Yet, fact checkers also re-situate contentious and consequential claims within their social, political, and empirical contexts (Amazeen, 2015; Watt et al., 2024). Studies suggest correlations between US conservative voters, a tendency to downplay the harms of misinformation (Park, 2024), and a tacit belief that fact checkers are biased (Walter et al., 2020). Political communication scholars coined the term factual belief polarization to describe the tendency for actors with opposing ideological beliefs to differ in what they consider valid knowledge-seeking practices and legitimate evidence (Lee et al., 2021; Rekker, 2022). Actors’ attempts to discredit information and erase complexities are indeed aspects of destructive polarization (Esau et al., 2024). Fact checkers have attempted to move against partisanship and destructive polarization by reintroducing complexity and following evidence in a politically agnostic way (Watt et al., 2024). Members of the 3PFC program also commit to non-partisanship and a transparent fact-checking process when they become signatories to the International Fact-Checking Network’s (IFCN) Code of Practice (Meta, 2021).

Meta’s own Oversight Board tempered Zuckerberg’s accusations of bias but welcomed changes to a system that had attracted controversy (Oversight Board, 2025). Still, Zuckerberg’s accusations remain unsubstantiated beyond a tacitly held partisan belief, and even Meta’s publications support the 3PFC program. According to Meta, the program has been successful in what it set out to do, which is to reduce the spread of misinformation (Meta, 2021): “We know this program is working and people find value in the warning screens we apply to content after a fact-checking partner has rated it” (para. 7).

More recently, a report by Meta in response to the European Union’s (EU) Digital Services Act demonstrated Meta’s support for the 3PFC program and indicated that it stood by the decisions of its fact checkers, at least in the EU. According to the report, the fact checkers in Meta’s 3PFC program flagged 13,876,973 pieces of misinformation content in the EU and debunked it “as False, Altered, or Partly False” (Meta, 2024, p. 13), leading to the content’s algorithmic demotion. The data in the report shows that misinformation flagged by fact checkers was notably the largest group of algorithmically demoted content by far (excluding deleted policy-violating content). Only 1,849 of these algorithmic demotions were reversed, which amounts to the lowest reversal rate among all categories of demoted content (0.01%). At the time of writing, there is no comparable report on 3PFC program operations in the US or data to suggest different findings from the EU.

Although Meta’s decision cannot – at least yet – be justified empirically, it can be explained by a recent strategic and ideological shift towards US conservative politics. Meta may have severed ties with fact checkers as a necessary but calculated concession to maintain its political and business interests in the United States. US President Donald Trump himself described it as likely a reaction to threats he made to the company and Zuckerberg in the lead-up to his election (Chan et al., 2025). This is tied to a “jawboning” controversy where the Biden administration was alleged to have unduly influenced Meta’s removal of heavily politicized content around elections and COVID-19 (Chung, 2024). Retreating from their US fact-checking partnerships might be Meta’s only option for reasserting control over their content policies and avoiding further political backlash. In the meantime, even though Meta is not the only platform stakeholder in independent fact checking, the 3PFC program has become a primary source of income for many independent fact checkers (IFCN, 2023), and there are no clear funding alternatives matching its current scale of operation. Alongside funding from Meta, IFCN-accredited fact checkers receive modest investments from other platform companies, including Google through its Google News Initiative and TikTok via its fact-checking partners program (Wake et al., 2024). The future of these programs remains uncertain.

4 One Deal May Set a Trend

If maintaining the goodwill of powerful political players in the US continues to be prioritized over information integrity, then it is important to consider what Meta’s recent policy changes mean for other tech giants and, consequently, for fact checkers. Given what we know about platform conformity from institutional theory – notably Caplan and boyd’s (2018) investigations of algorithmic isomorphism – we can assume that there will be little platform appetite or support for professional fact checking in the United States for quite some time. The recently signed US Executive Order Restoring freedom of speech and ending federal censorship specifically targets the use of platform-supported fact checkers in content moderation (The White House, 2025). But this is not the first time that the content moderation decisions of platform companies have signaled a lack of commitment to responsible content provision. After taking over Twitter, Elon Musk changed the platform’s policies to enable users to share misinformation about COVID-19 and vaccines and disbanded its Trust and Safety Council. YouTube subsequently changed its moderation policy to allow content that could attribute the results of the 2020 presidential election to widespread fraud, and Meta scaled back its trust and safety team and fact-checking efforts (Field & Vanian, 2023).

In light of Meta’s most recent changes, Google has announced that it will not comply with the EU’s Digital Services Act’s requirements to implement fact checking on Search and YouTube in EU countries (Weatherbed, 2025). However, in other areas, Google has demonstrated significant support for fact checkers, investing $13.2 million in 2022 in grants to IFCN members through its Global Fact Check Fund, and the Google News Initiative’s Fact Check Explorer streamlines access to fact checker–verified content. It appears that YouTube’s approach to content moderation will remain institution-driven for now. YouTube’s Priority Flagger Program prioritizes partnerships with non-governmental organizations and government agencies to flag content that violates the platform’s Community Guidelines rather than supporting individual volunteer participation or professional fact checking (Google, 2023).

Bluesky, a small but fast-growing platform, is resisting the move away from professional fact checking. A well-timed – albeit modest – grant to develop an independent fact-checking presence on its platform was recently announced (Schenk, 2025b). However, the dynamics of platform moderation are shifting. TikTok, which has established a fact-checking partnership similar to Meta’s through its TikTok Fact-Checking Partners program, has remained notably silent on the issue, likely due to its ongoing challenges with US lawmakers. However, with Meta’s 3PFC program gone, TikTok could play a pivotal role in shaping the future of platform-supported fact-checking initiatives.

5 Trading Truth: Meta’s Exit from Fact Checking and the Future of Online Information Integrity

Meta initially stated that its decision to end the 3PFC program would only apply to the US, but representatives have since stated that they expect Meta’s content moderation policy to eventually roll out to other regions (Haeck, 2025). This shift in content policy is likely to have significant downstream effects on other information integrity initiatives, including crowd-sourced content contextualization approaches like X’s Community Notes, which, according to one study, cites fact checks as its third most common source after Wikipedia and other posts on X (Maldita, 2025). Another study, published in preprint, looked at how fact checkers’ articles were cited in Community Notes found that fact checking was critical to community moderation practices, especially when addressing claims tied to larger narratives (Borenstein et al., 2025). The gap left in the US platform fact-checking landscape is likely to empower cynical fact checking by motivated actors (Montaña-Niño et al., 2024), exemplified by the Russian state’s Global Fact-Checking Network (EDMO, 2024). This will further empower state propagandists and facilitate the suppression of dissent by authoritarian leaders and violations of press freedom around the world. Fact-checking organizations have long maintained that fact checks must go hand in hand with other initiatives that increase the resilience of audiences and accountability for powerful actors operating in their own interests (Full Fact, 2019). These activities, which include upstream advocacy and lobbying, media-literacy campaigns, and technological interventions for screening out the most harmful content, are made possible by the steady and substantial source of income provided by Meta’s 3PFC program to fact checkers, many of whom rely on it (IFCN, 2023).

As we have argued in this article, Meta’s recent changes are part of a larger shift by US tech companies to align with conservative leaders, which will have implications for global efforts towards information integrity. Meta’s decision sets a precedent for private platform deal-making with populist leaders and lays the foundations for platforms and populist leaders to achieve greater influence over global information flows. We argue that this places the world’s largest social media platform ecosystem on a trajectory that runs against information integrity. Meta’s strategic rebranding comes at a pivotal time when big tech has been under heightened scrutiny by powerful stakeholders, including Trump. Essentially, the move aligns with the current trend of sycophancy by the US tech elite towards the US administration (AP News, 2024), likely to shield big tech from both legal and reputational accountability for harmful content circulating on digital platforms (Gilbert, 2025).

Given the reduced ability for content verification, it is likely that this political deal-making will contribute to further misinformation-related harm and damage democratic institutions for years to come. The recently signed US Executive Order Restoring freedom of speech and ending federal censorship, combined with this political deal-making approach, undermines the leverage of democratic institutions and voting publics to do anything about it. This does not bode well for the politics of truth; as Trump put it, “Leverage: don’t do deals without it” (Trump & Schwartz, 1987, p. 54).

References

Amazeen, M. A. (2015). Revisiting the epistemology of fact-checking. Critical Review, 27(1), 1–22. https://doi.org/10.1080/08913811.2014.993890

AP News. (2024, December 12). Amazon to contribute $1 million to Trump’s inauguration fund. Meta is also donating $1M. https://apnews.com/article/trump-meta-zuckerberg-inauguration-donation-c540bf7c638def11b8428e633965c718

Barrabi, T. (2024, November 18). Incoming FCC chair Brendan Carr vows to “dismantle” Big Tech’s “censorship cartel.” New York Post. https://nypost.com/2024/11/18/business/incoming-fcc-chair-brendan-carr-vows-to-dismantle-big-techs-censorship-cartel/

Barrett, P. M. (2025, January 27). Some facts about fact-checking: Defending the imperfect search for truth in an era of institutionalized lying. Tech Policy Press. https://techpolicy.press/some-facts-about-fact-checking-defending-the-imperfect-search-for-truth-in-an-era-of-institutionalized-lying

Borenstein, N., Warren, G., Elliott, D., & Augenstein, I. (2025). Can community notes replace professional fact-checkers? arXiv preprint arXiv:2502.14132. https://arxiv.org/pdf/2502.14132

Bucher, T. (2018). If... then: Algorithmic power and politics. Oxford University Press.

Burel, G., Farrell, T., Mensio, M., Khare, P., & Alani, H. (2020). Co-spread of misinformation and fact-checking content during the COVID-19 pandemic. In S. Aref et al. (Eds.), Social informatics. SocInfo 2020 (Lecture Notes in Computer Science, 12467, 28– 42). Springer International Publishing. https://doi.org/10.1007/978-3-030-60975-7_3

Caplan, R., & boyd, d. (2018). Isomorphism through algorithms: Institutional dependencies in the case of Facebook. Big Data & Society, 5(1). https://doi.org/10.1177/2053951718757253

Caplan, R. (2023). Networked platform governance: The construction of the democratic platform. International Journal of Communication, 17, 3451–3472. https://ijoc.org/index.php/ijoc/article/view/20035

Chan, K., Ortutay, B., & Riccardi, N. (2025, January 7). Meta eliminates fact-checking in latest bow to Trump. AP News. https://apnews.com/article/meta-facts-trump-musk-community-notes-413b8495939a058ff2d25fd23f2e0f43

Chung, A. (2024, June 26). US Supreme Court will not curb Biden administration social media contacts. Reuters. https://www.reuters.com/legal/us-supreme-court-wont-curb-biden-administration-social-media-contacts-2024-06-26/

European Digital Media Observatory. (2024, November 29). Putin’s new plan to undermine fact-checking. EDMO. https://edmo.eu/publications/putins-new-plan-to-undermine-fact-checking/

Field, H., & Vanian, J. (2023, May 26). Tech layoffs ravage the teams that fight online misinformation and hate speech. CNBC. https://www.cnbc.com/2023/05/26/tech-companies-are-laying-off-their-ethics-and-safety-teams-.html

Forman, R., Shah, S., Jeurissen, P., Jit, M., & Mossialos, E. (2021). COVID-19 vaccine challenges: What have we learned so far and what remains to be done? Health Policy, 125(5), 553–567. https://doi.org/10.1016/j.healthpol.2021.03.013

Full Fact. (2019). Fact checking doesn’t work (the way you think it does). https://fullfact.org/blog/2019/jun/how-fact-checking-works/

Gilbert, D. (2025, January 7). Meta’s fact-checking partners say they were ‘blindsided’ by decision to axe them. Wired. https://www.wired.com/story/metas-fact-checking-partners-blindsided/

Google. (2023). About the YouTube Priority Flagger program. https://support.google.com/youtube/answer/7554338?sjid=10340825454164077783-NA

Haeck, P. (2025, February 4). Meta chief lobbyist slams EU tech laws and fines. Politico. https://www.politico.eu/article/meta-chief-lobby-eu-tech-artificial-intelligence-fines-marketplace-joel-kaplan/

Holan, A. D. (2025). Statement as International Fact-Checking Network Director on Meta ending its US factchecking program [Social media post]. LinkedIn. https://www.linkedin.com/posts/angieholan_heres-my-statement-as-international-fact-checking-activity-7282421486085107712-VaKN/

IFCN. (2023). State of the fact-checkers report. Poynter. https://www.poynter.org/wp-content/uploads/2024/04/State-of-Fact-Checkers-2023.pdf

Kerr, J., Panagopoulos, C., & Van Der Linden, S. (2021). Political polarization on COVID-19 pandemic response in the United States. Personality and Individual Differences, 179, 1–9. https://doi.org/10.1016/j.paid.2021.110892

Lee, N., Nyhan, B., Reifler, J., & Flynn, D. J. (2021). More accurate, but no less polarized: Comparing the factual beliefs of government officials and the public. British Journal of Political Science, 51(3), 1315–1322. https://doi.org/10.1017/S000712342000037X

Luengo, M., & García-Marín, D. (2020). The performance of truth: politicians, fact-checking journalism, and the struggle to tackle COVID-19 misinformation. American Journal of Cultural Sociology, 8(3), 405– 427. https://doi.org/10.1057/s41290-020-00115-w

Maldita. (2025, February 13). Faster and more useful: The impact of fact-checkers in X’s Community Notes. https://maldita.es/nosotros/20250213/community-notes-factcheckers-impact-report/

Meese, J. (2023). Digital platforms and the press. Intellect. https://intellectdiscover.com/content/books/9781789388336

Meta. (2021). How Meta’s third-party fact-checking program works. Meta for Media. https://www.facebook.com/formedia/blog/third-party-fact-checking-how-it-works

Meta. (2023). Meta response to the Australian code of practice on disinformation and misinformation. https://digi.org.au/wp-content/uploads/2024/05/Meta-Transparency-Report-2024-Australian-Code-of-Practice-on-Disinformation-and-Misinformation.pdf

Meta. (2024). Regulatory transparency reports. Meta Transparency Center. https://transparency.meta.com/sr/dsa-transparency-report-apr2024-facebook

Meta. (2025). Where we have fact checking. https://www.facebook.com/formedia/mjp/programs/third-party-fact-checking/partner-map

Montaña-Niño, S., Vziatysheva, V., Dehghan, E., Badola, A., Zhu, G., Vinhas, O., ... & Glazunova, S. (2024). Fact-checkers on the fringe: Investigating methods and practices associated with contested areas of fact-checking. Media and Communication, 12, 1-19. https://doi.org/10.17645/mac.8688

Moon, W. K., Chung, M., & Jones-Jang, S. M. (2023). How can we fight partisan biases in the COVID-19 pandemic? AI source labels on fact-checking messages reduce motivated reasoning. Mass Communication and Society, 26(4), 646–670. https://doi.org/10.1080/15205436.2022.2097926

Mosseri, A. (2016). Addressing hoaxes and fake news. Meta. https://about.fb.com/news/2016/12/news-feed-fyi-addressing-hoaxes-and-fake-news/

Ohlheiser, A. (2016, November 19). Mark Zuckerberg outlines Facebook’s ideas to battle fake news. Washington Post. https://www.washingtonpost.com/news/the-intersect/wp/2016/11/19/mark-zuckerberg-outlines-facebooks-ideas-to-battle-fake-news/

OpenAI. (2025). Announcing the Stargate Project. https://openai.com/index/announcing-the-stargate-project

Oversight Board. (2025, January 7). Oversight Board to engage with Meta over its fact-checking replacement. https://www.oversightboard.com/news/oversight-board-to-engage-with-meta-on-its-fact-checking-replacement

Park, C. S. (2024). Why people rely on fact-checkers? Testing theses of “Perceived severity of fake news” and “Disappointment in news media.” Journalism Studies, 25(1), 1–18. https://doi.org/10.1080/1461670X.2023.2289878

Rekker, R. (2022). Political polarization over factual beliefs. In J. Strömbäck, A. Wikforss, K. Glüer, T. Lindholm, & H. Oscarsson (Eds.), Knowledge resistance in high-choice information environments (pp. 222–236). Routledge.

Riedlinger, M., Montaña-Niño, S., Watt, N., García-Perdomo, V., & Joubert, M. (2024). Fact-checking role performances and problematic Covid-19 vaccine content in Latin America and Sub-Saharan Africa. Media and Communication, 12, 1–25. https://doi.org/10.17645/mac.8680

Rogan, J. (Host). (2025, January 10). #2255 Mark Zuckerberg. In The Joe Rogan Experience. https://open.spotify.com/episode/3kDr0LcmqOHOz3mBHMdDuV

Sarewitz, D. (2004). How science makes environmental controversies worse. Environmental Science & Policy, 7(5), 385–403. https://doi.org/10.1016/j.envsci.2004.06.001

Schenk, M. (2025a, January 7). Meta ends third-party fact-checking partnership in the U.S. Lead Stories. https://leadstories.com/analysis/2025/01/end-of-meta-partnership.html

Schenk, M. (2025b, January 28). Lead Stories awarded $100,000 ENGAGE grant to build fact check labeling service for Bluesky. Lead Stories. https://leadstories.com/analysis/2025/01/lead-stories-engage-grant-bluesky-labeling-service.html

Stewart, E. (2021). Detecting fake news: Two problems for content moderation. Philosophy & Technology, 34(4), 923–940. https://doi.org/10.1007/s13347-021-00442-x

Stray, J., & Sneider, E. (2025, February 4). Meta dropped fact-checking because of politics. But could its alternative produce better results? Tech Policy Press. https://www.techpolicy.press/meta-dropped-fact-checking-because-of-politics-but-could-its-alternative-produce-better-results/

Stroebe, W., Vandellen, M. R., Abakoumkin, G., Lemay Jr, E. P., Schiavone, W. M., Agostini, M., ... & Leander, N. P. (2021). Politicization of COVID-19 health-protective behaviors in the United States: Longitudinal and cross-national evidence. PloS One, 17(1), 1-22. https://doi.org/10.1371/journal.pone.0256740

Suzor, N. P. (2019). Lawless: the secret rules that govern our digital lives. Cambridge University Press.

The White House. (2025). Restoring freedom of speech and ending federal censorship. https://www.whitehouse.gov/presidential-actions/2025/01/restoring-freedom-of-speech-and-ending-federal-censorship

Trump, D. J., & Schwartz, T. (1987). Trump: The art of the deal. Random House.

Uscinski, J. E., & Butler, R. W. (2013). The epistemology of fact checking. Critical Review, 25(2), 162–180. https://doi.org/10.1080/08913811.2013.843872

Vinhas, O., & Bastos, M. (2023). The WEIRD governance of fact-checking and the politics of content moderation. New Media & Society, Article 14614448231213942.

Wake, A., Ambrose, D., & Grenfell, D. (2024). Fact-checking and verification: The changing role of professional journalists. In A. Wake (Ed.), Transnational Broadcasting in the Indo Pacific: The Battle for Trusted News and Information (pp. 159–176). Springer International Publishing. https://doi.org/10.1007/978-3-031-47571-9

Walter, N., Cohen, J., Holbert, R. L., & Morag, Y. (2020). Fact-checking: A meta-analysis of what works and for whom. Political Communication, 37(3), 350–375. https://doi.org/10.1080/10584609.2019.1668894

Watt, N., Montaña-Niño, S. & Riedlinger, M. (2024). Examining constructive (de)polarisation: the case of Australian independent fact checkers. Selected Papers in Internet Research 2024: Research from the Annual Conference of the Association of Internet Researchers. https://spir.aoir.org/ojs/index.php/spir/article/view/14089/11953

Weatherbed, J. (2025, January 17). Google rejects EU fact-checking commitments for Search and YouTube. The Verge. https://www.theverge.com/2025/1/17/24345747/google-eu-dsa-fact-checks-disinformation-code-search-youtube

Yarrow, D. (2021). From fact-checking to value-checking: Normative reasoning in the new public sphere. The Political Quarterly, 92(4), 621–628. https://doi.org/10.1111/1467-923X.12999

Date accepted: March 2025

Metrics

Metrics Loading ...

Downloads

Published

17-04-2025

Issue

Section

Voices for the Networked Society