Systematic meta-analysis of research on AI tools to deal with misinformation on social media during natural and anthropogenic hazards and disasters

Vicari, R. & Komendantova, N. ORCID: https://orcid.org/0000-0003-2568-6179 (2023). Systematic meta-analysis of research on AI tools to deal with misinformation on social media during natural and anthropogenic hazards and disasters. Humanities and Social Sciences Communications 10 (1) 10.1057/s41599-023-01838-0.

[thumbnail of s41599-023-01838-0.pdf]
Preview
Text
s41599-023-01838-0.pdf - Published Version
Available under License Creative Commons Attribution.

Download (2MB) | Preview
Project: sCience and human factOr for Resilient sociEty (CORE, H2020 101021746)

Abstract

The spread of misinformation on social media has led to the development of artificial intelligence (AI) tools to deal with this phenomenon. These tools are particularly needed when misinformation relates to natural or anthropogenic disasters such as the COVID-19 pandemic. The major research question of our work was as follows: what kind of gatekeepers (i.e. news moderators) do we wish social media algorithms and users to be when misinformation on hazards and disasters is being dealt with? To address this question, we carried out a meta-analysis of studies published in Scopus and Web of Science. We extracted 668 papers that contained keyterms related to the topic of “AI tools to deal with misinformation on social media during hazards and disasters.” The methodology included several steps. First, we selected 13 review papers to identify relevant variables and refine the scope of our meta-analysis. Then we screened the rest of the papers and identified 266 publications as being significant for our research goals. For each eligible paper, we analyzed its objective, sponsor’s location, year of publication, research area, type of hazard, and related topics. As methods of analysis, we applied: descriptive statistics, network representation of keyword co-occurrences, and flow representation of research rationale. Our results show that few studies come from the social sciences (5.8%) and humanities (3.5%), and that most of those papers are dedicated to the COVID-19 risk (92%). Most of the studies deal with the question of detecting misinformation (68%). Few countries are major funders of the development of the topic. These results allow some inferences. Social sciences and humanities seem underrepresented for a topic that is strongly connected to human reasoning. A reflection on the optimum balance between algorithm recommendations and user choices seems to be missing. Research results on the pandemic could be exploited to enhance research advances on other risks.

Item Type: Article
Research Programs: Advancing Systems Analysis (ASA)
Advancing Systems Analysis (ASA) > Cooperation and Transformative Governance (CAT)
Depositing User: Luke Kirwan
Date Deposited: 19 Jun 2023 07:27
Last Modified: 19 Jun 2023 07:27
URI: https://pure.iiasa.ac.at/18854

Actions (login required)

View Item View Item