The rise in popularity of AI has led to an increased need for Wikipedia to be as accurate as possible, with academia being considered as the most reliable editors, writes Dr Raffaele Cirello.
A RECENT ARTICLE in Nature called on academics to edit Wikipedia for enhanced artificial intelligence (AI) reliability.
The case for that has been clear for 20 years. AI alone will not overcome the systemic barriers that have prevented it: misaligned incentives and cultural elitism.
Academia’s incentive structures prioritise prestigious journals, grants and sometimes teaching evaluations, not knowledge curation for the public good. Tenure committees and funders rarely (if ever) recognise Wikipedia contributions. Scholars already face heavy workloads and funding cuts. Without institutional support, editing Wikipedia remains a someday hobby.
Elitism makes this worse. Despite its vast influence, many academics still dismiss Wikipedia as “non-scholarly”. Being corrected by “non-experts” does not sit well with academia’s big egos, just as boasting about one’s own work does not resonate with Wikipedia editors.
Bridging this gap requires institutional training programs, partnerships with Wikimedia and policies that treat Wikipedia contributions as legitimate scholarly work. Most importantly, academia must rethink its values. The real question is not if scholars should engage with Wikipedia, but why they still have not.
AI’s reliance on Wikipedia does not change these barriers. Instead, the threat of AI-driven job cuts in academia may finally push universities and funders to recognise Wikipedia as a critical public knowledge infrastructure in need of expert stewardship.
Dr Raffaele F Ciriello holds a BSc in Information Systems from the University of Stuttgart and an MSc and PhD from the University of Zurich (2017). He is a Senior Lecturer in Business Information Systems at the University of Sydney, specialising in compassionate digital innovation.

Support independent journalism Subscribe to IA.
