Skip to content

Wikipedia vs. the Machines

How the World’s Encyclopedia Is Policing AI

Wikipedia is one of the internet’s most human projects. Now it’s figuring out how to coexist with the least human technology yet: generative AI.

The world’s largest open-access encyclopedia is facing a wave of AI-generated content—some accurate, much of it promotional or misleading—while exploring ways the technology could help its volunteer editors work faster and more effectively.

A 2024 Princeton study found that over 5% of new English Wikipedia pages contained significant AI-written text. In response, the community launched WikiProject AI Cleanup and updated speedy deletion policies to target what editors call “AI slop.”

At the same time, the Wikimedia Foundation is testing AI in controlled contexts. Tools like the Objective Revision Evaluation Service (ORES) already score edit quality, and newer pilots aim to assist with translation, citation checks, and factual updates. In April 2025, Wikimedia reiterated that humans—not algorithms—will remain in charge of editorial decisions.

But the AI shift isn’t only about content quality. Large language models scrape Wikipedia at scale, driving a 50% spike in bandwidth usage and raising sustainability concerns. Recruiting younger editors, especially Gen Z, has also become a priority for the site’s resilience.

Wikipedia’s next chapter may be defined by this hybrid approach: AI as a back-end assistant, paired with a community-led “immune system” to preserve accuracy and trust in an automated information landscape.

Comments

Latest