Told  /  Longread

Wikipedia Is Under Attack — and How It Can Survive

The site’s volunteers face threats from Trump, billionaires, and AI.

“Most editors on Wikipedia are English-speaking men, and our coverage is of things that are of interest to English-speaking men,” said a retired market analyst in Cincinnati who has been editing for over 20 years. “Our sports coverage is second to none. Video games, we got it covered. Wars, the history of warfare, my god. Trains, radio stations... But our coverage of foods from other countries is very low, and there is an absolute systemic bias against coverage of women and people of color.” For her part, she tries to fill gaps around food, creating new articles whenever she encounters a Peruvian chili sauce or African fufu that lacks one.

Yet these initiatives have come under attack as “DEI” by conservative influencers and Musk, who called for Wikipedia to be defunded until “they restore balance.”

These accusations of bias, familiar from attacks on the media and social platforms, encounter some unique challenges when leveled against Wikipedia. Crucially, if you think something is wrong on Wikipedia, you can fix it yourself, though it will require making a case based on verifiability rather than ideological “balance.”

Over the years, Wikipedia has developed an immune response to outside grievances. When people on X start complaining about Wikipedia’s suppression of UFO sightings or refusal to change the name of the Gulf of Mexico to Gulf of America, an editor often restricts the page to people who are logged in and puts up a notice directing newcomers to read the latest debate. If anything important was missed, they are welcome to suggest it, the notice reads, provided their suggestion meets Wikipedia’s rules, which can be read about on the following pages. That is, Wikipedia’s first and best line of defense is to explain how Wikipedia works.

Occasionally, people stick around and learn to edit. More often, they get bored and leave.

It was not unusual for skirmishes to break out over the Wikipedia page for Asian News International, or ANI. It is the largest newswire service in India, and as its Wikipedia article explains, it has a history of promoting false anti-Muslim and pro-government propaganda. It was these facts that various anonymous editors — not logged into Wikipedia accounts, so appearing only as IP addresses — attempted to remove last spring.

As typically happens, an experienced editor quickly reinstated the deleted sentences, noting that they had been removed without explanation. Then came another drive-by edit: actually, ANI is not propaganda and very credible, someone wrote, citing a YouTube video. Reverted: YouTube commentary is not a reliable source. Then another IP address, deleting a sentence about ANI promoting a false viral story about necrophilia in Pakistan. Reverted again. Another IP address, deleting the mention of propaganda with the explanation that the sources were “leftist dogs and swine.”

As the edit battle escalated, an editor locked the page so that only people who were logged in and had made a certain number of edits could make changes, ending the barrage of IP addresses.

Two months later, ANI sued.

The lawsuit revealed that several of the IP addresses had belonged to representatives of ANI attempting to remove unflattering information about the company. Blocked from doing so, ANI sued for defamation under a recent amendment to India’s equivalent of Section 230 that places stricter requirements on platforms to moderate content. When the Wikimedia Foundation declined to reveal the identities of three editors who had defended the page, the presiding judge said he would ask the government to block the site, threatening to cut off the country with the highest number of English Wikipedia readers after the US and the UK. “If you don’t like India,” the judge said, “please don’t work in India.”