Why does Wikipedia (mostly) work?
Loading...
For many internet users, checking a fact is synonymous with visiting a page on Wikipedia. Now in its 17th year, the free online encyclopedia has defied the expectation of its critics, who predicted that an encyclopedia produced by anonymous volunteers would quickly succumb to marketers, propagandists, and other bad-faith actors. Instead, it has been the for-profit tech giants that have stumbled, while Wikipedia’s credibility has risen. The organization attributes its resilience to its core principles: a neutral point of view, verifiability, and no original research. “Anybody using the encyclopedia can check that information comes from a reliable source,” says Kui Kinyanjui, a spokeswoman for the Wikimedia Foundation. “It’s not a completely rosy picture when it comes to the content we have,” she says. “But it is a work in progress.”
Why We Wrote This
As tech giants like Google and Facebook battle misinformation, one online platform has managed to remain above the fray. What is Wikipedia doing right that Silicon Valley is getting wrong?
Visit the Wikipedia entry for Apollo 11, and you’ll read about how NASA landed two men on the moon. Visit the Wikipedia entry for the Rothschild family, and you’ll see nothing about shapeshifting lizard people. Go to the entry on the Earth, and you’ll learn that the planet is basically spherical.
No political rants. No conspiracy theories presented as fact. It’s almost like not being on the internet.
As the information age has shaped up to be the misinformation age, with multi-billion-dollar Silicon Valley giants struggling to deal with hatemongers, propagandists, and all manner of crackpots, Wikipedia, the world’s fifth-most-visited website, has seen its credibility grow. By hewing close to a set of core principles, the free, collaborative encyclopedia recalls the optimism of the web’s early days, before the like buttons, clickbait headlines, and political bots began to strain the relationship between technology and the truth.
Why We Wrote This
As tech giants like Google and Facebook battle misinformation, one online platform has managed to remain above the fray. What is Wikipedia doing right that Silicon Valley is getting wrong?
“Wikipedia now is one of the lone survivors of that original Web 1.0 type of world,” says Joseph Reagle, an associate professor at Northeastern University and author of “Good Faith Collaboration: The Culture of Wikipedia.”
To be sure, it’s not hard to find an error on Wikipedia, especially if you include errors of omission. Groups marginalized in real life, such as women and ethnic minorities, are likely to be marginalized on Wikipedia. But those same groups are less likely to face the kind of overt vilification seen on Facebook, YouTube, or Reddit.
To find the kind of rancor common to these sites on Wikipedia, one has to peel back the site’s outer layer – the neutral and authoritative entries on nearly every subject – and visit each entry’s “talk” page. There, you can see the back-and-forth that goes into each article.
Sometimes these exchanges can get contentious, on topics ranging from the monumental to the mundane. In the event of an “edit war,” where dueling editors continually overwrite each other’s work, the site’s volunteer administrators can step in and restrict how a page is updated and by whom.
That’s what happened beginning in 2005 for the entry for “Hummus,” which became a proxy battleground for the Arab-Israeli conflict, with each side claiming to have invented the mashed garbanzo bean dip. Admins intervened, and today, any would-be “Hummus” contributor must have made at least 500 edits to other Wikipedia entries and have had an account for at least 30 days. As of this writing, the dish is described as “Levantine” in origin.
But even on the talk pages, a sense of shared purpose generally prevails, with adversaries on all sides agreeing, for the most part, that the goal is to create a high-quality encyclopedia entry.
“Indeed the purpose of Wikipedia, to create a free encyclopedia that anyone can edit,” says Professor Reagle, “that singularity of focus is the thing that holds Wikipedia in good stead in light of all of the propaganda and misinformation that we see now.”
According to Wikipedia’s entry for itself, the site was founded by tech entrepreneur Jimmy Wales and philosopher Larry Sanger in 2001, as a way of quickly expanding the number of entries on their Nupedia project, a more traditional online encyclopedia written by experts. Wikipedia’s initial aim was to turn a profit, but it transitioned to a non-profit model in 2003. Unless you count the year-end fundraising appeals, Wikipedia runs no ads.
“If you are in the business of making money out of people’s attention, fake news would be so helpful for your business,” says Mostafa Mesgari, an assistant professor of management information systems at Elon University in Elon, N.C. “Wikipedia is very different from social media.”
Today, Wikipedia – a collection of about 300 encyclopedias in various languages – is maintained by a community of about 200,000 volunteers, and is operated by the non-profit Wikimedia Foundation, which also runs sister sites Wiktionary, Wikiquote, Wikisource, Wikimedia Commons, and other sites that offer free content produced and curated by volunteers.
Early on, the site was derided for its open-access approach. A 2006 piece in The New Yorker described it as “a system that does not favor the Ph.D. over the well-read fifteen-year-old.” In 2007, a New Jersey middle-school librarian made headlines around the country when she posted signs that read “Just say no to Wikipedia.” That same year, then Sen. Ted Stevens (R) of Alaska proposed banning the site in public schools.
“Wikipedia inevitably will be overtaken by the gamers and the marketers to the point where it will lose all credibility,” wrote law professor Eric Goldman in a 2005 blog post titled “Wikipedia Will Fail Within 5 Years.”
Instead, it seems the opposite happened. Studies on Wikipedia’s reliability began comparing the site favorably with the centuries-old Encyclopædia Britannica. More and more professors began suggesting to their students that the site can be a good starting point for further research. As Professor Mesgari and his colleagues noted in 2014, the preponderance of studies comparing Wikipedia with professionally produced encyclopedic information find that Wikipedia is a “generally reliable source of information.”
“I don’t think any of their predictions and ruminations came true about Wikipedia,” says Reagle of the site’s critics. “But I think some of the concerns did apply more widely.”
Wikipedia credits its resiliency against misinformation to its three core content policies: a neutral point of view, verifiability, and no original research. “Anybody using the encyclopedia can check that information comes from a reliable source,” says Kui Kinyanjui, a spokeswoman for the Wikimedia Foundation. “We don’t publish original research. What we cover is what is out there and what has been reliably cited by knowledgeable sources.”
This epistemic conservatism often leads Wikipedia to reproduce the biases that exist in the larger culture. For instance, up until she won the Nobel Prize in Physics on Tuesday, University of Waterloo physicist Donna Strickland lacked her own Wikipedia entry, an oversight that Ms. Kinyanjui chalks up to an absence of media coverage about her.
“We consider ourselves a mirror of the world,” she says. “And that means that we are a mirror of some of the shortfalls of the world.”
Sometimes the demographics of Wikipedia’s community – which is overwhelmingly male and more than a little nerdy – can distort that mirror. For instance, just 17 percent of the biographies in the English-language Wikipedia are of women. And the entry for the Xhosa language, which is spoken by nearly 20 million people, is shorter than the entry for the fictional Klingon language. “It’s not a completely rosy picture when it comes to the content we have,” says Kinyanjui. “But it is a work in progress.”
Despite its gaps, Wikipedia remains a far better launching point for research than most other sites hosting user-generated content. (If you doubt this, just imagine a high-school student researching, say, chemtrails, and whether she would be better served by starting off with Wikipedia or with YouTube.)
YouTube recognized this particular shortcoming earlier this year, and now it posts links to Wikipedia or Brittanica at the top of search results for particularly charged search terms, such as “Holocaust” or “climate change.” The move marks something of a concession by Google, the company that owns YouTube, which acknowledged that even a company worth nearly a trillion dollars can’t outperform a nonprofit that relies on volunteers committed to getting the facts right.
“It’s the best sum of all human knowledge,” says Mesgari.