Reliability of Wikipedia


truth in numbers

The reliability of Wikipedia (primarily of the English-language edition), compared to other encyclopedias and more specialized sources, is assessed in many ways, including statistically, through comparative review, analysis of the historical patterns, and strengths and weaknesses inherent in the editing process unique to Wikipedia.

Several studies have been done to assess the reliability of Wikipedia. A notable early study in the journal ‘Nature’ said that in 2005, ‘Wikipedia scientific articles came close to the level of accuracy in Encyclopædia Britannica and had a similar rate of ‘serious errors.’ The study was disputed by ‘Encyclopædia Britannica,’ and later ‘Nature’ responded to this refutation with both a formal response and a point-by-point rebuttal of Britannica’s main objections.

Between 2008 and 2010, articles in medical and scientific fields such as pathology, toxicology, oncology, and pharmaceuticals comparing Wikipedia to professional and peer-reviewed sources found that Wikipedia’s depth and coverage were of a high standard. Concerns regarding readability were raised in a study published by the American Society of Clinical Oncology. However, omissions sometimes remained an issue, at times due to public relations removal of adverse product information.

Wikipedia is open to anonymous and collaborative editing, so assessments of its reliability usually include examinations of how quickly false or misleading information is removed. An early study conducted by IBM researchers in 2003—two years following Wikipedia’s establishment—found that ‘vandalism is usually repaired extremely quickly — so quickly that most users will never see its effects’ and concluded that Wikipedia had ‘surprisingly effective self-healing capabilities.’ A 2007 peer-reviewed study stated that ‘42% of damage is repaired almost immediately… Nonetheless, there are still hundreds of millions of damaged views.’ Several incidents have also been publicized in which false information has lasted for a long time in Wikipedia.

In May 2005, a user edited the biographical article on American journalist John Seigenthaler Sr. so that it contained several false and defamatory statements. The inaccurate information went unnoticed until September of that year. After the information was removed from Wikipedia, it remained for another three weeks on sites which mirror Wikipedia content. A biographical article in French Wikipedia portrayed Léon-Robert de L’Astran as an 18th century anti-slavery ship owner, which led Ségolène Royal, a presidential candidate, to praise him. A student investigation later determined that the article was a hoax and de L’Astran had never existed. Jimmy Wales, the de facto leader of Wikipedia, stresses that encyclopedias of any type are not usually appropriate as primary sources, and should not be relied upon as being authoritative.

The Wikipedia model allows anyone to edit, and relies on a large number of well-intentioned editors to overcome issues raised by a smaller number of problematic editors. It is inherent in Wikipedia’s editing model that misleading information can be added, but over time quality is anticipated to improve in a form of group learning as editors reach consensus, so that substandard edits will very rapidly be removed. This assumption is still being tested, and its limitations and reliability are not yet a settled matter. Wikipedia is a pioneer in communal knowledge building of this kind. It contrasts with many more traditional models of knowledge and publishing, which attempt to limit content creation to a relatively small group of approved editors in order to exercise strong hierarchical control. In order to improve reliability, some editors have called for ‘stable versions’ of articles, or articles that have been reviewed by the community and locked from further editing.

Wikipedia allows anonymous editing: contributors are not required to provide any identification, or even an email address. A 2007 study at Dartmouth of Wikipedia noted that, contrary to usual social expectations, anonymous editors were some of Wikipedia’s most productive contributors of valid content. The Dartmouth study was criticized by John Timmer of the Ars Technica website for its methodological shortcomings.

While Wikipedia has the potential for extremely rapid growth and harnesses an entire community—much in the same way as other communal projects such as Linux—it goes further in trusting the same community to self-regulate and become more proficient at quality control. Wikipedia has harnessed the work of millions of people to produce the world’s largest knowledge-based site along with software to support it, resulting in more than nineteen million articles written, across more than 280 different language versions, in less than twelve years. For this reason, there has been considerable interest in the project both academically and from diverse fields such as information technology, business, project management, knowledge acquisition, software programming, and sociology, to explore whether the Wikipedia model can produce quality results, what collaboration in this way can reveal about people, and whether the scale of involvement can overcome the obstacles of individual limitations and poor editorship which would otherwise arise.

Another reason for inquiry is a growing and widespread reliance on Wikipedia by both websites and individuals, who use it as a source of information, raising concerns over such a major source being susceptible to rapid change, including the whimsical introduction of misinformation. It is the responsibility of those who intend to use such a dynamically changing, multi-authored source to ascertain the quality and reliability of articles, and the degree of usefulness, misinformation or vandalism which might be expected, in order to decide what reliance to place upon them. A helpful safeguard is always to reference Wikipedia accurately when it is quoted to allow false or unreliable material to be identified and corrected.

The reliability of Wikipedia articles can be measured by the following criteria: Accuracy of information provided within articles; Appropriateness of the images provided with the article; Appropriateness of the style and focus of the articles; Susceptibility to, and exclusion and removal of, false information; Comprehensiveness, scope and coverage within articles and in the range of articles; Identification of reputable third-party sources as citations; Stability of the articles; Susceptibility to editorial and systemic bias; and Quality of writing. The first four of these have been the subjects of various studies of the project, while the presence of bias is strongly disputed on both sides, and the prevalence and quality of citations can be tested within Wikipedia.

In 2005, British newspaper ‘The Guardian’ published a story titled ‘Can you trust Wikipedia?’ in which a panel of experts was asked to review seven entries related to their fields, giving each article reviewed a number designation out of ten points. Scores ranged from 0 to 8, but most received marks between 5 and 8. The most common criticisms were: Poor prose, or ease-of-reading issues; Omissions or inaccuracies, often small but including key omissions in some article; and Poor balance, with less important areas being given more attention and vice versa. The most common praises were: Factually sound and correct, no glaring inaccuracies; and Much useful information, including well selected links, making it possible to ‘access much information quickly.’

In December 2005, ‘Nature’ conducted a single-blind study comparing the accuracy of a sample of articles from Wikipedia and ‘Encyclopædia Britannica.’ The sample included 42 articles on scientific topics, including biographies of well-known scientists. The articles were compared for accuracy by academic reviewers who remained anonymous − a customary practice for journal article reviews. Based on their review, the average Wikipedia article contained 4 errors or omissions; the average ‘Britannica’ article, 3. Only 4 serious errors were found in ‘Wikipedia,’ and 4 in ‘Encyclopædia Britannica.’ The study concluded: ‘Wikipedia comes close to ‘Britannica’ in terms of the accuracy of its science entries,’ although Wikipedia’s articles were often ‘poorly structured.’

‘Encyclopædia Britannica’ expressed concerns, leading ‘Nature’ to release further documentation of its survey method. Based on this additional information, ‘Encyclopædia Britannica’ denied the validity of the study, stating that it was ‘fatally flawed.’ Among Britannica’s criticisms were that excerpts rather than the full texts of some of their articles were used, that some of the extracts were compilations that included articles written for the youth version, that ‘Nature’ did not check the factual assertions of its reviewers, and that many points which the reviewers labeled as errors were differences of editorial opinion. ‘Nature’ acknowledged the compiled nature of some of the ‘Britannica’ extracts, but denied that this invalidated the conclusions of the study. ‘Encyclopædia Britannica’ also argued that while the ‘Nature’ study showed that the error rate between the two encyclopedias was similar, a breakdown of the errors indicated that the mistakes in Wikipedia were more often the inclusion of incorrect facts, while the mistakes in Britannica were ‘errors of omission,’ making ‘Britannica far more accurate than Wikipedia, according to the figures.’ ‘Nature’ has since rejected the ‘Britannica response,’ stating that any errors on the part of its reviewers were not biased in favor of either encyclopedia, that in some cases it used excerpts of articles from both encyclopedias, and that Britannica did not share particular concerns with ‘Nature’ before publishing its ‘open letter’ rebuttal.

In 2006, Roy Rosenzweig, a professor specializing in American history, published a comparison of the Wikipedia biographies of 25 Americans to the corresponding biographies found on ‘Encarta’ and ‘American National Biography Online.’ He wrote that Wikipedia is ‘surprisingly accurate in reporting names, dates, and events in U.S. history,’ with some of the errors being ‘widely held but inaccurate beliefs.’ However, he stated that Wikipedia often fails to distinguish important from trivial details, and does not provide the best references. He also complained about Wikipedia’s lack of ‘persuasive analysis and interpretations, and clear and engaging prose.’ Wikipedia’s policies on original research, including unpublished synthesis of published data disallow new analysis and interpretation not found in reliable sources.

In a 2004 interview with ‘The Guardian,’ self-described information specialist and Internet consultant Philip Bradley said that he would not use Wikipedia and was ‘not aware of a single librarian who would. The main problem is the lack of authority. With printed publications, the publishers have to ensure that their data are reliable, as their livelihood depends on it. But with something like this, all that goes out the window.’ A 2006 review of Wikipedia by ‘Library Journal,’ using a panel of librarians, ‘the toughest critics of reference materials, whatever their format,’ asked ‘long standing reviewers’ to evaluate three areas of Wikipedia (popular culture, current affairs, and science), and concluded: ‘While there are still reasons to proceed with caution when using a resource that takes pride in limited professional management, many encouraging signs suggest that (at least for now) Wikipedia may be granted the librarian’s seal of approval.’ A reviewer who ‘decided to explore controversial historical and current events, hoping to find glaring abuses’ concluded ‘I was pleased by Wikipedia’s objective presentation of controversial subjects’ but that ‘as with much information floating around in cyberspace, a healthy degree of skepticism and skill at winnowing fact from opinion are required.’ Other reviewers noted that there is ‘much variation’ but ‘good content abounds.’

In 2007, Michael Gorman, former president of the American Library Association (ALA) stated in an ‘Encyclopædia Britannica’ blog that ‘A professor who encourages the use of Wikipedia is the intellectual equivalent of a dietician who recommends a steady diet of Big Macs with everything.’ The library at Trent University in Ontario states of Wikipedia that many articles are ‘long and comprehensive,’ but that there is ‘a lot of room for misinformation and bias [and] a lot of variability in both the quality and depth of articles.’ It adds that Wikipedia has advantages and limitations, that it has ‘excellent coverage of technical topics’ and articles are ‘often added quickly and, as a result, coverage of current events is quite good,’ comparing this to traditional sources which are unable to achieve this task. It concludes that, depending upon the need, one should think critically and assess the appropriateness of one’s sources, ‘whether you are looking for fact or opinion, how in-depth you want to be as you explore a topic, the importance of reliability and accuracy, and the importance of timely or recent information,’ and adds that Wikipedia can be used in any event as a ‘starting point.’

An article for the Canadian Library Association (CLA) discusses the Wikipedia approach, process and outcome in depth, commenting for example that in controversial topics, ‘what is most remarkable is that the two sides actually engaged each other and negotiated a version of the article that both can more or less live with.’ The author comments that: ‘In fact Wikipedia has more institutional structure than at first appears. Some 800 experienced users are designated as administrators [Update: As of 2009 some 1600 on English Wikipedia alone], with special powers of binding and loosing: they can protect and unprotect, delete and undelete and revert articles, and block and unblock users. They are expected to use their powers in a neutral way, forming and implementing the consensus of the community. The effect of their intervention shows in the discussion pages of most contentious articles. Wikipedia has survived this long because it is easier to reverse vandalism than it is to commit it…’

Academics have also criticized Wikipedia for its perceived failure as a reliable source, and because Wikipedia editors may not have degrees or other credentials. For that reason, the use of Wikipedia is not accepted in many schools and universities in writing a formal paper, and some educational institutions have banned it as a primary source while others have limited its use to only a pointer to external sources. This criticism, however, may not only apply to Wikipedia but to encyclopedias in general – some university lecturers are not impressed when students cite print-based encyclopedias in assigned work.

Science and medicine are areas where accuracy is of high importance and peer review is the norm. While some of Wikipedia’s content has passed a form of peer review, most has not. A 2008 study examined 80 Wikipedia drug entries. The researchers found few factual errors in this set of articles, but determined that these articles were often missing important information, like contraindications and drug interactions. One of the researchers noted that ‘If people went and used this as a sole or authoritative source without contacting a health professional…those are the types of negative impacts that can occur.’ The researchers also compared Wikipedia to Medscape Drug Reference (MDR), by looking for answers to 80 different questions covering eight categories of drug information, including adverse drug events, dosages, and mechanism of action. They have determined that MDR provided answers to 82.5 percent of the questions, while Wikipedia could only answer 40 percent, and that answers were less likely to be complete for Wikipedia as well. None of the answers from Wikipedia were determined factually inaccurate, while they found four inaccurate answers in MDR. But the researchers found 48 errors of omission in the Wikipedia entries, compared to 14 for MDR. The lead investigator concluded: ‘I think that these errors of omission can be just as dangerous [as inaccuracies],’ and he pointed out that drug company representatives have been caught deleting information from Wikipedia entries that make their drugs look unsafe.

A 2009 survey asked US toxicologists how accurately they rated the portrayal of health risks of chemicals in different media sources. It was based on the answers of 937 members of the Society of Toxicology and found that these experts regarded Wikipedia’s reliability in this area as far higher than that of all traditional news media: ‘In perhaps the most surprising finding in the entire study, all these national media outlets are easily eclipsed by two representatives of ‘new media’: WebMD and Wikipedia. WebMD is the only news source whose coverage of chemical risk is regarded as accurate by a majority (56 percent) of toxicologists, closely followed by Wikipedia’s 45 percent accuracy rating. By contrast, only 15 percent describe as accurate the portrayals of chemical risk found in the ‘New York Times,’ ‘Washington Post,’ and ‘Wall Street Journal.”

In a 2004 piece called ‘The Faith-Based Encyclopedia,’ Robert McHenry, a former editor-in-chief of ‘Encyclopædia Britannica,’ stated that Wikipedia errs in billing itself as an encyclopedia, because that word implies a level of authority and accountability that he believes cannot be possessed by an openly editable reference. McHenry argued that ‘the typical user doesn’t know how conventional encyclopedias achieve reliability, only that they do.’ He added: ‘[H]owever closely a Wikipedia article may at some point in its life attain to reliability, it is forever open to the uninformed or semiliterate meddler… The user who visits Wikipedia to learn about some subject, to confirm some matter of fact, is rather in the position of a visitor to a public restroom. It may be obviously dirty, so that he knows to exercise great care, or it may seem fairly clean, so that he may be lulled into a false sense of security. What he certainly does not know is who has used the facilities before him.’

Similarly, Britannica’s executive editor, Ted Pappas, was quoted in ‘The Guardian’ as saying: ‘The premise of Wikipedia is that continuous improvement will lead to perfection. That premise is completely unproven.’ In the September 12, 2006 edition of ‘The Wall Street Journal,’ Jimmy Wales debated with Dale Hoiberg, editor-in-chief of ‘Encyclopædia Britannica.’ Hoiberg focused on a need for expertise and control in an encyclopedia and cited American historian Lewis Mumford; he argued that overwhelming information could ‘bring about a state of intellectual enervation and depletion hardly to be distinguished from massive ignorance.’ Wales emphasized Wikipedia’s differences, and asserted that openness and transparency lead to quality. Hoiberg claimed that he ‘had neither the time nor space to respond to [criticisms]’ and ‘could corral any number of links to articles alleging errors in Wikipedia,’ to which Wales responded: ‘No problem! Wikipedia to the rescue with a fine article,’ and included a link to the Wikipedia article ‘Criticism of Wikipedia.’

In one article, ‘Information Today’ (2006) likens comparisons between Wikipedia and Britannica to ‘Apples and Oranges’: ‘Even the revered ‘Encyclopædia Britannica’ is riddled with errors, not to mention the subtle yet pervasive biases of individual subjectivity and corporate correctness… There is no one perfect way. Britannica seems to claim that there is. Wikipedia acknowledges there’s no such thing. Librarians and information professionals have always known this. That’s why we always consult multiple sources and counsel our users to do the same.’

Jonathan Sidener of ‘The San Diego Union-Tribune’ wrote that ‘vandalism and self-serving misinformation [are] common particularly in the political articles.’ BBC technology specialist Bill Thompson wrote that ‘Most Wikipedia entries are written and submitted in good faith, and we should not let the contentious areas such as politics, religion or biography shape our view of the project as a whole,’ that it forms a good starting point for serious research but that: ‘No information source is guaranteed to be accurate, and we should not place complete faith in something which can so easily be undermined through malice or ignorance… That does not devalue the project entirely, it just means that we should be skeptical about Wikipedia entries as a primary source of information… It is the same with search engine results. Just because something comes up in the top 10 on MSN Search or Google does not automatically give it credibility or vouch for its accuracy or importance.’ Thompson adds the observation that since most popular online sources are inherently unreliable in this way, one byproduct of the information age is a wiser audience who are learning to check information rather than take it on faith due to its source, leading to ‘a better sense of how to evaluate information sources.’

Criticism and concerns have been expressed about other sources (such as newspapers) which silently use Wikipedia as a reference source. The danger is that if the original information in Wikipedia was false, the fact that it has been reported in other media means that there is now a reliable source to reference the false information in Wikipedia, giving it apparent respectability. This in turn increases the likelihood of the false information being reported in other media. A known example is the Sacha Baron Cohen article, where false information added in Wikipedia was apparently used by two newspapers, leading to it being treated as reliable in Wikipedia. This process of creating reliable sources for false facts has been termed ‘Citogenesis’ by webcomic artist Randall Munroe.

Fernanda Viégas of the MIT Media Lab and Martin Wattenberg and Kushal Dave of IBM Research studied the flow of editing in the Wikipedia model, with emphasis on breaks in flow (from vandalism or substantial rewrites), showing the dynamic flow of material over time. From a sample of vandalism edits on the English Wikipedia during May 2003, they found that most such acts were repaired within minutes, summarizing: ‘We’ve examined many pages on Wikipedia that treat controversial topics, and have discovered that most have, in fact, been vandalized at some point in their history. But we’ve also found that vandalism is usually repaired extremely quickly—so quickly that most users will never see its effects.’ They also stated that ‘it is essentially impossible to find a crisp definition of vandalism.’ A study in late-2007 systematically inserted inaccuracies into Wikipedia entries about the lives of philosophers. Depending on how exactly the data are interpreted, either one third or one half of the inaccuracies were corrected within 48 hours.

In 2007, ‘WikiScanner,’ a tool developed by Virgil Griffith of the California Institute of Technology, was released to match anonymous IP edits in the encyclopedia with an extensive database of addresses. News stories appeared about IP addresses from various organizations such as the Central Intelligence Agency, the Democratic Congressional Campaign Committee, Diebold, Inc. , and the Australian government being used to make edits to Wikipedia articles, sometimes of an opinionated or questionable nature. The BBC quoted a Wikimedia spokesperson as praising the tool: ‘We really value transparency and the scanner really takes this to another level. Wikipedia Scanner may prevent an organization or individuals from editing articles that they’re really not supposed to.’ However, WikiScanner only reveals conflicts of interest when the editor does not have a Wikipedia account and their IP address is used instead. Conflict of interest editing done by editors with accounts is not detected, since those edits are anonymous to everyone – except for a handful of privileged Wikipedia admins.

The story was also covered by ‘The Independent,’ which stated that many ‘censorial interventions’ by editors with vested interests on a variety of articles in Wikipedia had been discovered: ‘[Wikipedia] was hailed as a breakthrough in the democratization of knowledge. But the online encyclopedia has since been hijacked by forces who decided that certain things were best left unknown… Now a website designed to monitor editorial changes made on Wikipedia has found thousands of self-serving edits and traced them to their original source. It has turned out to be hugely embarrassing for armies of political spin doctors and corporate revisionists who believed their censorial interventions had gone unnoticed.’ Not everyone hailed WikiScanner as a success for Wikipedia. Oliver Kamm, in a column for ‘The Times,’ argued instead that: ‘The WikiScanner is thus an important development in bringing down a pernicious influence on our intellectual life. Critics of the web decry the medium as the cult of the amateur. Wikipedia is worse than that; it is the province of the covert lobby. The most constructive course is to stand on the sidelines and jeer at its pretensions.’

Wikipedia has been accused of systemic bias, which is to say its general nature leads, without necessarily any conscious intention, to the propagation of various prejudices. Although many articles in newspapers have concentrated on minor, indeed trivial, factual errors in Wikipedia articles, there are also concerns about large scale, presumably unintentional effects from the increasing influence and use of Wikipedia as a research tool at all levels. In an article in the ‘Times Higher Education’ magazine (London) philosopher Martin Cohen frames Wikipedia of having ‘become a monopoly’ with ‘all the prejudices and ignorance of its creators,’ which he describes as a ‘youthful cab-drivers’ perspective. Cohen’s argument, however, finds a grave conclusion in these circumstances: ‘To control the reference sources that people use is to control the way people comprehend the world. Wikipedia may have a benign, even trivial face, but underneath may lie a more sinister and subtle threat to freedom of thought.’ That freedom is undermined by what he sees as what matters on Wikipedia, ‘not your sources but the ‘support of the community.”

Critics also point to the tendency to cover topics in a detail disproportionate to their importance. For example, Stephen Colbert once mockingly praised Wikipedia for having a ‘longer entry on ‘lightsabers’ than it does on the ‘printing press.” In an interview with ‘The Guardian’ Dale Hoiberg noted: ‘People write of things they’re interested in, and so many subjects don’t get covered; and news events get covered in great detail. In the past, the entry on ‘Hurricane Frances’ was more than five times the length of that on ‘Chinese art,’ and the entry on ‘Coronation Street’ was twice as long as the article on ‘Tony Blair.” This critical approach has been satirized as ‘Wikigroaning,’ a term coined by Jon Hendren of the website ‘Something Awful.’ In the game, two articles (preferably with similar names) are compared: one about an acknowledged serious or classical subject and the other about a popular topic or current event. Defenders of a broad inclusion criteria have held that the encyclopedia’s coverage of pop culture does not impose space constraints on the coverage of more serious subjects. As Ivor Tossell noted: ‘That Wikipedia is chock full of useless arcana (and did you know, by the way, that the article on ‘Debate’ is shorter than the piece that weighs the relative merits of the 1978 and 2003 versions of ‘Battlestar Galactica’?) isn’t a knock against it: Since it can grow infinitely, the silly articles aren’t depriving the serious ones of space.’

Wikipedia has been accused of deficiencies in comprehensiveness because of its voluntary nature, and of reflecting the systemic biases of its contributors. Former ‘Nupedia’ editor-in-chief Larry Sanger stated in 2004, ‘when it comes to relatively specialized topics (outside of the interests of most of the contributors), the project’s credibility is very uneven.’ ‘Nupedia,’ is an online encyclopedia project founded by Jimmy Wales, which relies on experts.

Wikipedia’s notability guidelines, and the application thereof, are the subject of much criticism. Nicholson Baker considers the notability standards arbitrary and essentially unsolvable: ‘There are quires, reams, bales of controversy over what constitutes notability in Wikipedia: nobody will ever sort it out.’ Criticizing the ‘deletionists,’ Baker then writes: ‘Still, a lot of good work—verifiable, informative, brain-leapingly strange—is being cast out of this paperless, infinitely expandable accordion folder by people who have a narrow, almost grade-schoolish notion of what sort of curiosity an on-line encyclopedia will be able to satisfy in the years to come. […] It’s harder to improve something that’s already written, or to write something altogether new, especially now that so many of the World Book–sanctioned encyclopedic fruits are long plucked. There are some people on Wikipedia now who are just bullies, who take pleasure in wrecking and mocking peoples’ work—even to the point of laughing at nonstandard ‘Engrish.’ They poke articles full of warnings and citation-needed notes and deletion prods till the topics go away.’

Yet another criticism about the deletionists is this: ‘The increasing difficulty of making a successful edit; the exclusion of casual users; slower growth – all are hallmarks of the deletionists approach.’ Complaining that his own biography was on the verge of deletion for lack of notability, senior editor of ‘The New Republic,’ Timothy Noah argued that: ‘Wikipedia’s notability policy resembles U.S. immigration policy before 9/11: stringent rules, spotty enforcement. To be notable, a Wikipedia topic must be ‘the subject of multiple, non-trivial published works from sources that are reliable and independent of the subject and of each other.’ Although I have written or been quoted in such works, I can’t say I’ve ever been the subject of any. And wouldn’t you know, some notability cop cruised past my bio and pulled me over. Unless I get notable in a hurry—win the Nobel Peace Prize? Prove I sired Anna Nicole Smith’s baby daughter?—a ‘sysop’ (volunteer techie) will wipe my Wikipedia page clean. It’s straight out of Philip K. Dick.’ In the same article, Noah mentions that the Pulitzer Prize-winning writer Stacy Schiff was not considered notable enough for a Wikipedia entry before she wrote an extensive ‘New Yorker’ article on Wikipedia itself.

Another criticism is that a politically liberal bias is predominant. According to Jimmy Wales: ‘The Wikipedia community is very diverse, from liberal to conservative to libertarian and beyond. If averages mattered, and due to the nature of the wiki software (no voting) they almost certainly don’t, I would say that the Wikipedia community is slightly more liberal than the U.S. population on average, because we are global and the international community of English speakers is slightly more liberal than the U.S. population. There are no data or surveys to back that.’ Andrew Schlafly created ‘Conservapedia’ because of his perception that Wikipedia contained a liberal bias. Conservapedia’s editors have compiled a list of alleged examples of liberal bias in Wikipedia. In 2007, an article in ‘The Christian Post’ criticized Wikipedia’s coverage of intelligent design, saying that it was biased and hypocritical. Lawrence Solomon of the ‘National Review’ stated that Wikipedia articles on subjects like ‘global warming,’ ‘intelligent design,’ and ‘Roe v. Wade’ are slanted in favor of liberal views.

In a 2010 issue of the conservative weekly ‘Human Events,’ Rowan Scarborough presented a critique of Wikipedia’s coverage of American politicians prominent in the approaching midterm elections as evidence of systemic liberal bias. Scarborough compares the biographical articles of liberal and conservative opponents in Senate races in the Alaska Republican primary and the Delaware and Nevada general election, emphasizing the quantity of negative coverage of tea party-endorsed candidates. He also cites some criticism by Lawrence Solomon and quotes in full the lead section of Wikipedia’s article on the conservative wiki Conservapedia as evidence of an underlying bias.

Tim Anderson, a senior lecturer in political economy at the University of Sydney, said that Wikipedia administrators display a U.S.-oriented bias in their interaction with editors, and in their determination of sources that are appropriate for use on the site. Anderson was outraged after several of the sources he used in his edits to Hugo Chávez, including ‘Venezuela Analysis’ and ‘Z Magazine,’ were disallowed as ‘unusable.’ Anderson also described Wikipedia’s neutral point of view policy to ‘ZDNet Australia’ as ‘a facade,’ and that Wikipedia ‘hides behind a reliance on corporate media editorials.’

Justine Cassell, a professor and the director of the Human-Computer Interaction Institute at Carnegie Mellon University, has criticized Wikipedia for lacking not only women contributors but also extensive and in-depth encyclopedic attention to many topics regarding gender. An article in ‘The New York Times’ cites a Wikimedia Foundation study which found that fewer than 13% of contributors to Wikipedia are women. Sue Gardner, the executive director of the foundation, said increasing diversity was about making the encyclopedia ‘as good as it could be.’ Factors the article cited as possibly discouraging women from editing included the ‘obsessive fact-loving realm,’ associations with the ‘hard-driving hacker crowd,’ and the necessity to be ‘open to very difficult, high-conflict people, even misogynists.’

Although Wikipedia is stated not to be a primary source, it has been used as evidence in legal cases. In 2007, ‘The New York Times’ reported that U.S. courts vary in their treatment of Wikipedia as a source of information, with over 100 judicial rulings having relied on the encyclopedia, including those involving taxes, narcotics, and civil issues such as personal injury and matrimonial issues. In 2012, ‘The Wall Street Journal’ reported that in the five years since the story, federal courts of appeals had cited Wikipedia about 95 times. The story also reported that the U.S. Court of Appeals for the Fourth Circuit tossed out convictions in a cockfighting case because a juror used Wikipedia to research an element of the crime, expressing in its decision concerns about Wikipedia’s reliability. Several commentators have drawn a middle ground, asserting that the project contains much valuable knowledge and has some reliability, even if the degree is not yet assessed with certainty. Many of the librarian and academic reviewers of Wikipedia cited above take such a view.

While experienced editors can view the article history and discussion page, for normal users it is not so easy to check whether information from Wikipedia is reliable. University projects from California, Switzerland, and Germany try to improve that by methods of formal analysis and data mining. ‘Wiki-Watch’ from Germany, which was inspired by the ‘WikiBu’ from Switzerland, shows an evaluation up to five-stars for every English or German article in Wikipedia. Part of this rating is the Californian tool ‘WikiTrust’ which shows the trustworthiness of single text parts of Wikipedia articles by white (trustworthy) or orange (not trustworthy) markings.

Inaccurate information may persist in Wikipedia for a long time before it is challenged. The most prominent cases reported by mainstream media involved biographies of living persons. The Seigenthaler incident demonstrated that the subject of a biographical article must sometimes fix blatant lies about their own life. Wikipedia content is often mirrored at sites such as ‘,’ which means that incorrect information can be replicated alongside correct information through a number of web sources. Such information can develop a misleading air of authority because of its presence at such sites: ‘Then [Seigenthaler’s] son discovered that his father’s hoax biography also appeared on two other sites, and, which took direct feeds from Wikipedia. It was out there for four months before Seigenthaler realized and got the Wikipedia entry replaced with a more reliable account. The lies remained for another three weeks on the mirror sites downstream.’

Internet activist Seth Finkelstein reported in an article in ‘The Guardian’ on his efforts to remove his own biography page from Wikipedia, simply because it was subjected to defamation: ‘Wikipedia has a short biography of me, originally added in February 2004, mostly concerned with my internet civil liberties achievements. After discovering in May 2006 that it had been vandalized in March, possibly by a long-time opponent, and that the attack had been subsequently propagated to many other sites which (legally) repackage Wikipedia’s content, the article’s existence seemed to me overall to be harmful rather than helpful. For people who are not very prominent, Wikipedia biographies can be an ‘attractive nuisance.’ It says, to every troll, vandal, and score-settler: ‘Here’s an article about a person where you can, with no accountability whatsoever, write any libel, defamation, or smear. It won’t be a marginal comment with the social status of an inconsequential rant, but rather will be made prominent about the person, and reputation-laundered with the institutional status of an encyclopedia.’ In the same article Finkelstein recounts how he voted his own biography as ‘not notable enough’ in order to have it removed from Wikipedia. He goes on to recount a similar story involving Angela Beesley, previously a prominent member of the foundation which runs Wikipedia.

In 2005, the biography of Jens Stoltenberg, the Norwegian Prime Minister, was edited to contain libelous statements including accusations of pedophilia and prison time. Taner Akçam, a Turkish history professor at the University of Minnesota, was detained at the Montreal airport, as his article was vandalized by Turkish nationalists in 2007. While this mistake was resolved, he was again arrested in USA for the same suspicion two days later. Also in 2007, reported that Hillary Rodham Clinton had been incorrectly listed for 20 months in her Wikipedia biography as valedictorian of her class of 1969 at Wellesley College. (Hillary Rodham was not the valedictorian, though she did speak at commencement.) The article included a link to the 2005 Wikipedia edit. After the report, the inaccurate information was removed the same day. Between the two edits, the wrong information had stayed in the Clinton article while it was edited more than 4,800 times over 20 months.

Attempts to perpetrate hoaxes may not be confined to editing Wikipedia articles. In 2005 Alan Mcilwraith, a former call center worker from Scotland created a Wikipedia article in which he claimed to be a highly decorated war hero. The article was quickly identified by other users as unreliable. However, Mcilwraith had also succeeded in convincing a number of charities and media organizations that he was who he claimed to be: ‘The 28-year-old, who calls himself Captain Sir Alan McIlwraith, KBE, DSO, MC, has mixed with celebrities for at least one fundraising event. But last night, an Army spokesman said: ‘I can confirm he is a fraud. He has never been an officer, soldier or Army cadet.”

There have also been instances of users deliberately inserting false information into Wikipedia in order to test the system and demonstrate its alleged unreliability. For example, Gene Weingarten, a journalist, ran such a test in 2007 by anonymously inserting false information into his own biography. The fabrications were removed 27 hours later by a Wikipedia editor who was regularly watching changes to that article. Television personality Stephen Colbert lampooned this drawback of Wikipedia, calling it ‘wikiality.’ ‘Death by Wikipedia’ is a phenomenon in which a person is erroneously proclaimed dead through vandalism. Articles about the comedian Paul Reiser, British television host Vernon Kay, and the West Virginia Senator Robert Byrd, who died in 2010, have been vandalized in this way.

Wikipedia considers vandalism as ‘any addition, removal, or change of content in a deliberate attempt to compromise the integrity of Wikipedia.’ The Wikipedia page ‘Researching with Wikipedia’ states: ‘Wikipedia’s radical openness means that any given article may be, at any given moment, in a bad state: for example, it could be in the middle of a large edit or it could have been recently vandalized. While blatant vandalism is usually easily spotted and rapidly corrected, Wikipedia is certainly more subject to subtle vandalism than a typical reference work.’

While Wikipedia policy requires articles to have a neutral point of view, it is not immune from attempts by outsiders (or insiders) with an agenda to place a spin on articles. In 2006 it was revealed that several staffers of members of the U.S. House of Representatives had embarked on a campaign to cleanse their respective bosses’ biographies on Wikipedia, as well as inserting negative remarks on political opponents. References to a campaign promise by Martin Meehan to surrender his seat in 2000 were deleted, and negative comments were inserted into the articles on U.S. Senator Bill Frist and Eric Cantor, a congressman from Virginia. Numerous other changes were made from an IP address which is assigned to the House of Representatives. In 2008, ‘The New York Times’ ran an article detailing the edits made to the biography of Sarah Palin in the wake of her nomination as running mate of John McCain. During the 24 hours before the McCain campaign announcement, 30 edits, many of them flattering details, were made to the article by Wikipedia single-purpose user identity Young Trigg. This person later acknowledged working on the McCain campaign, and having several Wikipedia user accounts.

Larry Delay and Pablo Bachelet write that from their perspective, some articles dealing with Latin American history and groups (such as the Sandinistas and Cuba) lack political neutrality and are written from a sympathetic Marxist perspective which treats socialist dictatorships favorably at the expense of alternate positions.

In 2009, Wikipedia banned the Church of Scientology from editing any articles on its site. The Wikipedia articles concerning Scientology were edited by members of the group to improve its portrayal.

In 2010, Al Jazeera’s Teymoor Nabili suggested that the article ‘Cyrus Cylinder’ (an ancient clay cylinder) had been edited for political purposes by ‘an apparent tussle of opinions in the shadowy world of hard drives and ‘independent’ editors that comprise the Wikipedia industry.’ He suggested that after the Iranian presidential election, 2009 and the ensuing ‘anti-Iranian activities’ a “strenuous attempt to portray the cylinder as nothing more than the propaganda tool of an aggressive invader’ was visible. The edits following his analysis of the edits during 2009 and 2010, represented ‘a complete dismissal of the suggestion that the cylinder, or Cyrus’ actions, represent concern for human rights or any kind of enlightened intent,’ in stark contrast to Cyrus’ own reputation as documented in the Old Testament and the people of Babylon.

In 2008, the Boston-based Committee for Accuracy in Middle East Reporting in America (CAMERA) organized an e-mail campaign to encourage readers to correct perceived Israel-related biases and inconsistencies in Wikipedia. Excerpts of some of the e-mails were published in ‘Harper’s Magazine’ under the title of ‘Candid camera.’ CAMERA argued the excerpts were unrepresentative and that it had explicitly campaigned merely ‘toward encouraging people to learn about and edit the online encyclopedia for accuracy.’ According to some defenders of CAMERA, serious misrepresentations of CAMERA’s role emanated from the competing ‘Electronic Intifada’ group; moreover, it is said, some other Palestinian advocacy groups have been guilty of systematic misrepresentations and manipulative behaviors but have not suffered bans of editors amongst their staff or volunteers.

Five editors involved in the campaign were sanctioned by Wikipedia administrators. Israeli diplomat David Saranga said that Wikipedia is generally fair in regard to Israel. When confronted with the fact that the entry on Israel mentioned the word ‘occupation’ nine times, whereas the entry on the Palestinian People mentioned ‘terror’ only once, he replied: ‘It means only one thing: Israelis should be more active on Wikipedia. Instead of blaming it, they should go on the site much more, and try and change it.’ Political commentator Haviv Rettig Gur, reviewing widespread perceptions in Israel of systemic bias in Wikipedia articles, has argued that there are deeper structural problems creating this bias: anonymous editing favors biased results, especially if those Gur calls ‘pro-Palestinian activists’ organize concerted campaigns as has been done in articles dealing with Arab-Israeli issues, and current Wikipedia policies, while well-meant, have proven ineffective in handling this.

In 2010, it was reported that the Yesha Council together with ‘Israel Sheli’ (‘My Israel’), a network of online pro-Israel activists committed to spreading Zionism online, were organizing people at a workshop in Jerusalem to teach them how to edit Wikipedia articles in a pro-Israeli way. Around 50 people took part in the course. The project organizer  Ayelet Shaked was interviewed on ‘Arutz Sheva Radio.’ She emphasized that the information has to be reliable and meet Wikipedia rules. She cited some examples such as the use of the term ‘occupation’ in Wikipedia entries, as well as in the editing of entries that link Israel with Judea and Samaria and Jewish history.’ ‘We don’t want to change Wikipedia or turn it into a propaganda arm,’ commented Naftali Bennett, director of the Yesha Council. ‘We just want to show the other side. People think that Israelis are mean, evil people who only want to hurt Arabs all day.’ ‘The idea is not to make Wikipedia rightist but for it to include our point of view,’ he said in another interview. Following the course announcement, Abdul Nasser An-Najar, the head of ‘Palestinian Journalists Syndicate’ said there were plans to set up a counter group to ensure the Palestinian view is presented online as the ‘next regional war will be [a] media war.’ In 2011, Jimmy Wales stated in retrospect about the course organized by Israel Sheli, ‘we saw absolutely no impact from that effort whatsoever. I don’t think it ever – it was in the press but we never saw any impact.’

In 2012, members of the public relations industry created the Corporate Representatives for Ethical Wikipedia Engagement (CREWE) Facebook group with the stated goal of maintaining accurate articles about corporations. In 2007, Australian programmer Rick Jelliffe claimed that Microsoft had offered him compensation in exchange for his future editorial services on OOXML (a Microsoft file format). A Microsoft spokesperson, quoted by CBS, commented that ‘Microsoft and the writer, Rick Jelliffe, had not determined a price and no money had changed hands, but they had agreed that the company would not be allowed to review his writing before submission.’ CBS also quoted Jimmy Wales as having expressed his disapproval of Microsoft’s involvement: ‘We were very disappointed to hear that Microsoft was taking that approach.’

In a story covered by ‘InformationWeek,’ Eric Goldman, assistant law professor at Santa Clara University in California argued that ‘eventually, marketers will build scripts to edit Wikipedia pages to insert links and conduct automated attacks on Wikipedia, ‘thus putting the encyclopedia beyond the ability of its editors to provide countermeasures against the attackers, particularly because of a vicious circle where the strain of responding to these attacks drives core contributors away, increasing the strain on those who remain. However, Wikipedia operates bots to aid in the detection and removal of vandalism, and uses nofollow and a CAPTCHA to discourage and filter additions of external links.

In 2008, British technology news and opinion website ‘The Register’ stated that a prominent administrator of Wikipedia had edited a topic area where he had a conflict of interest to keep criticism to a bare minimum, as well as altering the Wikipedia policies regarding personal biography and conflict of interest to favor his editing. Some of the most scathing criticism of Wikipedia’s claimed neutrality came in ‘The Register,’ which in turn was allegedly criticized by founding members of the project. According to ‘The Register’: ‘In short, Wikipedia is a cult. Or at least, the inner circle is a cult. We aren’t the first to make this observation. On the inside, they reinforce each other’s beliefs. And if anyone on the outside questions those beliefs, they circle the wagons. They deny the facts. They attack the attacker. After our Jossi Fresco story, Fresco didn’t refute our reporting. He simply accused us of ‘yellow journalism.’ After our article, Wales called us ‘trash.”

Articles of particular interest to an editor or group of editors are sometimes ‘commandeered’ and ‘sanitized’ to continually reflect a point of view that sheds a favorable light on the subject or group. Editors essentially ‘squat’ on pages, watching for negative entries, then immediately revert them. This is especially true of pages on politicians as shown on USA Congressional staff edits to Wikipedia. The page on Scientology has also been subject to vandalism and protected under Wikipedia’s Protection Policy.

The 2005 ‘Nature’ study also gave two brief examples of challenges that Wikipedian science writers purportedly faced on Wikipedia. The first concerned the addition of a section on violence to the schizophrenia article, which exhibited the view of one of the article’s regular editors, neuropsychologist Vaughan Bell, that it was little more than a ‘rant’ about the need to lock people up, and that editing it stimulated him to look up the literature on the topic. The second dispute reported by ‘Nature’ involved the climatologist William Connolley related to protracted disputes between editors of climate change topics, in which Connolley was placed on parole and several opponents banned from editing climate related articles for six months; a separate paper commented that this was more about etiquette than bias and that Connolley did ‘not suffer fools gladly.’


2 Comments to “Reliability of Wikipedia”

  1. Thank you for this comprehensive discussion on the reliability of Wikipedia.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.