We Live in a Wikiworld; Should Our Students?

Spring 2010

By Brett Potash

In 2006, Stephen Colbert of The Colbert Report challenged his viewers to change the Wikipedia entry on African elephants to indicate that the population had tripled in the past six months. Any student doing a report on African elephants may well have noted the population increase in a paper due the next day. Herein lies the reason why so many educators prohibit the use of Wikipedia in their classroom. Never mind that access to the African elephant page on Wikipedia was subsequently restricted, and that the information on the page quickly reverted to form. The point is that it is susceptible to change. By anybody. 

Or is it? This is the question that, as a teacher of history and English, I have struggled with over the past few years. Like many other educators, I had forbidden my students to use Wikipedia as a source in research papers. Then, two years ago, I began teaching a course on postmodernism. In one of the units, we study the nature of truth. Is it objective? Does knowledge exist independently, waiting for us to uncover it? Or, as the Wikipedia model suggests, does truth depend on consensus and subjective interpretation? 

This article attempts to discern if and how Wikipedia should be used in the classroom, amid the backdrop of more epistemological questions concerning the nature of truth and knowledge. Given that Wikipedia is perhaps the most easily accessible site for our students, how should we, as educators, present it to them? I say “present” because, as is obvious to many of us by now, we must take an active role in discussing Wikipedia with our students, lest it become the… well… the elephant in the room. 

Verifiability

Wikipedia contains about ten million articles in 250 languages. At any given time, there are roughly 75,000 active contributors and seven million registered users. More experienced users may become one of the roughly 1,500 “administrators” — article editors who have been selected by the site’s community of users to be watchdogs. They ensure that users hew to Wikipedia’s guidelines and policies; they also have the power to restrict access to certain articles (think African elephants) or even lock a site so that it can’t be edited. The latter may occur when a controversial article evokes edits involving biased or even racist language, both of which are against Wikipedia guidelines. 

Given the expanse of Wikipedia’s domain, it is no surprise that when students type a topic into Google, Wikipedia is often a top result. Google algorithms sort articles in part by the number of links to them. Basically, a link is like a vote, and Wikipedia gets a lot of votes. Because of Wikipedia’s dominance in the field, even Encarta, Microsoft’s online encyclopedia, folded in October 2009.

Popularity, however, does not ensure credibility. A study by Communications of the Association for Computing Machinery determined that 11 percent of Wikipedia articles have been “vandalized” at some point in their life (Garfinkel). But truth in Wikiworld is not determined by consistency or observability. What creates truth for Wikipedia is verifiability: the information must simply exist in some other publication. The more reliable the other publication (peer-reviewed journals and books published at university presses are at the top of the rankings), the more preferable to Wikipedians. In fact, on Wikipedia’s own webpage concerning verifiability, the first sentence reads, “The threshold for inclusion in Wikipedia is verifiability, not truth — that is, whether readers are able to check that material added to Wikipedia has already been published by a reliable source, not whether we think it is true.” This is consensus-based knowledge, with no claim to objective truth. 

Reliability of Wikipedia

But how reliable is it? In 2007, Virgil Griffith, a computer science graduate student, created “WikiScanner,” a program capable of tracing anonymous IP addresses of Wikipedia editors back to the owner of the computer network that made the change. Suddenly, connections between corporations and their Wikipages emerged. A user from a WalMart computer altered information about employee compensation, and Diebold erased information revealing its voting machine irregularities. Exxon Mobil not only deleted information about the harmful effects of the Exxon Valdez oil spill, but also trumpeted environmental efforts of the company since then (Hafner).

In Wikipedia’s defense, these pages quickly reverted to their original state. Moreover, Wikipedia makes no claim that its information will always be unbiased. As an open source, it acknowledges that “many articles start their lives as partisan, and after a long process of discussion, debate, or argument, they gradually take on a neutral point of view reached through consensus.” The idea that the hordes of contributors will eventually reach some sort of unbiased balance is a thoroughly democratic idea. But can knowledge be democratized? Or is it, rather, an objective reality, impervious to our interpretations? A postmodernist would argue that all knowledge is subjective; furthermore, it is only from the discourse of local cultures and individuals that we can derive meaning. If this is “true,” then Wikipedia stands as the ultimate consortium of information.

In my course on postmodernism, I ask students to perform a Colbert-inspired activity as homework: change a Wikipedia entry of their choice and track the change. Presuming that the change is false (e.g., changing the date of Barbie’s creation from 1961 to 1861), it is typically changed back within a day or two, if not a few hours. I have been largely impressed by the Wikipedia community’s quick response to vandalism. Other students have made true edits that have lasted longer. The “Aurora Borealis” entry still displays relevant information written by one of my students.

More scientific studies have been conducted regarding Wikipedia’s reliability. In December 2005, a notable study reported in Nature compared Wikipedia to Encyclopedia Britannica (Giles). Of the 42 articles reviewed, the study found four errors or omissions per Wikipedia article and three per Britannica article (MSNBC). The press largely saw this as a vindication of Wikipedia, though three or four mistakes per article still seems unreliable to me. In a comprehensive review of the study, Eiffert concluded that Wikipedia is “generally well researched and substantiated by footnoting and linking to sources, allowing readers to judge the quality of information being used.” Moreover, Wikipedia entries often have more, and more current, information. 

To further combat errors and vandalism, Wikipedia has recently taken the controversial step of requiring that some changes pass through editors before they are posted. If an anonymous user attempts to change a page about a living person, that edit is subject to approval by a Wikipedia editor. Though proposed by Jimmy Whales, founder of Wikipedia, and approved by 80 percent of users in an online poll, the decision (in trial now) has elicited a volley of complaints (“Wikipedia to launch…”). Purists claim that any oversight is an assault on Wikipedia’s core principles, though I would argue that, even with the editors, it is still a form of democracy. The decision was ironically preceded by a new Britannica policy in which registered users can now propose contributions to the site that, if approved by an editor, will be included on the page (Hutcheon). The two sites are not so entrenched in their objective and subjective corners as it seems. 

Further studies are finding Wikipedia to be surprisingly reliable. Library Journal, which claims to have “the toughest critics of reference materials, whatever their format,” conducted a 2006 review of Wikipedia articles, in which they concluded that “while there are still reasons to proceed with caution when using a resource that takes pride in limited professional management, many encouraging signs suggest that (for now) Wikipedia may be granted the librarian’s seal of approval” (Miller). The Journal study focused on three areas typically acknowledged to be Wikipedia’s strong suit: current affairs, pop culture, and science. Other areas are more prone to unprofessional commentary or disorganized work. 

As Wikipedia grows into young adulthood, its credibility among academics also seems to be growing. Though Middlebury College prohibits students from citing Wikipedia in history papers, a review of the ScienceDirect database reveals that citations of Wikipedia in academic journals have grown fantastically every year since 2003, when the first citation appeared. By 2008, there were over 500 citations. In a remarkable development, the journal RNA Biology is now requiring that article submissions have a peer-reviewed Wikipedia entry before they are accepted (Timmer). Though a young journal, RNA Biology’s progressive mandate not only affirms Wikipedia’s arrival on the academic stage, it also attempts to harness the open source as a purveyor of reliable information. If everyone’s going to look at Wikipedia, they reason, they might as well get it right. The U.S. National Institutes of Health agrees. In July of 2009, it held a conference with the explicit goal of “teaching health professionals how to edit Wikipedia’s health pages” (Grossman). Despite the theoretical objections to Wikipedia, its profile is growing.

A Different Kind of Knowledge

Why does it work so well? Although the notion of collective intelligence has been ridiculed by observers, including Colbert, current research suggests that it may, in fact, be a natural law. In adding the word “wikiality” to the English lexicon, Colbert was criticizing the idea that we can create a reality simply by agreeing on it. For Colbert, a reality that can be altered by the click of a mouse is not a democratic reality, but rather an unstable one. But new research on “swarm intelligence” is indicating that rather elaborate processes occur in groups that can result in sophisticated knowledge structures. 

A recent National Geographic article, for example, states that “[a]nts aren’t smart. Ants colonies are” (Miller). Colonies regularly process such complicated procedures as finding the shortest route to food, dividing labor, and defending themselves from neighbors. Though there is no hierarchical control in an ant colony, decentralized knowledge seems to spread quickly and effectively.

A prime example of collectivized knowledge formation occurred in 2007–2008 after the Virginia Tech shootings. Within three hours of the shooting, over 500 edits had been made to the newly created page. Within the next few weeks, over 2,000 contributors “created a polished, detailed article on the massacre, with more than 140 separate footnotes” (Cohen). Even The Roanoke Times acknowledged that Wikipedia “has emerged as the clearing house for detailed information on the event” (Cohen). The masses, essentially, were the most trusted source of information, though not a single person was in control. 

As the horde flocks to Wikipedia, it is no wonder that some form of intelligence is developing there as well. Consider this description of swarm intelligence from Thomas Malone of MIT: “No single person knows everything that’s needed to deal with problems we face as a society… but collectively we know far more than we’ve been able to tap so far” (Miller). Now, compare that to a statement by Natalie Erin Martin, a frequent Wikipedia contributor: “There is no one person at the top saying this is what you need to do. It has all been out of a sense of personal responsibility” (Cohen). Perhaps Wikipedia, this grand experiment in collectivized knowledge, has some natural impetus behind it. It has now become an erroneous cliché to say of Wikipedia: “It only works in practice. In theory, it can never work.”

Pedagogical Implications 

If teachers are opposed to Wikipedia in theory, should they be opposed to it in practice? From everything I can gather, the answer is no. In the quote above, Martin raises the issue of accountability that so many teachers worry about. Somehow, in Wikipedia, there is a shared community ethic of accountability. Why else tag an article with “This section is written like an advertisement” (Wikipedia: The Webb Schools) or “its neutrality is disputed” (Wikipedia: Communism)? In its constant editing and checking, Wikipedia seems to impose its own simple rules on the formation of knowledge, rules that have led to my students receiving comments like, “Your contribution was not constructive,” or restricting their IP address from making further edits. And this negative feedback was strangely reassuring to me, indicating that the click of a mouse is not enough to alter reality, even wikiality.

So, if the Wikipedia model is here to stay, and if our students will inevitably use it, it becomes our responsibility as teachers to instruct our students in how to use it. As is true of any research, we must teach them to read critically, to explore links and footnotes, and to heed such warnings as “its factual accuracy is disputed.” The key is not to look for an infallible source. Even Britannica comes up short here. The key is to teach critical skills that can be applied to any source. Sifting through information in a digital age is the real challenge that comes with the opportunity of increased accessibility. After I ask my students to change a Wikipedia article, we discuss Wikipedia’s reliability and whether or not they would be comfortable using it in a research paper. What most students discover is that, in order to trust an article, they must examine its published “History” and “Talk” forum (both of which are accessible from a tab at the top of the article). Each source, whether Wikipedia or not, has background information that must be checked. Magazines have publishing companies connected to commercial ventures. Authors and researchers often have vested (or funding) interests. All sources must be checked for bias and subjectivity. If we teach this lesson, we are transcending the problems inherent in Wikipedia, and teaching critical skills necessary given the glut of available information.

While some of my students are surprised at how easy it is to edit, they are also surprised (and even a bit disappointed) at how quickly their contributions revert. And here’s the crux of Wikipedia’s success. While some edits provide those five minutes of fame, truly thoughtful ones provide much more. If we all want a little recognition, Wikipedia offers this opportunity to anyone. Is it any surprise, then, that editors take this forum seriously and guard it so carefully against vandals? Many of us, as educators, would be pleased if our students took ownership of a lesson the way Wikipedians take ownership of their contributions.

Wikipedia, it turns out, perfectly adheres to the “best practices” we try to implement as educators: the work is meaningful, it has a practical purpose, and it utilizes different forms of media. My students get genuinely excited when they enter a conversation with a prominent editor, or make a lasting contribution to the James Joyce page that has been hyperlinked by other contributors. Suggestions for bringing Wikipedia into the classroom are given on the site itself. Students can, for example, act as editors for an article, correcting grammar and factual errors. Or they can practice research and citing skills by adding citations to articles. This also enhances the credibility of existing information. All of these are consistent with best practices, which “allow students to collaboratively work on meaningful tasks… construct their own knowledge… and learn through active involvement rather than sitting and listening and watching” (“Best Practices”).

The larger issue, though, is how we, as educators, perceive the nature of knowledge. Is it, as a scientist would argue, objective, and does it simply exist in the world, for us to uncover, impervious to interpretation? Or is it, as a relativist would argue, constructed by culture? The answer, I suspect, lies somewhere in between, and depends greatly on the nature of the knowledge being considered. While there is little doubt that Columbus discovered America in 1492, there is considerable debate over whether or not Columbus discovered America in 1492. Although Wikipedia is clearly more constructivist than not, it dabbles a foot in both camps, hoping to arrive at some sort of unbiased consensus based on the subjective contributions of thousands of people. The deeper one delves into Wikiworld, the more viable it seems.

Works Cited 

“Best Practices: Instructional Strategies and Techniques.” www.centralischool.ca/~bestpractice/coop/index.html. Retrieved Dec. 8, 2008.

Cohen, Noam. “Wikipedia Serves as Essential Internet News Source on the Virginia Tech Shootings.” International Herald Tribune, April 23, 2007. www.iht.com/articles/2007/04/23/america/wiki.php. Retrieved April 23, 2007.

Garfinkel, Simson. “Wikipedia and the Meaning of Truth.” Technology Review November/December 2008. www.technologyreview.com/web/21558/page1. Retrieved Oct. 23, 2008.

Giles, Jim. “Internet Encyclopaedias Go Head to Head.” Nature 438 (2005): 900–901. www.nature.com/nature/journal/v438/n7070/full/438900a.html. Retrieved Dec. 9, 2008.

Grossman, Lisa. “Should You Trust Health Advice from the Web?” New Scientist, July 29, 2009. www.newscientist.com/article/mg20327185.500- should-you-trust-health-advice-from-the-web.html. Retrieved Aug. 9. 2009.

Hafner, Katie. “Corporate Editing of Wikipedia Revealed.” International Herald Tribune, Aug. 29, 2007. www.iht.com/articles/2007/08/19/business/wiki.php.

Hutcheon, Stephen. “Watch Out Wikipedia, Here Comes Britannica 2.0.” Sydney Morning Herald, Jan. 22, 2009. www.smh.com.au/news/technology/biztech/britannica-takes-on-wikipedia-and- google/2009/01/22/1232471469973.html. Retrieved Aug. 22, 2009.

Miller, Peter. “The Genius of Swarms.” National Geographic. July 2007. http://ngm.nationalgeographic.com/2007/07/swarms/miller-text. Retrieved Nov. 20, 2008.

MSNBC. “Science Journal Finds Wikipedia Pretty Accurate.” www.msnbc.msn.com/id/104782071. Retrieved Nov. 20, 2008.

Timmer, John. “Journal Requires Peer-Reviewed Wikipedia Entry to Publish.” Dec. 19, 2008. http://arstechnica.com/old/content/2008/12/journal-requires-peer-reviewed-wikipedia-entry-to-publish.ars. Retrieved April 19, 2010.

“Wikipedia: Reliability.” http://en.wikipedia.org/wiki/Wikipedia:Reliability_of_Wikipedia. Retrieved Nov. 25, 2008.

“Wikipedia to Launch Page Controls.” August 25, 2009. http://news.bbc.co.uk/2/hi/technology/8220220.stm. Retrieved Sept. 2, 2009.

Brett Potash

Brett Potash has taught at the American College of Sofia in Bulgaria and The Webb Schools (California).