Dr. Shaun Duke, Professional Nerd

Editor. Writer. Professor. Host.

Reading Time

Thoughts on Wikis, Responsibility, and Cultural Shifts

I’m currently re-reading Robert E. Cumming’s introductory chapter from Wiki Writing: Collaborative Learning in the College Classroom, entitled “What Was a Wiki, and Why Do I Care? A Short and Usable History of Wikis.” This is one of the readings for my class on digital rhetoric, and it serves as a handy introduction to the invention of wikis, the reactions to them in the “ancient times,” and some of the key concerns about their impact on knowledge production. Basically, it’s some nerd shhhhh.

That said, it has got me thinking a lot about the role of wikis in our culture and, more importantly, just how much has changed since I was a kid. While there are still people running about saying you should never use wikis, for the most part, even academics have softened on them. A lot of you probably remember when that wasn’t the case. Hell, remember when that wasn’t the case for me as a teacher. Mind you, I was never the type to outright fail a student for using Wikipedia, but I did strip away points.

Today, most of the teachers I know tell their students not to use Wikipedia as a primary source. But as a starting point for research? Have at it!

And as I prep for another introduction to wikis “conversation” for my digital rhetoric class, I’m thinking a lot about how digital technologies have seeped into our everyday lives while most of us didn’t notice it happening. Today, I’m deep in the digital rhetoric well, a consequence of social media activity and teaching a lot of composition. I might not have noticed way back when, but I am fully aware now even if I can’t figure out where the shift took place.

Cummings’ article attempts to put a pin on this shift. Folks who have studied or followed the history of Wikis are familiar with the study in Nature on the accuracy of Wikipedia compared to Encyclopaedia Britannica. While the study only looked at scientific pages on both sites (obviously), it concluded that accuracy levels were fairly similar. At the time, Britannica revolted, which is understandable given that encyclopedia work was effectively their domain, and Wikipedia was offering the same type of material for free.

In addition to the Nature study and the backlash, countless articles were written about how Wikipedia is bad for us. Too many inaccuracies (eh, not quite true). Too much emphasis on low effort learning (probably true). Jimmy Wales, the founder of Wikipedia, even remarked that university students complaining about receiving poor grades after using his website as a primary source didn’t have a leg to stand on. Some people misinterpreted this to mean “even Wales says his site is full of bad info,” but Cummings rightly notes that Wales was really pointing out that college students should be researching on a deeper level. And, well, I agree. Casual users are not likely looking for in-depth analysis; they’re on Wikipedia to find a quick fact (a date, info about an event, etc.). Students, however, are supposed to be conducting research and learning. And there’s only so much you can learn from the cliff notes (which, yes, are perfectly fine to use in a lot of contexts).

Basically, all of this hubbub amounted to…nothing. Because Wikipedia is clearly here to stay, and it has become such a staple of our everyday lives that it’s almost hard to believe that there was a time when nearly everyone within higher education was screaming NO WIKIS at the top of their lungs.

And I’m here thinking about how we got here, but more importantly, how so many technologies we use on a regular basis showed up, were attacked, and wormed their way in anyway. Cellphones, MP3 players, social media, self-driving cars (OK, so there’s still work to be done there), and so on and so forth. Technology, it seems, has this profound ability to seep and seep and seep, and those who push hard against it constantly end up looking like angry people screaming at clouds in fields of despair. I’m sure if you’re one of those “history of technology” people, the story is more complicated and goes back centuries. For me, I’m just having one of those weird “OMG, what is all this stuff around me and how did it get here” moments.

All that said, I don’t want to give Wikipedia a total out. While it has changed how we think about knowledge production and has irrevocably changed our society, I share the same concerns as many digital rhetoric scholars and critics about how we measure responsibility for these tools. To pull a quote from Cummings:

The Nature study showed Wikipedia as generally accurate or at least not substantially less accurate than online encyclopedias produced under the traditional print paradigm. True, if Seigenthaler’s false biography had been posted on Encyclopedia Britannica Online, while he might not have had legal recourse, there would have been a clear author and editor to hold accountable. Wikipedia could not provide this. Instead, Wikipedia relies on those invested in a knowledge community on a volunteer basis to provide edits, and the failure of that system is aptly noted in Seigenthaler’s case since one Wikipedian looked at the article after its first post and merely corrected a misspelling, leaving the false content in place. In essence, all that the Wikipedia model could offer Seigenthaler is the opportunity to join this knowledge community and continually monitor his own biography on Wikipedia. Hardly a workable solution.

Responsibility, in other words, is probably the biggest issue for sites like Wikipedia. Who is responsible for what is written on the site? Who can be held accountable? How do you have “responsibility” when knowledge can be produced by dozens of people, most of whom are likely contributing to the project honestly?

Wikipedia’s response has been to entrench the authority of knowledge production into the hands of a few. “Oversight” is one such entrenchment designed to solve a problem but which produces new problems of its own. Initially, the idea behind Wikipedia was to help democratize knowledge (using consensus to provide neutral accounts of things)(this is largely my interpretation). But restricting power into the hands of a few means you run into a lot of the same issues that arise in other systems of knowledge. The biggest “obvious thing” here is the accurate criticism of gender bias among Wikipedia’s editorial staff. Curiously enough, Wikipedia has a page about this, and the Wikimedia Foundation (i.e., the managers of Wikipedia) agree with the criticism (neato).

Good luck, I guess. If the last decade of Internet activity has taught me anything, it’s that you should never underestimate a certain segment of our digital world when it comes to sabotaging good natured efforts to make our society more equitable. But I’m a bit of a pessimist…

All of which is to say that I’m fascinated by how society has changed while remaining deeply skeptical of those changes. When I discuss wikis with my students, the potential issues are always part of that conversation. That’s part of what I’d call my pedagogic approach:  I do not simply introduce the technologies and ask students to study them; I want us to consider how society has changed as technology envelops our everyday lives. For wikis, I am always thinking about how they change knowledge production, what impacts they have on how we understand information and truth, and what we can do to integrate them into our lives while mitigating the harm they might do to our culture.

And so here we go…another day teaching digital rhetoric!

With that in mind, I’ll leave you with a quote to ponder:

While no one wants to undergo an operation from a physician who has just referenced the procedure on Wikipedia, similarly we all want surgeons to share their knowledge from procedures among themselves. There are as many possibilities for knowledge creation on wikis as there are authors and audiences. The key lies in shared definitions of truth: it is very unlikely that a wiki created by disgruntled Wal-mart employees will produce the same types of knowledge claims as a wiki created for astronomers. But as long as there is an agreed-upon scope for any particular wiki, there is no reason not to apply this tool of networked consciousness to almost any endeavor.

Email
Facebook
Twitter
LinkedIn
Digg
Reddit
LinkedIn

Get My Newsletter!

Subscribe (RSS)

Support Me

Recent Posts

Top Posts

Archives
%d bloggers like this: