1 Intervening in Wikipedia Intervening in Wikipedia
Feminist inquiries and opportunities
Monika Sengul-Jones
Introduction
Wikipedia, the ubiquitous open-access resource and open-source collaborative online encyclopedia project, presents opportunities and challenges for feminist media practitioners in a digital age. The promise of Wikipedia is rooted in the aspirational rhetoric of the late 1990s and early 2000s, when the possibilities of digital online networks for new, democratic forms of collaboration seemed to map well to Wikipedia’s 2001 launch. Wikipedia elicited curious and critical examinations from media studies scholars and popular commentators alike regarding the ways that internet users might use a wiki content management system to actively participate in the construction of their mediated worlds. Today, the Wikipedia website is the fifth most-visited platform globally, as well as the only non-profit, community-run platform in a commercial internet economy.
Yet Wikipedia is far from being the utopian example of “non-market peer production” that Yochai Benkler wrote of the site in 2006 (5). While it is an encyclopedia that everyone can edit, not everyone does: Wikipedia’s editorial community has a well-documented “gender gap”: around 85% to 90% of volunteers are white men from the global north (Wikimedia Foundation 2011; Herring et al. 2011).1 In this way, Wikipedia has not delivered on the promise of Web 2.0 as a conduit to a more open and inclusive democracy, though its failures are not unique; the Wiki-movement bears resemblance to other male-dominated open and free software projects (Reagle 2013). Indeed, Wikipedia falls short from being a feminist platform, by any definition. Though the community of volunteers has done a remarkable act in collaboratively developing the largest compendium of information ever to be written, the epistemological foundations of Wikipedia’s social and technical infrastructure reproduce a Western masculine bias throughout the project (Ford and Wajcman 2017). In other words, there are not only missing voices among its volunteer base, but biases are woven into the norms of wiki-based, collaborative editing and the mandate to establish authority by citing secondary sources. At the same time, these iterations of gendering are opportunities to explore how axes of power work through the interlocking of social processes and technical procedures, what I’ll call “socio-technical” processes in this essay. Also in motion are opportunities for intervention and activism. The chapter pivots feminist critiques of Wikipedia’s infrastructure to identify ways that media practitioners can understand, and tease out ways to ameliorate, the reproduction of oppressions that have materialized as discourse, processes, and participation on Wikipedia.
As a feminist writer, I wish to be cognizant of my own situated perspectives and their limits. I have written this chapter as a “critical Wikipedian,” a hat I have worn since I began as a volunteer Wikipedia editor, researcher, and Wikipedia event organizer in 2012 at an edit-a-thon for feminist academics in higher education. More recently, I am Wikipedian-in-Residence (WiR) with the global library cooperative, the Online Computer Library Center (OCLC), brought on to strengthen ties between public library staff and Wikipedia by designing and delivering training programs to hundreds of public library staff, a position partially funded by grants from the Knight Foundation and the Wikimedia Foundation. It is thanks to my experiences as a Wikipedia trainer, feminist critic, academic, volunteer organizer, and editor that I write this chapter.
A trusted authority: unpacking the neutrality of Wikipedia
In 2018, in a time of public distrust of for-profit social media and the health of democracy, Wikipedia has emerged in the United States as exemplifying a successful online community where ordinary people have collaborated to develop and adhere to processes to write and share neutral, fact-checked information. For many years, Wikipedia was considered to be amateur. Its reputation has changed. For instance, in 2016, Katherine Maher, currently Executive Director of the Wikimedia Foundation, motioned to the recasting of authority in Wikipedia’s community processes on a “fake news” panel at Wikipedia Day, held in New York City in January 2016. “Now Wikipedia is giving advice to The New York Times on how to build public trust and transparency,” Maher said, jokingly, to a Times editor in a moment when trust in both traditional journalism and social media giants was under critique. In 2017, The Economist editors used Wikipedia as an example of online community that Facebook might look to, were social media to be considered a public service (“Once Considered a Boon to Democracy” 2017). By early 2018, social media companies such as YouTube and Facebook had begun using Wikipedia content embedded in newsfeed posts to be a guidepost for readers in order to address the circulation of conspiracy theories and disinformation (Herrman 2018). Wikimedia content is also used to power semantic web content and knowledge graphs for Google, Yahoo, and Apple (Matsakis 2018). Though, as I have pointed out, Wikipedia is far from achieving its promise as a beacon of participatory media for democracy, Wikipedia is considered by many to be a bright spot and is now an increasingly influential fixture in social life.
Neutrality, the “bird’s eye view”
With Wikipedia’s heightened prominence in the social arena, feminist media practitioners can help to guide students to thinking critically about what neutrality is on Wikipedia and how it connects to feminist epistemological frameworks. Wikipedia’s participatory guidelines encourage editors to adhere to a set of best practices in style, tone, and referencing. According to the Neutral Point of View policy page, this means the tertiary reference should be “representing fairly, proportionately, and, as far as possible, without editorial bias, all of the significant views that have been published by reliable [secondary] sources on a topic.” For feminists, this is an opportunity to question what constitutes a reliable source, to identify the concepts of situated knowledges while also acknowledging the differences between fringe theories and marginalized perspectives and why we may be drawn to these. Conversations can acknowledge that what’s “legitimate” by other knowledge-making institutions, such as libraries or academic indexing, can have prejudices and biases (Christen 2015). What counts as “legitimate” secondary sources have been shaped historically by legacies of injustice and erasure; these omissions can be carried forward through library cataloging systems and academic indexes.
For feminist researchers concerned with the pervasiveness of internet-mediated information in daily life, Wikipedia provides an opportunity to pause and examine the epistemological position one must take to presume there can be a “sum” of human knowledge, as well as a chance to highlight feminist and other scholarly traditions that describe knowledge as situated and foreground the role of context and power in understanding how knowledges are constructed and legitimated (Harding 1991; Haraway 2003). In other words, feminist practitioners can utilize Wikipedia as a participatory media to contextualize and interrogate how authority is derived.
“Anyone can edit,” but who can do what?
In spite of its use as an authoritative reference with 5.4 million articles in English, Wikipedia does not claim to be comprehensive; it is a community-run reference with constantly changing content and incredible version control that “anyone can edit” (Ford and Wajcman 2017). Yet, as mentioned, not everyone edits. Among those that do, not all editing accounts have the same features. On English Wikipedia, there are approximately 300 edits per minute by more than 100,000 editors monthly, including bots, which are automated accounts that other editors have reviewed and approved.2 Editors who volunteer to contribute call themselves “Wikipedians” and are identified by Wikipedia usernames (which are often not a real name); technical privileges, such as the ability to create an new article, are earned according to edit counts and reviews of editing behavior (Jemielniak 2014).3 No payment can be exchanged for editing; paid contributions without explicit disclosure violate Wikimedia’s terms of use (Wikimedia Foundation 2018; see section “Refraining from certain activities”), and users with more than one account (“sockpuppetry”) may be blocked. The total number of Wikipedia editors exceeds 170,000, yet less than 3,500 editing accounts are responsible for making more than 25 edits per month; this number has been shrinking. With Wikipedia’s maturation since 2007, there has been a slow decline of active editors and higher barriers to entry for newcomers (Jemielniak 2014, 102; Halfaker et al. 2013; van Dijck 2013, 134). The barriers to entry are, in part, due to different technical permissions. However, a user’s authority is not just about technical attributes ascribed to the account; authority is also an expression of the extent to which the user is fluent with wiki-specific social and technical processes and has the ability to marshal “wiki-slang” in their edits. The “Wikipedian,” then, performs membership in the online environment not only by bringing new information to the encyclopedia, but also for their skills and adeptness in insider knowledge and confidence to act upon the wiki-encyclopedia’s inner workings. Thus, we can see that Wikipedia’s free and open transparency, while distinct from proprietary social media companies, maintains and even amplifies other forms of social and cultural power.
Wikipedia’s thought collectives
Volunteers’ knowledge about the inner workings of Wikipedia is necessary to participate fully in decisions about representation and visibility on the platform. Not only do people need to want to edit, but they also need to know how to participate in the peculiar socio-technical culture. A peek at the “View” history and “Talk” page of an article will reveal many layers of conversation that have taken place on any given article. Are these ideal versions of equitable collaboration with a general public (and what might we even expect that to look like)? Not surprisingly, the answer is no. A large-scale analysis of the editorial conversations among editors about articles about women or themes typically pertaining to them (such as women’s fashion or health) found that such articles receive more scrutiny from editors and are held to higher notability standards.4 An essay by Kristin Menger-Anderson, musing on reasons for the invisibility of female mathematicians on Wikipedia, provides context for how this plays out; in this case, topics pertaining to female mathematicians are four times more likely to be nominated for deletion than their male counterparts (2018). Her critique resonates with the insights historians and scholars of science and technology made on the role of collective context in determining scientific authority (Fleck 1981):
In the vast gray area between a Nobel prize winner and unpublished newcomer, what and who is notable enough for record cannot be separated from the community that feels passionate enough to document or delete this information. Nor can we separate greatness from the criteria we use to define it.
(Menger-Anderson 2018, para. 20)
The criteria defining notability on Wikipedia requires editors to learn and apply a specific communicative vocabulary. Moreover, establishing notability can require emotional labor for marginalized voices to establish authority (Menking and Erickson 2015); at the very least, it’s a time-consuming process to defend an article before it’s fully developed. This can lead to Wikipedia editing trainers suggesting that new editors do not start articles themselves, or include at minimum five verifiable, independent articles and submit the article for review.
There’s evidence that Wikipedia’s socio-technical culture, specifically policies about notability and neutrality, can lead to new biases. A 2015 analysis of biographies demonstrates there are biases endogenous to the content creation process that cannot be attributed to existing prejudices in secondary literature: women’s biographies are shorter and have fewer out-links, and women are overrepresented in some biographical categories without obvious gender emphases, such as “Fictional Character” and “Royalty.” In other words, far from being a bastion of objective knowledge or a mirror of preexisting knowledge, Wikipedia’s neutral point of view and notability are accomplished through the iterative process of subjective volunteer editors and automated bots, taking on procedural police work through debate that provides the online encyclopedia with the impression of authority and reliability.
Labors of participating
There are many pleasant and helpful Wikipedia editors who follow the community mandate to “assume good faith” in others. Yet like elsewhere on the open web, Wikipedia editors who self-identify as women or non-binary tolerate hostile, sexist discourse by other editors sometimes directed at them. Editors who identify as female on their user page are more likely to have their edits deleted (reverted) than male editors (Lam et al. 2011, 15). Consider the case of the user Lightbreather, who was told by an administrator, a special class of editors, that civility was not something she could expect. When she complained about obnoxious, sexualized comments in response to her edits, she was banned from editing (Paling 2015). More mundane versions of sexism also persist, with editors posting sexist comments on articles related to femininity or women’s health. In learning how this particular collaborative community has developed, there’s the opportunity to imagine what it might look like done differently. What would an alternative collaborative space look like, and how might it remain open and free from harassment and devaluation?
Beyond emotional labor, Wikipedia has been a nexus for harassment. In some cases, harassment between Wikipedia editors has continued off the site, leading to doxxing, which is the public sharing of personal information, such as address and phone number, for the purposes of harassment (Paling 2015). There’s also evidence that non-Wikipedians have gone to Wikipedia specifically to harass by way of editing living persons’ entries with false or derogatory information, which was the case with Gamergate (Mandiberg 2015; Sarkeesian 2012). To find redress, editors who have experienced harassment have had to marshal support working within Wikipedia’s peer- produced policies and guidelines, which prioritize transparency and freedom to create. Consider the efforts of Pax Ahimsa Gethen, user: Funcrunch, to protect their user page from transphobic defacement. For Gethen, persistent harassment such ...