BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Wikipedia’s New Code Of Conduct Gets One Thing Right; Another Will Be A Struggle

This article is more than 3 years old.

A major social network announced a new set of rules for its members Tuesday, and by itself that might not rate as news. 

But Wikipedia isn’t just any social network, and its new rulebook stands apart from the terms of service handed down by commercial social platforms like Facebook and Twitter.

The “Universal Code of Conduct” announced Tuesday by the Wikimedia Foundation, the San Francisco nonprofit that hosts Wikipedia and related projects, isn’t a top-down product. Instead, Wikipedians collaborated to write it, much as almost 356,000 of them regularly create or edit entries in that online encyclopedia. 

“More than 1,500 Wikipedia volunteers from 19 different Wikipedia projects representing five continents and 30 languages participated in the creation of the universal code of conduct,” Wikimedia’s announcement notes. 

That goes well beyond earlier moves by commercial social platforms to borrow the collective wisdom of their crowds. See, for example, Twitter, adopting the foundational features of @ mentions and hashtags from its early users, or Facebook letting users vote on new terms of service before scrapping that experiment in 2012 after too few people bothered to cast virtual ballots.

At Wikimedia, the collective drafting of the new code began with input from around the world about the need for revisions to its earlier terms and involved months of collaboration. 

“They’re an alternative model to the private social experience that exists almost everywhere else,” said Alex Howard, director of the Demand Progress Education Fund’s Digital Democracy Project.  

The results also differ from many other codes of conduct by virtue of being unusually short—under 1,700 words, or less than 1,300 if you subtract the introductory paragraphs. 

The operative text starts not on a thou-shalt-not note, but with a you-should list of expected behavior of any user: “Practice empathy”; “Assume good faith, and engage in constructive edits”; “Respect the way that contributors name and describe themselves”; “Recognize and credit the work done by contributors,” among others.

“The organization is saying, here are our values,” Howard said. “They’re giving people scaffolding to interact with each other.” 

An “Unacceptable behavior” list follows, including a broadly constructed ban on harassment. This covers the usual categories—for instance, insults targeting personal characteristics, threats, and doxing—but also covers the broader category of being a jerk.

That’s both necessary, because people who punch down a little in public often do more often in private, and tricky because these lesser fouls aren’t as obvious.  

“People at times assume that it’s unintentional,” said Caroline Sinders, founder of Convocation Design + Research and an expert in online harassment research who’s worked with the Ford Foundation, Amnesty International and others (including an earlier stint at Wikimedia itself).

Or, she added, the offense will go unrecorded and then forgotten without a “ladder of accountability” that recognizes how unchecked minor abuses can lead to more toxic behavior. 

These provisions also cover behavior outside Wikimedia projects. For example, the doxing clause notes that “sharing other contributors’ private information, such as name, place of employment, physical or email address without their explicit consent” is out of line “either on the Wikimedia projects or elsewhere.”

There’s a complicating factor here in Wikimedia’s understandable lack of a real-names policy—enforcing one would endanger marginalized communities, and in particular those living under abusive governments. Wikipedia doesn’t even require an email address to create a contributor account

Wikimedia Foundation communications lead Chantal De Soto noted this issue in an email: “enforcing any breaches of conduct that happen on other platforms is often very difficult—verifying connections between Wikimedia accounts, and, for example, a Twitter account, is often not straightforward.” 

But it’s important that Wikimedia communities make that effort, considering all the evidence now available of how online radicalization can erupt in the physical world

“All we have to do is look at January 6 to get a sense of what happens when that goes too far,” Howard said of the riots that took place at the U.S. Capitol.

The next chapter in Wikimedia’s effort will involve more collaboration on enforcement policies and mechanisms. This may be the most difficult part, since it will involve setting up structures that can work at scale and across cultures. 

“A community needs to think about how they’re going to document these cases, who has access to them, how are they keeping track of things, how are they going to respond to harassment,” said Sinders. 

Done right, this may require hiring more dedicated trust-and-safety professionals. 

“In open-source communities, a lot of this arduous labor is falling to volunteers,” Sinders warned. “And that leads to community burnout.”

Follow me on Twitter or LinkedInCheck out my website