BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Delving Deeper Into Facebook's Murky Community Standards

Following
This article is more than 3 years old.

Since addressing the issue of Facebook's "community standards" last month, this reporter has found that it isn't just military history that can somehow be in violation. Vintage advertisements, old comic books, antiques and even toys can all run afoul of Facebook's algorithms and because the social network tends to err on the side of caution anything even slightly controversial can be flagged.

It is true that Facebook is trying to maintain a community standard to ensure the safety and well being of its users. But does flagging a swastika on a vintage comic book (where the Nazis are clearly presented as the "bad guys") or banning the posting of an old (and arguably offensive) piece of advertising really address the issue?

Instead of creating a space for reasonable discourse of members – including in so-called private groups – Facebook simply uses its community standards to block all.

Is Pop Culture Next?

Even old movies and TV shows could be a problem based on how Facebook has flagged images or certain "hot button" keywords. As metadata and metatags are increasingly used to make search engines sort pages to the top and catalog content online by the very nature that could result in the blocking of content on such platforms.

"There is no question that older films could run afoul of 'community standards,'" explained Andrew Nelson, chair of the Department of Film & Media Arts and associate professor of Film Studies at the University of Utah.

There is some irony as Hollywood's "Hays Code" was used to maintain community standards for decades.

"Movies reflect, to a certain degree, the standards and morals of the time they were made, and we all know that standards and morals change," said Nelson. "With that said, I think that the majority of people who are inclined to watch older films understand this. Old movies are often presented in a 'curated' fashion – on TCM, or on Blu-ray with supplementary features – where context is provided. There is also something to be said about the history of cinema, which is partly a history of overcoming censorship."

White Washing History

The danger of using a catch all "community standards" on social media is that it silences all the discussion, at least in the open. It essentially blocks content that could be slightly controversial and stops any meaningful conversation in the process.

"In the interest of not propagating hate and racism, we need to be careful not to white wash our history," suggested Nathaniel Ivers, department chairman and an associate professor in the Online Master's in Counseling Program at Wake Forest University.

"As the cliché goes, if we don't study and learn from history, we are likely to repeat it," warned Ivers. "What I'm less certain about is where the line should be drawn in terms of social media posts and the virtual sharing of comics, model cars and airplanes, etc."

Banning images for being potentially offensive – such as a vintage advertisement or historic photo – present its own problem.

"In retrospect, we can truly say that the images used to advertise (some vintage) products were, at minimum, irresponsible," noted James R. Bailey, professor of leadership at the George Washington University School of Business. "Culpably racists, though? Unlikely. It was just marketing, as crass as it was." 

Despite the fact that such an old ad campaign might not have been culpably racist wouldn't matter in terms of social media community standards. The mere fact that it could offend would be enough for it to be deemed a problem.

That in itself is worrisome added Bailey. "The last time America went through this exercise, we burned books, such as Harper Lee's To Kill a Mockingbird. We don't burn books anymore. But we do ban them, or any images that could be interpreted as offensive, on social media. Be that pictures of innocent childhood toys, restaurants of bygone days, or licorice candies. These things are not subversive. They are history. Let us have a wry smile at them, and then understand them for what they were."

An Enforcement Issue

At the heart of the problem Facebook can be forgiven for how it handles these problems. Who can really defend an old ad that would be considered racist today and who really can defend the display of a swastika, even if it is on a flag that granddad brought back from Normandy Beach 76 years ago?

Yet, the blanket term of community standards ensures that neither item can even be discussed in the open on the platform.

"Put simply, Facebook cannot figure out how to consistently enforce its own standards," said David Kirsch, associate professor of Management and Entrepreneurship in the online MBA program at the University of Maryland.

"Leaving everything up to the algorithm results in the experiences you have documented – seemingly innocuous, historically legitimate photos, materials and contributions being improperly flagged as violations," explained Kirsch. "Finding exactly where the Facebook AI draws the line between an inappropriate piece of Nazi propaganda and a legitimate historical inquiry into that very same topic is impossible because the algorithm itself is proprietary and therefore invisible to us. We only see the results."

How those results are handled is also at issue.

"Some items are blocked that shouldn't be – false positive – while others get through that perhaps shouldn't – false negatives, like the questionable Trump posts," said Kirsch. "Many grey areas seem to beg for clarity, but we get nothing. Solutions to this problem are not easy, given the scale of the challenge, but it is not impossible. For instance, we could imagine a community standards board constructed such that someone in your position would expressly flag your article for review, thereby signaling to the board your awareness of the boundaries you are nearing."

Facebook – as well as any social media company – is not really there to create a platform for meaningful discussions, and that perhaps is a key point to remember. In some ways the community standards could be no different than the local watering hole having a policy that bans the discussions of religion or politics, or an academic club that bans all business talk.

Then there is the fact that Facebook simply can't be bothered with becoming a platform of meaningful discussions.

"I can understand how social media companies might make sweeping policy decisions to restrict these types of posts because it would be very difficult and costly to develop and execute assessments that determine users' intentionality behind their posts," added Ivers. "I suspect these companies are trying to avoid loopholes that hate groups could use to push vitriolic agendas."

Follow me on Twitter