Professor examines ways in which governments, private companies censor online speech


LAWRENCE — Work by a University of Kansas researcher challenges the notion that the Internet is a democratic and liberating forum that can facilitate an unimpeded exchange of ideas devoid of interference or censorship of unpopular ideas. Instead, it is a place where private businesses and governments around the world have at their disposal a multitude of ways to censor speech, the professor says.

“Right now the most important chapter in the story of free speech is being written by Internet companies,” said Jonathan Peters, assistant professor of journalism and faculty affiliate at the Information and Telecommunication Technology Center. “They are acting as arbiters of free speech globally in ways that we traditionally reserved for courts.”

Peters outlines how various intermediaries can control online speech in “The Spectrum of Internet Chokepoints,” a paper he recently presented at the International Symposium on Digital Ethics. The paper is part of a larger body of work Peters is undertaking to examine free speech and the Internet. The big question he is trying to answer is whether the First Amendment can be applied to companies that exercise the power to remove speech from online sources. Generally, the First Amendment limits only the government’s ability to restrict speech.

The first piece in the series outlines the various ways speech can be controlled or censored. Among the most common mechanisms are the terms of use and the community guidelines adopted by popular sites, such as YouTube, Twitter and Facebook. They tell users what types of content they may and may not post. Notably, the companies often claim to be devoted to democratic principles (for example, in their mission statements or founding documents), but in practice they sometimes censor content with inconsistent reasoning, the professor said.

Peters shares the example of “Innocence of Muslims,” a video that sparked outrage in countries throughout the Middle East for its perceived criticism of Islam. While YouTube allowed the video to remain online in the United States, saying the video broke no laws here, the company removed the video in countries where it violated local laws. However, in Libya and Egypt, the video was removed or blocked even though the company said it did not violate local laws. YouTube issued only vague statements about its decisions, according to the researcher.

That is but one example of the inconsistency that has become the rule rather than the exception, Peters said. Facebook has been criticized over the years for what content it allows. At one point, a team of six Facebook staffers were tasked with making final decisions regarding what content to remove in response to user complaints — guided by highly imprecise standards. Further complicating matters, the company has used lower-level content moderators to act as first responders when the complaints come in. Located all around the world, they come from different cultures and belief systems, making it nearly impossible to establish consistent practices, Peters said.

In another area, copyright law has protected content from unauthorized use, but it also has been used creatively as a mechanism to censor certain types of speech, chiefly through the threat of legal action. Many Internet service providers routinely remove content upon receiving a request to do so on copyright ground, because it is cheaper and easier than going through potentially expensive legal proceedings, and a federal copyright law will shield the providers from liability in some cases as long as they remove the content immediately.

Perhaps more troubling, Peters argues, is the ability of authoritarian countries to create an official version of reality by blocking certain queries from search engines and by controlling access to the network itself. In 2011, Egypt did just that when it effectively turned off the Internet and shut down cellphone service, causing a 90 percent drop in data traffic to and from the country. The intent and effect was to neutralize a tool used by antigovernment protestors to organize and speak during the Egyptian Revolution.

“For a government to have the capacity to control so easily, on such a large scale, what its citizens read and hear — that’s really troubling,” Peters said. “It can foreclose entire areas of thought. It can stunt intellectual growth. It can retard scientific progress. It can do all manner of bad things, all in service of societal harmony or citizen compliance.”

Throughout his research, Peters examines a number of other chokepoints, including content hosts, search and application providers, and filtering software. He also explores legal liability for intermediaries under the Communications Decency Act. In future research, he plans to address how the state-action doctrine distinguishes public and private actors and how that doctrine applies to Internet companies, and to devise a state-action theory suitable for the digital world. The overall project is meant to inform policy discussions about creating a governance structure for a single, global Internet.

In the meantime, intermediaries hold a large amount of influence over online information and speech, often keeping the public in the dark about how and why they might decide to censor content.

“The companies themselves are wrestling with these questions,” Peters said. “And then all of us, as users, tend to have no idea how this works, and that’s often because the companies won’t tell us. I think we deserve more than that, and I’m trying to figure out legally if we can demand more than that.”

Tue, 01/27/2015

author

Mike Krings

Media Contacts