Sunday, July 30, 2017



Silicon Valley Censorship

Google's latest project is an application called Perspective, which, as Wired reports, brings the tech company "a step closer to its goal of helping to foster troll-free discussion online, and filtering out the abusive comments that silence vulnerable voices." In other words, Google is teaching computers how to censor.

If Google's plans are not quite Orwellian enough for you, the practical results are rather more frightening. Released in February, Perspective's partners include the New York Times, the Guardian, Wikipedia and the Economist. Google, whose motto is "Do the Right Thing," is aiming its bowdlerism at public comment sections on newspaper websites, but the potential is far broader.

Perspective works by identifying the "toxicity level" of comments published online. Google states that Perspective will enable companies to "sort comments more effectively, or allow readers to more easily find relevant information." Perspective's demonstration website currently allows anyone to measure the "toxicity" of a word or phrase, according to its algorithm. What, then, constitutes a "toxic" comment?

The organization with which I work, the Middle East Forum, studies Islamism. We work to tackle the threat posed by both violent and non-violent Islamism, assisted by our Muslim allies. We believe that radical Islam is the problem and moderate Islam is the solution.

Statements rated as "toxic" by Google's Perspectives software.
Perspective does not look fondly at our work -- see selections at left. No reasonable person could claim that saying "radical Islam is a problem" is hate speech. But the problem does not just extend to opinions. Even factual statements are deemed to have a high rate of "toxicity." Google considers the statement "ISIS is a terrorist group" to have an 87% chance of being "perceived as toxic." Or 92% "toxicity" for stating the publicly-declared objective of the terrorist group, Hamas.

Google is quick to remind us that we may disagree with the result. It explains that, "It's still early days and we will get a lot of things wrong." The Perspective website even offers a "Seem Wrong?" button to provide feedback.

These disclaimers, however, are very much beside the point. If it is ever "toxic" to deem ISIS a terrorist organization, then -- regardless of whether that figure is the result of human bias or an under-developed algorithm -- the potential for abuse, and for widespread censorship, will always exist.

SOURCE

1 comment:

ЯΞ√ΩLUT↑☼N said...

Pepsi..?