Fighting Online Hate Speech
September 23rd, 2014 | Published in Google Public Policy
Posted by Christine Y. Chen, Senior Manager, Public Policy
Earlier today, the Anti-Defamation League (ADL) released its “Best Practices for Responding to Cyberhate.” For two years, Google has participated in an industry working group convened by the ADL where, together with several other companies, other NGOs, and academics, we have exchanged insights and ideas on how to balance the need for responsible discourse with the principles of free expression. The best practices set forth by the ADL grew out of these conversations and we are excited to see them being shared with the wider Internet community.
In line with the practices set forth by the ADL, we work hard at Google to combat the spread of hateful content in order to maintain safe and vibrant communities on platforms like YouTube, Blogger, and Google+. We don’t allow content that promotes or condones violence or that has the primary purpose of inciting hatred on the basis of race or ethnic origin, religion, disability, gender, sexual orientation or gender identity, age, nationality, or veteran status.
To make sure these communities stay vibrant, we also depend on our users to let us know when they see content that violates our policies. The Google Safety Center gives an overview of the tools that people can use to report content that violates our user policies on different products.
Here are more details about some of our content policies and how to flag violations:
-
YouTube: If you see videos that run afoul of our Community Guidelines, you can report it by clicking on the flag icon below the video player. Then click on the reason — such as “hateful or abusive content” — that best fits the violation for the video, and add any additional information that will help our reviewers make a decision. We have teams around the world reviewing content flagged by users 24 hours a day, 7 days a week, and they will remove any videos that violate our guidelines.
-
Blogger: Our content policies describe what kinds of content are and are not allowed in blog posts. If you’re on a blog that seems problematic, click the “More” drop-down at the top of the page, then click on “Report abuse” and follow the prompts to alert us about any policy violations. If the blog owner has hidden that link, you can still report it by going to this Help Center page. Select the type of content policy violation you’re reporting—such as “hate speech” or “harassment”—and click through to enter the URL of the blog in question.
-
Google+: Our user content policy outlines how we want to ensure a positive experience for our users. If you see inappropriate content, this Help Center page explains what to do. In a Google+ post, click the arrow in the upper right of the post, then click on “Report this post” to get to a pop-up where you can select the reason—like “hateful, harrassing or bullying”—for your report. To report a Google+ comment for a policy violation, click on the gray flag next to the comment.
These reporting systems operate much like an online neighborhood watch. We ask your help in maintaining a community that provides a positive and respectful experience for everyone. The Internet has enabled anyone to become an artist, a writer, or a creator by simply using a keyboard and few clicks to reach out to the rest of the world. The release of the ADL’s best practices are a good reminder that we must all work together to keep the Internet a safe and open place to exchange information and ideas, where people can connect and engage with each other in unprecedented ways.