Google broadens takedown of extremist YouTube videos



Alphabet Inc’s Google in the last few months has begun removing from YouTube extremist videos that do not depict violence or preach hate, YouTube said on Monday, a major policy shift as social media companies face increasing pressure from governments.

The new policy affects videos that feature people and groups that have been designated as terrorist by the US or British governments but lack the gory violence or hateful speech that were already barred by YouTube.

A YouTube spokesperson, who asked not be named for security reasons, confirmed the policy in response to questions. The company would not specify when the policy went into effect.

As YouTube terms already barred “terrorists” from using the service, the new policy keeps out videos uploaded by others that militants likely would try to distribute if they could have accounts, according to the spokesperson.

Hundreds of videos of slain al Qaeda recruiter Anwar al-Awlaki lecturing on the history of Islam, recorded long before he advocated violence against the United States, were among those removed under the new policy, the spokesperson said.

Governments and human rights groups have pressed YouTube for years to crack down on extremist videos. They argue that the propaganda radicalized viewers and contributed to deadly terror attacks.

British Home Secretary Amber Rudd amplified the pressure during visits with tech companies in Silicon Valley in July and a speech in Washington, DC last week. European Union and US lawmakers this year have threatened consequences for tech companies if concerns are not addressed.

Legislation could resemble a German law approved in June to fine social media companies 50 million euros ($57 million) if hateful postings are not promptly removed.

YouTube said discussions with outside experts prompted the new policy, but it was unclear why the company decided to act only recently. In June, the company announced that “inflammatory religious or supremacist content” that did not violate its policies would be allowed with warning labels and a restriction making them ineligible for ad revenue.

At the time, Google General Counsel Kent Walker said in a blog post, “We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints.”

The latest step goes farther and was praised by critics such as Paul Barrett, deputy director of the New York University Stern Center for Business and Human Rights.

“If the terrorist is in the business of recruiting and inciting people to make violent attacks, you’ve got to the draw the line” against any of their content, Barrett said.

The new policy does not affect news clips or educational videos about terrorism. But YouTube will not always have an easy time distinguishing, experts said, pointing to tactics such as overlaying extremist commentary on news footage to get around censors.

YouTube has resisted imposing more editorial control because it fears making it harder for important videos to get a wide audience, Juniper Downs, YouTube’s global director of public policy, told a San Francisco conference sponsored by the Anti-Defamation League on Monday.

“We will lose something very valuable if we completely transform the way these platforms work,” she said during a panel discussion.

Internet freedom advocates such as the Electronic Frontier Foundation have urged tech companies to be cautious and transparent in responding to government pressure.

YouTube is relying on government lists of terrorists and terrorist groups for enforcement. Content moderators check the listings and make removal decisions after fielding reports from an automated system, users or partner organizations such as the Anti-Defamation League and The Institute for Strategic Dialogue.

Al-Awlaki, whom the US killed in a 2011 drone strike, was designated a terrorist by the US Treasury the year prior.

The New York Times first reported the removal of al-Awlaki videos.

Reuters

Feedback

web
analytics