YouTube Deletes 5-Million Videos For Content Violation

These are the steps YouTube says it has taken against extremism, misinformation and child endangerment.
Dado Ruvic / Reuters

YouTube deleted about 5 million videos from its platform for content policy violations in last year's fourth quarter before any viewers saw them, it said in a new report that highlighted its response to pressure to better police its online community.

YouTube has been criticized by governments that say it does not do enough to remove extremist content, and by advertisers, such as Procter & Gamble Co and Under Armour that briefly boycotted the service when they unwittingly ran ads alongside videos the companies deemed inappropriate.

YouTube said in the report Monday that automating enforcement through software "is paying off" in quicker removals. The company said it did not have comparable data from prior quarters.

YouTube said it still needed an in-house team of humans to verify automated findings on an additional 1.6 million videos that were removed only after some users watched the clips.

The automated system did not identify another 1.6 million videos that YouTube took down once they were reported to it by users, activist organizations and governments.

They still have lots of work to do but they should be praised in the interim," Paul Barrett, who has followed YouTube as deputy director at the New York University Stern Center for Business and Human Rights, said.

Facebook also said on Monday it had removed or put a warning label on 1.9 million pieces of extremist content related to ISIS or al-Qaeda in the first three months of the year, or about double the amount from the previous quarter.

Corralling problematic videos, whether through humans or machines, could help YouTube, a major driver of Google's revenue, stave off regulation and a sales hit. For now, analysts say demand for YouTube ads remains robust.

The following are steps that YouTube has taken:

1. Extremism

YouTube officials say the company removes videos that contain hate speech or incite violence. It issues "a strike" to the uploader in each instance and bans uploaders with three strikes in a three-month period. Also banned are government-identified "terrorist organizations" and materials such groups would upload if they could. YouTube shares the digital fingerprints of removed videos with a consortium of tech companies.

White Supremacists doing the 'roman salute'; in order to antagonize protesters in Shelbyville, Tennessee, Illinois, US on 28 Ocotber 2017.
White Supremacists doing the 'roman salute'; in order to antagonize protesters in Shelbyville, Tennessee, Illinois, US on 28 Ocotber 2017.
Shay Horse/NurPhoto via Getty Images

Borderline videos get stamped "graphic" and stripped of features that would give them prominence. YouTube added options for advertisers to avoid sponsoring these videos last year.

YouTube automated scans have sped up takedowns of videos tied to ISIS or al-Qaeda. But it has struggled to draw a line on views espoused by white right-wing extremists, who tend to know the rules well and stop short of overt hate speech.

3D plastic representations of the Twitter, Facebook and Youtube logos are seen in front of a displayed ISIS flag in this photo illustration shot February 3, 2016.
3D plastic representations of the Twitter, Facebook and Youtube logos are seen in front of a displayed ISIS flag in this photo illustration shot February 3, 2016.
REUTERS/Dado Ruvic/File Photo

2. Misinformation

YouTube said it would be difficult to enforce a "truth" policy, leaving the company to look for other policy violations to remove videos with misleading information.

For instance, YouTube could delete a fabricated news report by finding it harasses its subject.

Since autumn, it has promoted "authoritative sources" such as CNN and NBC News in search results to push down problematic material. YouTube also plans to display Wikipedia descriptions alongside videos to counter hoaxes.

But YouTube still is cited as slow to identify misinformation amid major global breaking news events when video bloggers quickly upload commentary. The company preserves other challenged clips that have public interest value or come from politicians.

Getty Images/iStockphoto

3. Child endangerment

YouTube last year began removing videos and issuing strikes when the filming may have put a child in danger or when a cartoon character is used inappropriately.

YouTube does not alert law enforcement or intellectual property owners about these videos because YouTube says it cannot easily identify uploaders and rightsholders. Copyright owners that believe a video violates guidelines or infringes their copyright or trademark can report it to YouTube.

The company last year begin stepping up moderation of comments that inappropriately reference children.

Reporting by Paresh Dave; Editing by Greg Mitchell and Susan Thomas

Close

What's Hot