Facebook and YouTube Now Automatically Block Extremist Content

Facebook and YouTube Now Automatically Block Extremist Content
Text Size
- +

Toggle Dark Mode

Facebook and YouTube, two of the web’s premier platforms for video content, have implemented automated processes for removing extremist videos from their sites.

Internet companies have come under mounting pressure from government leaders to eliminate violent propaganda and other video content espousing extremist views, especially as terrorist attacks continue to proliferate and dominate headlines across Europe and America.

The technology that these two internet behemoths are likely using is called hashing, which was originally developed to protect copyrighted material from piracy and infringement. An algorithm searches for “hashes” which function as digital fingerprints that are assigned to specific videos. Once a video is flagged for removal, the technology blocks and takes down all other videos with a matching hash.

While such measures help stem the dissemination of inappropriate videos by preventing them from being reposted, they would be unable to stop new Islamic State propaganda from being posted in the first place.

Google, which owns YouTube, and Facebook have yet to comment on or reveal any details regarding their anti-extremism measures. It is unknown how much human input is involved in identifying banned content and what standards the companies are using to determine whether content is extremist. Such standards are likely subject to further debate and refinement given that, unlike child pornography and pirated material, extremist propaganda is difficult to define and is largely protected by the First Amendment.

The notion of a content-blocking system was first advanced earlier this year by a non-government organization called the Counter-Extremism Project, which founded by Frances Townsend who was a homeland security advisor for former president George W. Bush.

While Internet companies expressed wariness of letting a third-party act as an arbiter of acceptable content, it’s also clear that, amid pressure from the government and public, they have decided that user-driven flagging and removal of content is no longer sufficient and that more drastic intervention may be necessary.

Social Sharing