Facebook CEO and EU President- Terror Takedown | iTMunch

Since the war on terror has expanded to social media, EU lawmakers are drafting policies to tackle the growing epidemic. Trending in the latest tech news, the European Union has drafted a new law to counter online terrorist activities. The law states that any social media platform will face a fine on failing to take down terrorism-related content within 60 minutes. Currently, the removal of such content by social networking sites like Facebook, Twitter, and YouTube is voluntary. After this legislation is drafted, these internet companies will be forced to delete material that has been flagged as extremist material by law enforcement agencies within an hour.

Facebook, Twitter and YouTube to Face Fines by EU

This latest tech development could be construed as controversial. Previously, policymakers have been accused of not doing enough to curb online terrorist activities. Julian King, the EU security commissioner has refrained from divulging details about the extent of the law and how the measures would be implemented. However, he has hinted that the legislation would most likely involve a mandate to remove the inflammatory content within an hour. This would turn the existing voluntary guidelines into a mandatory requirement.

EU Will Take Stronger Action to Protect Its Citizens

According to EU officials, the need for a law was deemed necessary because the existing approach was not showing “enough progress.” They also stated that European countries couldn’t afford to become “complacent” and take matters relating to terror lightly.

Although it might be a while before any definitive action is taken, the procedure to publish the legislation is already underway. Come September, the EU will publish the draft. However, member nations of the EU as well as the European Parliament will still have to vote on it.

How Does the Draft Affect Social Media Platforms?

Contrary to popular belief, a law to regulate extremist content on social media platforms would not pose a challenge for the latter. This is because most of the companies like YouTube, frequently flag and take down terrorist content. Tech giants like Facebook are large organizations that can easily hire content moderators to address such concerns. However, the law might prove challenging for smaller enterprises. Factors such as budget constraints, limited staff and logistics may create difficulties in matching the response speed of their bigger counterparts.

SEE ALSO: Social Mapper to Use Facial Recognition to Target Users on Social Media

The tightening of the guidelines follows a few months after Facebook announced that it has taken down nearly 2 million pieces of content relating to ISIS and Al-Qaeda. The platform also stated that 99 percent of the content had already been removed before they were reported. Meanwhile, Google has announced to add an additional 10,000 employees to its workforce solely to tackle content guideline violations by the end of the year. YouTube, that is owned by Google, currently uses machine learning to counter terrorism-related material. Almost 90 percent of the content is identified and removed by computers.

When the EU’s proposed legislation is approved, it will be a milestone for the European Commission as it targets the handling of illegal content by tech giants. Subscribe to iTMunch for the latest news and developments in technology.