Jean Claude Joker, a UN address address from his European Parliament, said that one hour was “decisive time window”.

Net companies were given three months in March so that they were working fast to reduce extremist posts.

But the EU regulators said that it is going to be very little.

 

According to the European Union’s proposal of government employees, if the authorities deprive the flag of extremism and support them, the contents should be removed from the web within an hour. Net companies fail to complain will face up to 4% of their annual global change.

This proposal will require support from the European Union as well as countries preparing the European Parliament.

Read More:‘My robot makes me feel like I haven’t been forgotten’

In response to the projects, Facebook said: “There is no place for terrorism on Facebook, and we have shared the purpose of the European Commission to fight it, and it is believed that it is only companies, civil society and There is a general effort in the institutions. Results can be obtained.

“We have played an important role in finding and removing terrorist and rapidly propaganda, but we know we can do more.
YouTube’s spokeswoman said that the site shared “the European Commission’s demand to reactivate terrorist content rapidly and to keep extremism of extremism away from our platform”. ”

“So we have invested heavily in people, technology and cooperation with other tech companies on these efforts.”

Internet platforms will need to develop new methods in police content but it is not clear what they will do.

Read More:EVGA confirms Hydro Copper and Hybrid GeForce RTX 20 cards

“We need strong and targeted devices to win this online war,” Justice Commissioner Vera Kruro said.

While companies such as Google insist on increasing the machine to connect matters quickly, they also require many human centers to accommodate extremist content.

The Commission has demonstrated its voluntary code on Facebook’s hate speech, with Microsoft, Twitter, and YouTube.

What’s the problem and what’s going on?

In 2017, Google said it would devote more than 10,000 staff to eliminate multiple extremist content on YouTube.
Staff said the staff has seen about two million videos for violent utility from June to December 2017
YouTube said more than 98% of these types were automatically flagged, YouTube said, over 50% of videos have been seen more than 10
Industry members have worked together to create database of “digital toes” of previously identical materials to improve extremist content from 2015. By December 2017, it has more than 40,000 “hashes”

Read More:I’m Sorry, Status Quo Backup And Recovery Can’t Come To The Phone Right Now

Facebook 2017 claims that consumers had removed 99 percent Islamic State and al-Qaeda-related content before throwing this flag. The social network said that remaining remaining 83% remaining material was identified and removed within an hour.
Between August 2015 and December 2017, Twitter said that in order to prevent the spread of extremist propaganda, he has suspended more than 1.2 million accounts. It has been said that 93% of the internal appliances are flagged, 74 percent have been suspended before their first tweet.

http://learntechnews.com/wp-content/uploads/2018/09/103094745_mediaitem103094742.jpghttp://learntechnews.com/wp-content/uploads/2018/09/103094745_mediaitem103094742-150x150.jpgAli OfflicalTechnology
Jean Claude Joker, a UN address address from his European Parliament, said that one hour was 'decisive time window'. Net companies were given three months in March so that they were working fast to reduce extremist posts. But the EU regulators said that it is going to be very little.   According to...