Pages

Monday, March 18, 2019

Facebook, YouTube and Twitter caught in a game of whack-a-mole to delete mosque shooting videos

In the hours after a shooting suspect in New Zealand broadcast his rampage of a mosque across social media, internet companies worked quickly to remove versions of the video that continued to pop up on their platforms.

Facebook said Saturday it removed 1.5 million videos of the attack in the first 24 hours after it was originally livestreamed. Facebook said 1.2 million of those videos "were blocked at upload." Facebook did not immediately respond to CNBC's inquiry about the number of people who viewed the videos of the attack prior to their removal.

Google-owned YouTube, Twitter and Reddit also took steps in the hours after the attack to remove copies of the video that continued to populate their sites. Reddit banned a forum where a video of the attack had been posted, saying it violated its policies by "glorifying or encouraging violence." But hours after the shooting, which took 50 lives and was declared an act of terrorism by New Zealand's prime minister, the videos were still available online as tech companies continued to play whack-a-mole with duplicate versions.

YouTube removed tens of thousands of videos from its platform following the attacks and removed human review from its usual content moderation process in order to more quickly take down violent content related to the massacre, according to a spokesperson. The company also "terminated hundreds of accounts created to promote or glorify the shooter," the spokesperson said in a statement.

"The volume of related videos uploaded to YouTube in the 24 hours after the attack was unprecedented both in scale and speed, at times as fast as a new upload every second," the spokesperson said in the statement. "In response, we took a number of steps, including automatically rejecting any footage of the violence, temporarily suspending the ability to sort or filter searches by upload date, and making sure searches on this event pulled up results from authoritative news sources like The New Zealand Herald or USA Today. Our teams are continuing to work around the clock to prevent violent and graphic content from spreading, we know there is much more work to do."

YouTube has previously taken steps to prioritize news reports during a trending event, rather than videos that could potentially spread misinformation. But some of the copied videos of the New Zealand shooting were altered in ways that YouTube's automated systems couldn't detect, The Washington Post reported. The YouTube spokesperson said the company suspended the ability to sort searches by upload date as it tried to remove videos of the attack to make it more difficult to find the violent videos, though it's unclear how quickly this step was taken.

Twitter and Reddit did not immediately respond to requests for comment on the number of videos removed from their platforms after the attack.

Subscribe to CNBC on YouTube.

Watch: 'See something, say something' to tackle online extremism: Stratfor

Let's block ads! (Why?)

from Top News & Analysis https://ift.tt/2HEh4cf

No comments:

Post a Comment