Facebook said Friday it's launching a new AI tool to detect revenge porn before it's reported. That saves victims of the unwanted intimate posts the time and trial of getting the posts taken down.
It's Facebook's latest attempt to rid the platform of abusive content. The company has recently come under fire for the working conditions of contracted content reviewers who moderate posts on the site, so a successful AI detection tool could be a step in the right direction.
"Finding these images goes beyond detecting nudity on our platforms. By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram," Facebook's Global Head of Safety Antigone Davis said in a blog post.
The tool is trained to recognize a "nearly nude" photo — a lingerie shot, perhaps — coupled with derogatory or shaming text that would suggest someone uploaded the photo to embarrass or seek revenge on someone else. The flagged post is then sent to a human reviewer for confirmation.
In most cases where the flagged post is found to be in violation of the company's community standards, Facebook said, it will disable the associated account.
Facebook also said it's expanding a previously announced pilot program that lets Facebook users preemptively upload and flag images that they fear could be posted as revenge porn. Davis said the company has received positive feedback from the program, but called it an "emergency option."
The company is launching a support hub for victims of revenge porn, called "Not Without My Consent," developed with experts and victims organizations.
WATCH: Zuckerberg's push to make posts private could cause more misinformation, says expert
—The Associated Press contributed to this report.
No comments:
Post a Comment