Facebook—who can’t seem to control its own content—hopes we’ll trust it with our nude pictures. It’s an interesting, yet brazen plan they have to try to combat revenge porn.
This highly unusual measure is likely to split opinions. For those not familiar, revenge porn is the publication of explicit material; that which portrays someone who has not consented for the image or video to be shared. About 4% of US Internet users have been victims of revenge porn, according to a 2016 report from the Data & Society Research Institute. This number rises to 10% when dealing with women under the age of 30; who are 80% of the victims.
Facebook, however, believes the solution is for you to upload nude pictures of yourself to their system before anyone else manages to. The social network has developed an anti-revenge porn system that uses artificial intelligence to recognize and block specific images; testing it in the Canada, the US, UK and Australia.
“The safety and well-being of the Facebook community is our top priority,” said Antigone Davis, Facebook’s Head of Global Safety.
Send Facebook your nude pictures
Facebook users who fear their nude pictures may go public can pre-emptively send the image in question to themselves via Messenger. They then flag the image as a non-consensual file. So if somebody tried to upload that same image, which would have the same digital footprint or “hash” value, it will be prevented from being uploaded. But that begs the question: what are they doing with the photos?
e-Safety Commissioner Julie Inman Grant explains: “They’re not storing the image, they’re storing the link and using AI and other photo-matching technologies.”
The system, however, will only protect you from revenge porn on Facebook and Instagram. People would still be able to post your nude pictures elsewhere or send them via email.
This builds on other existing tools Facebook has to deal with revenge porn. In April they released reporting tools to allow users to flag nude pictures posted without their consent. They would be sent to “specially trained representatives” who would review the image and remove it if it violates Facebook’s community standards. Once a picture has been removed, photo-matching technology is used to ensure the image isn’t uploaded again.
Facebook’s proposed new plan sounds like as potentially counterintuitive as stopping a burglary by leaving the front door open. In the end, this is either going to be a smart technology, or one that’s sure to end in tears and a class-action lawsuit against Facebook.
For more information, click here.
Photo matching technology
PhotoDNA’s hash matching technology was first developed in 2009 by Microsoft. Facebook and other tech companies now use this type of photo-matching technology to tackle other types of content including child sex abuse and extremist imagery; the technology made it possible to identify known illegal images, even if someone had altered them. Facebook, Twitter and Google all use the same hash database to identify and remove illegal images.
As it relates to advertising, find out why Facebook banned this ad of a woman shaving her legs. Let us know your thoughts on Facebook’s play on combatting revenge porn. Do you think uploading nude pictures is the best way to prevent them from being shared?