Uploading all your nudes to Facebook isn't such a bad idea

An industry-wide database of image hashes could stop non-consensual pornography, or revenge porn, at source

Facebook users in Australia are being invited to upload nude images of themselves in a bid to stem the flow of non-consensual pornography. It’s a significant move against an issue that many will know all-too-well.

More catchily but inappropriately dubbed ‘revenge porn’, the process of image-based abuse often works like this: you’re in a relationship with someone, send them nude pictures, break up and then find that they’ve been sharing them online. Your phone might also have been stolen or hacked. All cases have one thing in common: explicit images are distributed without knowledge or consent.

Networks such as Facebook and Instagram unwittingly play a major role in distributing such images. But maybe that’s about to change. In a limited trial, Facebook has partnered with Australia’s eSafety Commissioner to let people upload nude images of themselves to Messenger so they can be preemptively hashed. Similar trials will soon launch in Canada, the UK and the USA.

It’s not as dumb as it sounds. People who want to submit an image will be asked to send it to themselves on Facebook Messenger. Authorities then notify Facebook of the submission so the network’s community operations team can create a hash, or digital fingerprint, of the image. This hash should, in theory, prevent it from being uploaded or shared on any Facebook-owned platforms. Facebook could also share this hash with other platform owners such as Google and Microsoft. And let’s be clear: this isn’t Facebook asking people for their nudes, it's a partnership with the authorities to tackle online abuse.

Read more: Scotland lays down law on social media crime and revenge porn

Frances Rideout, deputy director of the Legal Advice Centre at Queen Mary University of London and an expert on the sharing of intimate images online, says the problem has become “endemic” and welcomes a new technological tool to help tackle it. “It’ll be interesting to see how well it works,” she says. Rideout also believes that more and more people are becoming aware of the consequences of sharing such images. "About a year ago I would've said yes, it is a growing problem. But I think potentially the message is starting to get out and people are more aware of the legalities.”

She challenges the perception that revenge porn is only an issue for teenagers and twenty-somethings. Older generations sometimes use images as a means of extortion, or share images online without realising the consequences of their actions, she explains. Ultimately though, it’s all about stopping the images at source so they can’t spread widely online.

There are, potentially, other solutions. Last month, two 21-year-old entrepreneurs from UC Berkeley announced an app called Nude. Marketed as a “secure vault for all your naughty photos” the app uses machine learning to scan your camera roll for nude photos (videos aren’t supported) and hide them away in a PIN-protected vault. If anyone tries to guess your PIN and fails, the app automatically takes a photo of their face with the front-facing camera. All analysis of images is done locally, on-device.

Read more: Why there's no 'silver bullet' for ridding the web of revenge porn

Disturbed by the prospect of sending intimate pictures to Facebook? Don’t be. Apple can already tell if you’ve taken a picture of yourself in your underwear – brassiere is just one of the near-4,500 categories hidden in the iPhone’s Photos app along with ammunition, rocket salads, spatulas and gastropod. Your iPhone can also categorise your photos based on how greedy, disgusted, neutral, scream-ey, smily, surprised or suspicious the person or people in it look. Google’s artificial intelligence has been rummaging through anything stored in its Photos app for years. In Apple’s case, the analysis is done on-device, limiting its potential. But Google’s trove of data and intelligence lets you search for images based on even the most obscure terms. Want to find a picture you took of your partner eating an ice cream while wearing a hat on a beach in Malaga? It’s only a quick search query away. Facebook is using similar systems to analyse and categorise images.

People are understandably unnerved by this kind of intrusion into their lives, but in the case of Facebook the latest data-grab should be welcomed. The social network won’t retain a copy of the image, only the hash and the new database should help to stop this kind of abuse at source. The technology already exists and is exploited at scale by people carrying out acts of sexual abuse online. Now, belatedly, those same tools are being turned against the abusers.

Hashing of images and videos has been commonplace for years. YouTube has been operating its Content ID system since 2007. Based on a vast database of known copyrighted audio and video material, Content ID scans all new videos uploaded to the site to check for piracy. Copyright holders can then either block a video or make money from it. In 2015, the UK-based Internet Watch Foundation (IWF) partnered with Microsoft to use the firm’s PhotoDNA technology to create an extensive list of hashes for images and videos of child sexual abuse. That database is shared with major technology firms. In 2016 alone, 122,972 hashes were added to the list, with 60,821 of those depicting the rape or sexual torture of children.

The IWF hash list for child sexual abuse presents an interesting model for how major networks might eventually handle revenge porn. The notion of an industry-wide hash list to tackle the issue is daunting, but would go a long way towards stopping the issue at source. We’re already used to sharing intimate images of ourselves with those we trust, but are we ready to put that trust into Facebook?

This article was originally published by WIRED UK