Facebook is asking users to “send nudes.” Yep, you heard that right.

The social media giant, in an effort to tackle revenge porn, is asking their users to send the company nude photos, in an attempt to give some control back to victims of the abuse.

Individuals who have shared intimate or sexual images with their partners and are worried that they might distribute them without their consent, can use Messenger to send the images to the Facebook. The company will then “Hash” those images, by converting the image into a unique digital fingerprint that can be used to identify and block any attempts to re-upload that same image.

In partnership with a government agency, Facebook is piloting the technology in Australia, headed up by the e-safety commissioner, Julia Inman Grant. According to her, this move would allow victims of “image-based abuse” to take action before pictures were posted to Facebook, Instagram or Messenger.

“We see many scenarios where maybe photos or videos were taken con-sensually at one point, but there was not any sort of consent to send the images or videos more broadly,” she told the ABC.

Even though the company’s intentions are all about a good cause and to help their users, but any such demand can seriously confuse or frighten people. So the Facebook has made a blog-post named, “The Facts: Non-Consensual Intimate Image Pilot” to explain and clear the air.

Here is what Antigone Davis, Facebook’s Global Head of Safety says:

“With this new small pilot, we want to test an emergency option for people to provide a photo proactively to Facebook, so it never gets shared in the first place. This program is completely voluntary. It’s a protective measure that can help prevent a much worse scenario where an image is shared more widely. We look forward to getting feedback and learning.”

Here’s how the pilot works, according to Facebook itself:

Australians can complete an online form on the eSafety Commissioner’s official website.
To establish which image is of concern, people will be asked to send the image to themselves on Messenger.

The eSafety Commissioner’s office notifies us of the submission (via their form). However, they do not have access to the actual image.

Once we receive this notification, a specially trained representative from our Community Operations team reviews and hashes the image, which creates a human-unreadable, numerical fingerprint of it.

We store the photo hash—not the photo—to prevent someone from uploading the photo in the future. If someone tries to upload the image to our platform, like all photos on Facebook, it is run through a database of these hashes and if it matches we do not allow it to be posted or shared.

Once we hash the photo, we notify the person who submitted the report via the secure email they provided to the eSafety Commissioner’s office and ask them to delete the photo from the Messenger thread on their device. Once they delete the image from the thread, we will delete the image from our servers.

Roughly 4% of internet users from United States have been victims of revenge porn, according to a report from 2016. The proportion rises to 10% when dealing with women under the age of 30.

The technology that is being used to “Hash” the images was first developed in 2009 by Microsoft to clamp down on the images of sexually abused children being circulated on the internet over and over again.

A Facebook spokeswoman says the company is also exploring additional partners and countries.