The pilot program is designed to prevent intimate images of users from appearing against their will on the social network’s platform.
- Users worried an inappropriate image might appear on Facebook’s platforms are asked to send an intimate image via Messenger, a preventive measure designed to flag the images before they’re shared.
- The limited pilot program is available in three other countries: the U.S., U.K., and Canada.
- To prevent revenge porn, Facebook will look at user-submitted nude photos
Pilot program goals are laudable, but is the remedy as bad as the ailment it treats?
DAN GOODIN, Technica
Nov 8, 2017: Facebook is experimenting with a new way to prevent the posting of so-called revenge porn that involves a highly questionable requirement. Potential victims must send nude pictures of themselves though the social network’s official messenger so the images can be viewed, in full, unedited form, by an employee of the social network.
A Facebook spokeswoman said the employee would be a member of the company’s community operations team who has been trained to review such photos. If the employee determines the image violates site policies, it will be digitally fingerprinted to prevent it from being published on Facebook and Facebook-owned Instagram. An article posted by the Australian Broadcasting Corporation reported said the service is still being tested with help from Australian government officials. To use it, potential victims will first complete this online form, and then send the images to themselves over Facebook Messenger.
The Facebook spokeswoman said she was unable to confirm details published earlier by The Daily Beast that said Facebook would continue to store blurred versions of the images for an unspecified amount of time after the hash was taken. The Facebook spokeswoman agreed to describe the new program on the condition the discussion be kept on background, an arrangement that prevents this post from naming or directly quoting the representative.
The service is designed to block the unauthorized posting of a person’s nude images by former romantic partners or others who have obtained the pictures. Over the past decade, revenge porn has emerged as a major Internet scourge. Earlier this year, a private US Marines Corps Facebook group calling itself Marines United was caught posting nude photos of female service members without permission. The group had almost 30,000 followers.
Extraordinary undertaking, unprecedented trust: People started criticizing and questioning the pilot Facebook project as soon as it came to light. The first issue it raises is the unprecedented trust it requires in both Facebook technology and employees. The social network is telling people to provide unedited, intimate images that have yet to be published.
Ensuring those images are viewed only by a limited number of trained employees, are eventually destroyed permanently, and are never inadvertently or intentionally leaked is an extraordinary undertaking, particularly when done in the mass numbers Facebook typically deals with. There’s no doubt the problem of revenge porn is real. But in an age when some of the most careful and security-conscious companies get hacked or suffer insider breaches, it’s possible Facebook’s remedy may be as bad as the ailment it seeks to treat.
Other questions revolve around the technology used to uniquely identify images. Traditional cryptographic hashing is easy to defeat. Changing the metadata, size, or format of an image will almost certainly completely change the long string of numbers and letters that serve as standard digital fingerprints. Facebook Chief Security Officer Alex Stamos seemed to take on these doubts on Tuesday when he said on Twitter that the hashing algorithms Facebook will use work differently and can be used to fingerprint photos and videos in a way that’s resilient to such changes.
“I hate the term ‘hash’ because it implies cryptographic properties that are orthogonal to how these fingerprints work,” he explained.
Technica