Connect with us

Technology

Meta launches tool to stop revenge porn from spreading on Facebook and Instagram

Published

on

[ad_1]

Meta launches tool to stop revenge porn from spreading on Facebook and Instagram – but users concerned of being victimized must submit a case of their images and videos to a hashing database”

  • The tool is a global website called StopNCII.org, which stands for ‘Stop Non-Consensual Intimate Images’
  • People concerned their intimate images or videos have been posted or might be posted to Facebook or Instagram can create a case through the website
  • Users go to the website, which opens the camera roll on their device and allows users to select photos and videos they wish to submit as the case 
  • Meta says the content is turned into a digital fingerprint, allowing them to identify or detect the explicit content  and claims no human eyes see them 










Meta rolled out a new tool on Thursday that stops revenge porn from spreading on Facebook and Instagram.

When someone is concerned their intimate images or videos have been posted or might be posted to either of the social media platforms, they can create a case through a global website called StopNCII.org, which stands for ‘Stop Non-Consensual Intimate Images.’

To make a case, users press the ‘Select photos/videos’ option on the website, which opens the device’s camera roll and allows users to select media that could be used as possible revenge porn against them. 

Each photo or video receives a digital fingerprint, or unique hash value, which is used to detect and track its copy that was shared or attempted to be posted without the person’s permission – but a Meta spokes person told DailyMail.com that the images or videos do not leave the smartphone. 

Meta says they ‘will not have access to or store copies of the original images.’ 

A Meta spokesperson told DailyMail.com in an email that ‘only the person submitting a case to StopNCII.org has access to their images/videos’ and ‘all of the computations necessary to compute an image’s hash happens in the browser, which means ‘images never leave the person’s device.’

‘Only cases are submitted to StopNCII.org and hashes of the person’s images or videos are shared with participating tech companies like Meta,’ the spokesperson added. 

Scroll down for video 

Meta rolled out a new tool on Thursday that stops revenge porn from spreading on Facebook and Instagram

Meta rolled out a new tool on Thursday that stops revenge porn from spreading on Facebook and Instagram

‘Only hashes, not the images themselves, are shared with StopNCII.org and participating tech platforms,’ Antigone Davis, global head of safety for Meta, shared in a blog post.

‘This feature prevents further circulation of that NCII content and keeps those images securely in the possession of the owner.’ 

StopNCII.org builds on a pilot program launched in 2017 in Australia, when it asked the public for photos of themselves to create hashes that could be used to detect similar images on Facebook and Instagram.

Meta was originally going to setup Facebook to let people submit their intimate images or videos to stop them from spreading, but the sensitive media would have been reviewed by human moderators during the process before they were converted into unique digital fingerprints, NBC News reports. 

When someone is concerned their intimate images or videos have been posted or might be posted to either of the social media platforms, they can create a case through a global website called StopNCII.org

When someone is concerned their intimate images or videos have been posted or might be posted to either of the social media platforms, they can create a case through a global website called StopNCII.org 

Knowing this, the social media firm opted to bringing in a third-party, StopNCII, which specialize in image-based abuse, online safety and women’s rights.

‘Only hashes, not the images themselves, are shared with StopNCII.org and participating tech platforms,’ Davis wrote.

‘This feature prevents further circulation of that NCII content and keeps those images securely in the possession of the owner.

StopNCII.org is for adults over 18 years old who think an intimate image of them may be shared, or has already been shared, without their consent.

The website was developed by UK-based Revenge Porn Helpline, with technical and financial support from Meta, and can be used to stop revenge porn from spreading on other platforms besides Meta’s services.

A report in 2019, released by NBC, Meta identifies nearly 500,000 cases of revenge porn every month.

To deal with the influx of revenge porn, Facebook used human moderators in tandem with an algorithm developed to identify nude images, helps vet reports and take pictures down.

But with StopNCII.org, human moderators are replaced with hashes that can detect and identify the images – after potential victims share the explicit content with Meta.

WHAT CONSTITUTES REVENGE PORN ON FACEBOOK?

Much of what constitutes revenge porn is covered under Facebook’s rules on nudity.

In March 2015, however, the social network brought in specific community guidelines to address the growing problem of revenge porn.

The section, entitled ‘Sexual Violence and Exploitation’, deals specifically with the subject.

The guidelines say: ‘We remove content that threatens or promotes sexual violence or exploitation. 

‘This includes the sexual exploitation of minors and sexual assault.

‘To protect victims and survivors, we also remove photographs or videos depicting incidents of sexual violence and images shared in revenge or without permission from the people in the images.

‘Our definition of sexual exploitation includes solicitation of sexual material, any sexual content involving minors, threats to share intimate images and offers of sexual services. 

‘Where appropriate, we refer this content to law enforcement.’

Advertisement

[ad_2]

Source link