March 18, 2019:
Called "Not Without My Consent," the resource hub will help victims find organisations and resources to support them, including steps they can take to remove the content from Facebook and prevent it from being shared further, Antigone Davis, Facebook's Global Head of Safety, said in a statement.
"We're also going to make it easier and more intuitive for victims to report when their intimate images were shared on Facebook," Davis said.
Facebook said it will also build a victim support toolkit to give people around the world more information with locally and culturally relevant support.
"By using machine learning and artificial intelligence, we can now proactively detect near nude images or videos that are shared without permission on Facebook and Instagram," Davis said.
"This means we can find this content before anyone reports it, which is important for two reasons: often victims are afraid of retribution so they are reluctant to report the content themselves or are unaware the content has been shared," he added.
A specially-trained member of Facebook's Community Operations team will review the content found by its technology.
"If the image or video violates our Community Standards, we will remove it, and in most cases we will also disable an account for sharing intimate content without permission," Davis added.
Source Link
Publish Your Article
Campus Ambassador
Media Partner
Campus Buzz
LatestLaws.com presents: Lexidem Offline Internship Program, 2026
LatestLaws.com presents 'Lexidem Online Internship, 2026', Apply Now!