X (formerly Twitter) takes swift action when taking down deepfake nude images that are videos of people having sexreported as copyright violations — but not when they're reported under "nonconsensual nudity," a study has found.
The paper, published by researchers at the University of Michigan and Florida International University, is an audit of X's reporting systems and hasn't yet been peer-reviewed, 404 Media reported. Researchers created five AI "personas" of young white women (to prevent further variables of race, gender, and age) and then made 10 replica images of each, resulting in 50 images. In terms of the ethics around generating deepfake porn themselves, researchers said these images underwent a "rigorous verification process" to ensure they didn't represent an existing individual.
SEE ALSO: How Big Tech is approaching explicit, nonconsensual deepfakesThey posted these images to X on 10 "poster accounts" they created, and then they created five X accounts to report the images. Twenty-five images were reported as Digital Millennium Copyright Act (DMCA) violations, and the other 25 were reported as nonconsensual nudity.
Researchers then waited three weeks to see the results of these reports. All 25 images reported for copyright were removed from X within 25 hours. In contrast, none of the images reported for nonconsensual nudity were removed within the three-week waiting period.
"Our findings reveal a significant disparity in the effectiveness of content removal processes between reports made under the DMCA and those made under X's internal nonconsensual nudity policy," the study states. "This highlights the need for stronger and directed regulations and protocols to protect victim-survivors."
X owner Elon Musk dissolved the platform's trust and safety council in 2022, but the site has recently opened up two dozen safety and cybersecurity positions in the U.S. Mashable has reached out to X for comment.
Earlier this year, WIRED found that victims of nonconsensual deepfake porn leveraged copyright laws to take down deepfakes on Google.
If you have had intimate images shared without your consent, call the Cyber Civil Rights Initiative’s 24/7 hotline at 844-878-2274 for free, confidential support. The CCRI website also includes helpful informationas well as a list of international resources.
Topics Social Good X/Twitter
(Editor: {typename type="name"/})
25 TikTok, YouTube and Instagram creators who made it to TV, movies
Dating the Iliad, and Other News by Sadie Stein
How to eat pussy, according to sex experts
Banned books: Here's where to read them for free
Why there's no such thing as being 'very online' anymore
Barnaby Conrad: Author, Matador, Bon Vivant, and Thorn in Hemingway’s Side by Lesley M.M. Blume
See a Paris Review Interview: Live! by Sadie Stein
Sugar Rush: Letter from Cape Town by Anna Hartford
Trump praises storm response as historic disaster unfolds in Houston
Happy Birthday, Victor Hugo by Sadie Stein
接受PR>=1、BR>=1,流量相当,内容相关类链接。