Digital Event Horizon
GitHub’s Deepfake Porn Crackdown: The Unyielding Challenge of Moderating Open-Source Material
In a recent investigation, WIRED found more than a dozen GitHub projects linked to deepfake "porn" videos evading detection on the platform. Despite efforts by GitHub to curb this content, much of it continues to persist online.
Github struggles to enforce its rules on deepfake porn content, with much of it continuing to evade detection. A deepfake video featuring Charli D'Amelio's face was uploaded to GitHub despite the platform's new policies. The code used to create the deepfake model is still available on GitHub after being disabled for violating rules. GitHub's moderation efforts are hindered by open-source material, making it difficult to remove content once it's publicly available. Repos with near-identical branding or clear descriptors as "NSFW" or "unlocked" versions of deepfake models continue to evade detection. The proliferation of deepfake content poses significant risks, including intimate image abuse and misinformation.
GitHub, the world's largest developer platform, has been grappling with a contentious issue: enforcing its rules on deepfake porn content. Despite the introduction of new policies and measures to curb such content, it appears that much of this illicit material continues to evade detection.
In late November, a deepfake porn maker claiming to be based in the US uploaded a sexually explicit video to the world’s largest site for pornographic deepfakes, featuring TikTok influencer Charli D’Amelio’s face superimposed onto a porn performer’s body. Despite the influencer presumably playing no role in the video’s production, it was viewed more than 8,200 times and captured the attention of other deepfake fans.
The creator of this video, “DeepWorld23,” has claimed that the program used to create the deepfake model was hosted on GitHub, a platform known for its open-source projects. This program was reportedly "starred" by 46,300 other users before being disabled in August 2024 after the platform introduced rules banning projects for synthetically creating nonconsensual sexual images.
However, it appears that this code has persisted in various repositories on GitHub, with more than a dozen programs linked to deepfake "porn" videos evading detection. This is a stark reminder of the challenges faced by platforms such as GitHub when it comes to moderating open-source material online.
Henry Ajder, an AI adviser to tech companies like Meta and Adobe, notes that removing content from the platform can be a difficult task. “It’s not easy to always remove something the moment it comes online,” he says. “At the same time, there were red flags that were pretty clear.” He also highlights that once a model is made open source publicly available for download, there's no way to do a public rollback of that.
GitHub has taken some steps to address this issue. In June 2024, it implemented a new policy banning projects that are "designed for, encourage, promote, support, or suggest in any way the use of synthetic or manipulated media for the creation of nonconsensual intimate imagery or any content that would constitute misinformation or disinformation."
However, it appears that some repositories have popped up elsewhere on the platform, including those with near-identical branding or clear descriptors as "NSFW," "unlocked" versions, or "bypasses." This has led to concerns about the effectiveness of GitHub's moderation efforts.
One project identified by a news outlet in December 2024 had branding almost identical to a major project described as the “leading software for creating deepfakes.” This repository was disabled by GitHub for several months last year for violating its terms of service. However, an archived version of this major repository is still available on the platform.
At least six other repositories based on the model were present on GitHub as of January 10, including another branded almost identically to the first one. The existence of these repositories highlights the ongoing challenge faced by platforms like GitHub in policing open-source material online.
The proliferation of deepfake content continues to pose significant risks, particularly with regards to intimate image abuse and misinformation. It is not just psychological harm that is a concern, but also potential knock-on effects such as intimidation and manipulation of women, minorities, and politicians.
In response to this issue, at least 30 US states have implemented legislation addressing deepfake porn, including bans. Additionally, the UK government has announced plans to criminalize the creation of sexually explicit deepfakes, as well as their sharing.
While there are steps being taken to address this problem, it is clear that much work remains to be done. Platforms like GitHub must continue to adapt and evolve their moderation strategies in response to emerging threats.
In conclusion, the struggle of platforms like GitHub to curb deepfake porn content is a complex issue that highlights the challenges posed by open-source material online. While efforts are being made to address this problem, it appears that much of this illicit material continues to evade detection.
Related Information:
https://www.wired.com/story/githubs-deepfake-porn-crackdown-still-isnt-working/
Published: Thu Jan 16 06:29:00 2025 by llama3.2 3B Q4_K_M