AI image generator produces 500,000 NSFW pics daily, CEO says, but AI porn is a complex industry.
AI image generator produces 500,000 NSFW pics daily, CEO says, but AI porn is a complex industry.
Unstable Diffusion: The Controversial AI Image Generator Creating Waves
Unstable Diffusion, the NSFW AI image generator that was kicked off Kickstarter in December, is making a significant impact in the world of artificial intelligence. With its CEO and co-founder, Arman Chaudhry, revealing that the platform generates over 500,000 images every day, it is clear that Unstable Diffusion has garnered a substantial following.
Unstable Diffusion operates similarly to other AI image generators, using text inputs to create images. However, what sets it apart is its minimal content restrictions, allowing it to generate NSFW (Not Safe for Work) images, including pornographic content. This characteristic has attracted a specific audience seeking to explore AI art without limitations.
The foundation of Unstable Diffusion stems from Stable Diffusion, a popular text-to-image generator developed by Stability AI. Unstable Diffusion leverages the publicly available code of Stable Diffusion, enabling any user to view and modify the code to create unique versions of the original model. Arman Chaudhry himself capitalized on this opportunity, launching Unstable Diffusion in August 2022, the same month Stable Diffusion was made public.
Initially, Unstable Diffusion gained traction as a subreddit dedicated to sharing AI-generated porn. As the community grew, it migrated to Discord, where members began building tools, eventually creating a bot that would evolve into an early version of the Unstable Diffusion image generator. Chaudhry explains the motivation behind creating this community: “We founded the Discord group as a refuge for artists who wanted to create AI art without limitations, including those related to adult content.”
Despite being removed from Kickstarter after just 12 days, Unstable Diffusion managed to raise the necessary funds through alternative channels. Chaudhry revealed that they secured $30,000, which allowed them to launch a standalone web app. The platform offers a basic free service, as well as three premium options costing between $14.99 and $59.99 per month. These premium services offer users the ability to generate more images simultaneously, produce images faster, and use images for commercial purposes.
- Ron DeSantis comments on President Biden’s age following Mitc...
- Airlines slow to adopt cloud technology, but still time to upgrade.
- 13 vaccines developed in history
Nevertheless, the proliferation of AI-generated porn opens the door to misuse, including the creation of deepfake pornography and the portrayal of minors in sexual acts. In February, a popular Twitch streamer, Atrioc, was caught watching deepfake porn featuring female streamers, leading to discussions about the traumatic and abusive nature of non-consensual deepfake porn.
Chaudhry maintains that Unstable Diffusion employs an “aggressive filter and moderation system” to prevent deepfakes and other undesirable content. However, TechCrunch demonstrated that it is not foolproof, as they were able to create images resembling Donald Trump and Chris Hemsworth using the generator. Chaudhry acknowledged that this occurred during a period when the deepfake content generation filter had temporarily malfunctioned for some users.
Perfect content moderation, especially for a product with minimal restrictions, poses significant challenges. Dan Hendrycks, an AI safety expert and director of the Center for AI Safety, highlights the inherent unreliability of systems depending on human reliability. According to Hendrycks, it is practically impossible to achieve flawless content moderation for any system capable of producing inflammatory material.
Even the extensively moderated AI chatbots developed by tech giants like Google and OpenAI are susceptible to having their guardrails bypassed. The lack of content moderation with Unstable Diffusion raises concerns about potential legal issues. Several jurisdictions have taken steps to address the dangers associated with deepfake pornography. For instance, in 2019, California enacted legislation allowing residents to sue individuals who use deepfake technology to insert them into pornographic material without consent. Additionally, a bill was introduced in Congress in May to criminalize the sharing or dissemination of non-consensual, AI-generated pornography.
While Unstable Diffusion continues to drive innovation in AI image generation, its existence raises important ethical and legal questions. As the technology advances, it becomes imperative to strike a balance between artistic freedom and responsible use to protect individuals from the potential ramifications of AI-generated content.
Overall, Unstable Diffusion remains a remarkable example of the power and limitations of AI image generation technology, offering a glimpse into the potential future of artistic expression and the challenges society must tackle head-on.