Creator kills app that uses AI to fake naked images of women
An app developer who created an algorithm that can digitally undress women in photos has pulled the plug on the software after high traffic and a viral backlash convinced him that \"the world is not ready\" for it, \"yet.\"
DeepNude used artificial intelligence to create the \"deepfake\" images, presenting realistic approximations of what a woman — it was not designed to work on men — might look like without her clothes.
The free version of DeepNude placed a large watermark on images it generated. The $50 version, however, just slapped a small stamp that reads “FAKE” in the upper-left corner of the pictures. As the online magazine Motherboard noted, it could be easily cropped out.
When Motherboard first reported on the app Thursday, its creator, a programmer who goes by \"Alberto, \" insisted he was \"not a voyeur\" but merely a technology enthusiast who was driven to create the app by \"fun and enthusiasm.\"
Soon after Motherboard\'s report, traffic caused the server to crash. On Thursday evening, after further coverage and outrage on social media, Alberto announced DeepNude\'s end on Twitter, saying the chances of people abusing the app were too high.
\"We don\'t want to make money this way, \" the tweet read. \"The world is not yet ready for DeepNude.\"
Though pornographic deepfake images don\'t technically count as revenge porn, since they aren\'t actually images of the real women\'s bodies, they\'re still capable of causing psychological damage. California is considering a bill that would criminalize pornographic deepfakes, making it the only state to date to take legislative action against them.