全部版块 我的主页
论坛 提问 悬赏 求职 新闻 读书 功能一区 学道会
960 0
2019-08-03
Creator kills app that uses AI to fake naked images of women

An app developer who created an algorithm that can digitally undress women in photos has pulled the plug on the software after high traffic and a viral backlash convinced him that \"the world is not ready\" for it, \"yet.\"

DeepNude used artificial intelligence to create the \"deepfake\" images, presenting realistic approximations of what a woman — it was not designed to work on men — might look like without her clothes.

The free version of DeepNude placed a large watermark on images it generated. The $50 version, however, just slapped a small stamp that reads “FAKE” in the upper-left corner of the pictures. As the online magazine Motherboard noted, it could be easily cropped out.

When Motherboard first reported on the app Thursday, its creator, a programmer who goes by \"Alberto, \" insisted he was \"not a voyeur\" but merely a technology enthusiast who was driven to create the app by \"fun and enthusiasm.\"

Soon after Motherboard\'s report, traffic caused the server to crash. On Thursday evening, after further coverage and outrage on social media, Alberto announced DeepNude\'s end on Twitter, saying the chances of people abusing the app were too high.

\"We don\'t want to make money this way, \" the tweet read. \"The world is not yet ready for DeepNude.\"

Though pornographic deepfake images don\'t technically count as revenge porn, since they aren\'t actually images of the real women\'s bodies, they\'re still capable of causing psychological damage. California is considering a bill that would criminalize pornographic deepfakes, making it the only state to date to take legislative action against them.
二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

相关推荐
栏目导航
热门文章
推荐文章

说点什么

分享

扫码加好友,拉您进群
各岗位、行业、专业交流群