The world is not yet ready for DeepNude
An app developer who created an algorithm that can digitally undress women in photos has pulled the plug on the software after high traffic and a viral backlash convinced him that the world is not ready for it.
DeepNude used artificial intelligence to create the “deepfake” images, presenting realistic approximations of what a woman — it was not designed to work on men — might look like without her clothes.
Deep-fake technologies will enable the creation of highly realistic and difficult to debunk fake audio and video content. Soon, it will be easy to depict someone doing or saying something that person never did or said. Soon, it will be hard to debunk digital impersonations in time to prevent significant damage.
Though much has been made of the technology’s threat to national security, it has also been harnessed to make a torrent of fake porn, including widely circulated videos of celebrities such as Gal Gadot and Scarlett Johansson. Although sites including Reddit, Twitter and Pornhub have tried to ban pornographic deepfakes, they have had limited success. The technology is cheap and easily accessible, and the opportunities for use are limitless.
The free version of DeepNude placed a large watermark on images it generated. The $50 version, however, just slapped a small stamp that reads “FAKE” in the upper-left corner of the pictures. As the online magazine Motherboard noted, it could be easily cropped out.
The app, which was available for Windows and Linux, was based on an open-source algorithm from the University of California at Berkeley, Alberto told Motherboard. DeepNude was taught to create convincing nudes using 10,000 naked images.
DeepNude’s creator said he mulled the ethics of his software but ultimately decided the same results could be accomplished through any number of photo-editing programs.
Soon after Motherboard’s report, traffic caused the server to crash. Late Thursday, after further coverage and outrage on social media, Alberto took to Twitter to announce DeepNude’s end, saying the chances of people abusing the app were too high.
“We don’t want to make money this way,” the tweet read. “Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it.”
“The world is not yet ready for DeepNude,” it said.
Pornographic deepfake images don’t technically count as revenge porn because they aren’t actual images of real women’s bodies, but they are still capable of causing psychological damage. California is considering a bill that would make pornographic deepfakes illegal, making it the only state to date to take legislative action against them.
DeepNude used artificial intelligence to create the “deepfake” images, presenting realistic approximations of what a woman — it was not designed to work on men — might look like without her clothes.
Deep-fake technologies will enable the creation of highly realistic and difficult to debunk fake audio and video content. Soon, it will be easy to depict someone doing or saying something that person never did or said. Soon, it will be hard to debunk digital impersonations in time to prevent significant damage.
Though much has been made of the technology’s threat to national security, it has also been harnessed to make a torrent of fake porn, including widely circulated videos of celebrities such as Gal Gadot and Scarlett Johansson. Although sites including Reddit, Twitter and Pornhub have tried to ban pornographic deepfakes, they have had limited success. The technology is cheap and easily accessible, and the opportunities for use are limitless.
The free version of DeepNude placed a large watermark on images it generated. The $50 version, however, just slapped a small stamp that reads “FAKE” in the upper-left corner of the pictures. As the online magazine Motherboard noted, it could be easily cropped out.
The app, which was available for Windows and Linux, was based on an open-source algorithm from the University of California at Berkeley, Alberto told Motherboard. DeepNude was taught to create convincing nudes using 10,000 naked images.
DeepNude’s creator said he mulled the ethics of his software but ultimately decided the same results could be accomplished through any number of photo-editing programs.
Soon after Motherboard’s report, traffic caused the server to crash. Late Thursday, after further coverage and outrage on social media, Alberto took to Twitter to announce DeepNude’s end, saying the chances of people abusing the app were too high.
“We don’t want to make money this way,” the tweet read. “Surely some copies of DeepNude will be shared on the web, but we don’t want to be the ones who sell it.”
“The world is not yet ready for DeepNude,” it said.
Pornographic deepfake images don’t technically count as revenge porn because they aren’t actual images of real women’s bodies, but they are still capable of causing psychological damage. California is considering a bill that would make pornographic deepfakes illegal, making it the only state to date to take legislative action against them.