The creators of the app “DeepNude” have shut it down citing the high probability of its misuse. The application allowed users to virtually “undress” women using Artificial Intelligence (AI) powered algorithms. It received a lot of flak on social media over its potential for abuse. The creators of the app said that the software was launched several months ago for “entertainment” purposes and that they “greatly underestimated” its demand.
"We never thought it would be viral and (that) we would not be able to control the traffic. Despite the safety measures adopted (watermarks), if 500,000 people use it, the probability that people will misuse it is too high. We don't want to make money this way,” the DeepNude creators, who listed their location as Estonia, said on Twitter. The app used Deepfake technology, which combines and superimposes existing images and videos onto source images or videos using a machine learning technique.
pic.twitter.com/8uJKBQTZ0o
— deepnudeapp (@deepnudeapp) 27 June 2019“Surely some copies of DeepNude will be shared on the web, but we don't want to be the ones who sell it. Downloading the software from other sources or sharing it any other means would be against the terms of our website. From now on, DeepNude will not release other versions and does not grant anyone its use. Not even the licenses to activate the Premium version,” the creators said.
Several media reports have noted how the app could be used to take a photo of a clothed woman and transform that into a nude image. Cyber Civil Rights Initiative (CCRI), which seeks protection against “revenge” porn tweeted, “This is a horrifically destructive invention and we hope to see you soon suffer consequences for your actions.” CCRI President Mary Anne Franks later tweeted, “It's good that it's been shut down, but this reasoning makes no sense. The app's INTENDED USE was to indulge the predatory and grotesque sexual fantasies of pathetic men.”
https://ift.tt/2XcCS2I