The Scariest Thing About DeepNude Wasn’t the Software
<p>Atthe end of June, <a href="https://www.vice.com/en_us/article/kzm59x/deepnude-app-creates-fake-nudes-of-any-woman" rel="noopener ugc nofollow" target="_blank"><em>Motherboard </em>reported on a new app called DeepNude</a>, which promised — “with a single click” — to transform a clothed photo of any woman into a convincing nude image using machine learning. In the weeks since this report, the app has been <a href="https://twitter.com/deepnudeapp/status/1144307316231200768" rel="noopener ugc nofollow" target="_blank">pulled by its creator</a> and <a href="https://www.vice.com/en_us/article/8xzjpk/github-removed-open-source-versions-of-deepnude-app-deepfakes" rel="noopener ugc nofollow" target="_blank">removed from GitHub</a>, though open source copies have surfaced there in recent days.</p>
<p>Most of the coverage of DeepNude has focused on the specific dangers posed by its technical advances. “DeepNude is an evolution of that technology that is easier to use and faster to create than deepfakes,” wrote Samantha Cole in <em>Motherboard</em>’s initial report on the app. “DeepNude also dispenses with the idea that this technology can be used for anything other than claiming ownership over women’s bodies.” With its promise of single-click undressing of any woman, it made it easier than ever to manufacture naked photos — and, by extension, to use those fake nudes to harass, extort, and publicly shame women everywhere.</p>
<p><a href="https://onezero.medium.com/the-scariest-thing-about-deepnude-wasnt-the-software-a8df4e7f239b"><strong>Visit Now</strong></a></p>