
The images it created in the online samples I saw didn’t look perfect, but they were good enough to make a casual observer gasp. (Vice reported that it would insert a vulva, in place of pants, in a photo of a man.) It is the latest example of how it’s getting increasingly easy to use technology to shame and demean women, in particular, online.
Deep nude creator how to#
Instagram head says company is evaluating how to handle deepfakesĭeepNude was meant to work on women, specifically. Unpacked launch event in San Francisco, California. There was a free version, too, that would place a big watermark on resulting images (the paid version of the app, according to Vice, instead had a “FAKE” stamp in one corner).Īdam Mosseri speaks during the Samsung Electronics Co. The website, initially reported on by Samantha Cole at Vice site Motherboard on Wednesday, began selling a $50 Windows and Linux application just a few days earlier that could take a photo of a clothed woman and, using artificial intelligence, replace it with a fairly realistic-looking naked image of her. Like the woman I saw, the resulting nudes weren’t real. If you type in the URL, you’ll see a blank, white page and the words “not found.” But before it disappeared, it offered visitors like myself free previews of a horrific AI-enhanced world where photos of women - any woman, really - could be undressed via algorithms and shared with reckless abandon.



I felt like I had just peeped through a stranger’s window, utterly violating her privacy.Ī day later that website had disappeared its creator apparently had a crisis of conscience. Suddenly, her outfit disappeared, and naked breasts were on my computer screen. On Wednesday afternoon, I clicked on a picture of a woman on a website called.
