This article is more than 1 year old

Fake prudes: Catholic uni AI bot taught to daub bikinis on naked chicks

Send nudes plz... for the purposes of training this machine-learning software

NSFW Artificially intelligent software is used more and more to automatically detect and ban nude images on social networks and similar sites. However, today's algorithms and models aren't perfect at clocking racy snaps, and a lot of content moderation still falls to humans.

Enter an alternative solution: use AI to magically draw bikinis on photos to, er, cover up a woman’s naughty bits. A group of researchers from the Pontifical Catholic University of Rio Grande do Sul, Brazil, have trained generative adversarial networks to perform this very act, and automatically censor nudity.

In a paper for the IEEE International Joint Conference on Neural Networks (IJCNN) in Rio de Janeiro earlier this month, the eggheads presented some of their results. And it looks as though their AI code is quite hit or miss. Sometimes the computer-drawn bikinis don’t look realistic enough, and in some examples they are lopsided, too skimpy, or not well positioned to cover up enough skin.

AI_bikini

Examples of some of the bikinis generated by different models. The quality of the second row of images is okay, but the third row not so much. Image credit: More et al ... Click to enlarge (NOT SAFE FOR WORK)

At the moment, the project is more of a proof of concept. Rodrigo Barros, coauthor of the paper, told The Register on Thursday: “The approach described in our paper is an interesting and novel way of performing censoring on images. We believe that our approach for censoring nudity, in its current state, can't be applied practically simply because it isn't robust enough yet.”

The researchers pulled together their dataset by downloading photos from “image packs via torrent files,” according to Barros. More than 2,000 images were collected, 1,965 were used for training, and the leftover 220 were used for testing the model.

Pictures of swimwear-clad women were fed into the system to teach it what a bikini looks like. To work out where the beachwear should be pasted, the software also had to be exposed to nude pictures. Armed with these photo sets, the software boiled down to an image-to-image translator.

Here, wear this

“When we train the network, it attempts to learn how to map data from one domain – nude pictures – to another domain – swimsuit pictures. The distribution of the images from both domains are quite different,” Barros explained.

Image of woman holding mask of her own face

The eyes have it: 'DeepFakes' bogus AI-meddled videos outed by unblinking gaze

READ MORE

Because the system is just interested in recreating bikinis and pasting them over other pictures, it doesn't matter that the skin tones, body shapes, and faces differ wildly between the snaps in the clothed and naked domains. The generated swimwear isn't always perfect, however, since the system also takes some surrounding pixels from the training data into account, which ends up as noise on the final rendering.

“One thing that we noticed in particular is that several of our swimsuit images are photo-shoot pictures, shot on a white background, while the background of the nude pictures is often quite complex," said Barros. "The network, therefore, implicitly learns to associate 'background types' for each domain. Thus, when translating, the background is often disturbed significantly."

Essentially, when the model is given a naked picture of a woman, the generator network should automatically stick a bikini on. It’s trained using a separate discriminator network, that tries to work out if the images spat out by the generator are real or not. As both networks battle it out over time during training, the generator improves to a level where it begins to trick the discriminator successfully.

So far, so good. But be warned, this technology could be tweaked to work in reverse. Developers could, in theory, use the same code to erase the bikinis and generate nipples for the breasts.

“Once training is done, we can safely discard the model that maps from domain Y (swimsuit pictures) to domain X (nude pictures) and keep only the model that we are interested in,” Barros said. “We would like to emphasize that removing clothing was never our intended goal, it is merely a side-effect of the method, which is discarded anyway.”

Let's hope this sort of thing doesn't fall into the wrong hands – remember Deepfakes?

The team hopes to continue improving their system to make the internet a safer space for children, and hope to implement it as a browser application in the future. ®

More about

TIP US OFF

Send us news


Other stories you might like