Designers built an AI penis detector to protest Google’s prudish doodles

Cast your mind back to 2016 and you might recall Quick, Draw! — an AI experiment from Google that guessed what users were doodling. It was basically AI Pictionary, with Google later releasing the millions of sketches it collected as an open-source dataset.

But there was one doodle that Google’s AI never recognized and that never appeared in its data: the humble penis.

It sounds childish, but it’s sort of a big omission. The penis is perhaps the most significant and durable doodle of all time. It’s a sigil that’s been scrawled on surfaces for thousands of years — everywhere from Roman walls to medieval manuscripts — and variously signifies good luck, virility, or just “I’m a man and I was here.”

To rectify Google’s mistake, the Mozilla foundation commissioned Dutch design studio Moniker to build an AI penis doodle detector. It’s a bit of silly fun, but Moniker and Mozilla say they’re also making a serious point: in an age where US tech giants control so much of what we see online, should we be worried about the moral standards they get to set?

You can test out the penis detector out here. When you doodle a penis it’ll say “we assume this was a mistake” and erase it, warning users: “Don’t take individual expression too far!” Draw enough of them and it will go on a mad tirade, doodling itself into a frenzy.



Moniker’s Roel Wouters tells The Verge that the inability of Google’s AI to recognize a penis doodle is certainly trivial in the grand scheme of things, but it’s still a potent symbol of tech giants’ power. He gives the example of Facebook and Instagram’s ban on the nipple as a more serious example of American prudishness being imposed on the world.

“The point is that we think our moral compasses should not be in the hands of big tech,” says Wouters. “We question the fact that we gave the responsibility for our social infrastructure away in exchange for “free” usage without even realizing. Don’t you think it’s a bit weird that Instagram’s ‘community guidelines’ for sharing images are imposed on all the world’s citizens and all cultures?”

Wouters says he personally loves Google’s Quick, Draw! project, and even used the company’s AI software TensorFlow to build his alternative. But he points out that as companies use more artificial intelligence to moderate online platforms, the potential for mistaken censorship — even self-censorship — increases. Knowing that an AI might spot our profane thoughts or feelings, we might never express them in the first place.

The project isn’t really about “freedom of speech,” says Wouters, but it’s a reminder “about the unwanted powers of big tech and their governmental paternalistic tendencies.” He adds: “To us doodling a penis is a light-hearted symbol for a rebellious act.”

That’s been true for thousands of years — whether corporate AI recognizes it or not.