+ Both Hands Full

From Slide 06b · ~10 min · auto-saves

Name what
you see.

We've said the word “bias” so many times it's lost meaning. Algorithmic bias. Generic. Safe. Abstract.Stop saying bias. Name what you're seeing.

Call it what it is

Three case studies

What “bias” is hiding.

The marketing professor prompt

The image generator has a gender bias.

When asked for a marketing professor, the system gave a man in a Harvard blazer with authority and gravitas. When asked for a female marketing professor, it gave a timid school teacher. That's misogyny embedded in image generation.

Bias is a math word. Misogyny names the actual content of the discrimination. Names point at people. Math points away from them.

Joy Buolamwini · Gender Shades

Facial recognition has a 'performance gap.'

Facial recognition systems fail on darker-skinned women at 34.7% — versus 0.8% on lighter-skinned men. That's a 40× gap. That's racism and sexism embedded in code.

'Performance gap' sounds like a benchmark issue. The 40× ratio sounds like what it is: discrimination, automated.

1,800 photos · the consent story

AI was trained on 'public data.'

I released thousands of Creative Commons photos on Flickr over two decades. When I checked a dataset search tool, ~1,800 of my images sit inside a major training corpus. Nobody asked.

'Public' implies consent. CC-licensed isn't the same as 'free for any commercial purpose.' Saying 'public data' launders the missing ask.

Now you

Pick one system. Rewrite the criticism.