Fallen Apples

Idiom The apple does not fall far from the tree

Artificial Intelligence, AI, isn’t going away. It also hasn’t conquered us. Yet. It has provided a looking glass of our own flaws and prejudices and biases.

In 2016, Microsoft’s chatbot Tay spent a day chatting with people. The dewy-eyed innocent was learning, trying to be like them. Like us. The plug was pulled when it became apparent that Tay was spouting all manner of racist, hate-filled blather. It took less than a day. Garbage in, garbage out.

“AI is in its infancy, and like children, how it grows reflects how we raise and nurture it.”

Cartoon on how our children can grow up like us

We are flawed. And we can pass those same flaws on to our children, real or artificial. Especially when we aren’t aware of them.

Arvind Narayanan co-authored a 2017 paper analyzing the meanings an AI assigned to words it learned from us. We associate male names with words like “executive” and female names with “marriage”. AI’s in the study did the same.

It is important to understand that this is bigger than just one person or just one bias. Our individual biases are all part of the system, part of the great cloud of data and metadata that we have created, we are creating, we will create. It isn’t just in the code.

That's not my name - lytics from a song by the musical group the Ting TingsIf the programmer checks their biases at the door and creates clean, unprejudiced code–the very data sets themselves the AI samples from can be skewed. A news scanning software asked to complete the statement “Man is to computer programmer as woman is to X,” replied, “homemaker.” Translation software replaced the gender-neutral “They are” with the gendered “He is a Doctor” and “She is a Nurse”.

We are only examining incidental error caused by bias. Even the best AI can be intentionally misled, seeing dogs as skiers or cats as guacamole. Funny in the lab. Not so funny when a self-driving vehicle is deceived.

computer representation of the space around a vehicle

If you are at all intrigued, alarmed, or just plain curious, you should read/view Kate Crawford’s “The Trouble with Bias” keynote given at the 2017 Neural Information Processing System Conference. I can’t say it any better than she does.