As we get closer and closer to create artificial intelligence that can think and reason in ways that mimic a human brain it’s becoming increasingly clear that allowing a machine mind to learn from humans is a very bad idea. We’ve seen examples of it in the past, but a new study on AI biases reveals that not only does training an artificial brain create biases, but those leanings reinforce many societal issues regarding race and gender that plague humanity today.
The study, which was conducted by scientists at Princeton University and published in the journal Science, sought to determine not just if the behavior of an AI exhibited specific biases, but whether the machine learning systems that determine the outcome inherently lean one way or the other. To do this, the team trained an AI using standard datasets that are popular choices for machine learning. These kinds of sets include millions of words and are often gathered from many sources, including the internet.
The AI studies the words, how they’re used, and what words they’re used in association with, in order to provide natural language responses and answers in a way that we can understand. It also, as it turns out, learns some of our more unfortunate quirks.
After training the AI, scientists tested how it associates various words with others. For example, “flower” is more likely to be associated with “pleasant” than “weapon” is. That, of course, makes perfect sense. However, the trained AI also had a habit of associating typically caucasian-sounding names with other things that it considered to be “pleasant,” rather than African-American names. The AI also shied away from pairing female pronouns with mathematics, and instead often associated them with artistic terms.
This is obviously a huge issue since, in the name of creating AI that sound and behave more human, some of the standard training materials being used carry with them some of the worst parts of us. It’s a very interesting problem, and one that you can bet will get a lot of attention now that evidence seems to be mounting.