Is prejudiced Ai inevitable? ?
We are prejudiced, even when we try not to be. Ai learns from us.
Ai becomes prejudiced.
Amazon recently binned its sexist recruitment Ai. It didn’t like women much, and showed a distinct preference for recruiting male developers. Where on Earth could it have picked up that attitude?
From the past experiences it was given to learn from, of course, the tens of thousands of CVs of successful Amazon candidates. And this is the crux of the problem. We want our Ai to be aspirational, thought and morality leading, better than we are. We want Ai that lacks our flaws, our prejudices, our cultural biases, our institutional prejudices. We want Ai to be an improvement on ourselves, not a mirror.
Unfortunately, this ain’t gonna happen. Ai learns from us, just like the justice bots. They have a itty bitty prejudice against criminals of colour. They think they’ll repeat offend. They don’t think white people will repeat offend so often. Even when they really do.
Ai are like our children. They watch what we do, listen to what we say, and then create their own selves in such a way as to hopefully please us. Awww.
But we don’t want them to grow up like us, to make our mistakes, to be like us. We want them to be pure, and perfect, and fair and loving.
Can’t we get a nanny?