Online Contributors — Login

The slightly prejudiced algorithm will see you now.

By bird_lovegod | 22 January 19 03:13pm | Business News, News and Views

First Published by Bird Lovegod in the Yorkshire Post newspaper.

There aren’t many business activities that companies aren’t developing software for to enable machines to do those jobs. It’s a bit of a closed loop sometimes. The tech giants develop more tech to streamline and automate their own processes, and this mindset and approach leaks out into mainstream society. Whatever the issue or challenge is, their first and only thought is ‘how can we write a programme to fix this’.

The assumption is everything can be, and should be, automated. This mindset is integral to their businesses. It’s all digitised and at any point where actual people get involved there’s going to be what the industry terms ‘frictions’. The worst kind of friction is where people have to do something manually, like respond in person to a user or get involved in manual decisions in any way. But sometimes, just sometimes, it’s better left to human beings.

Amazon tried to solve its recruitment burden by developing software that could filter out the top 5% of candidates from the CVS it was presented with. The Ai system, a self teaching, machine learning software, ‘read’ tens of thousands of CVs of successful candidates and used that information to create for itself rules for identifying the most suitable prospects. Starting in 2014 and running until 2017, the system downgraded CV’s it identified as being from women and developed a bias towards men. The Ai was sexist, and Amazon scrapped it as a bad job. The problem was, the Ai was using the example of humans as a model for developing the process in an automated way. The gender bias existed in the human system, the successful candidates were usually men. So clearly, when this was used as the information for the Ai to absorb and develop its own system from, it incorporated that bias, and found its own ways of expressing it. Clever, but wrong. And it’s a frequently recurring feature.

In the US they have a partially automated judicial system. Algorithms are used to predict the likelihood of reoffending for persons of crime. And these algorithms have a distinct bias against persons of colour. Have a look at the report on Machine Bias on ProPublica.org for a long read on the subject. And it’s relevant to us here in the UK because decisions regarding all manner of activity are becoming digitised. Decisions relating to child services are being made by software systems, here in the UK, in Hackney, and other London councils. They actually have computer software to ‘predict’ vulnerable children and those at risk from abuse.

Some people might suggest this strategy is nothing to do with actually safeguarding children, and everything to do with cost effectiveness. If the software is required to find 300 families that year, it will. If next year the target is set at 500, it’ll adjust the algorithm and identify more. And of course, no one will be able to challenge the results in any meaningful way, because it’s a case of ‘computer says so’. It’s definitely a concern that our own sometimes unconscious prejudices and biases can find their way into computer software that that institutionalises these problems even further. There’s often talk of institutions being sexist or prejudiced, be it in the wages for women, lower than men, or the next transparency, wages of different races. These issues arise due to deeply held cultural bias that we, as individuals, would probably claim not to hold. Very few people would actually state that women should be paid less than men, and yet in large organisations, on average, they usually are.

The good news is that we, as a Nation, are actually very focused on improving ourselves as a culture and society. We want to be better. We want to be more inclusive, more accepting, more transparent, and fairer.

The problem we really need to avoid is developing computer systems based on how we are now, rather than how we want to be in the future.

Our computer systems should drive us forward, into a better society, not loop us into repeating the mistakes of the present. This means the Ai needs information not just on who we are, but who we hope to become. How to give it that I do not know.

Image from the very readable article on ProPublica

Previous Post Next Post

Leave a Reply

Share this Page

Facebook Twitter LinkedIn Email