Trevor Paglen, is an artist and one of the pair behind an app that exposed racist and sexist flaws in a colossal database used to train AI. He has warned that these same terms could be present in systems developed by big technology companies.
The flaws could have spread to companies including Google, Microsoft, Facebook and Huawei if they used it as a “seed” database, he has claimed.
Paglen says that we can assume similar things are going on in the databases of Google and Facebook or whatever, but we can’t see that happening.
He also says there are often trade secrets, and this is a huge problem for the field of machine learning in general, especially in applications that touch people’s everyday lives.
Paglan has called for “a lot more transparency” from companies on how machine learning systems are being used and how they’re classifying people to stop this from making biased decisions.
Mr Paglen’s app, created with the AI researcher Kate Crawford and called ImageNet Roulette, exposed that pictures of black and ethnic minority people generated race labels such as “negroid” or “black person”, while results from caucasian faces varied more widely, such as “researcher”, “scientist” or “singer”.
You must be logged in to post a comment.