[kictanet] Values, biases automated
S.M. Muraya
murigi.muraya at gmail.com
Fri Apr 14 13:04:30 EAT 2017
While we Africans argue, politic, loot, fail to fund our own innovation,
other cultures document, automate their values, biases..
https://www.theguardian.com/technology/2017/apr/13/ai-programs-exhibit-racist-and-sexist-biases-research-reveals
And the AI system was more likely to associate European American names with
pleasant words such as “gift” or “happy”, while African American names were
more commonly associated with unpleasant words.
The findings suggest that algorithms have acquired the same biases that
lead people (in the UK and US, at least) to match pleasant words and white
faces in implicit association tests.
These biases can have a profound impact on human behaviour. One previous
study showed that an identical CV is 50% more likely to result in an
interview invitation if the candidate’s name is European American than if
it is African American. The latest results suggest that algorithms, unless
explicitly programmed to address this, will be riddled with the same social
prejudices.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.kictanet.or.ke/pipermail/kictanet/attachments/20170414/17b9aa70/attachment.htm>
More information about the KICTANet
mailing list