The term “machine learning” is becoming increasingly popular, but what is less often discussed is that machines don’t learn per se. They are programmed by human beings to process more sophisticated information in more complex ways, which enables the machine to do more without ongoing human intervention. But software is language, remember. And language tends to lend itself bias, whether intentional or unintentional. Here’s a fantastic explanation of what bias looks like when realized through a machine.
Want Gradgrind's to come TO YOU?
Sign up for the Tuesday newsletter!
It's that easy. Go on.
December 15, 2016
June 11, 2018
January 26, 2017
April 5, 2018