Racist algorithms: how Big Data makes bias seem objective

[embedded content] The Ford Foundation’s Michael Brennan discusses the many studies showing how algorithms can magnify bias — like the prevalence of police background check ads shown against searches for black names. What’s worse is the way that machine learning magnifies these problems. If an employer only hires young applicants, a machine learning algorithm will learn to screen out all older applicants without anyone having to tell it to do so. Worst of all is that the use of algorithms to accomplish this discrimination provides a veneer of objective respectability to racism, sexism and other forms of discrimination. I recently attended a meeting about some preliminary research on “predictive policing,” which uses these machine learning algorithms to allocate police resources to likely crime hotspots. The researchers at the Human Rights…


Link to Full Article: Racist algorithms: how Big Data makes bias seem objective