Can gender bias be coded out of the algorithms that power software like Siri?

Machine learning algorithms, such as those that power Google’s search or Apple’s Siri, learn by extracting information from large amounts of text, images, or other data. Unfortunately, research has shown that these algorithms not only learn to understand language, they also learn to replicate human biases, including implicit biases that humans aren’t even aware of. In July, researchers from Boston University and Microsoft Research published results of an attempt to forcefully remove bias from the language model (pdf) created by a machine learning algorithm. The new technique, which they refer to as “debiasing,” promises to eliminate linguistic bias without altering the meaning of words. This is accomplished by identifying gender stereotypical analogies learned by the algorithm, such as “man is to computer programmer as woman is to homemaker,” and then…


Link to Full Article: Can gender bias be coded out of the algorithms that power software like Siri?