Surveillance, recruitment, housing … Artificial intelligence governs all aspects of our daily life. Directed by Shalini Kantaya, the documentary “Coding Bias: Algorithms and Discrimination” highlights its multiple biases.
When Joy Buolamwini, a researcher at the Massachusetts Institute of Technology Media Lab, discovered that the facial recognition software the university lab uses did not recognize her face, she initially thought it was a mistake. In fact, it just got its finger on a major flaw with the kind of tools Microsoft or IBM uses: They are unable to recognize the faces of non-white women.
Since the 1950s, white men have shaped AI in their own image, and have incorporated their prejudices into the technology. So it is not surprising that after half a century, the algorithms governing facial recognition software in particular have become riddled with racial, sexist, or transgender biases. In the documentary Crypto Bias: Algorithms and Discrimination, Director Shalini Kantaya pursued Joy Buolamwini’s brilliant battle for an AI legal framework and a better understanding of its flaws.
Toward algorithmic ethics?
Crypto bias Still not locked in lines of code and sticking a finger on the tangible results of these algorithmic discrimination processes in the UK, however, Silky Carlo, president of the Big Brother Watch Association, is working to educate citizens about the faults of the facial recognition software used by law enforcement agencies. In particular, we know that black men are more likely to be misidentified victims. Joy Buolamwini also discusses the Amazon-mandated AI recruitment that has led to the exclusion of all female profiles … She also explains that transgender people are regularly “brainwashed” by facial recognition software.
This article is for subscribers only
- All common content : Articles, reviews, newsletters and magazine in digital version
- arrive to Plus 300 VOD movies In the year chosen by the editorial board
- to Benefits and discounts on cultural events Chosen by Télérama
Already subscribed? I communicate