Smile, you’re on covert camera. No, this is not referencing an episode of Black Mirror. Director Shalini Kantayya brings us Coded Bias, a documentary that rightly feeds fears on a subject many have blithely ignored, the increasing control technology is having over the world, and in every aspect of life. Worse, it is pervasively anchored in bias, expanding disparity in wealth, education, health, safety, and so much more. The film is an eye-opening examination of just how little the public is aware of how and when they are being watched, categorized, pigeonholed, and discarded, often because of their gender, race, or background, more often than not erroneously.

Much of the film is centered on ‘poet of code’ Joy Buolamwini, a researcher at MIT, digital activist, and woman of color who, while working on a project, discovered facial recognition worked inconsistently for women, and even less so for women of color. She explains some of the ways AI is structured in such a way as to diminish women of color, and about the need to be aware and astute about how data is being gathered. She founded the Algorithmic Justice League to fight this impending control, which in the United States is coming through the corporate/capitalist environment. Says Buolamwini, in reference to anyone but white men, “Expect to be discredited. Expect to be dismissed.” This is no surprise, given it is the white men who head the tech giants that have spearheaded algorithms that determine things like who gets tenure, who gets the job, who gets a mortgage, and who gets accepted into the most prestigious colleges. Steve Wozniak, one of the original ‘Apple geniuses’, famously exposed the fact that a credit card algorithm gave him 10 times the credit limit as his wife, though they have no separate accounts. There’s absolutely no transparency about how the algorithms are created and how they choose who wins and who loses, who is seen as a threat and who is seen as safe.

Coded Bias also shows that in England facial recognition is being increasingly used by the government, yet is often incorrect in identifying potentially dangerous individuals. China has built a ‘social credit score’ through facial recognition, and whenever citizens negatively reference the communist party, their score is effected, as are the scores of their family and friends.

For the entire review, go to HERE.