Sundance review: 'Coded Bias' reveals the unfairness in the algorithms that run everything
‘Coded Bias’
★★★1/2
Playing in the U.S. Documentary competition of the 2020 Sundance Film Festival. Running time: 90 minutes.
Screens again: Monday, Jan. 27, 3 p.m., Redstone 7 (Park City); Thursday, Jan. 30, 9 p.m., PC Library (Park City); Friday, Jan. 31, 11:59 p.m., Tower (Salt Lake City); Saturday, Feb. 1, noon, Park Avenue (Park City).
——
The old computer-programming mantra of “garbage in, garbage out” takes on new and scary immediacy in “Coded Bias,” an urgent examination of the flaws embedded in the algorithms that rule our lives.
It starts with Joy Buolamwini, an MIT student who was working on a project that involved facial-recognition software. The problem she discovered is that the software wouldn’t recognize her face because of her chocolate skin — but that when she put a white mask on, the software could read it fine.
Thus began an odyssey in which Buolamwini studied how the artiticial intelligence, and the algorithms they use, were created from databases that began with the people who programmed them. Those people were usually white men, so the resulting algorithms read those faces more readily than women or people of color.
Director Shalini Kantayya follows Buolamwini as she meets other experts on the unseen and often unrecognized bias within algorithms, who describe the problems they cause. From facial-recognition systems being used by cops in London to Microsoft’s A.I. Twitter creation Tay becoming racist and misogynist in 16 hours, the biases are everywhere — from China, where the government monitors everyone, to the United States, where the job has been farmed out to commercial interests like Facebook, Google and Amazon.
The movie is chock-full of slick graphics to illustrate the issues. But the draw is Buolamwini’s charming personality, so powerful that when she testified before Congress, she got liberal icon Alexandria Ocasio-Cortez and right-wing bomb-thrower Jim Jordan to agree on something. Here’s hoping more people hear the message of “Coded Bias,” and get angry enough to take action.