Articles

ARTIFICAL INTELLIGENCE

February 1, 2020

THE INFILTRATION OF PREJUDICE IN AI: WHAT ‘CODED BIAS’ MEANS FOR AMERICA TODAY BY RILEY ARNOLD,
1 February 2020

UTAH TEENS TALK FILM: Sundance 2020

Our digital footprint has become an invaluable product, marketed and sold to corporations. So where does it all stem from? Joy Buolamwini, the founder of the Algorithmic Justice League and the main subject of the film, explains that bias is ingrained and written into the very code of these systems. The accuracy rate for Amazon’s facial recognition software when correctly identifying women of color peaks at just 68.7% compared to the virtually 100% accuracy rate for the same software when correctly identifying white males (Buolamwini, 2019). This parallels the same flawed technology that is knowingly being sold to the CIA, the FBI, and police forces to identify and track wanted individuals. From mortgages to job applications to the prison system, algorithms affect almost every aspect of our lives in modern America. While the pool of consumers AI affects is large and diverse, the group of those programming the systems themselves is largely white, cisgender men. Although they may not have malicious intent programming these algorithms, their own personal biases and inclinations tend to seep into the data and intelligence of the system itself, hence the danger of having such a homogenous group shape big tech.

   Director Shalini Kantayya poses two parallel obstacles in big tech: the inaccuracy of AI and facial recognition in recognizing minorities, particularly women and people of color, and then the issue of this technology working accurately, but the misuses and biases that could follow it. Increased police harassment, lack of personal privacy and discretion, and deeply targeted political and social advertisements, job openings, and other opportunities are only the surface of what could come next if AI remains unchecked. So what does it say about our consumerist culture that we are willing to accept these risks, often at the expense of women, people of color, and other minorities for the sake of convenience?

    Kantayya tosses the viewer into a rabbit hole of dystopias, some paralleling China’s use of social credit, Hong Kong’s riots retaliating against government use of AI, England’s Police surveillance, and even the possibility of the United States becoming a mass surveillance state. This may perpetuate further a society of environmental determinism and manifest destiny based on algorithms and risk assessment, removing “choice”. This lack of individualism and increase of hyper-awareness and discipline takes away our humanity, pushing the working class into a world of consumerism and mindless obedience for fear of consequences and constant tracking and surveillance. Perhaps this may be bold, but it also may very well be the world we could be facing. Buolamwiwi constantly forces the viewer to question how much they’re willing to stake on the status quo of the digital world and it sticks.

   Through a web of perspectives of many different researchers, authors, consumers, and developers, Kantayya is able to meaningfully portray a diverse arrangement of voices, elevating the ignored and highlighting a wide range of the many concerns surrounding technology today. Coded Bias is somehow able to portray a complexity and variety in this documentary while neatly wrapping it into bite-sized, mere 90 minutes. Viewers can expect to leave the theater with a mind-twisting plethora of new questions and a digital conscience that may not have existed for them before. Through the film’s multitude of topics and perspectives, a message of the importance and beauty of humanity rang clear throughout, and that is something that no machine will ever be able to replicate.

By Riley Arnold

RECENT ARTICLES

VIEW MORE

PUBLICATIONS

VIEW MORE