
OVERVIEW | ARTWORKS | ARTISTS | DAC
Derek Curry
Artwork Description:
Boogaloo Bias is an online application, interactive installation, and research project that highlights some of the known problems with law enforcement agencies’ use of facial recognition technologies, including the practice of ‘brute forcing’ where, in the absence of high-quality images of a suspect, agents have been known to substitute images of celebrities the suspect is reported to resemble. The Boogaloo Bias facial recognition algorithm is trained on faces of characters from the 1984 movie “Breakin’ 2: Electric Boogaloo.” The film is the namesake for the Boogaloo Bois, an anti-law enforcement militia that emerged from 4chan meme culture and has been present at US protests since January 2020. The system uses movie character faces to brute force the generation of leads to find members of the Boogaloo Bois in live video feeds, videos of protest footage, and images uploaded to the Boogaloo Bias website. All matches made by the system are false positives. No images or information is saved or shared in either the live or online version of the project.
Technical Information:
Boogaloo Bias can be interacted with through an online app where users can upload images to be processed by the Boogaloo Bias facial recognition program (no images or information are saved by the website). The project website also includes two interactive social network analysis (SNA) diagrams that show connections between characters and resemble those used by law enforcement to understand, predict and intervene in targeted groups. One SNA diagram was created manually based on an analysis of the characters’ relationships in the film, while the second SNA diagram was created algorithmically based on how many times a character was recognized in the same video frame as another character. Interestingly, the most prominent connections in both diagrams are the same, but the algorithmically generated diagram also contains many spurious connections and false positives. Finally, the website features a resource library that links to research about the unregulated use of facial recognition in the US, information about the Boogaloo Bois militia, and social network analysis techniques.
Relation to the Theme:
The placement of police cameras, patrol routes, and checkpoints to monitor and control specific areas is an important aspect in the production of space. These spatial practices contribute to the creation of a surveilled space, where certain behaviors are regulated and monitored by authorities and directly impact what LeFebvre called “lived space,” which refers to the everyday experiences and practices of individuals within a given space. Police surveillance can intrude upon and alter the lived experiences of communities, affecting their sense of privacy, freedom, and safety. For instance, constant surveillance may lead to feelings of paranoia or distrust among residents, fundamentally changing their relationship with their surroundings. Using facial recognition software adds a virtual layer of surveilled space by automating the identification of suspects. However, the automation of these practices can introduce technical problems and the potential for the misuse of technology. For example, the location, position, and fidelity of surveillance cameras, as well as the ambient lighting can all impact the effectiveness of facial recognition. In response, law enforcement with these systems have adopted new techniques of “brute forcing” this technology to produce results. Boogaloo Bias helps members of the public develop technical literacies about facial recognition algorithms by pushing the logic behind the current forms and uses of facial recognition in law enforcement to an extreme, highlighting the absurdity of how this technology is being developed and used.
The project draws from a number of academic and journalistic sources, including a study by the Georgetown Law Center on Privacy and Technology, which found that because there are “no rules when it comes to what images police can submit to face recognition algorithms to generate investigative leads,” agents have been known to substitute not only low-quality images from CCTVs, but hand-drawn forensic sketches, proxy images generated from artist sketches, and images of celebrities thought to resemble a suspect (Angelyn, 2019). Using images of celebrity doppelgängers to find suspects aligns perfectly with a post-truth world where non-human intelligent agents increasingly rely on models and proxies rather than empirical observation.
The interactive experience in Boogaloo Bias demonstrates how unregulated surveillance technology without public oversight can lead to absurdly erroneous results. Part of the project is a corporate fiction where the facial recognition system is a product being marketed by a near future startup emphasizing how the motivation for profit can exacerbate technological problems.
References
Angelyn, Clare. “Garbage In. Garbage Out. Face Recognition on Flawed Data,” Georgetown Law Center on Privacy and Technology, May 16, 2019.