San Francisco

Locating Technology: Against Recognition

From our partners at Art Practical, today we bring you an essay by Emily K. Holmes, who analyzes the work of artist Zach Blas: “Blas creates space for facial-recognition technology to be not only strange, but dangerous and deserving of our critical questioning.” This article was originally published on April 16, 2014.

Zach Blas. Facial Weaponization Suite Communiqué: Fag Face, 2012 (video still); HD video; 08:10. Courtesy of the Artist. Photo: Zach Blas.

Zach Blas. Facial Weaponization Suite Communiqué: Fag Face, 2012 (video still); HD video; 08:10. Courtesy of the Artist. Photo: Zach Blas.

Biometric technologies aim to “authenticate” and “verify” individuals by digitally scanning physical traits on the body, ranging from iris scans to fingerprint matching and facial-recognition technology. These technologies scan bodies at borders to administer international mobility in the form of digitized passports, and act as proxy guards in high-security workplaces. Biometric technologies also appear in popular social media, such as when Facebook asks us to tag our friends when we upload photographs (the company is researching ways to increase accuracy), as well as in smartphones, tablets, and some gaming consoles. Before we get too swept up in the excitement of unlocking iPhones with our grinning faces, artist Zach Blas suggests we question the normalization of facial-recognition technology.

In Blas’ digital video Facial Weaponization Suite: Fag Face Communiqué (2012; HD video; 8:10), he presents a montage of clips to survey the current uses of facial-recognition technology, similar to those listed above. Other than sheer proliferation, why should we rethink facial-recognition technology? In one scene, a clip taken from a biometric company’s promotional video shows the possibility of detecting an “unwanted individual.” A slender white man enters through the doors of a building’s lobby; we see through the CCTV monitor as a square latches onto his face as he walks through the space. The man’s face triggers a match in a database, and the screen shows a previously stored image paired with the words “unwanted individual detected.” Although this use of biometrics is situated as protective and beneficial, Blas imagines a possible future in which detecting “unwanted” individuals might collide with socially marginalized groups. 

Read the full article here.

Share

Leave a Reply