Colorado Students Were Unknowingly Used to Train Facial Recognition

Line of surveillance cameras.

Line of surveillance cameras.

It all began in 2012. Dr. Terrance Boult attached a camera to a building on the University of Colorado at Colorado Springs campus and began filming students, who were completely oblivious of it. In twenty days, pictures had been taken of over 1,700 people who were unaware of being watched. The pictures became a dataset called the Unconstrained College Students- the newest way to train facial recognition algorithms. The creation of this dataset was funded by the U.S. government and caused many to worry about how unethical it is; in reality, however, it’s perfectly legal.

The Rise of Facial Recognition

Facial recognition has slowly been on the rise, becoming more sophisticated and more useful in both everyday usage and military usage. Over six years ago, a study was conducted on a Colorado university campus. The study was conducted by Dr. Terrence Boult, funded by the U.S. government, and was created to determine if algorithms could identify faces in poor lighting, from a distance, and through obstacles.

After setting up a camera in a place on campus that received a lot of traffic, researchers began going through the photos, combining them and creating what is now known as the “Unconstrained College Students” dataset. [1] The photos are random and lack in detail compared to other databases, therefore making this dataset more helpful to scientists who are trying to develop smarter facial recognition algorithms.

Although this whole procedure might seem sketchy and an infringement on peoples privacy, maybe even illegal, it isn’t. Acquiring photos of people in public places is completely legal, and Boult cleared the study with the school before conducting it. The UCCS Institutional Review Board even cleared the study. “The research protocol was analyzed by the UCCS Institutional Review Board, which assures the protection of the rights and welfare of human subjects in research,” wrote University spokesperson Jared Verner in a statement. “No personal information was collected or distributed in this specific study. The photographs were collected in public areas and made available to researchers after five years when most students would have graduated.” [2]

A More Diverse Dataset

Recently facial recognition has been continuously put in a bad light; San Fransisco banned the use of facial recognition earlier in May [3] and Uber has been accused of using racist facial recognition software. [4] As Boult says, however, it’s not the algorithms fault that they’re racist, its just that the datasets that they’re trained on are mostly Caucasian people.

The Unconstrained College Students dataset was made to be more diverse than your average dataset, but Boults bigger goal is getting an algorithm to admit when it can’t recognize someone. An algorithm can’t come up with a close match, it has to be an exact match, and if it can’t come up with an exact match, it has to recognize that it can’t, Boult explained. [5]


Notes:

  1. ^Harvey, Adam. “MegaPixels: UnConstrained College Students Dataset.” MegaPixels, 28 May. 2019, megapixels.cc/datasets/uccs. (go back  ↩)
  2. ^Colorado college students were secretly used to train facial recognition.” Engadget, 28 May. 2019, www.engadget.com/2019/05/28/uccs-facial-recognition-study-students. (go back  ↩)
  3. ^San Francisco bans city use of facial recognition.” Engadget, 28 May. 2019, www.engadget.com/2019/05/14/san-francisco-bans-city-use-of-facial-recognition. (go back  ↩)
  4. ^Uber faces racism claim over facial recognition software.” Telegraph, 28 May. 2019, www.telegraph.co.uk/technology/2019/04/23/uber-faces-racism-claim-facial-recognition-software. (go back  ↩)
  5. ^Stanley, J. Adrian. “UCCS secretly photographed students to advance facial recognition technology.” Colorado Springs Independent, 28 May. 2019, www.csindy.com/coloradosprings/uccs-secretly-photographed-students-to-advance-facial-recognition-technology/Content?oid=19664437. (go back  ↩)

Leave a Reply

*