Retail Facial Recognition Comes Of Age
Written by Mark RaschAttorney Mark D. Rasch is the former head of the U.S. Justice Department’s computer crime unit and today serves as Director of Cybersecurity and Privacy Consulting at CSC in Virginia.
Some years ago, I demoed an ATM that had no card, no chip, no PIN and only a limited keyboard. The ATM used facial recognition software to identify me (after registration), so I only had to walk up to the machine, type in $20 from checking and, voila! Money dispensed. Assuming that everything works as promised and that facial recognition software is close to 100 percent accurate and reliable (more on this later), retailers should consider the legal, privacy and compliance issues related to biometrics before rushing in. Like all innovative technologies (from credit cards to loss prevention devices), it’s not clear yet whether consumers will embrace or reject the new technology, or how regulators will ultimately react.
The legal issues for biometric technology surround various phases of its implementation. Capture. Enrollment. Storage and protection. Sharing. Comparison. Use. De-enrollment and purging. And this says nothing about the technical issues.
How do you get the image you are going to use for the facial recognition? Not an easy question. Sure, if it’s an ATM or payment-card replacement, the person can voluntarily sit down and give consent for a picture to be taken. But what about passive capture? Setting up a camera in a store or elsewhere and taking images of those who walk in? Benneton recently announced it was testing (but had not deployed) a technology called EyeSee, a camera and facial recognition software deployed inside mannequins. The technology captures shoppers and eye level, and it can be used for both loss prevention, trend analysis (what kind of people are doing what types of things in the store) and, ultimately, identification of customers by comparison with other databases.
This type of “passive capture” is particularly problematic from a legal perspective. Although we may have convinced the consuming public that they have no “right to privacy” in their images while they are in the store or mall (outside bathrooms or dressing rooms), the concept of creating a database of individual actions and movements based on facial recognition software takes that privacy expectation to a new level.
There’s a fundamental difference between monitoring traffic and monitoring individuals. Do people in your parking lot know they are consenting to your capture of their license plate numbers (and images of the number, race, gender and age of the occupants of their vehicles)? Once you add the possibility of facial recognition to “ordinary” capture devices (like theft prevention cameras), you have converted the data into personally identifiable information (PII). So how do you get the image that matters. If you get a picture taken at Costco for its membership, are you consenting to the chain’s use of that image for facial recognition and tracking?
What’s worse, retailers can “capture” images from publicly (or semi-publicly) available databases or social networking sites. Is it “legal” for a company like, say, WalMart to scour Facebook, LinkedIn or PhotoBucket to capture names and images to create a database? This would depend partly on the Terms of Use or Terms of Service of these entities and on whether each permits or prohibits both “scraping” and commercial use of its services, in addition to the privacy expectations of the users. Generally, if an image is placed on a publicly accessible portion of a social networking site, it is, well, publicly accessible. That doesn’t mean the images are accurate, however. Just ask Notre Dame’s Manti Te’o about that one! Moreover, even if it is legal, it’s really creepy.
The next issue is enrollment. How do you link a captured image to a specific person? Again, people can voluntarily enroll—like those credit cards that have pictures on them. Or they can be forced to enroll—like a person who is arrested for shoplifting, has a picture taken and then is banned for life not only from the individual store but from all of the chain’s stores and its affiliates forever. Stores use facial recognition software to create a nationwide database of such “banned” persons and to enforce the ban. Was consent required? Most likely not.
January 31st, 2013 at 12:57 pm
And would facial recognition be fooled by a mask or photo? Fingerprints might be hard to get a good enough image of to duplicate, but the image of my face? Not so much. So if my bank uses facial recognition ATMs, could anyone with a good photo of me withdraw from my account?
Thanks, but I’ll take my chances with an ATM card and a PIN!