Biometrics and Privacy: In the Court of Public Opinion
This past month has been a hard one for biometrics and facial recognition in particular. First, the British press exploded in mid-May with reports that police where using facial recognition technology to identity people in public spaces. Later in May, privacy advocate organizations in the United States asked Amazon to refrain from selling facial recognition technology to police departments.
The exact circumstances and political implications are slightly different in both countries, but the general concern of privacy groups is the same: “This technology has the potential to be misused and abused. The ends do not justify the means.”
In a free society, police and even private organizations cannot simply do whatever they want. When dealing with peoples’ private information, the public will have the final word. This year it has become clear that the public does not like the idea of others using or collecting their personal information. Evidenced by GDPR taking effect in Europe last week, the European Union has set a new standard for how both public and private companies need to treat their customers’ data. And, this standard is having a ripple effect worldwide. This change has been many years in the making.
So, what are biometric companies and their customers to do? Biometrics, like most technology, has many potential implementations and uses, most of which are empirically good and help people.
Let’s take a non-technical example that is easy to understand, knives. Knives obviously can be used as a weapon to harm someone. But knives are also used daily to help people eat their food. You would never dream of saying that you can’t have butter or cake knives in your kitchen because other people use knives for destructive purposes. This example can be applied to many tech products including computers and smart phones.
Blacklist vs. Whitelist Biometrics
The same is true with biometrics and facial recognition. The products in the articles mentioned earlier are what we call “blacklist”-based systems. These images are often obtained without consent and are used without the subject’s knowledge. This is the heart of the privacy issues which privacy advocates fear.
By contrast, FST Biometrics and companies that operate in our space, sell “whitelist” products. Whitelist systems use facial recognition and similar technology to identify users who have opted into the system and are authorized to have access to a specific facility. An image is never taken or used without consent and users know what the system is and what it is used for.
Despite this technical difference, often a technology is judged on its reputation. And, creating a good reputation is about more than just being an opt-in or whitelist system. You also have to make sure that the technology you employ, biometric or otherwise, is there to help people, and make their lives better somehow. If people think that your company exists to “catch” them, you are never going to win their favorable opinion.
Biometrics: Are you Doing Good or Harm?
At FST Biometrics, we pride ourselves on making identification for secure access more convenient. We believe we are helping people by making their lives easier and more secure. In the corporate world, we want to make sure that no one ever again gets stuck outside their office because they forgot their keycard. In the construction industry we are saving lives by closing off dangerous areas to non-certified workers and the public.
When you employ a new technology on a mass scale, you need to ask, “How am I making peoples’ lives better?” If you can answer that question, you will have friends supporting your mission. If you can’t, expect to have your day in the court of public opinion.
+ Add comment