Facial Recognition Monster

Facial Recognition Monster

Facial Recognition Monster

Considerable debate was conducted during 2019 over the use and future use of facial recognition technology in policing. The debate has mainly focused on the ethical use of the technology and how it may impact an individual’s privacy, but very little discussion on the benefits of facial recognition.

It appears the media and lobbyists groups have very little confidence in facial technology being an effective capability in the protection of the public or supporting crime prevention or detection. They do make a good case for privacy issues relating to the use of facial tech being intrusive and inaccurate, which should be acknowledged.

However, it should also be acknowledged that the technology is and has been utilised in several other industry areas including banking (FinTech), Computer log in, and border control. Which appear not to attract the same vigorous review as policing but still may impact a person’s privacy. The scrutiny of facial recognition in policing suggests that databases exist which contain facial mapping of many thousands of people, without consent. The large technology companies such as Amazon, Microsoft and Google are believed to possess large databases with facial mapping indexes, in some cases these databases are accessible to commercial companies who are willing to pay, but should also raise questions of whether true consent was obtained and if privacy may have been breached!

So why is the Facial Recognition Monster so much bigger and fiercer in policing, and so it would appear, smaller and tamer elsewhere?

A question which may be answered with future conversations around public safety, safe guarding and crime reduction, detection and prosecution, which appears to be lost in the current debate.

To conclude - Facial Recognition technology will be a useful tool for policing but will need strong and ethical management in order to be accepted in society, therefore should not be viewed as an uncontrollable monster.