All Party Parliamentary Group APPG – Facial Recognition

All Party Parliamentary Group APPG – Facial Recognition

Ross Edwards

Ross Edwards – IORMA Technology Director

FACIAL AND EMOTION RECOGNITION IN SOCIETY   June  8  2020

APPG AI Chairs, Stephen Metcalfe MP and Lord Clement-Jones CBE, and the Secretariat at Big Innovation Centre

In his introduction Lord Clement-Jones quoted the Metropolitan  police commissioner referring to facial recognition as being Orwellian.

Professor Nadia Berthouze UCL Computing

Professor Nadia Berthouze has years of experience investigating technology that can sense people’s emotions.

While looking at the commercial uses of this, Nadia mentioned that it would help to deliver an environment that helps foster people’s wellbeing and not just for better productivity.

Finding the triggers that make people more aware of their biases could also help salespeople recognise honest responses which could lead to more ethical sales.

She also pointed out that people need to know if they are being analysed as one of the ethical issues that surrounds this subject.

Matt Celuszak Human Measurement Systems

Matt has spent 16yrs in the human measurement business, he said the last 15 of those have been spent fixing the problems with it due to the many challenges involved.

They have collected over 450 thousand emotional experiences across 87 countries 2.5 billion data points about attention, emotion and thought behaviour linking it to about 40 different businesses and their matrix.

Matt said it is important to understand the why and how rather than the when and where?

His second point was that it is important to use the data in an informative and not directive manner.

Matt said regarding projects in this area it is far better to start at the end, with what outcome the companies are looking for and work your way back and also the importance of diversified data sets.

Matt said that some companies at the moment were making million dollar decisions using data sets that take into account only 2% of the population.

His remedy for this is to move away from universal data sets and work with more individual sets and create trust, his says people will trust you if you’re transparent and give them control.

Finally Matt had three recommendations

1.  Openness

2.  Quality of Data

3.  Make safe by design

Dr Seeta Pena Gangadharan

Dr Seeta was very concerned about the use of flawed data sets where the accuracy, or lack of it is the result of historical data sets along with misclassified data.

It was also pointed out that there is a chance that face recognition software might be attacked and with false positives the process can be invalidated and can be corrupted at many points during the process.

Dr Seeta finished up by posing the question that if there is a need for face recognition technology, is society willing to pay the costs of it.

Matthias Spielkamp AlgorithmWatch – which is a nonprofit organisation whose aim is to add light on the algorithmic decision making processes that have relevance to society it is financed by foundations and individual donations.

A lot of his current research is looking at automated decision making

One of his questions being around the reliability and  accuracy of the data, he thinks this to be a very broad topic but he mentioned one case in particular where analysing an image of someone holding a stick was taken for a man holding a binocular but the same image with a dark skinned hand was classified as holding a gun.

Matthias said we should be asking for a mass ban on biometric surveillance and such technologies used in public spaces as it does not comply with basic EU human rights.

Andrew Bud – Founder and CEO of iProov

A business involves face based biometrics

Andrew made the point that there is a big difference between face recognition and face verification In the way his system works a person’s identification does not need to be known for the process of the facial verification to take place. It is a system currently in use with the Home Office.

On the subject of protecting citizens information he said it is how the technology is applied and when it is used by the individual citizen, chosen by the individual citizen and with the consent of the individual citizen 

Consent is an absolute central part of the verification process as well as applying the appropriate GDPR regulations.

Silkie Carlo from Big Brother Watch which is a privacy and civil liberties non profit organisation

Silkie spoke passionately against the use of live facial recognition and thought the prospect of emotional recognition would be frightful. She put forward the case that part of the surveillance could lead to people being exploited, the need to have regulations and standards in place. She pointed  out that people are being checked to see if they are criminals while this technology is being used in shopping centres and sports stadiums.


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Translate »