The seemingly universal condemnation of China over its surveillance techniques, not to mention the mass detentions of the Uyghur ethnic minority group in camps (the so-called “re-education centres”) in the northwestern province of Xinjiang, has some experts and human rights activists worried that facial recognition is a step too far.
And maybe that’s why situations like these scare people about the consequences AI will have for their future liberties. Worries over such threats have got a group of fifty investors with a management portfolio of more than $4.5 trillion in assets to make sure companies like Amazon and Facebook regulate and use their facial recognition technology in a totally ethical way, no questions asked.
Led by asset manager Candriam, a European division of U.S. financial services company New York Life, the investor group — which includes the UK’s Aviva Investors, Royal London Asset Management, Canada’s BMO Global Asset Management, Dutch-based NN Investment Partners, and Norway’s KLP — announced in a statement the technology could contravene an individual’s privacy rights, due to lack of consent of those being identified.
Candriam stated there is presently no global framework that governs the gathering and use of biometric data. The European Union (EU), however, has suggested the first-ever legal framework while Beijing has published a draft standard. The EU’s privacy watchdog said in April the technology should be banned in Europe because of its “deep and non-democratic intrusion” into people’s private lives.
The initiative communicates why fund managers are keener to adopt important policy issues that were at one time believed to be marginal topics for shareholders as retail investors invest billions into funds whose ethical and sustainability criteria have become a focal point.
Facial recognition technology has the ability to corroborate bank accounts, be used by governments to monitor its citizens and, like what is happening in China, quash political opponents and signs of dissent, say human rights activists.
In the statement the investor group said it would commence a two-year process of engagement with companies developing or using the technology, considering 34 companies to be leaders in facial recognition, these included Amazon, Facebook, and Asian tech giants Alibaba and Huawei.
When Amazon was asked to comment, a spokesperson declined the invitation.
“Technology should only ever be used to enhance human, social, and environmental well-being. We encourage a global conversation to develop ethics and governance standards around emerging technologies and we continue to play our part in this conscious, ongoing, and collaborative effort,” said a Huawei representative.
The other companies, when contacted by Reuters, did not immediately comment.
Only last month Amazon told Reuters that it was extending a moratorium it had imposed on police use of its facial recognition technology. Many human rights groups have warned erroneous matching could lead to a situation where people are wrongly arrested or detained.
“For investors to be able to fulfil our own responsibility to respect human rights, we call on companies to proactively assess, disclose, mitigate and remediate human rights risks related to their facial recognition products and services,” said Rosa van den Beemt, Responsible Investment Analyst at BMO Global Asset Management, one of the investors that have signed up to the initiative.
According to a report published last year by Adroit Markey Research, Facial Recognition Market by Technology, the facial recognition market is set to hit $12 billion by the year 2025, it’s little wonder the investor group has its concerns.
“The increasing deployment and use of facial recognition technologies have human rights implications which are not fully being considered by companies,” said Louise Piffaut, Senior ESG analyst at Aviva Investors.
This is the first step and one that will be warmly welcomed by the majority. For, though facial recognition software and AI are a godsend to humanity, we must not forget about the implications they could have for people’s civil liberties as well.