The New Zealand Police, Customs, Ministry of Justice, DIA and other government agencies have a wide arsenal of Facial Recognition technologies with little or no regulation or consultation with Māori. This will likely lead to the likelihood of further widespread discrimination and cultural unsafe practices that will directly impact Māori as we have already seen overseas with other minorities who are being discriminated against by law enforcement agencies using Facial Recognition systems. Moreover, a number of supermarkets and other retailers in New Zealand have also implemented Facial Recognition systems in store without any regulation expect the Privacy Act.
Facial recognition systems are the digital equivalent of the old colonial practice of collecting Māori heads or mokomokai. Our Faces and images including our moko are being taken without permission by the government and used in ways we are still not certain of. Though we are learning a lot from Official Information Requests by the media and recently academia.
Some of our images could be sold by collectors (CCTV and other vendors) to, and or collected by government agencies who will use the images with no consideration of cultural values. These images are being stored overseas and subject to other countries laws.
Any digital images of Maori faces are likely to become the legal property of others as is the case with many social media services now. Due to this and the lack of regulation, we have already seen the appropriation of Māori faces and moko on shower curtains, emoji’s, cigarettes and alcohol labels to name a few. As facial technology increases, the risks for further abuse and discrimination against Māori is likely.
Background
This is the sixth in a series of articles I am writing about Māori ethics with AI, Data sovereignty and Robotics. Article 5: Māori Data Sovereignty with AI, Algorithms, IOT and Machine learning. Rights afforded to Māori & Crown obligations with legal instruments Article 4: Treaty Clause Required for NZ Government AI Systems and Algorithms Article 3: Māori Ethical considerations with Artificial Intelligence Systems; Article 2: Māori ethics associated with AI systems architecture and Article 1: Māori cultural considerations with Artificial Intelligence and Robotics.
Overview
Currently the New Zealand Customs use face recognition technology with their e-gates[i]. Department of Internal Affairs uses Facial Recognition for RealMe[ii]. Many local councils, the New Zealand Police and many other government department buildings and shops have Closed Circuit Television that constantly records our movements. The New Zealand Police in the past have not ruled out using council and government CCTV footage for facial recognition to combat crimes, assist with missing persons and for traffic violations[iii]. We now understand that the Police use Passport photos to ascertain a persons identity. The Police have already purchased an image management and facial recognition system to allow them to identify people from CCTV[iv]. This new technology will allow Ta Moko and Moko Kauae to be stored and used to identify people from CCTV footage[v].
Anyone who visits a New World supermarket and possibly other supermarkets, or Z Energy Petrol station will be recorded and their faces inputted into a Facial Recognition system[vi]. The possibilities that these retail chains will read your facial expressions based on their advertising, use your image for internal marketing or perhaps in the future cross check your image with an offender list is feasible and raises a number of ethical issues and further raises concerns about the lack of regulation.
If we look to our colonial oppressor as a country we are similar to, the United Kingdom; 59 percent of fashion retailers in the UK use facial tracking, which captured the faces of shoppers, before cross-referencing their biometric data with known criminals[vii]. If this is occurring in New Zealand with retailers, the risks have already been highlighted internationally as being bias against people of colour. Ohio became the latest of several state and local governments in the United States to stop law-enforcement officers from using facial-recognition databases . The technology is not designed for people of colour and Māori and Pacific Islanders. Therefore, it would require a manual check by a person who would need special training and would hopefully not be bias against Māori.
Facial Recognition technology will likely make our communities safer, but at what cost to personal privacy and risks of discrimination to minorities? Police and legal system bias means that Māori need to consider and express cultural, privacy and technological advancements that will likely impact on Māori if the system does not address its inherited bias against Māori. Here are examples of the biases AI facial recognition have with minorities.
2020 Academic Report into Facial Recognition Technology in New Zealand
Victoria University Researchers released their research report “Facial Recognition Technology in New Zealand Towards a Legal and Ethical Framework” which highlights amongst many other concerns, bias and ethical issues against Māori. The report concludes with 15 recommendations.
The report identified multiple government agencies have their own Facial Recognition and AI systems and access to at least some of these systems are granted to other government organisations such as Department of Internal Affairs and Corrections also have similar systems. The 5 Eyes and other organizations also have access to New Zealand’s facial recognition systems.
New Zealand police AI arsenal
The New Zealand Police have a wide ranging arsenal of AI and Facial Recognition technologies including:
- Clearview AI – Trailed AI software that compares a photograph of a facial image for matches.
- Brief Cam – Was developed in Israel and is now owned by Japanese major Canon. It aggregates footage happening at different times to analyse as if the events were simultaneous. Used to analyse CCTV footage
- NewX – Searches unstructured data and platforms for faces, guns, and body markings (tattoos).
- Cellebrite – has been dubbed by CNN as the FBI’s “go-to phone hacker” Includes a facial recognition capability. It can extract personal data from android mobiles or iPhones, even locked or encrypted ones, and reaches beyond the device into over 50 social media and Cloud-based sources or apps, including Snapchat and Instagram, without needing any permission from Apple, Google etc.
- Automated Biometric Information Survey – search capability across scars, marks and tattoos
- RPAS – Remotely‐piloted aircraft systems (a.k.a. ‘drones’)
Politicians Reactions
At this stage some ministers appear to be supporting the findings in the report. Minister for the Digital Economy and Communications David Clark told RNZ he would be seeking more advice on the topic by adding “It’s a priority for me to ensure that we maintain and enhance the ethical frameworks in place when it comes to use of data, protecting human rights, and upholding the provisions of the Privacy Act.”
Green Party justice spokesperson Golriz Ghahraman said she had deep reservations about the use of the technology.
While showing somewhat of a privileged background, National Party spokesperson on digital media Melissa Lee was cautious about the impact of more regulations.
Police Minister Poto Williams said in a statement she would not comment on individual recommendations when asked if she supported them, saying it was an operational police matter.
The Māori Party have not made any comment that I can find, nor have the Labour Māori Caucus.
Facial Recognition – Current unique to Māori issues
- There is no regulation and no Te Tiriti protection mechanisms with the current facial recognition systems that could protect Māori.
- Facial Recognition systems by the Police have been according to RNZ secretly (sometimes without the Minister of Police knowledge or the Police Commissioner) installed or trailed and many other government agencies have also installed Facial recognition systems without any consideration for co design and co governance with Māori.
- Images are shared with external agencies and international organizations.
- Māori are more likely to be subjected to discrimination in all aspects of the legal system due to biases and inherited racisms etc.
- Internationally, people of colour are unfairly targeted by Facial Recognition systems and incorrectly identified as the wrong person. This is likely to occur to Māori as the systems have not been fixed or trained on Māori and Pacific faces.
- If our images are being analysed by foreign entities and people, the possibility of false positives with identify will further increase the bias and discrimination against Māori.
- Māori who wear Tā Moko, Moko Kauae and Mataroa will have their sacred intellectual Property used, stored and analyzed by the system. It is likely that the Facial Recognition systems will be unable to differentiate between various artworks and wearers of moko could be subject to intense and unfair wrongful identification and discrimination. Noting some systems do not recognise facial tattoos.
- It is culturally inappropriate for our photos to be placed with images of the dead. Despite that, there is no regulations to protect Māori cultural beliefs.
- Māori Data Sovereignty issues have been ignored. Our images will be stored overseas and accessed by multiple entities in various countries.
- Tikanga Māori states that an image of a person has a spiritual connection to the person the image is taken from. This has been ignored as has the fact that pictures of Māori people are a taonga and should be protected by Te Tiriti obligations.
Current and Future risks of images of Māori online
While this is not directly related to Facial Recognition, the fact that social media do claim intellectual property rights over images online and they do use facial recognition, this section discusses some of the other issues that images of Māori are currently occurring online and with countries.
3D Printing and image reproduction
Any image of your face on social media and the Internet can be used for products, marketing, fraud and artwork. The later is a concern for Māori with Ta Moko and Moko Kauae. Anyone with an interest in moko could simply use a photo from social media or the Internet, then use a 3D Printer to literally recreate your head or as we have recently seen, artists and overseas companies who are using images from the Internet of Māori with moko on paintings and other products which are then retailed online with no consultation or engagement with the person who the art is based on. Some examples include: Shower curtains, magazines, tea towels, game characters, art work, paintings, food and beverage labels.
Cartoon pornography/Animated cartoon pornography/erotic animation/adult animation
There is nothing stopping creators in the pornography industry of using your image as the face of one or more of their characters. Like many other socially unacceptable fetishes, there is likely if not already a fetish for this niche market.
Deep Fakes
Technology allows for your image to be replaced on a video of a person so that it looks like you. There have been many examples of this occurring on the Internet and the resources to do so are readily and freely available.
Employment
Face recognition is popular in Europe for employment recruitment. The algorithm looks at a candidate’s facial expressions to situations. A good candidate will react the same as the senior management who the algorithm is based. This will create a new era of racial discrimination for Māori in employment. Previous generations of Māori were discriminated against with jobs for being Māori or having a Māori name. Facial recognition will likely not consider Māori cultural facial expressions unless the senior management team are Māori. Many common facial traits with Māori include: raising their eyebrows, puck their lips, a unique sense of humor and not engaging in prolonged eye contact. These cultural expressions will discriminate against Māori and add to unemployment issues.
Emojis
Emoji’s are popular with some communities and recently Apple and others launched software to create imoji’s based on a photo. The latest Apple update rectified a number of racial issues. At the time of writing, imoji software could not recognise Ta moko, moko kauae or mataroa. But this will likely change in the future, so some considerations need to be applied.
In 2017 a number of Māori designed Emoji’s were released with their own branding. A number of the Emoji’s used photos of dead tipuna with no consultation and of some living Māori leaders with out their knowledge. In a later update, there were some Māori leaders who approved their usage of their face as an Emoji. Some of the risks are that the emoji’s can and were used in all manner of ways including some that were socially unacceptable and others culturally inappropriate.
Game Characters
There have already been multiple instances of international gamers using both made up and copied moko from Māori peoples images for their gaming characters. There is nothing against the law to prevent this. The only way to deal with this is by education and awareness and vigilante tactics that fall within the law such as writing en mass to the developers.
Killer Robots, AI armed guns and Drones
This is no longer science fiction. Today, writing this section NewsHub reported an alleged state sanctioned assassination by a satellite-controlled machine gun using AI and facial recognition.
World Super Powers and their militaries are using Killer Robots that use AI and facial recognition. A new generation of autonomous weapons or “killer robots” could accidentally start a war or cause mass atrocities, a former top Google software engineer has warned.
Māori face on a robot or hologram
Statistically there is an increasing amount of lonely people who fall in love with a hologram. In Japan people are getting married to a hologram. These holograms learn from the human emotions and will text the human and act like the hologram loves the human. We have an increasing amount of kaumatua and young people becoming isolated. The ability for a deceased person to be added to a robot or hologram is already possible and needs consideration and some sort of regulation.
Reimaging the dead
Archaeologists who dig up ancient human remains have for many years attempted to mould physical features onto the skull or a replica. Today, advanced computer systems can do the modelling and use facial images as a source of probability of what the person looked like. This is a potential risk for Māori to be targeted in this area based on iwi and hapū affiliations to the bones of the dead. One example in Norway of a computer facial appearance generated ancestor.
Robots and virtual assistants
With no regulation and the increased introduction of physical and virtual robots that are currently being designed in New Zealand to reflect society. Moko kauae and Mataroa are increasingly becoming more and more common, there is the likelihood that a real Māori face and or moko could be used on the face of a robot or virtual assistant.
Positive Cultural usages for Facial Recognition
There are a myriad of potential benefits for Facial Recognition systems that could protect Māori culture and knowledge that is not common knowledge or even lost knowledge. For instance, the return of mokomokai to New Zealand, their identity is often not known. There may be intrusive methods to ascertain the whakapapa of the head, but perhaps a Facial Recognition system could be modified to look at the moko style and cross reference that against a database of known artists, iwi patterns and an image database for a likelihood of where or who the person is. This could also be applied to carvings whose origin is not known.
Many photos emerge generations later of Māori and there is often uncertainty of exactly who the person or people are in the photo. A modified Facial Recognition system to help identify those people would be another useful resource for Māori and could assist with family issues.
Modified Facial recognition systems to track our land based taonga species as a non invasive method may also be another possibility.
But these systems will need to be co designed with Māori and Iwi to ensure appropriate cultural protection.
Conclusion and Additional Recommendations that provide safeguards for Māori with Facial Recognition
These additional recommendations are based and compliment the 15 recommendations in the Victoria University Researchers released their research report “Facial Recognition Technology in New Zealand Towards a Legal and Ethical Framework” , but are uniquely designed to recognise Te Tiriti and to prevent bias and discrimination against Māori.
- Add enforceability and oversight to the New Zealand Algorithm Charter as recommended by the report, but the Algorithm Charter must recognise Data as a Taonga and Māori Data sovereignty which it currently does not recognise breaching the previous Data Stewards commitment to Māori. The charter must also include co governance, co design and have a clearer Te Tiriti clause and guidelines as to how to implement the Te Tiriti clause.
- The Privacy Act to be revamped to include Biometrics and a Biometrics Commissioner. That a mandated number of Māori seats similar to the recommendations made by the Law Commission’s report on “DNA in Criminal Investigations” be implemented.
- A cultural audit of all NZ govt Facial Recognition Systems.
- As recommended by the Victoria University report, a privacy code of practice for biometric information under s. 32 of the Privacy Act 2020, but that this is co written and co researched with Māori and Iwi.
- Technology available to New Zealand for multiple ethnics may be at an acceptable level for Asian and Euro physical features, but are the people using those systems should be offered extensive training to increase awareness and diversity to enable them to identify any systems biases.
- There are other international agencies such as 5 eyes who have access to facial images. If we have international people making a call on a Maori face there could be unintended biases. Therefore, A Maori liaison and support in place for cultural safety purposes.
- The introduction of an opt out scheme and for images of a deceased person to be removed.
- All New Zealand Facial Recognition Systems to have a mandatory Te Tiriti clause.
APPENDIX
Useful New Zealand Media reactions to Facial Recognition in past year
December 2020
Police using technology riddled with controversy overseas https://www.rnz.co.nz/news/national/432271/police-using-technology-riddled-with-controversy-overseas
Facial recognition regulations will be reviewed – minister December 2020
‘Regulation gap’ for facial recognition technology, law expert says December 2020
Police facial recognition: new study calls for greater oversight in NZ December 2020
Government to seek advice on facial recognition tech laws December 2020
Biometric tracking for parolees and prisoners grows, Supercom and Tyler Tech win contracts December 2020
November 2020
Biometric ID systems in prisons, but no facial recognition – Corrections November 2020
Audit reveals new tech tools in police’s digital armoury November 2020
October 2020
Government facial recognition tech deal offers wide access October 2020
Minister not briefed on police facial recognition move, office says October 2020
Fears police facial recognition system will falsely accuse Māori October 2020
Facial recognition deal too weak to protect Māori data sovereignty – specialists October 2020
September 2020
‘Matter of time’ before police AI leads to Māori or pacific person’s wrongful arrest – expert September 2020
Company run by Chinese military made bid to run NZ Police facial recognition September 2020
Concerns about facial recognition system being set up by police September 2020
Blurred lines – the police and facial recognition technology September 2020
Cost of huge CCTV combined network sees Auckland Council pull out September 2020
August 2020
Police setting up $9m facial recognition system which can identify people from CCTV feed August 2020
Global facial recognition company working closely with NZ govt August 2020
Global facial recognition company NEC working closely with New Zealand Government August 2020
New Zealand renews airport facial biometrics contract with NEC and partner for $20M August 2020
Global facial recognition company working closely with Government August 2020
May 2020
Police trialled facial recognition tech without clearance May 2020
Police trial of facial recognition technology ‘a matter of concern’ – Andrew Little May 2020
Police searched for suspects in unapproved trial of facial recognition tech, Clearview AI May 2020
Police ‘stocktake’ surveillance tech after Clearview AI facial recognition trial May 2020
Correction: $727,000 Spend On Face Recognition Was Separate To Clearview Initiative May 2020
January 2020
Why you could be fired for refusing to use fingerprint or face scanners January 2020
AI cameras could help catch litterers in New Zealand January 2020
The quiet creep of facial recognition systems into New Zealand life January 2020
December 2019
Police facial recognition: US academic says Kiwis deserve answers December 2019
Worker fired for declining a face scan awarded $23,200 December 2019
September 2019
The man whose state surveillance revelations rocked the world speaks exclusively to the Guardian about his new life and concerns for the future https://www.theguardian.com/us-news/ng-interactive/2019/sep/13/edward-snowden-interview-whistleblowing-russia-ai-permanent-record
Victoria sends identity data to national facial recognition system to stay ‘ahead of the pack’ https://www.themandarin.com.au/116125-victoria-provides-identity-data-to-national-facial-recognition-system-to-keep-its-agencies-ahead-of-the-pack/
August 2019
Police open to using facial recognition from Auckland Transport CCTV cameras August 2019
Auckland Transport’s $4.5m plan could mean 8000 cameras watching the city August 2019
Footnotes
[i] eGate https://www.customs.govt.nz/personal/travel-to-and-from-nz/travelling-to-nz/egate/
[ii] Daon provides biometric onboarding for New Zealand government-backed mobile credential https://www.biometricupdate.com/201808/daon-provides-biometric-onboarding-for-new-zealand-government-backed-mobile-credential
[iii] Police open to using facial recognition from Auckland Transport CCTV cameras https://www.rnz.co.nz/news/national/396716/police-open-to-using-facial-recognition-from-auckland-transport-cctv-cameras
[iv] New Zealand police reviewing proposals for new facial recognition system https://www.biometricupdate.com/201808/new-zealand-police-reviewing-proposals-for-new-facial-recognition-system
[v] Police eyeing up newer, smarter CCTV facial recognition technology https://www.stuff.co.nz/national/crime/103196220/police-eyeing-up-newer-smarter-cctv-facial-recognition-technology
[vi]Revealed: Supermarkets in NZ using facial recognition tech https://www.rnz.co.nz/news/national/357293/revealed-supermarkets-in-nz-using-facial-recognition-tech
[vii] Revealed: how facial recognition has invaded shops – and your privacy https://www.theguardian.com/cities/2016/mar/03/revealed-facial-recognition-software-infiltrating-cities-saks-toronto
Leave a Reply