🔬 Research summary by Ameen Jauhar (right) and Jai Vipra (left).
Jai Vipra works at the Centre for Applied Law and Technology Research, Vidhi Centre for Legal Policy, with a focus on digitalisation and development in the Global South. Ameen Jauhar is a senior resident fellow leading the Centre for Applied Law & Technology Research. His work focuses on law and policy issues around data governance, internet and cyber regulation, and AI ethics in India.
Overview: The Centre for Applied Law & Technology Research (ALTR) which is part of the Vidhi Centre for Legal Policy, recently published its third and final working paper in a three part series. The objective behind this working papers’ series is to discuss how facial recognition (FRT) is being deployed at the state level by local law enforcement. India joins a growing number of countries across the globe aiming to integrate emerging technologies into state surveillance apparatuses, in the midst of increasing concerns around its legal, ethical, and social ramifications.
Our working papers covered three distinctive themes –
- Working Paper 1 discusses the risks posed by FRT in the India context, deriving from global literature covering these issues in different jurisdictions. Specifically, the paper identifies three broad categories of risks associated with the use of FRT in law enforcement, namely, legal and constitutional issues, ethical risks, and problems for the criminal justice system.
- Working Paper 2 was an empirical case study of how FRT is likely to affect citizens in the Indian capital of Delhi. The paper reviewed through a spatial analysis of existing CCTVs to determine how FRT integrated into these systems will put a higher risk of surveillance against vulnerable populations of the city, especially Muslim inhabited areas.
- Working Paper 3 titled Procurement of Facial Recognition Technology for Law Enforcement in India: Legal and Social Implications of the Private Sector’s Involvement deals with a relatively underexplored issue in the larger ethical discourse around the use of FRT. It explores how state agencies, particularly law enforcement bodies, procure such technology from private corporations. The role of the private sector in the Indian context is quite opaque and raises several red flags which have been examined in this final paper of the series.
The post aims to provide a concise overview of the papers, and conclude with some thoughts on how the conversation around use of FRT in India can be developed from a regulatory perspective.
Working Paper 1: Indian law enforcement’s ongoing use of automated facial recognition technology – Ethical risks and legal challenges
The rapid growth in the Indian AI industry has also triggered concerns around its risks. Arguably, FRT as a surveillance technology has become the most visible conflict to India’s commitment of adopting responsible and safe AI. Both internationally and within India, the rapid deployment of FRT by law enforcement agencies, for seemingly surveillance and security purposes, have raised concerns. These can broadly be categorized into a). Legal and constitutional issues; b). Ethical risks; and c). Problems for the criminal justice system in India.
Legal and constitutional risks: Foremost among these is the risk that FRT poses to an individual’s privacy. The right to privacy has been read as a fundamental right by the Indian Supreme Court, guaranteed by the Indian Constitution. It entails a person to have autonomy over how her data is collected, if it is to be collected, and how it is processed and shared. However, with the 1:many form of live facial recognition, which is typically deployed in surveillance and monitoring formats, most such systems will counteract such informational autonomy. Further stemming from this is the potential violation of due process. Basically, with local police having unbridled (and often arbitrary) authority to use this technology in the absence of any legal safeguards, there is quite likely going to be an Executive overreach, resulting in local police targeting people, or certain stratas of society, and eroding the procedural due process that is also guaranteed in any criminal investigation or trial by the Indian Constitution. All in all, such unregulated application can have a chilling effect on the freedom of speech and expression. Already, there have been reports of local police forces in Delhi using sophisticated FRT to surveill and target protestors. It is a considerable function creep given that these technologies were originally designed for the limited (and self-claimed) use of locating missing children in the national capital.
Ethical risks around FRT: FRT also poses certain ethical risks like biased and inaccurate results, as well as lack of transparency, both in the underlying algorithms and the process of procuring such technologies by law enforcement.
With respect to design flaws, the paper discusses how limitations in train datasets have often been a significant cause for biased or inaccurate outcomes. In the Indian context specifically, the absence of more local, Indian datasets, are likely to result in false positives or false negatives. Even within India, the physical features of individual faces has stark variations from region to region, and any effort to create pan-Indian, or mutli-ethinic FRT systems, is likely to run into a serious logistical nightmare.
The other major concern is the lack of transparency, both in terms of the algorithm, as well as on the administrative side. The lack of design transparency is a general critique in several use cases of AI, and poses high risk when human liberties may be compromised (as with the use of FRT in criminal justice). What makes the Indian experiment with FRT even riskier is the complete lack of transparency around how the systems have been designed and deployed. For the working paper, we solicited official responses from different states regarding their respective FRT applications. Mostly, we were informed that there were no official documents recording this activity, or formal bidding processes for procurement. We were also not given any information about police manuals or governing regulations which were regulating the use of FRT by local police forces.
Problems with the use of FRT in criminal justice: An issue that is likely to impact the Indian usage of FRT by law enforcement agencies, is the lack of legal frameworks to integrate it into the criminal justice system. The use of FRT will struggle to make any real improvement in policing because the Indian Evidence Act, 1872 does not contemplate the use of such technologies to prosecute individuals. Our paper delves into why the absence of such legal recognition is likely to impede any successful prosecutions attributable to the use of FRT. And in the absence of the same, there is a legitimate concern – to what end is this surveillance system being deployed?
Working Paper 2: The use of Facial Recognition Technology for policing in Delhi
This paper is an attempt to empirically examine the potential discrimination that can occur due to the uneven distribution of police resources that are fortified with FRT. It found that Muslims were more likely to be targeted with the use of this technology in Delhi.
Premises: The paper uses existing literature to establish the following premises:
- FRT has a non-zero error rate, which can lead to innocent people being identified as suspects;
- In India and in Delhi specifically, there is a policing bias against Muslims;
- The distribution of police stations is an indicator of the distribution of policing resources in the city of Delhi; and
- The addition of FRT to policing in Delhi would exacerbate existing biases.
Police stations: First, we mapped police stations and their jurisdictions for Delhi. On top of this, we mapped ward-wise populations. In this manner, we could see the population per police station jurisdiction. Areas with less population-per-jurisdiction were relatively over-policed, while areas with more population-per-jurisdiction were relatively under-policed. We examined the demographic characteristics of the ward-wise population of the most over-policed quintile after excluding areas with low civilian population. We found that the over-policed quintile had an over-representation of Muslims compared to the rest of the city – i.e., nearly half of the remaining over-policed areas are estimated to have a significant Muslim presence (defined as more than that of Delhi’s average share of Muslim population – 12.86%).
Thus our primary conclusion from this data was that the relatively over-policed parts of Delhi had a significant Muslim population. It is reasonable to assume that a population that is subjected to more policing (and more FRT), will be subjected to more FRT errors in absolute numbers than a relatively under-policed population. Combined with policing biases against Muslims, the use of FRT in Delhi would likely lead to worse outcomes for Muslims in Delhi.
Note that the data did not prove that Muslim areas in Delhi were deliberately over-policed, nor that all Muslim areas in Delhi were relatively over-policed.
CCTV cameras: We also attempted to map the presence of CCTV cameras across the city to identify any patterns in the distribution. However, the data we received from police agencies was inadequate and we could only discern that the distribution of CCTVs was not uniform. We were not able to say whether the uneven distribution would affect a particular group of people more than others.
Recommendations: Such disproportionate impacts would pose serious challenges to the right to equality, and we recommended that the use of FRT by the police in Delhi be halted until the questions of equality and over-policing are examined by the public and public representatives.
Further analysis along these lines would ideally examine policing biases on the basis of caste, class, sex work and homelessness, and understand if FRT would exacerbate these biases.
Working Paper 3: Procurement of FRT for law enforcement in India – Legal and social implications of the private sector’s involvement
This final paper examines how this technology is procured by public law enforcement. The role that the private sector plays in designing and developing FRT for law enforcement agencies has continued to be under examined, despite its significant impacts for society and the rule of law. There are four broad concerns which emanate from the clandestine manner in which private corporations have been involved in the design and procurement processes. These are privacy risk, delegation of state surveillance functions, opaque processes and decision making, and private profiteering steering critical policy.
Privacy risks: A crucial element of the right to privacy guaranteed by the Indian Constitution, is to have informational autonomy. As discussed in Working Paper 1, FRT is dependent on large datasets to train and develop the underlying algorithms. However, are such datasets also using sensitive biometric information (using facial mapping) that is consented to by a data principal? Also, in the absence of a formal data protection law, what protects such data from potential breaches or abusive usage? The implications of involving the private sector to design state surveillance tech (like FRT) are dangerous in undermining individual privacy, and needs to be adequately addressed.
Delegation of state surveillance functions: A second serious risk with the non-transparent role of the private sector is whether state surveillance functions are being delegated in the private sector. Surveillance is an exclusive function of the state, which is also subject to exceptional use. It is an essential act that is non-transferable. However, it is unclear that when private corporations design and develop FRT for law enforcement agencies in different Indian states, is their role ending once this procurement happens, or do they continue to aid in the operationalisation of such tools. The latter would certainly risk delegation of the aforementioned state function. It further risks letting a private entity practically be in control of large scale surveillance apparatuses which is highly anathematic to a democratic society.
Opaque process: The third issue with the involvement of the private sector is that this involvement is completely opaque. Usually, public procurement processes are at least nominally transparent and are heavily scrutinised. In the case of FRT, the entire process is outside public scrutiny. The public is not aware of the procurement conditions, bidders, checks and balances, terms of contracts, safety procedures, and so on. Even crucial information such as accuracy rates and the list of officers authorised to use the technology, is missing.
Profiteering motives: The fourth issue is the fact that with private provision of surveillance technology, there is a real risk of private incentives directing public policy. Ideally, any surveillance function of the state should be driven by public priorities and wishes; the involvement of the private sector, particularly under opaque conditions, means that the profit motive can drive surveillance policy. Venture capital funded FRT companies can and do discount the price of this technology and push its use in law enforcement contexts, without any public input or scrutiny.
Ameen Jauhar and Jai Vipra are co-authors of the working papers published by the Centre for Applied Law & Technology Research (ALTR). Ameen heads ALTR, and Jai is a Senior Resident Fellow with the team.