India: Use of Facial Recognition Technologies (FRTs) in Policing.

You are already Here! Don’t leave without Like, Share, Commenting, or Tweet…be the Voice of the Free Press!!

By Jibran Khan, Copy Edited by Adam Rizvi, TIO: India has increased in recent years its reliance on the use of facial recognition technologies (FRTs) in policing. This is in alignment with the international community shifting towards the use of artificial intelligence (AI). According to a report by Carnegie Endowment, at least 56 countries are actively using FRTs in policing currently. With AI increasingly used, the potential threat of more policing against historically ostracised communities has also been exacerbated.

Also Read: Why Is India’s Successful 1971 Sri Lanka Intervention, A Forgotten Story?

A significant issue with the deployment of such technology is that it is often presented as a progressive way of policing and ensuring efficiency within the criminal justice system. However, in the case of India, this technology was implemented without sufficient legislative discussion or feedback from potentially affected stakeholders. Moreover, since these policies choose to leave out affected individuals, they go against the democratic ethos on which the law itself may have been established.

Rapid implementation of such technology ensures that fears of a surveillance state are furthered in the minds of communities. This is especially true for minority communities who have been on the receiving end of extrajudicial police action and police brutality in the past.

Also Read: Sri Lanka Outcome: America Returns To South Asia


Delhi Police Brutality

AI is not impartial. The technology works on data sets that are governed by machine learning and past trends in arrests. Therefore, it is not error-free and issues such as mischaracterization based on generalized traits are common. Historically, the police forces have been known to act in discriminatory ways against socio-economically weak minorities, which has ensured that the data sets that current AI systems have been skewed against individuals from certain ethnicities or geographical regions, as well as descriptions based on gender and religion. The likelihood of individuals being singled out by a technology that is being operated in regions densely populated by such minorities with data sets that haven’t been corrected or reformed leaves space for exploitation.

Also Read: Crucial Exercise of Delimitation, Electoral Map Jammu & Kashmir

India, Delhi Yesha Renna, 22, a history student at Jamia Millia Islamia, and Ladeeda Sakhaloon, 22, a BA Arabic student were among those who were seen getting thrashed by policemen. They were part of the protests near NFC.

VIDHI Centre for Legal Policy, in a report, highlighted how the highest density of police stations in Delhi was in parts of the city where the representation of Muslims (a minority community in India) was high. This meant that the highest amount of policing and data collection was happening in these regions. The problem with this distribution emerges from the data sets that the technology gathers, which are often discriminately geared towards socio-economically weaker communities because of the police being biased actors in the past before reform and checks were implemented.

Also Read: Trigs on Track: Diplomacy Unwound – An Alternate Financial Architecture waiting is in the pipeline

Since the technology does not have a regulatory framework being accessed by all and is not open to public feedback, it just feeds into the issue of an inability of locating responsibility as well. The possible solutions to this include explainable logs in the technology, i.e., a determination of why a certain decision was taken by the AI. Additionally, the usage of diverse data sets can mitigate the bias. Lastly, there must be enough checks and balances which allow for the decisions to be taken by the technology to be questioned and appealed against.

Also Read: New political culture has made India a moving anarchy

The impending implementation of the FRTs in India poses a threat to citizens especially in ways that policing practices will progress in the country and their efficiency. It also poses questions about who will be held accountable in the future if the determination of crime is done through technology and not through individual people.  This stems from a lack of deliberation and discussion in the public domain and the lack of a regulatory framework. It is essential to create checks and balances, especially with the technology that happens to have such far-reaching effects on how the lives of citizens will be governed. The legislative push for the same must come from a space where the citizen is prioritized as the supreme stakeholder.

Also Read: Criminal Procedure (Identification) Act 2022 attempts to make India a Police State

Jibran Khan is a fifth-year law student at Dr. Ram Manohar Lohiya National Law University, Lucknow, and is interested in Policy and Law, Arbitration, and Constitutional Law. This article was first published in Human Rights Pulse on 8 Feb. 2022

Curated and Compiled by Humra Kidwai




Leave a Reply

Your email address will not be published. Required fields are marked *