NCRB's National Automated Facial Recognition System

Background

The NCRB has issued a Request for Proposals which invites bids for the creation of a National Automated Facial Recognition System (AFRS), having an estimated budget of INR 308 crore, to further create a national database of photographs. According to the RFP, this database is purported to be used to swiftly identify criminals by gathering existing data from various other databases.

The NCRB first released the RFP calling for bids for the creation of AFRS on 28 June, 2019 (Document Reference: 02/001). Initially, the deadline for submission of bids was on August 16, 2019. However, the deadline for submission has been extended multiple times due to administrative reasons by the NCRB.

On June 22, 2020, the NCRB recalled and cancelled the original RFP [Document Reference: 02/001] that it had issued on July 3, 2019. A revised RFP [Document Reference: 02/001 (Revised)] was issued in its place.

Who will be affected?

The project is being developed at a national level and will include information from central databases.

Scope of the Project

Functional Scope of the Project

One of the most important changes which have been made in the revised RFP is that it now states that the project "does not involve the installation of CCTV camera nor will it connect to any existing CCTV camera anywhere". This is a departure from the original RFP wherein CCTV integration had been included as a functional requirement.

Functional Requirements

In the functional requirements for AFRS, the revised RFP has introduced the requirement wherein the technology should be able to carry out N:N combination search. Previously, the original RFP required the technology to be able to carry out searches which responded to 1:1 (identification) and 1:N (verification) combinations. The RFP does not provide insight into the nature of searches which will be carried out through N:N combination searches. Since N:N combinations searches have various definitions in foundational literature, there is no clarity with regard to the use case for which they will be utilised. Vagueness in terms of purpose of use is harmful since it allows for the possibility of function creep.

Technical Requirements

Integration with existing crime analytics solutions

In the revised RFP, a new technical requirement has been mandated wherein the AFRS platform which is being built should be able to integrate with existing crime analytics solutions with the Police for providing unique attributes on images/visuals. Crime analytics solution has not been defined in the RFP and could thus be interpreted in a wide sense to include CCTV footage which contradicts the previous statements in the RFP which talk about restricting integration with CCTV cameras. It would thus be useful if a clarification is issued wherein "existing crime analytics solutions" are defined with emphasis on their scope.

Integration with existing crime analytics solutions could thus mean integration of AFRS with private vendor solutions which could lead to questions about data access and sharing. Another question which arises is the scope of the existing solutions. Since different solutions are currently being used by different states, will integration mean scaling up the entire solution for use at the national level automatically or will the solution be vetted first at the national level. Another interpretation could be that existing state-level solutions will only be integrated with AFRS in that particular state.

Dilution of technical requirements

In the revised RFP, there is no mention of the international standards which were included in the original RFP which had to be complied with as a technical requirement. The reason behind the exclusion of these standards is unclear and raises the question of why necessary standards of technical requirements are being diluted.

Functional architecture

Scene of Crime (SOC) images/videos included as a data source

In the revised RFP, a new data source "Scene of Crime images/videos" has been introduced for input into the AFRS database. This inclusion is at odds with the previous assertion made in the RFP - that integration of CCTVs will not take place. Deletion of CCTV camera footage as a data source leaves a gap in the functional architecture of AFRS and the RFP fails to satisfactorily account for its replacement. The RFP also fails to mention how the data and subsequent analysis/ information which is obtained through AFRS will be presented and utilized in a court of law, i.e., the nature of the evidence obtained from AFRS and its admissibility as pertaining to its reliability in courts.

List of databases which would be integrated with AFRS changed

The original RFP provided a list of databases from which data was to be gathered to create the AFRS database. This list of databases has now been removed from the revised RFP and has been replaced with the term "dynamic police databases". The lack of definitional clarity and broad scope surrounding this term sets up the AFRS for function creep and an open ended data sharing/mining endeavor, which is untenable in law.

As per the Hon'ble Supreme Court's decision in Justice K.S. Puttaswamy vs Union of India (2017 10 SCC 1) any justifiable intrusion by the State into people's right to privacy protected under Article 21 of the Constitution must conform to certain thresholds. These thresholds are:

Legality

Where the intrusion must take place a defined regime of law i.e. there must be an anchoring legislation, with a clear set of provisions. As pointed out in our previous legal notice as well, there is no anchoring legislation which allows for and regulates the use of AFRS. In the absence of such a framework and safeguards, the first requirement for lawful restriction on the right to privacy is not met.

Necessity

Which justifies that the restriction to people's privacy (in this case data collection and sharing) is needed in a democratic society to fulfill a legitimate state aim. In the RFP, it is stated that the need for AFRS arises because it will enable automatic identification and verification through criminal databases which would help investigation of crime and tracking and detection of criminals.

This characterisation is based on a faulty assumption that facial recognition technology is accurate and would provide speedy and correct results. However, ongoing research in the field has shown that facial recognition technology which is completely accurate has not been developed yet. Use of such inaccurate technology, especially for criminal prosecution, could thus result in a false positive, i.e., misidentification of an innocent individual as a suspect in a crime. Thus, AFRS fails to meet the requirement of necessity as laid down by the Supreme Court in the Puttaswamy judgment.

Proportionality

Where the Government must show among other things that the measure being undertaken has a rational nexus with the objective. The AFRS contemplates collecting sensitive personal information, intimate information of all individuals in the absence of any reasonable suspicion by collecting images and videos from a scene of crime. This could cast a presumption of criminality on a broad set of people. In the Supreme Court's decision in K.S. Puttaswamy v. Union of India (2019) 1 SCC 1 or the Aadhaar judgment , the Hon'ble Supreme Court held that:

"[u]nder the garb of prevention of money laundering or black money, there cannot be such a sweeping provision which targets every resident of the country as a suspicious person"

While this statement was made in the context of rejecting the mandatory linkage of Aadhaar with bank accounts to counter money laundering, it clearly shows that imposition of such a restriction on the entire population, without any evidence of wrongdoing on their part, would constitute a disproportionate response. Similarly, collecting sensitive personal information of all individuals who were present at the scene of crime creates a presumption of criminality which is disproportionate to the objective it aims to achieve.

Procedural safeguards

Where there is an appropriate independent institutional mechanism, with in-built procedural safeguards aligned with standards of procedure established by law which are just, fair and reasonable to prevent abuse. At present, there is no independent institutional mechanism which would put in place procedural safeguards in the RFP which will regulate the proposed project. In the absence of any checks and balances, function creep becomes an immediate problem wherein the issue of AFRS being used for functions more than its stated purpose becomes a reality. Use of AFRS without safeguards could result in illegal state-sponsored mass surveillance which would have a chilling effect on fundamental rights such right to freedom of expression, freedom of movement and freedom of association which are guaranteed in the Constitution. Fear of identification and retaliation by the state would deter individuals from exercising their fundamental right to protest which is included in the freedom of speech and expression.

Conclusion

Use of this technology without having legal safeguards in place could lead to harms such as discrimination and exclusion which will be difficult to undo. Additionally, in the absence of a strong data protection law, use of this technology could easily lead to mass surveillance. A strong data protection law is needed to hold AFRS and the NCRB accountable in terms of collection, storage and usage of data including sharing of data across government agencies and with third parties to ensure that there is proportionality in processing the data collected. Proportionality during processing of personal data means that only that personal data which is adequate and relevant for the purposes of the processing is collected and processed.

Published By
Anushka Jain

Anushka Jain

Associate Counsel (Surveillance & Transparency), IFF

Share this Case Study