The ongoing protests and the response has led to a focus on face recognition. In multiple places, police have taken photographs using drones, and then used facial-recognition software to identify individuals in crowds, including many minors. This raises concerns about all-pervasive state surveillance.
India may soon have one of the largest facial recognition databases. In June, the National Crime Records Bureau (NCRB) tendered out for technical help to build a database for the police match “persons of interest” with pictures, including pictures from sources like newspapers, social media and TV grabs, etc.
The system should allow the police to carry out “fast and accurate face recognition” in a “live environment”, according to the tender. The NCRB says this would help to identify missing persons. The face-recognition database would not, according to the NCRB, be integrated with Aadhaar, which includes photographs and other biometric data.
The NCRB also said, in a written response to the Internet Freedom Foundation, that the system would not use images from CCTV in public places unless a crime was committed. The data would be hosted on a centralised application “made available for access to only police”.
The protests are likely to lead to an accelerated adoption of such systems. It’s anyone’s guess if the NCRB will actually abide by its commitment not to link to other databases like aadhaar. In addition to that database, CCTVs are up and running in both public spaces and closed environments. It’s likely that your image is being captured multiple times on a normal day on multiple CCTV cameras, with the data stored and processed by multiple organisations. The police already uses CCTV images to levy traffic fines.
This ambitious surveillance programme would be considered over-reach in democratic countries. It involves collection and processing of private personal data without consent. However, India’s proposed Personal Data Protection Bill, which has not yet been passed in Parliament, does not contain specific protections against this. It proposes to give all government agencies and organisations an open licence to collect and process data, without consent.
The original draft contained a provision that government agencies should only collect data without consent, when it was necessary and proportionate. That was deleted in the draft circulated to the Lok Sabha in early December. That means there are currently zero checks and balances against state surveillance and if the Bill is passed as is, there will continue to be zero checks and balances.
Digital photographs of people are among the easiest things to collect without consent, or knowledge. In the Eurozone, where this is considered private personal data, the “right to forget” clause of the General Data Protection Regulation can be invoked to ask for erasure of such data. It is not clear if such a request could be made under the proposed Personal Data Protection Bill, 2019 although it does contain a right to forget clause.
Face recognition technology comes in many types, for many purposes. There are one of one matching systems, such as laptops or mobile device, which stores the digital picture of one individual and matches faces of users upon login. There is one versus few, such as a small database of employees in a given organisation. In both these cases, there is consent which can be withdrawn.
One from many matches occur when the police match a picture of a person of interest (or a corpse), versus a large database. There are also many from many systems when the police take pictures of large crowds and match every face versus other large databases. There may not be consent, or knowledge in either of these.
Fooling modern facial recognition programs isn’t easy, though these do throw up both false positive and negatives. An air-filter mask may not fool a modern face-recognition program. Make up streaked in odd lines can work, but is conspicuous. A scarf, or earrings, or anything worn close to a person's real face, with images that look like a face may confuse the software. Special baseball caps with LED /infrared lights can work while being invisible to the human eye. So can privacy visors, which look like normal sunglasses but reflect light back in odd ways.
Face recognition technology and its use in mass surveillance by sundry governments is deeply contentious. It has been banned in several jurisdictions. It also has technical flaws — for example, false positives lead to innocents being harassed. Courts and judges may not be sufficiently techno-savvy to understand the limitations, or the scope for framing and digital forgery if this is used for police work. However this is now widely prevalent in India and it will remain legal and all-pervasive, going by the data protection bill.