While data privacy is a contentious issue, face recognition programmes offer an opportunity to trace missing children. The tricky part is to guard against the leak of data related to ethnic profiling.
By Sujit Bhar
The Ministry of Electronics and Information Technology, Government of India, held a virtual global summit on Artificial Intelligence (AI) from October 5 to 9. The event, held in collaboration with NITI Aayog, was called RAISE 2020 (Responsible AI for Social Empowerment).
The government is going all out to encourage development and the incorporation of AI in various aspects of research and development in the country. It is also aware of the perils of the incredible quantity of raw data that can be processed by algorithms employed by the AI software.
Union Minister Ravi Shankar Prasad said in an article he wrote in Hindustan Times just before the event: “Data resources are going to play a vital role in AI’s development, but concerns regarding the misuse of data and breach of privacy of users must be addressed by AI systems.” To this end, the minister had introduced the Personal Data Protection Bill in the Lok Sabha on December 11, 2019. It had been referred to a Parliamentary Standing Committee the same day.
The objective of RAISE 2020 was explained by Prasad he holds the law and justice, electronics and information technology and communications portfolios. In that article he had made two key observations. The first was about data privacy and the second was about the very objective of the algorithms applied and that they should be free of any biases and prejudices.
While data privacy is a rather contentious issue worldwide, face recognition programmes, enhanced by the use of algorithm based AI, have been in the thick of debates for as long as face recognition has come into the picture.
It is a programme of evil, say naysayers, an opportunity for doing good, say others. The divide has been growing. Wrote Prasad: “Algorithms that define the set of rules to operate AI systems must be free of any biases and prejudices. For example, face recognition systems must not display any racial or ethnic biases and news and social media systems must not be biased towards any particular political ideology.”
This statement has to be treated with care. Apparently, it is benign, almost benevolent, except the last part, where he says “social media systems must not be biased towards any particular political ideology.”
The obvious reference was to Cambridge Analytica and its involvement in trying to rig polls. But that is a different story. The apparent idea of not including “any racial or ethnic biases”, while it looks good on the face of it can actually be a wrong approach.
The biggest use of such mass face recognition programmes in India would be to trace missing children. With massive shortage of trained police and other investigative personnel most are, anyway, involved in politically-motivated cases, leaving little scope or time for more serious social problems the use of such technology could have been highly beneficial. The police in India have just about started using these at various places such as railway and bus stations, busy market places and such other places.
However, if the algorithm that drives the AI system refuses to identify faces by race or ethnicity, there would be little hope of getting anywhere near the true identification of a child’s face and then matching it with a missing person’s record. There will not be much to trace the face back to its origin.
An NCRB study of 2018 says: “Hundreds of children go missing every day in the country. During the year 2016, a total of 63,407 children, during 2017, 63,349 children, and during 2018 a total of 67,134 children have been reported as missing.”
Those are large numbers. There is district wise data on the missing children. The reasons for such instances are many. They could have been kidnapped and put into begging or held for ransom, they could be trafficked to different states or countries and forced into prostitution or camel jockey work or it could even be a case of revenge. Let’s face it, child trafficking is an issue which India understands, has tried to address but has miserably failed to achieve results in.
This is where face recognition software comes in. This is a very big necessity and has to be balanced with the overarching fear of loss of privacy of the individual, as well as of ethnic and racial profiling possibilities. There can be no one-size-fits-all solution to this complex problem. The minister has addressed one side of the problem, and this is a good, socially mature move. But how does one address the issue of our nation’s children?
China has been in the frontline in the use of face recognition technologies which employ AI. Technologies are so advanced today that it would take a very small time to spot, identify and match a single face even in a large crowd. This is the type of software that could come in handy for the police and other investigating agencies in spotting and identifying missing children, matching them in the blink of an eye with existing databases.
Of course, it is another matter that there are no existing proper databases in India that can be accessed by any form of online image-searching software from remote locations. That lacuna has to be addressed separately.
In the “grounds for processing personal data” Section of the Personal Data Protection Bill, there is a provision by which data fiduciaries can bypass the consent restriction for data access. It says: “…in certain circumstances, personal data can be processed without consent. These include: (i) if required by the State for providing benefits to the individual, (ii) legal proceedings, (iii) to respond to a medical emergency.”
The first and the second would constitute operational possibilities for data grab of children from other sources, sans consent of parents. These would be Aadhaar and such other online databases. Technically, this is a sound proposition, but Aadhaar, from its inception, has been infested with malignant and unreliable data. Plus, with the process being cumbersome, most do not bother to update their pictures and other data on the Aadhaar database. Without parental intervention such changes would anyway not be possible, and for the entire BPL section of the country, this is a luxury they cannot afford. The primary objective is to have the fingerprints etched in correctly, so government aid can be accessed.
Yet this piece of biometric data, however well recorded, will not help in the recognition of a face in a large crowd. For that an exact photograph would be essential. This is possibly the least enriched part of an Aadhaar card. Hence, most Aadhaar cards carry old data.
Under these circumstances, the face recognition software would probably be at a loss to understand and negotiate such old data. In the overall data sorting area, ethnic facial features have an important role. Every feature of a face and its bone structure are categorised and recognition is made on the basis of those.
If new AI is designed to ignore such facial features and attendant data, the problems could multiply manifold, giving wrong or often misleading data. If time is lost in identifying the face of a child, the chances of finding him or her become slim. Therefore, it would be necessary to build into the algorithm a system that would be able to identify, while not applying any bias in the profiling. It is a sticky point that needs to be addressed.
Facial recognition software has recently been used by the police in India to track missing children with some benefits. But it has already has had an acid reflux. A report in Al Jazeera, in December last year, described how the police used the same facial recognition software to screen crowds at a political rally on December 22. That was the first time it was done in this country. Mass surveillance is possible through such software. And the bigger, nobler objective of finding missing children is quickly sacrificed.
The software referred to in the new item was Automated Facial Recognition System (AFRS). How does it work? According to Norton, the computer anti-virus maker, “facial recognition software reads the geometry of your face. Key factors include the distance between your eyes and the distance from forehead to chin. The software identifies facial landmarks one system identifies 68 of them that are key to distinguishing your face. The result: your facial signature.” Thereafter, “Your facial signature a mathematical formula is compared to a database of known faces.”
Hence every ethnic feature is important in narrowing down the search for a missing child. This could be his/her sole thread of survival. As the minister himself agrees, there can be no excuse for misuse of this critical progress of science.
—The writer is a senior journalist
Lead Picture: devteam.space