In 2009, I became extremely concerned with the concept of Unique Identity for various reasons. Connected with many like minded highly educated people who were all concerned.
On 18th May 2010, I started this Blog to capture anything and everything I came across on the topic. This blog with its million hits is a testament to my concerns about loss of privacy and fear of the ID being misused and possible Criminal activities it could lead to.
In 2017 the Supreme Court of India gave its verdict after one of the longest hearings on any issue. I did my bit and appealed to the Supreme Court Judges too through an On Line Petition.
In 2019 the Aadhaar Legislation has been revised and passed by the two houses of the Parliament of India making it Legal. I am no Legal Eagle so my Opinion carries no weight except with people opposed to the very concept.
In 2019, this Blog now just captures on a Daily Basis list of Articles Published on anything to do with Aadhaar as obtained from Daily Google Searches and nothing more. Cannot burn the midnight candle any longer.
"In Matters of Conscience, the Law of Majority has no place"- Mahatma Gandhi
Ram Krishnaswamy
Sydney, Australia.

Aadhaar

The UIDAI has taken two successive governments in India and the entire world for a ride. It identifies nothing. It is not unique. The entire UID data has never been verified and audited. The UID cannot be used for governance, financial databases or anything. It’s use is the biggest threat to national security since independence. – Anupam Saraph 2018

When I opposed Aadhaar in 2010 , I was called a BJP stooge. In 2016 I am still opposing Aadhaar for the same reasons and I am told I am a Congress die hard. No one wants to see why I oppose Aadhaar as it is too difficult. Plus Aadhaar is FREE so why not get one ? Ram Krishnaswamy

First they ignore you, then they laugh at you, then they fight you, then you win.-Mahatma Gandhi

In matters of conscience, the law of the majority has no place.Mahatma Gandhi

“The invasion of privacy is of no consequence because privacy is not a fundamental right and has no meaning under Article 21. The right to privacy is not a guaranteed under the constitution, because privacy is not a fundamental right.” Article 21 of the Indian constitution refers to the right to life and liberty -Attorney General Mukul Rohatgi

“There is merit in the complaints. You are unwittingly allowing snooping, harassment and commercial exploitation. The information about an individual obtained by the UIDAI while issuing an Aadhaar card shall not be used for any other purpose, save as above, except as may be directed by a court for the purpose of criminal investigation.”-A three judge bench headed by Justice J Chelameswar said in an interim order.

Legal scholar Usha Ramanathan describes UID as an inverse of sunshine laws like the Right to Information. While the RTI makes the state transparent to the citizen, the UID does the inverse: it makes the citizen transparent to the state, she says.

Good idea gone bad
I have written earlier that UID/Aadhaar was a poorly designed, unreliable and expensive solution to the really good idea of providing national identification for over a billion Indians. My petition contends that UID in its current form violates the right to privacy of a citizen, guaranteed under Article 21 of the Constitution. This is because sensitive biometric and demographic information of citizens are with enrolment agencies, registrars and sub-registrars who have no legal liability for any misuse of this data. This petition has opened up the larger discussion on privacy rights for Indians. The current Article 21 interpretation by the Supreme Court was done decades ago, before the advent of internet and today’s technology and all the new privacy challenges that have arisen as a consequence.

Rajeev Chandrasekhar, MP Rajya Sabha

“What is Aadhaar? There is enormous confusion. That Aadhaar will identify people who are entitled for subsidy. No. Aadhaar doesn’t determine who is eligible and who isn’t,” Jairam Ramesh

But Aadhaar has been mythologised during the previous government by its creators into some technology super force that will transform governance in a miraculous manner. I even read an article recently that compared Aadhaar to some revolution and quoted a 1930s historian, Will Durant.Rajeev Chandrasekhar, Rajya Sabha MP

“I know you will say that it is not mandatory. But, it is compulsorily mandatorily voluntary,” Jairam Ramesh, Rajya Saba April 2017.

August 24, 2017: The nine-judge Constitution Bench rules that right to privacy is “intrinsic to life and liberty”and is inherently protected under the various fundamental freedoms enshrined under Part III of the Indian Constitution

"Never doubt that a small group of thoughtful, committed citizens can change the World; indeed it's the only thing that ever has"

“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say.” -Edward Snowden

In the Supreme Court, Meenakshi Arora, one of the senior counsel in the case, compared it to living under a general, perpetual, nation-wide criminal warrant.

Had never thought of it that way, but living in the Aadhaar universe is like living in a prison. All of us are treated like criminals with barely any rights or recourse and gatekeepers have absolute power on you and your life.

Announcing the launch of the # BreakAadhaarChainscampaign, culminating with events in multiple cities on 12th Jan. This is the last opportunity to make your voice heard before the Supreme Court hearings start on 17th Jan 2018. In collaboration with @no2uidand@rozi_roti.

UIDAI's security seems to be founded on four time tested pillars of security idiocy

1) Denial

2) Issue fiats and point finger

3) Shoot messenger

4) Bury head in sand.

God Save India

Sunday, October 11, 2015

8909 - India: New computer system to reduce online data snooping - Deal Street Asia




October 10, 2015:  

There are no two opinions that Big Data analytics has the potential to reenergise businesses by inspecting mountains of data to provide insights into consumer behaviour. Researchers, too, have been mining massive online datasets for insights that can save lives, improve services and inform our understanding of the world.

On the flip side, though, governments and cyber criminals have access to these very datasets and have the capability to snoop on sensitive public data, especially that which may be generated by surfing the web, interacting with medical devices or from sensors.

Some data may be trivial, but in many cases, data are deeply personal, and can even influence our employers, tarnish our reputation, influence insurance premiums or even the price we pay for a product online.

Almost 15 years back, for instance, Latanya Sweeney—professor of government and technology in residence at Harvard University—led a team that uncovered the identities of patients, including the then Massachusetts governor William Weld, by correlating anonymized data with other publicly available data.

Using public anonymous data from the 1990 census, Sweeney found that 87% of the population in the US (then, 216 million of 248 million), could likely be uniquely identified by their five-digit ZIP code, combined with their gender and date of birth. A similar study was conducted by Philippe Golle of the Palo Alto Research Center.

In December, 2004, graduate student Arvind Narayanan and professor Vitaly Shmatikov—both from the Department of Computer Sciences at the University of Texas at Austin–claimed to have identified two people out of the nearly half million anonymized users whose movie ratings were released by online rental company Netflix in 2003. The company published the large database as part of its $1 million Netflix Prize that year.

Four years later, they applied their de-anonymization methodology to the Netflix Prize dataset, which contained anonymous movie ratings of 500,000 subscribers of Netflix to “demonstrate that an adversary who knows only a little bit about an individual subscriber can easily identify this subscriber’s record in the dataset”.

Such revelations get compounded with the fact that many prominent websites, social networking sites, and even governments admitting to comprising personal information, and thus the privacy, of individuals. Telecoms, the world over, are known to have colluded with governments to part with sensitive public data, ostensibly culled for security reasons.

In India, which is yet to get a law to protect privacy, the Supreme Court is yet to take a final decision on the plea of the government and some regulatory bodies that are seeking the setting up of a larger bench to modify an earlier order that restricts the voluntary use of the Aadhaar card to the public distribution system (PDS) and LPG schemes. The use of the Unique Identitification Authority of India’s (UIDAI) Aadhaar number beyond the PDS and LPG schemes, has been challenged in court over privacy concerns since it uses biometric data like fingerprint and iris scans.

To address such privacy concerns, researchers like Salil Vadhan—a professor of computer science at Harvard University and former director of the Center of Research on Computation and Society—are exploring an approach known as ‘Differential Privacy’ that allows one to investigate data without revealing confidential information about participants.

Initially introduced by Cynthia Dwork, Frank McSherry, Kobbi Nissim and Adam Smith, among others, in the mid-2000s, researchers continue to develop the concept today to apply it for real-world problems.

As the lead researcher for the National Science Foundation (NSF) supported “Privacy Tools for Sharing Research Data,” Vadhan and his team at Harvard, according to a 7 October press statement, are developing a new computer system that acts as a trusted curator—and identity protector—of sensitive, valuable, data.

The Sloan Foundation and Google, Inc. are providing the project with additional support.

Researchers ask the virtual curator questions based on the data. For instance, “What percentage of individuals who have Type B blood are also HIV positive?” The computer returns an answer that is approximately accurate, but that includes just enough “noise” that no matter how hard someone tries, they cannot find out anything specific to any individual in the database.

“Even if an adversary tries to target an individual in the dataset, the adversary should not be able to tell the difference between the world as it is and one where that individual’s data is entirely removed from the dataset,” Vadhan said. “Randomization turns out to be very powerful.”

If the system is implemented simply, the level of privacy degrades with multiple queries, so one could keep asking questions until the point where identifying people in the database becomes possible. However, by judiciously increasing the amount of noise and carefully correlating it across queries, the system can maintain privacy protection, even in the face of very large number of questions, notes Vadhan.

Differential privacy has become a hot topic in recent years. A 2015 Science magazine article referred to differential privacy as one of the most promising technical solutions for protecting the data of students enrolled in Massive Open Online Courses (MOOCs). Projects including OnTheMap, used for US census data, RAPPOR, a new product from Google, apply forms of differential privacy for data sharing.

Harvard’s Institute for Quantitative Social Science, according to Vadhan, is planning to use differential privacy techniques to enable more researchers to share, retain control of, and credit for their data contributions as part of the Dataverse Network–a project that guarantees the long-term preservation of critical datasets.

Dataverse is the largest public general-purpose research data repository in the world. However, the scientific community could access far more datasets that are currently not publicly available, if differential privacy’s promise is fulfilled, according to Gary King, Albert J. Weatherhead III University Professor at Harvard University and Director of the Institute for Quantitative Social Science.

“That’s why we’re so thrilled to be working on this project,” King said in the press statement. “The social sciences are finally getting to the point in human history where we have enough information to move from studying problems to actually solving them. As we make progress on the privacy problem, we will be able to unlock more and more of the potential of this new information.”

The differential privacy tool Vadhan and his team are developing will allow the inclusion of datasets that were previously withheld because the information was too sensitive and privacy was uncertain.

Currently, Dataverse is not equipped to handle datasets with privacy concerns associated with them, according to Vadhan.
Differential privacy also doesn’t work for every type of research question. Vadhan pointed to regression, machine learning, and social network analysis as areas where there are very promising theoretical results, but challenges remain to making differential privacy work well in practice.

This article was first published on Livemint.com