In 2009, I became extremely concerned with the concept of Unique Identity for various reasons. Connected with many like minded highly educated people who were all concerned.
On 18th May 2010, I started this Blog to capture anything and everything I came across on the topic. This blog with its million hits is a testament to my concerns about loss of privacy and fear of the ID being misused and possible Criminal activities it could lead to.
In 2017 the Supreme Court of India gave its verdict after one of the longest hearings on any issue. I did my bit and appealed to the Supreme Court Judges too through an On Line Petition.
In 2019 the Aadhaar Legislation has been revised and passed by the two houses of the Parliament of India making it Legal. I am no Legal Eagle so my Opinion carries no weight except with people opposed to the very concept.
In 2019, this Blog now just captures on a Daily Basis list of Articles Published on anything to do with Aadhaar as obtained from Daily Google Searches and nothing more. Cannot burn the midnight candle any longer.
"In Matters of Conscience, the Law of Majority has no place"- Mahatma Gandhi
Ram Krishnaswamy
Sydney, Australia.

Aadhaar

The UIDAI has taken two successive governments in India and the entire world for a ride. It identifies nothing. It is not unique. The entire UID data has never been verified and audited. The UID cannot be used for governance, financial databases or anything. It’s use is the biggest threat to national security since independence. – Anupam Saraph 2018

When I opposed Aadhaar in 2010 , I was called a BJP stooge. In 2016 I am still opposing Aadhaar for the same reasons and I am told I am a Congress die hard. No one wants to see why I oppose Aadhaar as it is too difficult. Plus Aadhaar is FREE so why not get one ? Ram Krishnaswamy

First they ignore you, then they laugh at you, then they fight you, then you win.-Mahatma Gandhi

In matters of conscience, the law of the majority has no place.Mahatma Gandhi

“The invasion of privacy is of no consequence because privacy is not a fundamental right and has no meaning under Article 21. The right to privacy is not a guaranteed under the constitution, because privacy is not a fundamental right.” Article 21 of the Indian constitution refers to the right to life and liberty -Attorney General Mukul Rohatgi

“There is merit in the complaints. You are unwittingly allowing snooping, harassment and commercial exploitation. The information about an individual obtained by the UIDAI while issuing an Aadhaar card shall not be used for any other purpose, save as above, except as may be directed by a court for the purpose of criminal investigation.”-A three judge bench headed by Justice J Chelameswar said in an interim order.

Legal scholar Usha Ramanathan describes UID as an inverse of sunshine laws like the Right to Information. While the RTI makes the state transparent to the citizen, the UID does the inverse: it makes the citizen transparent to the state, she says.

Good idea gone bad
I have written earlier that UID/Aadhaar was a poorly designed, unreliable and expensive solution to the really good idea of providing national identification for over a billion Indians. My petition contends that UID in its current form violates the right to privacy of a citizen, guaranteed under Article 21 of the Constitution. This is because sensitive biometric and demographic information of citizens are with enrolment agencies, registrars and sub-registrars who have no legal liability for any misuse of this data. This petition has opened up the larger discussion on privacy rights for Indians. The current Article 21 interpretation by the Supreme Court was done decades ago, before the advent of internet and today’s technology and all the new privacy challenges that have arisen as a consequence.

Rajeev Chandrasekhar, MP Rajya Sabha

“What is Aadhaar? There is enormous confusion. That Aadhaar will identify people who are entitled for subsidy. No. Aadhaar doesn’t determine who is eligible and who isn’t,” Jairam Ramesh

But Aadhaar has been mythologised during the previous government by its creators into some technology super force that will transform governance in a miraculous manner. I even read an article recently that compared Aadhaar to some revolution and quoted a 1930s historian, Will Durant.Rajeev Chandrasekhar, Rajya Sabha MP

“I know you will say that it is not mandatory. But, it is compulsorily mandatorily voluntary,” Jairam Ramesh, Rajya Saba April 2017.

August 24, 2017: The nine-judge Constitution Bench rules that right to privacy is “intrinsic to life and liberty”and is inherently protected under the various fundamental freedoms enshrined under Part III of the Indian Constitution

"Never doubt that a small group of thoughtful, committed citizens can change the World; indeed it's the only thing that ever has"

“Arguing that you don’t care about the right to privacy because you have nothing to hide is no different than saying you don’t care about free speech because you have nothing to say.” -Edward Snowden

In the Supreme Court, Meenakshi Arora, one of the senior counsel in the case, compared it to living under a general, perpetual, nation-wide criminal warrant.

Had never thought of it that way, but living in the Aadhaar universe is like living in a prison. All of us are treated like criminals with barely any rights or recourse and gatekeepers have absolute power on you and your life.

Announcing the launch of the # BreakAadhaarChainscampaign, culminating with events in multiple cities on 12th Jan. This is the last opportunity to make your voice heard before the Supreme Court hearings start on 17th Jan 2018. In collaboration with @no2uidand@rozi_roti.

UIDAI's security seems to be founded on four time tested pillars of security idiocy

1) Denial

2) Issue fiats and point finger

3) Shoot messenger

4) Bury head in sand.

God Save India

Showing posts with label Data Privacy. Show all posts
Showing posts with label Data Privacy. Show all posts

Monday, June 25, 2018

13715 - 1706, Nsk Contractor pulled up for handling Aadhaar data shoddily - TNN


Abhilash Botekar | TNN | Jun 18, 2018, 17:50 IST

Nashik: It was a tweet by college student Nilay Kulkarni that jolted district authorities out of their slumber and forced them to take action against the shoddy manner in which acknowledgment print-outs of several Aadhaar applicants were stocked.

Hours after Kulkarni’s tweet at 12.43 pm on Saturday, authorities swung into action. By Sunday afternoon, a show-cause notice had been slapped on the contractor entrusted the task of safe custody of these papers for alleged ‘dereliction of duties.’

During his visit to Nashik Municipal Corporation-managed Yashwantrao Chavan Planetarium on Saturday for a science exhibition, the 18-year-old student stumbled on print-outs at the Aadhaar enrolment centre functioning at the premises of the planetarium. There is no door at this centre, so trespassing is easy.

“There were about 1,000 such print-outs having personal information of the applicants like name, sex, date of birth, mobile number and email address,” Kulkarni told TOI.

He uploaded this information on Twitter with a photograph, saying, “Aadhar print-outs simply lying around at Planetarium. You don’t even need to hack databases, simply take home the papers and scan for a huge mail address list to be sold.” The 28-year-old Frenchman who follows Aadhaar closely and operates a Twitter handle under the name ‘Elliot Alderson’, with over 50,000 followers, was among over 100 people to retweeted this.

On Twitter, people expressed shock at the easy access to private information. Some wondered how this information could be abused.

“My only concern was to ensure the papers were secured before going into the hands of miscreants,” Kulkarni said, thanking authorities for acting promptly to set things right.

Deputy collector Shashikant Mangrule told TOI that it was not true that there were thousands of print-outs. “There were about 100 odd print-outs of enrolment identification (EID) containing name and only such information needed for update on Aadhaar. We have decided to slap the contractor assigned to run that centre on Monday with a show-cause notice. These should have been in safe custody,” the deputy collector said.

Mangrule said Aadhaar processing kits are being kept in government premises, in accordance with the rules. The planetarium is guarded by security personnel of the civic body. He admitted that the manner in which EIDs were accessed by common people only highlights the fact that the contractor had failed to take adequate measures to protect these.

LATEST COMMENT
Administration now asks Aadhaar operators to maintain no print records with them in any case as the work is the entirely ''online'' process and maintaining print records is not required
Abhilash Botekar

Some district officials remarked on the alleged failure of security staff for being unable to ensure that there were no trespassers in the area.

“Since Aadhaar processing is now carried out only online, no documents are expected to be retained by the contractor. No printouts were also expected. This norm was violated, and so a show-cause notice was served,” Mangrule said.

13711 - Privacy advocates seek stronger laws - The Hindu


NEW DELHI, JUNE 18, 2018 01:45 IST

A citizen advocacy group has put together a model Bill that focuses on user rights, data protection

The Cambridge Analytica scandal and the Aadhaar database security concerns have provoked a citizen advocacy group to launch a campaign to protect the privacy of individuals in India.
A set of lawyers and policy analysts have put together a model Bill — the Indian Privacy Code, 2018, — with an overriding effect over the Aadhaar Act. The initiative, backed by the Internet Freedom Foundation (IFF), is looking to garner public awareness and nudge the government into adopting a strong law focused on user rights.

The model Bill envisages a law that will prevent some of the fundamental features of the Aadhaar Act from allegedly operating against citizens. This, the advocacy group expects, will shift power from the Unique Identification Authority of India (UIDAI) to the people.

Advocate Apar Gupta, a co-founding member of IFF, said fundamental features of the Aadhaar Act make its use mandatory while being a universal digital ID not tied to a specific purpose.

Sensitive form of data
“It relies on biometrics, which are an incredibly sensitive form of data. It results in mass surveillance as precondition to availing essential services. Due to its architecture, it makes people vulnerable to data breach and identity theft,” Mr. Gupta said.
However, the model Bill seeks to allow people the option of knowing how much of their data are collected, what information is parted with and what are its consequences. More importantly, it will clearly demarcate an option for the people to refuse consent. This undercuts the Aadhaar Act but more importantly the administrative practices which have resulted in making it mandatory, Mr. Gupta said.

IFF members were also part of the ‘Save the Internet’ campaign that was instrumental in pushing back Facebook’s Free Basics in India.

The advocacy body said its latest campaign, ‘Save Our Privacy’, was to make sure that India gets a privacy and data protection law that protects the fundamental right to privacy.

Privacy law
There is no separate law in India on privacy and data protection. While many drafting efforts have been made since 2010, little has come out of it. In 2012, an Expert Group on Privacy, chaired by former Delhi High Court Chief Justice A.P. Shah, had submitted a report to the Planning Commission. The report had recommended passing a law that makes privacy safeguards technology-neutral and applicable to both government and private sectors.

During the Aadhaar hearing before the Supreme Court in mid-2017, the Centre had constituted a committee of experts under the chairmanship of former Supreme Court Justice B.N. Srikrishna. This committee had released a White Paper and is expected to recommend a draft law to the Ministry of Electronics and Information Technology.

A nine-judge Bench of the apex court had last year declared privacy as intrinsic to life and liberty, and an inherent right protected under the Constitution. This means that an ordinary citizen can now directly approach the court in case of violation of his/ her privacy. The verdict armed the common man against unreasonable State intrusions and protected informational privacy in a digital age.

During the hearing, the top court had expressed apprehensions against the State passing on personal data collected from citizens to private players.

Privacy commission
The model Bill is built on seven progressive privacy principles, including use and purpose limitation (personal data collected for specified purposes cannot be further processed for other purposes) in collection and processing of data. It said a strong and independent privacy commission was necessary to ensure that data protection rights are enforced. The model Bill provides the privacy commission wide powers of investigation, adjudication, rule-making and enforcement.

The model Bill said the government, its arms, bodies and programmes should be made compliant with the privacy protection principles through a data protection law. “We support the use of digital technologies for public benefit. However, it should not be privileged over fundamental rights,” the advocacy group said.

Mass surveillance
“The government is responsible for delivery of many essential services to the public. These services must not be withheld from an individual due to such individual not sharing data with the government...Withholding services on the pretext of requirement of collection of data effectively amounts to extortion of consent. Individuals cannot be forced to trade away data and citizenship at the altar of being permitted to use government services and access legal entitlements on welfare,” the advocacy group said.

The group said the data protection law will have to limit mass surveillance as it contravenes the principles of necessity, proportionality and purpose limitation. It said that evidence gathered illegally, such as telephone intercepts without valid tapping orders, is inadmissible as proof in legal proceedings. To ensure further accountability, all such orders need to be communicated to the person who was surveilled.

Collection of data
The model Bill seeks to ensure that no government or private entity collects sensitive personal data without consent from an individual. The individual will have the right to obtain information from the data controller. This information will include purposes of storage and processing; categories of personal data; recipients to whom personal data have been or will be disclosed; the right to lodge a complaint with a supervisory authority; and existence of automated decision-making.

More importantly, the individual will have a right to request erasure and destruction of data at any time, and data controllers and processors will have to comply with such requests within a fixed time frame.

Offences and penalties
The model Bill also seeks to provide punishment for those found illegally collecting, receiving, storing, processing, disclosing or otherwise handling any personal data. Punishment for this offence may include a fine of ₹1 crore and a three-year imprisonment. Even illegal surveillance of another person will be liable to a fine, which may extend to a fine of ₹10 crore and a five-year jail term.


The foundation has sent an e-mail to the Srikrishna Committee, with a copy of the model Bill. It said this was a policy fix for recurring concerns and controversies, including issues such as Aadhaar, Cambridge Analytica, the social media communication Hub and Edward Snowden’s revelations on mass surveillance.

Saturday, May 20, 2017

11437 - Don’t panic, your Aadhaar is safe, writes RS Sharma

Amidst concerns regarding Aadhaar data leak, a look at how the unique identification system actually adheres to the principles of privacy

RS Sharma | May 19, 2017 | New Delhi

                          Illustration: Ashish Asthana

Privacy and data protection concerns have become serious in a digital world due to ease of search and aggregation, with or without Aadhaar. Type any name in Google and it will throw up thousands of results, giving data/information publicly available. Hence, the responsibility falls on organisations which collect data from individuals to protect it. If a state government puts bank account details of MNREGA workers or the PAN details on its portal then one doesn’t need an Aadhaar number to find any information – simply the name will do. There may be multiple people with that name. But so what! You can always find the person with their other attributes, if available publicly. Hence, each data custodian must become conscious of their responsibility to protect their customer’s data.

Aadhaar has been designed as a digital identity platform, which is inclusive, unique and authenticable to participate in any digital transaction. This has transformed service delivery in our country, providing huge convenience to citizens and substantial reduction of leakages. Direct benefit transfer, subscription to various services and authentication at the point of service delivery are some benefits. 

The UID project has been aware of privacy and data protection issues since the very beginning and has taken every step, as per the best practices available in the world, to ensure they are not violated. The general law on privacy is beyond the ambit of the UIDAI. With the Aadhaar Act in place, let us discuss the provisions relating to privacy and data protection in the Act.

UIDAI’s strategy document

Unlike many countries, India does not have a law on privacy. The law relating to it has been evolved by the courts through various judicial pronouncements over the years. Interestingly, the former UIDAI chairman, Nandan Nilekani, had written to the PM as early as in May 2010 suggesting a need for the privacy law. The government prepared a draft bill on Right to Privacy but it was not turned into a statute. Since then the law has been in the making.

Despite the absence of a formal legislation, UIDAI seems to have been aware of privacy concerns from the beginning and claims to have incorporated these in the design of Aadhaar. In its strategy document (‘UIDAI Strategy Overview: Creating a Unique Identity Number for Every Resident in India’, 2010), the authority states: “The UIDAI envisions a balance between ‘privacy and purpose’ when it comes to the information it collects on residents. The agencies may store the information of residents they enrol if they are authorised to do so, but they will not have access to the information in the UID database. The UIDAI will answer requests to authenticate identity only through a ‘Yes’ or ‘No’ response.”

The UIDAI recognised the potential risks in the area of privacy and put in place a mechanism to deal with them. Two such risks are:
Security and privacy of resident data: Aadhaar by design ensures security and privacy of residents’ data collected through enrolment process. Access to authentication services are given only to authorised ecosystem partners of UIDAI. Under no scenario, biometric data of residents is shared.

Risk to privacy and security of residents’ demographic and biometric data: UIDAI has deployed robust security infrastructure to prevent any unauthorised dissemination of demographic or biometric data of residents stored in central identities data repository (CIDR). Biometric data is never shared with any entity or individuals.

Thus, privacy and security of resident data seems to have been the focus of the UIDAI’s approach in designing the project. There have been many approaches to data protection and privacy. One of the most accepted approaches, which has become a kind of world standard, is called Privacy by Design (PbD).

PbD is an approach to systems engineering which takes privacy into account throughout the whole engineering process. It is an example of value-sensitive design, i.e., to take human values into account in a well-defined manner throughout the whole process and may have been derived from this.

The concept of PbD is related to the concept of Privacy Enhancing Technologies or PET. This term was used for the first time in the report ‘Privacy-enhancing technologies: the path to anonymity published in 1995’ (Hustinx, 2010). Since 1995, the concept of PET has been fully accepted and become a kind of standard. A number of countries have invested in creating better understanding and promotion of PET. One of the original proponents of PbD, Dr Ann Cavoukian, information & privacy commissioner, Ontario, Canada, has laid down seven foundational principles required to achieve the desired goal.

PbD advances the view that the future of privacy cannot be assured solely by compliance with regulatory frameworks; rather, privacy assurance must ideally become an organisation’s default mode of operation (Cavoukian, 2011).

The seven foundational principles are: proactive not reactive; preventative not remedial; privacy as the default setting; privacy embedded into design; full functionality – positive-sum, not zero-sum; end-to-end security – full lifecycle protection; visibility and transparency – keep it open; and respect for user privacy – keep it user-centric.

Proactive not reactive; preventative not remedial: The PbD approach is characterised by proactive rather than reactive measures. It is not an afterthought. It requires established methods to recognise poor privacy designs, anticipate poor privacy practices and outcomes, and correct any negative impacts, well before they occur in proactive, systematic and innovative ways.

Privacy as the default:
This principle is particularly informed by the following fair information practices (FIPs).

(i) Purpose specification: The purpose for which personal information is collected, used, retained and disclosed shall be communicated to the individual (data subject) at or before the time the information is collected.

(ii) Collection limitation: Collection of personal information must be fair, lawful and limited to that which is necessary for the specified purposes.

(iii) Data minimisation: Collection of personally identifiable information should be kept to a strict minimum.

(iv) Use, retention and disclosure limitation: The use, retention and disclosure of personal information shall be limited to the relevant purposes identified to the individual, for which he/she has given consent, except where otherwise required by law. Personal information shall be retained only as long as necessary to fulfil the stated purposes, and then securely destroyed.


Privacy embedded into design: Privacy must be embedded into technologies, operations and information architectures in a holistic, integrative and creative way. A systemic, principled approach to embedding privacy should be adopted − one that relies on accepted standards and frameworks, which are amenable to external reviews and audits.

Wherever possible, detailed privacy impact and risk assessments should be carried out and published, clearly documenting the privacy risks and measures taken to mitigate them, including consideration of alternatives and selection of metrics.

The privacy impacts of the resulting technology, operation or information architecture, and their uses, should be demonstrably minimised, and not easily degraded through use, misconfiguration or error.

Full functionality – positive-sum, not zero-sum: This seeks to accommodate all legitimate interests and objectives in a positive-sum “win-win” manner, not through a dated, zero-sum approach, where unnecessary trade-offs are made. PbD avoids the pretence of false dichotomies, such as privacy vs security, demonstrating that it is possible, and far more desirable, to have both.

End-to-end security – lifecycle protection: PbD having been embedded into the system prior to the first element of information collected, extends securely throughout the entire lifecycle of the data involved – strong security measures are essential to privacy, from start to finish. This ensures that all data are securely retained, and then securely destroyed at the end of the process, in a timely fashion. Thus, PbD ensures cradle to grave, secure lifecycle management of information.

Visibility and transparency: It assures all stakeholders that whatever the business practice or technology involved, it is in fact operating according to the stated promises and objectives, subject to independent verification. This PbD principle tracks well to fair information practices in their entirety, but for auditing purposes, special emphasis may be placed upon the following FIPs: accountability, openness and compliance.

Respect for user privacy: PbD requires architects and operators to keep the interests of the individual uppermost by offering such measures as strong privacy defaults, appropriate notice and empowering user-friendly options. Respect for user privacy is supported by the following FIPs:

(i) Consent: The individual’s free and specific consent is required for the collection, use or disclosure of personal information, except where otherwise permitted by law. Consent may be withdrawn at a later date.

(ii) Accuracy: Personal information shall be as accurate, complete, and up-to-date as is necessary to fulfil the specified purposes.

(iii) Access: Individuals shall be provided access to their personal information and informed of its uses and disclosures.

(iv) Compliance: Organisations must establish complaint and redress mechanisms, and communicate information about them to the public, including how to access the next level of appeal.

Application of PbD in Aadhaar

The second foundational principle ‘privacy as the default’ lays down some basic principles relating to collection and usage of personal information. These relate to privacy of users’ personal data. There are following operating principles here: purpose specification, collection limitation, data minimisation use, retention, and disclosure limitation.

Minimal data collection: You must justify collection of every data element from the perspective of its need and usage. Ideally, you should begin with zero data. Then you should add necessary data element. 

In Aadhaar’s case, a demographic data standards and verification procedure (DDSVP) committee, constituted by the UIDAI, recommended that only four aspects of demographic information should be collected. These are name, date of birth, gender and communication address.
On the issue of biometric data, the UIDAI collects photograph and images of both iris and all ten fingerprints. Photograph is certainly an essential data for identity establishment. The biometric data (iris and fingerprints) are also essential for ensuring uniqueness.
It optionally collects mobile number and email ID. This helps in communicating with the resident for any activity.

Thus, we find that there is no information which is unnecessary and unrelated to the purpose of this identity project. The principle of purpose specification is satisfied.

Data use limitation:
The UIDAI clarifies in its strategy document that the data collected will only be used for issuance of Aadhaar number and later for providing authentication service. In fact, consent of the resident is taken whether he/she would like to share their demographic data with the bank for opening an account.

Keeping the resident informed and the right to access their own data: This relates to informing the resident about data usage. The strategy document does not specify if the resident will be informed about it at the time of enrolment. However, the resident’s consent is taken relating to data usage.

The Aadhaar Act has the following provisions on keeping the resident informed about data usage and the right to access their own data (Section 3(2)).

Resident consent and information for authentication: While authentication process, defined in the strategy document as also in law, implies that the owner of the data participate in the process of authentication, the Act makes an explicit provision making the entity requesting for authentication responsible for informing the resident about the authentication (Section 8(1,2 & 3)).

Hence, the resident is always informed of the purpose, nature of information to be shared during and the use of information received during authentication. And all these responsibilities are cast upon the entity utilising the authentication facility of UIDAI through law, the violation of which is punishable (Section 40 of the Act).

Random numbers: The second design principle is to issue random numbers with no intelligence. The strategy document states: “Loading intelligence into identity numbers makes them susceptible to fraud and theft. The UID will be a random number.” It was done so with a view to protect privacy and profiling of Aadhaar holders.

Further, Aadhaar is a 12-digit number with the 12th digit being the check digit. With 11 digits, you can construct about 100 billion numbers. Considering that there are 1.3 billion numbers required for India, the system will be using merely 1.3% of the available numbers. As these are randomly distributed over the entire space (of 100 billion possible numbers), it is not possible to even guess a number. It ensures compliance with the principle of data anonymisation.

Data sharing policies: No data download is permitted, search is not allowed on any attribute. For example, you cannot search the database by giving some search criteria like a name or Aadhaar number.

There are only a few ways in which one can interact with Aadhaar database. One of these is authentication. It is a process wherein Aadhaar number, along with other attributes (demographic/biometrics/OTP) is submitted to UIDAI’s CIDR for verification. The CIDR responds with a “yes or no”. No personal identity information is given as part of the response. Only authentication user agencies can submit such requests.
However, data can be shared based on an explicit authorisation from the data owner, i.e., the concerned Aadhaar holder. This process is called electronic-KYC (eKYC).

Biometrics are never to be shared, except in certain situations like national security or a competent court’s orders. The law lays down these processes in a detailed manner.

Legal provisions relating to data sharing: According to Section 29 (2), the identity information, other than core biometric information, collected or created under this Act may be shared only in accordance with the provisions of this Act and in such manner as may be specified by regulations.

The subsection 3 says, No identity information available with a requesting entity shall be:

(a) used for any purpose, other than that specified to the individual at the time of submitting any identity information for authentication; or

(b) disclosed further, except with the prior consent of the individual to whom such information relates.

The Subsection 4 says, No Aadhaar number or core biometric information collected or created under this Act in respect of an Aadhaar number holder shall be published, displayed or posted publicly, except for the purposes as may be specified by regulations.

The Aadhaar Act also has stringent provisions relating to data sharing. Section 3(2)(b) specifies that the resident will be informed at the time of enrolment relating to data sharing: “(b) the nature of recipients with whom the information is intended to be shared during authentication”;

Protection of information: Chapter VI of the Act deals with protection of information. Section 28 casts the responsibility of data protection to the authority. The authority ensures security and confidentiality of identity information and authentication of individual records.
Biometric information is given a special treatment in the Act. Section 30 defines it as “sensitive personal information” within the meaning of the IT Act. As per Section 43A of IT Act, “sensitive personal data or information means such personal information as may be prescribed by the central government in consultation with such professional bodies or associations as it may deem fit.”

And there are stringent provisions for the body entrusted with handling sensitive personal data. If such a body “is negligent in implementing and maintaining reasonable security practices and procedures and thereby causes wrongful loss or wrongful gain to any person, such body corporate shall be liable to pay damages by way of compensation, not exceeding five crore rupees, to the person so affected.” [Section 43A of IT Act 2000]

Exceptions to data sharing provisions:
Adequate safeguards have been provided in the Act relating to safety, security and protection of data. However, it does make exception to this general rule in two specific cases listed in Section 33 of the Act, quoted below:

33 (1) “any disclosure of information, including identity information or authentication records, made pursuant to an order of a court not inferior to that of a District Judge”

33 (2) any disclosure of information, including identity information or authentication records, made in the interest of national security in pursuance of a direction of an officer not below the rank of joint secretary to the government of India specially authorised in this behalf by an order of the central government.


Even within these exceptions, safeguards are provided relating to review of cases by a high-powered oversight committee and limiting the duration of this sharing.

Federated data model: Besides minimal data, the UIDAI does not keep any data except the logs of authentication done by a person. It only knows the date/time and the agency through which authentication was done, and not the purpose. Thus, the transaction details remain with the concerned agency, not the UIDAI. This is the best model of keeping data where each data owner has the responsibility of data confidentiality and security. This principle is articulated in Section 32(3) of the Act, which ensures no aggregation of information about an individual.

Data protection technologies:
Aadhaar enrolment is done by the UIDAI registrars through enrolment agencies, which are private. This could pose serious data breach. This has been eliminated by ensuring that enrolment is done through standardised software and it is encrypted at the time of enrolment itself through an encryption key as strong as 2048 bits. Thereafter, the data is kept encrypted all the time – during transit and at the CIDR. It is momentarily opened for reading during processing. This has ensured not even a single case of data breach from the UID system. Even if the enrolment machine is stolen, the data cannot be misused as it is encrypted. These practices satisfy end-to-end security and lifecycle protection of the resident data.

Publication of Aadhaar number

Though Aadhaar numbers are random, section 29(4) of the Act prohibits its publication except for the purposes specified by regulations. The regulations also reiterate this provision and provide that no entity shall make public any database or record containing the Aadhaar numbers unless they have been “redacted or blacked out through appropriate means, both in print and electronic form”.
The purpose of these restrictions is that while Aadhaar numbers themselves are not confidential, their publication in various public records will make it easy to collate information about persons. Collation of data, as explained before, has become relatively easy in the digital world even otherwise.

The recent controversy regarding certain websites publishing Aadhaar numbers and bank account details of beneficiaries of various government programmes seems to have come in conflict with the prohibition of Section 29(4). This publication has been done in compliance with the RTI Act which makes it mandatory to publish the details of beneficiaries of various subsidy programmes being executed by every public authority. Hence prima-facie, the provisions of RTI Act and Aadhaar Act seem to be in conflict. While RTI Act mandates transparency, the Aadhaar Act prohibits publication of Aadhaar numbers. The best way to resolve this is to partially mask the Aadhaar numbers on websites. This will be the best balance between transparency of public records and privacy of individuals.

Summing up
The UIDAI, since its inception, has kept the basic principles to protect the privacy of individuals in mind. While the Aadhaar Act cannot replace an omnibus national privacy law needed for our country, it has certainly made that task easier by embracing data privacy principles. The debates on privacy violation by the UID have largely been concentrated on generalities. Aadhaar is being put to random tests with the expectation of being compliant with anything and everything on privacy (wishful thinking and whims and fancies of individuals included) and then criticised for falling short! Nothing comes out better fighting the windmills, not even Aadhaar, the world’s largest next generation digital identity platform.

Sharma is the chairman of TRAI. The views expressed are personal.

- See more at: http://www.governancenow.com/gov-next/egov/dont-panic-your-aadhaar-is-safe-writes-rs-sharma-uidai-privacy#sthash.8Nr9IRS3.dpuf

Saturday, March 5, 2016

9372 - #dnaEdit: Aadhaar’s bad luck - DNA


Tue, 16 Feb 2016-06:40am , dna

The reported plan to introduce the Aadhaar legislation as a money bill does disservice to its potential for use in welfare schemes

The Centre’s plan to introduce a legislation to provide statutory backing to the disbursal of central subsidies through the Aadhaar Unique Identification (UID) project is welcome. However, it has been reported that the government is planning to introduce the legislation as a money bill, so that the Rajya Sabha, where the Opposition commands a majority, cannot block it. This is ill-conceived and may not withstand judicial scrutiny. While there have been two recent instances of the black money and the bankruptcy code legislations being introduced as money bills, the implications of the UID project are much bigger. A thorough discussion by both houses of Parliament will only help iron out the weaknesses in the Bill and ensure it passes legal scrutiny.

The Modi government has come up with the catchy acronym JAM (Jan Dhan, Aadhaar, Mobile Governance) to plug leakages and speed up the delivery of payments related to central welfare schemes worth nearly Rs3.5 lakh crore. Though over 90 crore persons have already been enrolled in the UID project, the failure to ensure statutory backing has hurt its rollout in cash transfer schemes. With concerns about privacy violations and its possible use for intrusive purposes unlike welfare schemes, Aadhaar has come under the Supreme Court’s scanner. 

Currently, the Supreme Court’s interim order on Aadhaar usage restricts it to identifying beneficiaries of the public distribution system (PDS) and transferring subsidies on cooking gas and kerosene. As a result, ambitious plans to use Aadhaar for biometric attendance, Jan Dhan Yojana, pension payments, scholarships, MGNREGA wage disbursals, and operating payment banks, which were finalised, are being held in abeyance.

The legislation which has reportedly been titled the Aadhaar (Delivery of Benefits, Subsidies and Services) Bill 2016 will have to ensure that the twin imperatives of privacy and data security are upheld. For the government, it is also an opportunity to convey the impression that it has not lost its reformist zeal. The Jan Dhan Yojana has been successful in meeting its financial inclusion goal by roping in more people into the banking structure. But a large number of zero balance accounts opened under JDY are not showing signs of activity thus robbing the scheme of any meaningful utility for the beneficiaries. But by tying the bank accounts to the Aadhaar UID number and the mobile phone number, the Centre is on the cusp of entering a revolutionary phase where cash transfers become the norm.


The Aadhaar UID number will help in identifying the beneficiary while the mobile number will help in alerting the beneficiary about deposits to the account. Currently, there is no way to account for the leakages at the grass-roots level where fake muster rolls and ration cards are being used to divert MGNREGA funds and food grains allocation. Similarly, payment banks can use Aadhaar and mobile numbers to validate payments thus helping to drive more financial transactions to digital platforms. Despite the benefits that Aadhaar entails, the central government has repeatedly made a mess of ensuring its orderly rollout. The Opposition is certain to see political motives in certifying the legislations as a money bill. The Centre should attempt to build political consensus over Aadhaar, a UPA-era project, rather than provoke the Opposition to mindlessly raise reservations.

Thursday, September 10, 2015

8675 - Biometric data and data protection law: the CJEU loses the plot

Friday, 17 April 2015


Steve Peers

Many people are increasingly concerned about adequate protection of their biometric data. To this end, the proposed EU data protection Regulation would classify that data as sensitive data, ensuring an extra degree of protection for it. But in the meantime, before that proposal is adopted, there are other EU measures which regulate the issue. Unfortunately, yesterday’s judgment of the CJEU in Willems and others does an inadequate job, with great respect, in applying the current EU rules to such data.

Background

The Willems judgment concerns biometric data collected for passports, as provided for in an EU Regulation of 2004, as amended in 2009. In fact, the CJEU has ruled on this Regulation several times before. In UK v Council, it (unconvincingly) ruled that the UK could not participate in the Regulation, since it was closely linked to the parts of Schengen rules (the abolition of internal border controls) in which the UK didn’t participate. In Schwarz, it ruled that the Regulation was valid from two different angles, as it was correctly adopted using the ‘legal base’ allowing the EU to adopt measures on external border control, and the interference which it entailed with the right to privacy was justified by the interest in ensuring the identity of passport holders and the validity of the passport. Finally, the Court recently ruled on the privacy aspects of displaying names in passports (as discussed here).

Building on these judgments, the national court in Willems had two questions. First of all, did the Regulation apply to some types of identity cards, given that they can in effect be used as passports for travel within the EU? Secondly, the national court asked the CJEU to interpret the data protection rules applicable to the further use of biometric data after it was collected for the purposes of passports. The latter question stemmed from the concern of the litigants in this case that their biometric data would be stored on a centralised database with inadequate security, which would be used for other purposes without a clear identification of who would have access to it.

More precisely, the national court’s second question was whether ‘Article 4(3) of [the passport Regulation, read] in light of Articles 7 and 8 of the Charter of Fundamental Rights of the [EU], Article 8(2) of the [ECHR] and Article 7(f) of [the current data protection Directive], read in conjunction with Article 6(1)(b) of that Directive’, required a guarantee that when collecting biometric data under the Regulation, Member States had to apply a ‘purpose limitation’ rule that such data  could only be used for the original purpose for which the passport was issued.

Judgment

On the first question, the CJEU looked at the wording of the Regulation, which specified that it did not apply to ‘identity cards issued to [Member States’] nationals or to temporary passports and travel documents having a validity of 12 months or less’. The Court ruled that the words ‘having a validity of 12 months or less’ only set out the scope of the Regulation as regards ‘temporary passports and travel documents’, meaning that such documents were within the scope of the Regulation if they were valid for more than 12 months. On the other hand, the words ‘having a validity of 12 months or less’ did not set out the scope of the Regulation as regards national identity cards. So no identity cards fall within the scope of the Regulation, regardless of the period of their validity.

On the second question, the CJEU ruled that the passport Regulation only governed the use of data for the purposes of that Regulation. Any further use of that data, as specified in the preamble, was regulated by national law. It followed that the Regulation did not apply a purpose limitation rule upon Member States as regards biometric passport data. Because the Regulation did not apply to such uses by Member States, the EU Charter did not apply either, although such further use of data might be restricted by national law or the ECHR. Finally, as for the data protection Directive, the CJEU stated that ‘the referring court was requesting the interpretation of [the passport Regulation] and only that Regulation’, so there was no need to examine whether the data protection Directive affected national law on the further storage and use of biometric data collected for passport purposes.

Comments

I won’t mince words: this judgment is appalling.  It’s sensible enough as regards the scope of the passports Regulation itself, which clearly wasn’t intended to apply to any national identity cards or to the creation of government databases using biometric data. But the Court’s fundamental flaw is its failure to confirm and elaborate upon the application of the Charter and the data protection Directive to such databases.

Let’s examine those two points in turn. As regards the Charter, of course it’s true, as the Court says, that it only applies when a dispute falls within the scope of EU law. But the Court made that point only as regards the scope of the passports Regulation, before (not) answering the question about the data protection Directive. Logically the Court cannot conclude that this dispute is not linked to EU law before it assesses also whether the data protection Directive applies.

Anyway, if we apply the Court’s own case law, the link to the passports Regulation alone brings this issue within the scope of the Charter. In NS, a key judgment on the scope of the Charter, the EU’s Dublin Regulation left an option to Member States to decide in their national law whether to consider asylum applications which fell within the responsibility of another Member State. But the Court ruled that the Charter applied to such national discretion. More relevantly, in a line of cases starting with Promusicae, the Court applied the Charter in detail to a national option to provide for the collection of personal data on use of the Internet set out in EU law. And in last year’s Digital Rights judgment, the Court invalidated the EU’s data retention Directive for the very reason that this Directive failed to effectively regulate the further national use of personal data collected pursuant to it.

As regards the question about the data protection Directive, the CJEU’s answer simply departs from reality. It is quite clearly not true that the national court was ‘only’ asking for an interpretation of the passport Regulation. As we can see from the text of the question excerpted above, it also asked the CJEU to interpret the data protection Directive. Admittedly, it only asked the CJEU to interpret the Directive in the context of the Regulation. But the CJEU does not make that distinction clear; and more importantly, that distinction just doesn’t matter.

Why? Because the CJEU has frequently rephrased questions by national courts in order to give a full reply to the EU law issues which they are actually having to address in the relevant litigation. The examples are legion, but the most relevant one is the judgment in Promusicae. In that case, which concerned mass interception of Internet users’ activity for the purposes of enforcing intellectual property rights, the national court only asked questions about EU intellectual property law and the e-commerce Directive. The CJEU quite rightly redrafted the questions in order to give an answer about the relevant data protection rules (in that case, the e-privacy Directive) as well. In Willems, the national court had already identified the relevance of the data protection Directive, so a comparatively minor redraft of its questions would have sufficed in order to ensure a reply that was fully relevant to the national litigation.

The Court’s ruling is also unsatisfactory in the broader context of the legislation and case law on similar issues. When it asserted that national law applied to databases of biometric data, the CJEU only selectively quoted from the preamble to the passports Regulation. Recital 4 of the preamble to the 2004 Regulation states that access to the data collected as regards biometric passports is ‘subject to any relevant provisions of [EU] law’. Moreover, the CJEU interpreted the data protection Directive as regards a comparable national database (a collection of information on foreign nationals) in the Huber judgment. I should note that the data protection Directive also applies where the passport Regulation does not: to biometric information collected as regards identity cards, and to passport biometric information collected in the Member States that are not bound by the Regulation (the UK and Ireland). Finally, the Court’s indifference to the fate of biometric data collected by Member States as regards passports seriously undercuts its own rulinge in Schwarz, when it defended the validity of the passports Regulation on the basis of the limited scope of its interference with privacy rights (proportionality), and quoted the S and Marper judgment of the European Court of Human Rights to the effect that ‘the [EU] legislature must ensure that there are specific guarantees that the processing of such data will be effectively protected from misuse and abuse’.  

At first sight, these criticisms of the ruling may seem legalistic. But my concerns are about much more than the deep flaws in the Court’s legal reasoning here. As we all know, the scope of databases and mass surveillance of individuals (‘big data’) have increased exponentially in recent years. This raises huge human rights issues and EU law has a significant role to play. Last year, in its judgments in Digital Rights and Google Spain, the CJEU genuinely tried to grapple with these issues. Many aspects of these judgments have been criticised, but the Court is at its best when it fully engages in these important legal debates. When it avoids them, with the specious legalism it spouts in Willems, it is at its worst.

Image credit: Dailyalternative.co.uk
Barnard & Peers: chapter 9, chapter 26

Posted by Steve Peers at 01:00


+3   Recommend this on Google

2 comments:


  • Douwe Korff21 April 2015 at 04:11
    Dear Steve - I fully agree with your view: this is indeed an appalling abdication of responsibility on the part of the Court. However, at least it was an act of (deliberate) omission: the refusal to look at crucial questions concerning biometric data, in particular the danger of secondary uses/linking of biometrics with other data(bases). The one halfway positive thing is that at least it did not simply ok such secondary uses or linkages. So future national and European (ECtHR) challenges on such matters are at least not pre-empted. Indeed, other national courts can still ask the full questions to the CJEU, in terms that the Luxembourg Court cannot avoid ... But that said, you are quite right to be angry about this ghastly, cowardly judgment. Douwe
    Reply


  • Laura | Dutch law firm AMS Advocaten16 July 2015 at 04:36
    Data protection is more important than we can ever imagine, especially since more and more personal data is being extracted really from our lives.

Saturday, February 7, 2015

7340 - Obama finds bipartisan support for first 'Big Data' privacy plan



Obama finds bipartisan support for first 'Big Data' privacy plan
BY ROBERTA RAMPTON


WASHINGTON Thu Feb 5, 2015 10:45am IST



U.S. President Barack Obama makes a point with his finger as he delivers remarks at the House Democratic Issues Conference in Pennsylvania, January 29, 2015.

CREDIT: REUTERS/LARRY DOWNING

(Reuters) - The White House is working with bipartisan sponsors on a bill to protect data collected from students through educational apps - the first of President Barack Obama's "Big Data" privacy plans to gain traction in the Republican-controlled Congress.

Obama has pushed to do more to protect privacy in an age when consumers leave a trail of digital footprints through smart phones, personal devices and social media - information that can be collected, analyzed and sold.

He has proposed action on a series of laws to address "Big Data" concerns, but most have yet to find momentum.
That could change, given public concerns over privacy and cybersecurity that have been amplified by high-profile hacking of credit card data at companies such as Target and Home Depot, said top Obama adviser John Podesta.

"I think there's much more pressure now to move legislation and we're certainly going to use all of the resources we have, including the president's time, to ensure that the Congress takes this up," Podesta told Reuters in an interview.

In the next couple of weeks, Indiana Congressman Luke Messer, the chairman of the House of Representatives Republican Policy Committee, and Democrat Jared Polis of Colorado, an Internet entrepreneur who founded a network of charter schools, will unveil a student privacy bill.

"Protecting America’s children from Big Data shouldn’t be a partisan issue," Messer said in a statement. "I’m glad to work across the aisle to find the appropriate balance between technology in the classroom and a parent’s right to protect their child’s privacy."

The lawmakers have long worked on the issue with privacy advocates and more than 100 companies including Microsoft, Google, and News Corp subsidiary Amplify to develop a privacy pledge to prevent misuse of data collected in classrooms.
The bill, still being finalized, will go a step further to ensure data collected from students is used only for educational and legitimate research purposes.

"Legislation is the best way to address parental concerns, while encouraging new developments in individualized learning," Polis said in a statement.

OTHER PRIVACY BILLS IN PLAY
A year ago, Obama assigned Podesta the task of making suggestions to beef up data privacy laws after former spy contractor Edward Snowden leaked classified information about government use of Big Data analytics for surveillance.

Obama proposed a new national standard to require companies to tell consumers within 30 days from the discovery of a data breach that their personal information had been compromised.
Lawmakers have struggled to come up with a way to replace a patchwork of differing state regulations, but Podesta said he is optimistic the issue could advance in the current Congress.
"We think there's real urgency with moving forward in legislation in this regard," he said.

Also stalled: a proposal to update the outdated Electronic Privacy Communications Act to protect email and other data stored in the cloud, Podesta noted.

Yet to be tested on Capitol Hill is draft legislation, to be unveiled this month, aimed at empowering consumers to have a say in how their online data is harvested and sold by companies.
Podesta also has sought to raise awareness of concerns that Big Data could be used to discriminate against people based on race or where they live for housing or jobs.

On Thursday, the White House will release a report on how companies use Big Data to offer different prices to different consumers.

Big Data techniques have "turbocharged" price discrimination, raising concerns about fairness, particularly when consumers do not control their own data or understand how companies are using it, Podesta said.

"The report concludes that increased consumer transparency and control can help prevent harmful discrimination in high-stakes transactions and urges policy makers to guard against such outcomes," Podesta said.


(Reporting by Roberta Rampton; Editing by Ken Wills)

Wednesday, December 31, 2014

7083 - AADHAAR ecosystem has provisions to ensure data security - Business Standard

Delhi  December 18, 2014 Last Updated at 16:22 IST

The architecture of Aadhaar ecosystem has been designed to ensure data security, privacy, non-duplication, data integrity and other related aspects. Government is fully alive of the need to constantly upgrade the technology and infrastructure to maintain highest level of data security and integrity. For this purpose, a well- designed and robust data security system is in place. Security is an integral part of the system from the initial design to the final stage and security audits are conducted on regular basis. A multi-layer approach is adopted in providing security measures with multiple formats being used at different steps from the point of collection to the ending stage. Security of data is monitored at all the times i.e. at rest, in transit and in storage. Security and privacy of personal data are fully ensured, without sacrificing the utility of the project. Various policies and procedures have been defined, these are reviewed and updated continually thereby appropriately controlling and monitoring any movement of people, material and data in and out of UIDAI premises, particularly the data centres. Further strengthening of security and privacy of data is an ever evolving process, and all possible steps are taken to make the data safer and protected. This information was given by the Minister of State (Independent Charge) for Planning, Shri Rao Inderjit Singh in a written reply in Rajya Sabha today. 

The Minister said that a total amount of Rs. 5311.60 crores has been spent by UIDAI on the Aadhaar project, as on 30th November 2014. 

Monday, July 29, 2013

4437 - Lessons Learnt From UID Data Loss

by Robin Chatterjee 22nd July, 2013 in Security
        
In April 2013 when Maharashtra government admitted to the loss of personal data of around 3 lakh applicants for Aadhaar card, it served to highlight just the tip of the potentially disastrous, catastrophic iceberg we are sitting on. The possible misuse of citizen data, containing Permanent Account Number (PAN) and biometric information, has raised question marks around trusting UIDAI’S IT infrastructure with the data of a billion. And more importantly, around the government’s empathy and understanding on the issue of data privacy. Three months down the line, we take a peek into the measures undertaken by the government machinery to avoid such incidents in the future, and bring to the table some suggestions from an expert.

The Facts

As per media reports, the data was lost while being uploaded from Mumbai to UIDAI server in Bengaluru. “While the transmission was in progress, the hard disk containing data crashed. When the data was downloaded in Bangalore, it could not be decrypted,” the newspaper report said quoting an official from Maharashtra Information Technology (IT) department, which is overseeing the enrolment of citizens. According to Rajesh Aggarwal, Secretary, Information Technology, Government of Maharashtra, the number of individuals affected is expected to be less than 1 percent of total enrolment done.

Measures Undertaken

Many analysts term this incident a case of extreme irresponsibility. From the very first phase of enrolments, several flaws were detected. But, the question is has the government learnt from its past mistakes and what is it doing to ensure that history doesn’t repeat itself? According to Aggarwal, in Phase II some fundamental changes have been made to eliminate most of the irregular practices. For instance, now the operator has to authenticate himself/herself before starting the enrolment; hence, no unauthorised person can do the enrolment.

On the question of delayed sync with the national server, Aggarwal explains that the enrolment agency is supposed to sync the machine within 10 days of enrolment; else no further enrolment is possible on the machine. The packets need to be uploaded within 20 days; else there is a huge penalty. The agency is now pro-actively uploading the data packets quickly and within time, which significantly reduces the chances of hard disk failure, data loss, etc.

What More Can Be Done

With applicants getting added to the system by the thousands and lakhs on a continuous basis, scale is going to be critical. “Irrespective of what the agency believes, it seems that most of the IT infrastructure that UIDAI has was not meant for this scale. The agency should think of scaling up the existing infrastructure so that trivial things like a hard disk crash can be averted,” says HP Kincha, Former Secretary IT, Government of Karnataka and Chairman, Karnataka Innovation Council. 

He further adds that dependency on IT is critical to effectively manage a process of such scale as the Aadhar project. Hence, IT awareness among operators needs to be given due importance. The government also needs to figure out how to backup the data and re-use the same in cases of urgency.

There isn’t an iota of doubt about the criticality of the data that is at potential risk in the entire Aadhar operation. But data loss due to trivial failures such as a hard disk crash only raises serious questions about the effectiveness of the government machinery. In the end, we expect our government to be pro-active in matters of data security and privacy.  But, we also recognise the fact that continuing to criticise the government alone is not the answer. Let the government take up this particular incident as a wake-up call to rectify existing flaws within the system and avoid such mishaps in the time to come.



Tuesday, February 21, 2012

2394 - An Indepth Look into Data Privacy - Dataquest

In the age of data creating and sharing, where a user is under constant surveillance, we need more privacy-enabled tools and rules to root out socio-psychological harms

Dr. Kamlesh Bajaj
Monday, February 06, 2012

Personal Information (PI) is generally defined as any information relating to an identified or identifiable natural person. It may be referred to as personal data, personal information, non-public personal information, etc. Examples include, but not limited to, name, address, date of birth, telephone number, fax number, email address, government identifier (eg, PAN number, PF account number, UID number, etc), bank account number, credit card number, driving license number, IP address, biometric identifier, photograph, or video identifiable to an individual, and any other unique identifying number, characteristic, or code. Privacy is all about protecting one's PI. Since 1940s privacy has been recognized as a fundamental civil liberty. The Universal Declaration of Human Rights (1948) contains a paragraph on privacy. The 1950 European Convention on the Protection of Human Rights and Fundamental Freedoms includes a similar clause. The Supreme Court of India has upheld the right to privacy as part of 'Article 21-right to liberty', under the Constitution of India.


Technology Killed Privacy

Is technology impacting privacy of individuals? If yes, how and what can be done about it? Is it possible to protect privacy through laws that are technology-neutral; that can anticipate threats from new technologies? It was Samuel Brandeis, who along with Warren, defined privacy in 1890, as a 'right to be left alone' when a new technology, namely the printing press was publishing about famous individuals. It was the print media that were invading the privacy of a few individuals at the end of the 19th century; computers in the 1960s, followed by networked computers in the 1980s enabled invasion of privacy of individuals by governments and businesses. In the first wave of information and communication technologies (ICT), there were large databases on central systems-almost a replica of large filing cabinets with paper files-in which individuals could be tracked for their PI in a single database. The second wave enabled an individual to be tracked in multiple databases with cross-referencing leading to what is now known as 'profiling'. There was a need to develop privacy laws or data protection laws based on a set of privacy principles to ensure privacy protection; privacy laws were created in the 1980s. The European Union Data Protection Directive 95/46 was a far-reaching effort to harmonize privacy protection laws in all the EU countries. It mandated that the EU countries legislate and implement privacy laws based on this Directive. Have these laws helped achieve the objective of privacy protection, or they have been overwhelmed by technological developments?

The Age of Oversharing

Let's look at the next ICT wave since the dawn of the present century, which has transformed the individual from being a passive data subject to an active data creator, communicator, and sharer. E-commerce applications, email, chat, blogs, and social networks like Facebook, Orkut, Twitter help persons become data creators. Alan Westin's definition of privacy as 'the claim of individuals, groups, or institutions to determine when, how, and to what extent information about them is communicated to others' starts becoming more relevant, since the focus has shifted to a person's choice on what they want to be known about them to others. They want to control what they want to reveal about themselves to others. But can they really control?

Controlling Commercialization of Personal Information

PI has become a commodity that has an economic value attached to it. Organizations correlate increasing amounts of data, convert it into forms that are useful to the data subject himself, and to many other businesses. People are driven by data sharing for a number of reasons, but it's those who are aggregating data from social networks and correlating with that obtained from other sources that have the potential to put privacy at risk. The real cost of trading in privacy is not known. 
What is a reasonable expectation of privacy, especially with pervasive technologies such as cookies, which are invasive as well? Technologies enable data collection without the knowledge of users in an online environment. But consumers cannot be expected to be activists about their privacy even though they want to control dissemination of their PI. This is true in an offline environment too, when data is collected, especially by the government in surveys, during enrollment in schemes, enumeration, or in electoral rolls for issuing a voter card. Governments have to set an example by eschewing practices that abuse citizens' data by using it for purposes other than what it was collected for. For example, data collected for a survey, or for assessing low-income groups, cannot be used by law-enforcement agencies. It cannot be sold by collecting agencies to businesses. Banks collecting PI of account holders cannot sell it to telecom call centers. And so on. Nations have to have laws that promote privacy protection.

Living with a Stalker

There is a need to dissociate the availability of data from its use. Digital data generated by all kinds of sources is everywhere. An individual's primary purpose of going online is to engage in activities that include buying, reading, leisure, social networking, blogging, and chatting. They are burdened with notice, choice, and consent regime, which does not seem to be working anymore. They are asked to worry about how their data is collected, for what purpose, what value does their data have, and so on. They are tracked and linked by several organizations for different purposes. One can know about oneself by doing a Google search, going to Facebook, and various other online communities. But then this data is available to others too; and they can use it for any purpose such as denying a job based on their views at a certain site. Worst of all, data is permanent-the internet does not let you forget anything. Does an individual have a right to oblivion? How do you empower an individual to control their data? That should be a key consideration in devising privacy principles for the new age. But let's first review the existing privacy principles and their limitations.

Privacy Principles and Laws

European Union and the US have different approaches to privacy protection resulting in different international instruments of privacy. Should countries have privacy laws that are consistent? Or should the objective be outcome-driven, based on globally accepted privacy principles and best practices with industry self-regulation under an appropriate law, ie, co-regulation? Most countries are in agreement on the universality of a set of privacy principles, although emergence of several new ICTs have put some of these principles at risk; some new principles are being debated. It was the US that came up with a set of privacy principles, in what is known as the Fair Information Privacy Practices (FIPPs) in 1974 that provided for protection of consumers' PI . The OECD Privacy Guidelines, on the other hand, released in 1980, were issued to ensure that privacy protection did not end becoming a non-tariff barrier in international trade in which global a data flows were ever increasing. The privacy principles (PPs) are as following: Collection limitation, data quality, purpose specification, use limitation, security safeguards, openness, individual participation, and accountability principles. As can be seen these are similar to FIPPs.

United States: The US has a history of self-regulation, especially in its safe-harbor program with the EU. It has defined 7 PPs: Notice, choice, onward transfer (to third parties), access, security, data integrity, and enforcement. Privacy is largely viewed as a consumer and an economic issue. 
Americans are comfortable with having businesses handle their information, but skeptical about putting data with the government. There are many laws restricting the government collection and use of information than laws restricting corporate use of collection and information. With so many state and federal laws and various agencies responsible for data protection-SOX, GLBA, HIPAA, California Data Breach Notification Law-privacy protection is a veritable patchwork of laws. In fact, nearly 600 laws for privacy data security exist in the US. Moreover, oversight is decentralized; and data protection is not a core mission of any government agency. A partial list of privacy and data protection laws in the US is as follows: Children's Online Privacy Protection Act 1998 (COPPA), Gramm-Leach-Bliley Act (1999), Federal Trade Commission Act, Freedom of Information Act - 1966 (FOIA), Privacy Act of 1974, Health Insurance Portability and Accountability Act of 1996 (HIPAA), Family Education Rights and Privacy Act (1974), Privacy Protection Act of 1980, Driver's Privacy Protection Act of 1994 
OECD Guidelines: In the OECD Guidelines, what to do and how to implement are in the Explanatory Memoranda; with 'how' to implement left to countries. It is this 'how' that is different between the US and the EU. The EU Directive seeks to standardize the 'how' part not only in EU countries but across the world. However because of cultural differences between various countries, the 'how' part is and likely will remain different. Hence harmonization or consistency can not mean that every country should follow a certain way of implementation; instead it means the same set of globally acceptable PPs. So, while the aspirational goals are common to all countries, implementations are going to be different because the privacy instruments take into consideration the local cultures and history of country's legal traditions.
EU Data Protection Directive: The EU Data Protection Directive, as noted above, mandates that the EU member states "shall protect the fundamental rights and freedoms of natural persons, in particular, their right to privacy with respect to the processing of personal data." The Directive stipulates the following privacy principles: Processed fairly and lawfully, collection for specific and legitimate purpose, adequate and relevant data collection, accurate and secure, not kept longer than necessary, data subjects' rights protected, access and correction, no transfer to third countries with inadequate protection, and restriction on automated decision-making; and mandates that Data Protection Authorities (DPAs) shall be created with wide powers to oversee implementation of privacy protection. Article 25 mandates that transfer of data to third countries can take place only if "the third country in question ensures an adequate level of protection." It's the EU that determines whether a third country has 'adequate security'-it's based on an unclear criteria; an important element of assessment is whether privacy law in a third country is similar to that expected by the Directive. The expectation thus is harmonization of laws in accordance with the EU Directive. Derogation are through the routes of Binding Corporate Rules (BCRs) for multinational corporations, and standard contractual clauses for contracts between data controllers and data processors in third countries that are deemed not to have adequate security.

APEC Privacy Framework: This is a grouping of some 21 countries that has come up with the APEC Privacy Framework to promote e-commerce. Self-regulation is part of the APEC Privacy Program, which has taken the approach of accountability under which the data protection obligations flow along with data in trans-border data flows. 
In order to accommodate different privacy laws in various countries, APEC has placed emphasis on the practical aspects of data flows and on the manner of interface between various players, including companies, regulators, and governments. Cross-Border Privacy Rules (CBPRs), along with information sharing, investigation, and enforcement across borders among regulators, including self-regulatory organizations (SROs) will form an integral part of the APEC Privacy Framework. The CBPRs are akin to BCRs allowed to multinationals under the EU Directive. 

The privacy principles represent conception of privacy, and there is high degree of agreement among various approaches-US, OECD, EU, APEC-in the world. There is thus a set of globally accepted privacy principles. Transparency, enforcement, and accountability are the cornerstone of privacy protection. Many countries do not have privacy laws; in some countries such as the US, data protection is realized through consumer protection laws. As long as there are laws that can be used to punish the violators, privacy can be protected. The EU Directive, was based on OECD privacy principles, which in turn was inspired by the FIPPs of the US. There is, therefore, a high degree of compatibility between the EU and the US. However similarity is at the privacy principle level, not in the method of implementation. APEC privacy principles are similar too, but they promote working with countries that may not have any privacy laws. APEC Privacy Program recognizes the role of SROs; they can fulfill the role of regulators. The focus is on accountability of data controllers and data processors.

Privacy Principles and New Technologies

During the last 30 years, since the OECD privacy principles were announced, the context in which these guidelines operate has changed-explosion in the volume and uses of PI triggered by technological advancements that help collect, store, process, aggregate, link, mine, analyze, and transfer large quantities of data. Moreover, the role of PI in the economy and society has expanded largely because of an easy access to fixed and mobile devices connected over the global internet.

The 1980 OECD Privacy Guidelines were for free global data flows and not to hinder international free trade. Today, people want data delivered to them on multi-platforms, and they want consumer empowerment too. Yet, innovation and new tools have to be encouraged for an economic growth. For example, Facebook enables people use many applications, which deliver value to them.

Many emerging technologies have stretched the limits of applicability of privacy principles-in fact, some of the principles appear to be in trouble. Has 'consent' any meaning with advanced cookies? Notice and choice do not have a central role, but they seem to occupy a major part of the global debate on privacy. In practice, the principles seem to cause an endless frustration for consumers, since although in online transactions, such notices are sent to them, there's precious little in terms of choice available to them. The only choice is not to avail of services if one disagrees. Thus consent is neither informed nor voluntary. This is similar to the case of government asking for information, failing which service may not be delivered to a citizen or consumer. Since much of privacy is to do with 'fairness', many of the privacy principles, which are in trouble because of emerging technologies, social networking, pervasive surveillance online, and in the physical world through cameras, scanners, RFID tokens, mobile phones,

GPS, etc, are under review. At the same time, principles like 'accountability' and 'privacy by design' are gaining acceptance.

Determining the Right

It has to be recognized that individuals have various roles-consumer, citizen, employee-in which their privacy concerns are different. They have different attitudes towards privacy: Privacy intensive, privacy pragmatists, or privacy insensitive. 
The pragmatists are governed by the following considerations: Benefits, risks, legal protection, trust in others. Many of them view data empowerment as the use of data to achieve fundamental rights-similar to rights of freedom of assembly, association, and speech. The interplay of technologies and individuals has changed over the last 3 decades. In the decade of 2010s, people use data to exercise their rights as opposed to 1980s when only government and business organizations had data about people. The data was used for economic growth and innovation in the 1980s. Now it's about right versus right-which right is more fundamental? For example, data minimization versus data empowerment. In the 1980s, it was all about human/fundamental right vis--vis data controller-data minimization was a privacy guideline. What is the status now, and how will it evolve? 

Global Privacy Protection Review Efforts

EUs Review: The EU launched a consultation on the legal framework for the fundamental right of personal data in July 2009. In a paper entitled A comprehensive Approach on Personal Data Protection in the European Union, which the European Commission submitted to the European Parliament in November 2010, the key objective was to ensure that individuals have the right to enjoy effective control over their PI in the new digital age.

Recommendations of DSCI: DSCI submitted its response to the questionnaire, in so far as it relates to outsourcing and global data flows that was circulated by EU. Prior to that, DSCI had submitted its suggestions on extending BCRs to service providers.

Measures of the US Bodies: In the US, on the other hand, the Federal Trade Commission (FTC) and the Department of Commerce have engaged people on privacy matters, and have come up with separate green papers through which they are seeking comment of people. The FTC report: Protecting Consumer Privacy in an Era of Rapid Change: A Proposed Framework for Business and Policy Makers was released on December 1, 2010. Its focus is consumer privacy protection. It has concluded that the existing privacy models based on the Notice-and-Choice and harm based approach are insufficient to address evolving privacy issues. Consumer consent is missing in the complicated online environment, while the reputational and psychological harms are also not covered. It suggests a new framework with 3 core principles: Privacy by design, simplification of consumer choice, and greater transparency. This report also suggests that do not track feature be developed in applications to enable consumers to prevent the tracking of their internet activities.

The Department of Commerce Internet Policy Task Force Privacy Green Paper: Commercial Data Privacy and Innovation in the Internet Policy a Dynamic Policy Framework focuses on reducing barriers to business development and innovation, and recommends minimal regulation using voluntary, enforceable policy codes that would be created by industry. It advocates a privacy framework based on revitalized FIPPs, that would engender consumer trust while maintaining flexibility in business development and innovation. It also discussed the importance of global interoperability among diverse international privacy frameworks, and nationally consistent breach notification rules.

Recommendations of DSCI
India is a vast country, where outreach can be through industry associations and other NGOs, and not through a single bureaucratic DPA. DSCI recommends that the proposed privacy law should take care of the following:

Light Weight Regulations: It should be based on global privacy principles that value economic benefits of data usage and flow, while guaranteeing privacy to citizens

Bureaucratic Structure: Avoid bureaucratic structure that could hinder business interest and lose the spirit of the intent in the operational implementation

Self-regulated Businesses: Rely on self-regulation of businesses that promote practices, making the privacy program relevant to technology advancements

Legal Recognition: Provide legal recognition to the role of self-regulatory bodies, promoted by industry associations, in enforcing codes for the privacy in the interest of citizens rights

Associations: Notify and implement through self-regulatory organizations like industry associations

Ensuring Privacy of Customers: Allow businesses self declare the codes of practices that they have implemented to protect the privacy rights of the customers

Public Private Partnership: Establish a mechanism, in the form of public private partnership, to resolve the disputes and grievances of citizens Self-Regulation with a legal sanction, ie, co-regulation should be the way forward. The self-regulatory organizations will define the process and codes of practices, which are vetted and recognized by the government through the proposed privacy law. Co-regulation should be the guiding spirit.

What can society do to increase public awareness of privacy? Ethical responsibility is essential, merely sending 'notice' is not adequate. How to better implement data minimization? The solution lies in improved practices. Cloud computing adds another dimension to the problem, which is that an individual maybe viewed as a citizen of a Cloud Database: what rights does one have; cloud will have to share data back with the individual. Regulatory structure will be expected to create right incentives for companies to engage in privacy protection, and create tools that empower people, eg, for privacy impact assessment (PIA). Users should be empowered with self-audit tools that maybe provided by online providers such as Google. Governments need to create more transparency, eg, through PIA of departments, and making them public.

Consumer and privacy issues come together. Trust factor can come from regulators that may have a certification role and enforcement function too. SROs in various sectors can do the same. Privacy Seal type certification schemes can be used-these are being considered in the review of the EU Directive. NGOs have a role to watch privacy conformance. Citizens can be assured of privacy protection if the gatekeepers work according to the following rules: Government should do minimum regulation, industry should engage in self-regulation, and a user should be careful to put out their personal information.

Striking a Balance between National Security and Privacy 
In India, the government has initiated a series of discussions on the establishment of a legal framework for a privacy law. This was triggered by the launch of UID and NATGRID projects of the government. While the objective of the first project is to provide unique identity of respondents to enable them participate in the benefits of the economic development through financial inclusion; the objective of the other is to target criminals for national security. In both these projects, it is the privacy of individual which is at risk. A privacy law is proposed to be created which balances national security with privacy requirements of individuals. DSCI submitted a consultation paper to the government of the subject, and also made a presentation to the Select Committee drafting the privacy law.

The views expressed here are the author's personal views.