Governance

Protecting data privacy: Authorisation and access control

  • Blog Post Date 04 July, 2018
  • Perspectives
  • Print Page
Author Image

Subhashis Banerjee

Indian Institute of Technology Delhi

suban@cse.iitd.ac.in

Author Image

Subodh V. Sharma

Indian Institute of Technology Delhi

svs@cse.iitd.ac.in

The B.N. Srikrishna Committee, which was constituted in August 2017 to draft India’s data protection laws, is expected to submit its final recommendations this month. In this post, Banerjee and Sharma discuss key features of the Committee’s White Paper, and advocate authorisation and access control as a viable framework for privacy-by-design.



The Committee of Experts on a Data Protection Framework, with Justice B.N. Srikrishna as the chairman, was constituted in August 2017 to examine issues related to data and privacy protection in digital applications, recommend methods to address the issues, and draft an effective data protection law. The committee’s White Paper on a data protection framework (Srikrishna et al. 2017) is based on the broad principles of informed consent; collection, storage and purpose limitation1 enforcement; accountability and penalties; uniform application across all sectors, and technology neutrality. The report outlines the roles of data controllers and data processors, and their responsibilities and accountability. It recommends independent data regulators for enforcement, in either a command-and-control or a co-regulatory structure. While the report also suggests privacy-by-design, it does not elaborate on possible methodologies for such an approach. In this post, we advocate authorisation and access control as a viable framework for privacy-by-design.

Detection vs. prevention

Privacy protection in digital databases has been less than effective, anywhere, mainly because the enforcement methods have been weak. In most cases, the enforcement strategies have been based on post-facto punitive and corrective measures after detection of violations. Even the recently enacted European General Data Protection Regulation (GDPR) (The European Parliament and the Council of European Union, 2016) does not clearly specify any standards for enforcement. We argue that an architectural solution that prevents privacy invasions in the first place is more likely to succeed than strategies based on detection of violations and subsequent punitive measures. Detection of privacy infringements will often be uncertain because the causal effects of invasions are usually hard to establish.

For example, it may turn out to be impossible to know for sure whether a person has lost their job because their medical data was accessed without authorisation and used to discriminate against them, or if some other reason given as the official explanation was the actual determining factor. Causal links of privacy violations due to indiscriminate and unethical use of machine learning are also hard to establish, and the right to explanation proposed in the GDPR is unlikely to be an effective countermeasure. Hence, ex-ante, rather than ex-post, (Raghavan 2018) ought to be the preferred approach.

Privacy protection does not demand that personal data should not be collected, stored, or used, but that there should be provable guarantees that the data cannot be used for unauthorised purposes.

Rights-based approach

When user participation in a digital application is voluntary, informed consent has often been advocated as the foundational principle for privacy protection. However, information overload and choice limitation often makes consent ineffective. Also, considering that a large fraction of India’s population may not have the necessary cultural capital to deal with complex digital setups, a rights-based approach (Matthan 2017) that shifts a significant part of the accountability from an individual to the data controller will be more appropriate. This should not be in lieu of but in addition to individual consent. Any mandatory digitisation with personal identifiers, such as in income tax or welfare, needs to be backed by a just and proportional law, and must balance the potential loss of individual privacy with the expected public good. In either case, purpose limitation must be a fundamental operative principle for protecting privacy rights of individuals.

Identifiers and privacy

Use of unique personal identifiers, or even phone numbers, in application databases may enable unauthorised profiling of individuals by correlation of identities across different application domains, and lead to identification without consent. Moreover, it is well known (Dwork and Roth 2014) that anonymisation with provable guarantees against re-identification attacks is a difficult task. Hence, it is imperative not only to use different virtual identifiers for each application domain making linking impossible (Agrawal et al. 2017), but also to architecturally prevent unauthorised access of information from siloed databases.

Obligation of regulators

The main task of a data regulator should be to ensure that all data accesses are legitimate and that they do not violate consent, purpose limitation, or any other rights-based principles. It should be obligatory for all controllers and processors to present their data access and processing requirements to the data regulator for scrutiny. As a part of the process, the associated computer programmes for accessing and processing of data must also be audited and pre-approved by the data regulator. Both the data regulator and the data controller should maintain independent and non-repudiable logs – perhaps in a public blockchain (cryptographically secured, distributed ledger) – of all requests and approvals, and the data regulator should issue an authorisation token for each such access request. It should be incumbent on the data regulator to ensure that the data accesses are according to the authorisations granted. The data regulators should also approve and authorise any purpose extension requirements after verifying the legitimacy of the requests based on a rights-based or a consent renewal principle.

For example, if the data controller for the Ministry of Health wants to make parts of an individual’s electronic health record available to a requesting health professional, the regulator of the health record database must examine and pre-approve the entire protocol for consent generation and case-by-case verification, in addition to the architecture for data access. There ought to be a data regulator for every database maintained by a data controller, and a data processor may need to obtain approval from multiple regulators. If a longitudinal study to understand the causes of high rate of stunting in India needs access to electronic health records of individuals, and the PDS (public distribution system) purchase records and consumption data of their families, the computer programmes for data access and analytics must be scrutinised and pre-approved by the data regulators of the health, PDS, and other consumption-related databases. All data accesses must be monitored by the regulators. Neither the regulator nor the controller should access the data independent of the other, and both must maintain independent non-repudiable logs of all data accesses. The regulatory control should not be so lax that it is ineffective, neither should it be so overbearing or paralytic with inertia that it stifles innovation.

The technology to support such regulatory functions exist (Agrawal et al. 2017), and it may be possible for the data regulators to digitally sign both the authorisations and the computer programmes responsible for the data access and processing. In such situations, it should be possible for the regulators to even verify the authorisations and the authenticity of the accessing and processing programmes, online and in real-time, before granting access to the data. We believe that building such regulatory capacity and a strong data protection law will be key to effective data privacy protection in India.

Note:

  1. This requires that data should be used only for the purpose for which it was collected.

Further Reading

Tags:
No comments yet
Join the conversation
Captcha Captcha Reload

Comments will be held for moderation. Your contact information will not be made public.

Related content

Sign up to our newsletter