In August 2023, India’s Digital Personal Data Protection Act (Act) received presidential assent, formalizing its first ever comprehensive and union-wide data protection law. The Act differs in many respects from a preceding version that was withdrawn in 2022, and reflects intensive discussion and revisions since then. In its final form, the Act reflects the central government’s concerns to preserve and enhance a pro-business and pro-innovation environment for the country.

With a population in excess of 1.4 billion, healthcare and associated sectors such as life sciences and insurance are key elements both of India’s economy and of the central government’s social policy agenda. Although historically dominated by the private sector, India’s healthcare industry has seen increasing cooperation and synergies between and among government and private entities, including public private partnership (PPP) arrangements, the Ayushman Bharat National Health Protection Mission, which was launched in 2017, and the Pradhan Mantri Jan Arogya Yojana (PM-JAY) component, which was launched in September 2018 to create the world’s largest health assurance scheme, aiming to provide health cover of ₹500,000 per family per year for secondary and tertiary care hospitalization to the bottom 40% of the Indian population. Given the huge potential to harness personal data, both for large-scale research and for the precise targeting of medical services, key provisions of the Act seek to balance the protection of individuals’ (referred to as data principals in the Act) rights and freedoms and the potential for startups and data-intensive businesses to flourish and grow.

However, sensitive healthcare and insurance data is an attractive target for cyberattacks including ransomware and denial of service. The Indian healthcare industry faced more than 1.9 million cyberattacks in 2022, including a November 2022 attack on the All India Institute of Medical Sciences (AIIMS) New Delhi, forcing it to shut down many of its servers and switch to manual operations. The key challenge, therefore, is to ensure that innovation does not come with a dilution of data protection.

Consent and “Legitimate Uses”

The Act recognizes two main lawful grounds for processing personal data, namely:

  • Consent from data principals
  • Certain “legitimate uses”

The Act also provides that consent may be sought and obtained for any “lawful purpose”, which means any purpose that is not expressly forbidden by law. There is, however, a test of “necessity” under which a particular processing activity must be required in order to achieve the “lawful purpose” for which consent has been obtained. Much will therefore depend on the clarity with which healthcare organizations seek consent for the processing of personal data. Adopting an extremely useful feature of previous Indian legislation, the Act includes a number of illustrations to aid its interpretation. The illustration used to demonstrate the limits of consent relates directly to healthcare:

“X, an individual, downloads Y, a telemedicine app. Y requests the consent of X for (i) the processing of her personal data for making available telemedicine services, and (ii) accessing her mobile phone contact list, and X signifies her consent to both. Since phone contact list is not necessary for making available telemedicine services, her consent shall be limited to the processing of her personal data for making available telemedicine services.”

Consent is not required if a processing activity is within one of the “legitimate uses” recognized by Section 7 of the Act. The most general of those “legitimate uses” is set out at Section 7(a) and allows processing of personal data “for the specified purpose for which the Data Principal has voluntarily provided her personal data to the Data Fiduciary, and in respect of which she has not indicated to the Data Fiduciary that she does not consent to the use of her personal data”. The relevant illustration says

“X, an individual, makes a purchase at Y, a pharmacy. She voluntarily provides Y her personal data and requests Y to acknowledge receipt of the payment made for the purchase by sending a message to her mobile phone. Y may process the personal data of X for the purpose of sending the receipt.”

While that general provision might provide some useful leeway, public or private sector organizations concerned with healthcare are perhaps more likely to rely on the more specific provisions within Section 7.

Section 7(c) applies where the state or any of its instrumentalities provide or issue to the data principal such subsidy, benefit, service, certificate, license or permit as may be prescribed, where:

  • They have previously consented to the processing of their personal data by the state or any of its instrumentalities for any subsidy, benefit, service, certificate, license or permit
  • Such personal data is available in digital form in, or in non-digital form and digitized subsequently from, any database, register, book or other document that is maintained by the state or any of its instrumentalities and is notified by the central government

subject to standards followed for processing being in accordance with the policy issued by the central government or any law for the time being in force for governance of personal data.

The relevant illustration reads

“X, a pregnant woman, enrols herself on an app or website to avail of government’s maternity benefits programme, while consenting to provide her personal data for the purpose of availing of such benefits. Government may process the personal data of X processing to determine her eligibility to receive any other prescribed benefit from the government;”

“Health” is specifically referred to in Section 7(f) and (g). Personal data may be processed without consent

“(f) for responding to a medical emergency involving a threat to the life or immediate threat to the health of the Data Principal or any other individual; or

(g) for taking measures to provide medical treatment or health services to any individual during an epidemic, outbreak of disease, or any other threat to public health.”

While section 7(f) is similar in terms and limitations to the “vital interests” lawful basis for processing personal data at GDPR Article 6(d), Section 7(g) reflects experience of the COVID-19 pandemic and provides a basis for programs such as testing, contact tracing and vaccination.

Exemptions

While section 7 sets out “legitimate uses” within which personal data may be processed without need for consent, Section 17 creates a number of exemptions within which certain obligations of the data fiduciary and certain rights of the data principal do not apply. For healthcare organizations, the most likely to be useful are:

  • Section 17(2)(b), which applies where processing is necessary for research, archiving or statistical purposes if the personal data is not to be used to take any decision specific to a data principal and such processing is carried on in accordance with such standards as may be prescribed.
  • Section 17(3), which empowers the central government, having regard to the volume and nature of personal data processed, to exempt certain data fiduciaries or a class of data fiduciaries, including startups, from the provisions of Section 5, subsections (3) and (7) of Section 8 and sections 10 and 11. For the purposes of this subsection, the term “startup” means a private limited company or a partnership firm or a limited liability partnership incorporated in India, which is eligible to be and is recognized as such in accordance with the criteria and process notified by the department to which matters relating to startups are allocated in the central government.

Duties and Challenges

Under the Act, data fiduciaries are obliged to protect personal data in their possession or under their control, by taking reasonable security safeguards to prevent a personal data breach. This extends to where a data fiduciary engages a data processor to carry out processing of personal data on its behalf.

The Act itself does not distinguish between “general” processing of personal data and the processing of sensitive, critical or (in GDPR terms) “special category” personal data. However, the interaction between the Act and other existing or prospective laws relating to information technology security remains to be fully articulated. Rule 3 of the Information Technology (Reasonable security practices and procedures and sensitive personal data or information) Rules, 2011 includes physical, physiological and mental health condition and medical records and history as sensitive personal data or information, with a specific requirement in Rule 8 to ensure that it is kept secure.

This may in practice add nothing to the data fiduciary’s obligations under the Act to take reasonable security safeguards to prevent personal data breach and to implement appropriate technical and organizational measures to ensure effective observance of the Act’s provisions. However, it is possible that more specific requirements might be introduced should the central government decide to proceed with The Digital Information Security in Healthcare Act (DISHA). Under that proposed legislation, healthcare organizations would have specific duties and individuals would have a range of rights, including a right to compensation if any damage is caused because of a data breach. Again though, it remains to be seen whether there is any perceived need to supplement the comprehensive and union-wide regime created by the Act with sector-specific rules that would arguably risk replicating the complex patchwork of laws that preceded the new legislation. With that said, the Act makes it clear, at least in the provisions on cross-border transfers, that in the event of any inconsistency between the Act’s provisions and sectoral rules that are stricter, the stricter sectoral rules will prevail.

Significant Data Fiduciaries

One key feature of the Act arguably mitigates or removes the need for any separate, sector-specific laws. The central government is empowered to classify any persons or category of persons as “Significant Data Fiduciaries” based on the following factors, the first two of which might be particularly relevant in the context of healthcare and life sciences:

  • The volume and sensitivity of personal data processed
  • Risk to the rights of harm to the data principal
  • Potential impact on the sovereignty and integrity of India
  • Risk to electoral democracy
  • Security of the state
  • Public order

Once designated, significant data fiduciaries will be required to carry out periodic data protection impact assessments and independent audits, and appoint a data protection officer, who must be an individual based in India, and responsible to the company’s board of directors.

India as a Global Research Pool?

While India’s own healthcare will be the primary concern of its central government, the Act’s extraterritorial reach provisions differ from those of the GDPR in one key respect. Unlike the EU GDPR, UK GDPR, and even the earlier 2022 version of the bill, the Act’s extraterritorial reach provisions do not apply to processing in connection with profiling of individuals within India. That omission is potentially helpful to organizations outside India looking to use data to, for instance, train artificial intelligence (AI) models using big datasets likely to include personal data relating to individuals within India. It potentially allows AI service providers to scrape publicly available personal data from the internet without consent and without being swept up by other provisions of the Act. Given that AI has an increasingly important role to play in medical and pharmaceutical research, India’s Act arguably creates a particularly permissive and innovation-friendly environment for AI-driven research. Coupled with the explicit encouragement of startups, that feature of the Act might provide a huge boost to India’s status as a global center for AI development.