Understanding the Data Protection Bill Through the Lens of Child Rights
Updated: Aug 11
Ever Since the Supreme Court of India deemed privacy as a fundamental right and constituted the Justice Srikrishna Committee, the country has been waiting for the Personal Data Protection Bill. With the subsequent Digital Personal Data Protection Bill (DPDPB), 2023, now passed in the Lok Sabha and Rajya Sabha both, we have been seeing several analyses of it from various perspectives. The bill needs to be examined from the perspective of child protection, though. This piece is an attempt to fill that gap by reading it through the prism of child rights, data protection, and digital risks.
The bill is a huge leap forward in ensuring child protection by making verifiable parental consent mandatory for processing personal data of persons under 18 years of age. We term it a giant leap because this introduces a step of human judgment between children and digital algorithms. By placing the power of this decision in the hands of parents or guardians, the bill acknowledges the need for gatekeeping a child’s digital life and personal data.
The bill moves beyond consent as well by expressly forbidding data processing likely to cause harm to a child and restricting tracking or behavioural monitoring. These provisions resonate with a global movement towards greater ethical considerations in data handling, particularly concerning those unable to provide informed consent. The provision barring behavioural monitoring is a brave and crucial step, considering all the social media and gaming giants use behavioural studies to target and engage users. This tactic, lucrative for businesses, often leads to a form of digital entrapment where online habits are analysed, and content is tailored to sustain attention, sometimes leading to serious problems like gaming or social media addiction in children. The DPDPB 2023, as such, acknowledges that children must be protected from predatory marketing practices and algorithms, even if they defy powerful commercial interests.
One important aspect to be considered is that irrespective of who the Data Fiduciary is and what they declare as the purpose of collecting data, in the case of children, there is a need to state that the Data Fiduciary shall not collect any personal details, device IDs, or information related to a child, and in the case of any limited information that may be collected for necessary purposes like - educational details, such information will not be shared further with any other entity. Further, the parent or guardian of the child shall have the right to ask for the removal/erasure of such data if required.
Yet, as they say, the devil is in the details. The operational word here is ‘verifiable’, and much depends on its implementation. Ensuring this verification occurs while maintaining the balance of being a robust mechanism- but not so stringent that it hampers legitimate needs- will, in our view, be the linchpin of the digital safety of children. Another important aspect we see as necessary is a clear definition of what constitutes harm to children. The considered opinions of experts, important stakeholders, and organisations working in the domain of child safety can prove a valuable asset to the government, and it must consult with these groups to ensure the bill is a well rounded instrument of child safety.
As an organisation working in the domain of child safety, this legislation’s nuances make us consider several intricate scenarios. Will schools be allowed to continue to collect various data that is additional to the information needed to secure admission for the child, for instance, parents' income, even if the parent is not seeking any EWS concessions? The bill declares that unnecessary data shall not be collected, so can a parent refuse to share their income without facing denial of admission for their child? If a school does deny admission for such a reason, what redressal mechanisms are in place to ensure the child does not suffer?
These probing questions demand precise answers as they touch upon the real-life implications of the law. The need for detailed descriptions of the Roles and Accountabilities of institutions/industry/and any entity, or, in a nutshell, a Data Fiduciary that provides and their clear roles and accountabilities to ensure data privacy, will be an essential part of the Rules. Continued room for interpretation, will leave enough scope to breach the data protection of children without a prompt redressal mechanism alongside.
Another concern that we have is regarding the exemptions. The Data Protection Bill, 2023, in its set of exemptions, offers considerable latitude to certain data fiduciaries, including startups. While it is understandable that these exemptions may have been conceived with the intent to foster innovation and ease the regulatory burden on fledgling businesses. On the surface, this makes sense. By reducing red tape, the government hopes to fuel the growth of a burgeoning industry; however, it gives rise to significant concerns when viewed through the lens of child protection.
The exemptions permit these start-ups to bypass obligations that are otherwise essential for safeguarding individual privacy: notice before processing data, the necessity of a valid contract before using third-party data processors, the right of data principals (in this case, often children or their guardians) to information about their personal data, several other crucial data protection protocols, and some other key aspects of the DPB 2023.
Let's put this into perspective. Imagine a start-up that offers interactive learning modules for children. This startup is one of the data fiduciaries exempted from the essential provisions. Without the obligation to provide prior notice, this company could collect and process vast amounts of data on a child's learning patterns, preferences, and more.
Without the need for a valid contract with third-party data processors, there's no telling where this data might end up or how it could be used. Such information, in the wrong hands, could be used to target the family with tailored advertisements or educational products that may not be in the child's best interest. If this data were mishandled or breached, it could lead to further exploitation, impacting the child's educational journey or even leading to identity theft.
And if a concerned parent decides they no longer want their child's data with the company? The exemption means there's no onus on the start-up to erase this data upon withdrawal of consent.
According to the bill and the requirement to prioritise child protection through data privacy, startups in the Edtech and edutainment sectors may need a specific set of guidelines or an adjusted regulatory framework that reflects both their innovative spirit and their obligations to children.
Another important aspect to consider is the commonly accepted but unspoken rule established by social media companies and various learning and edutainment platforms that sets 13 as the minimum age for participation. How does the Data Protection Bill, 2023, interact with this standard? Does it now mandate that any Data Fiduciary falling under this category must ensure parental consent for any data collected on children by these platforms for anyone under 18 years old? Once again, the devil lies in the details. What will the bill or the subsequent Act permit? Will it merely require a set of terms and conditions presented by a Data Fiduciary and accepted on behalf of the child by the parents? Or is there a deeper layer of responsibility, particularly in the context of any child with internet access? The answers to these questions are critical to shaping the actual impact of the legislation on children's digital safety.
As this bill is poised to become an act, we feel that it is imperative that the child protection lens is applied to it. While we feel that it is a forward looking bill, there are some recommendations we offer for the rules as may be prescribed:
Define 'Harm': A clear and comprehensive definition of 'harm' must be established, considering the multifaceted nature of potential threats to children in the digital space.
Redressal Mechanism: A structured redressal mechanism is vital, wherein NCPCR and SCPCRs should play a role in monitoring the implementation of this act concerning children, ensuring accountability and responsiveness.
Recognise Far-Reaching Consequences: Acknowledge that potential 'harm' to children in case of a data breach can have extensive consequences, including, but not limited to, abuse and exploitation.
Stricter Penalties for Child Data Breach: Accept that a data breach involving children, by virtue of them being a more vulnerable group, constitutes a more serious offence than a similar breach involving adults. Accordingly, such breaches should attract stricter penalties.
No Exemptions for Children's Data: Ensure that any data fiduciary or start-up collecting children's data will not be exempted from any provisions, thereby mandating robust safeguards regardless of the company's size or nature.
Fast-Tracked Redressal with CPCR Consultation: If a data breach involving children occurs, the relevant authorities such as NCPCR/SCPCR must be consulted, and a fast-tracked redressal mechanism should be implemented to mitigate and respond to the harm expeditiously.
In conclusion, the success of this legislative effort lies not only in its breadth but also in its depth. Embracing these specific recommendations will strengthen the framework, making it more than just a law on paper but a substantial shield for children in the rapidly digitised world.
Written by: Chitra Iyer (CEO and Co-founder, Space2Grow) and Maitreyee Shukla (Research Specialist, Space2Grow)