UPSC MainsGENERAL-STUDIES-PAPER-II202515 Marks250 Words
हिंदी में पढ़ें
Q18.

The National Commission for Protection of Child Rights has to address the challenges faced by children in the digital era. Examine the existing policies and suggest measures the Commission can initiate to tackle the issue.

How to Approach

The answer should begin by briefly introducing the NCPCR and the increasing relevance of its role in the digital age. The body will be structured into two main parts: first, an examination of the existing legal and policy framework in India for child protection in the digital space, highlighting both strengths and limitations. Second, a detailed discussion of proactive measures the NCPCR can initiate to tackle these evolving challenges. The conclusion will synthesize these points, emphasizing a multi-stakeholder and child-centric approach.

Model Answer

0 min read

Introduction

The digital era, while offering unprecedented opportunities for learning and connection, simultaneously exposes children to a myriad of online risks, ranging from cyberbullying and exposure to inappropriate content to online sexual exploitation and data privacy violations. The National Commission for Protection of Child Rights (NCPCR), established under the Commissions for Protection of Child Rights (CPCR) Act, 2005, is a statutory body mandated to safeguard the rights of children (0-18 years) in India. Its role has become critically important in extending its protective mandate into the virtual world, given the rapid digital adoption by children, exacerbated by developments like the COVID-19 pandemic.

Challenges Faced by Children in the Digital Era

Children today are increasingly vulnerable to various online harms due to their pervasive presence in digital spaces. These challenges include:

  • Cyberbullying and Online Harassment: Children face trolling, harassment, and bullying on social media and gaming platforms, severely impacting their mental health.
  • Exposure to Inappropriate Content: Easy access to violent, explicit, or age-inappropriate content can harm a child's psychological and emotional development.
  • Online Sexual Exploitation and Abuse (OCSEA): This includes child pornography, online grooming, sex-texting, and webcam sexual abuse, where perpetrators often exploit anonymity.
  • Data Privacy Violations and Identity Theft: Personal data of children is collected via apps and games, often without proper safeguards or verifiable parental consent, leading to privacy breaches.
  • Online Gaming Addiction and Financial Exploitation: Excessive gaming can lead to addiction, aggressive behavior, and financial exploitation through in-app purchases.
  • Disinformation and Misinformation: Children are exposed to false narratives and conspiracy theories, affecting their understanding and critical thinking.
  • Digital Divide: Disparities in access to devices and internet connectivity persist, excluding many children from digital opportunities and services.

Existing Policy Frameworks and Their Limitations

India has a multi-layered legal and policy framework aimed at protecting children, some of which extend to the digital realm. However, these often have limitations in addressing the dynamic nature of digital threats.

Key Existing Policies:

  • Protection of Children from Sexual Offences (POCSO) Act, 2012: This Act criminalizes sexual exploitation of minors and includes provisions for online offenses like child pornography (Section 14) and online grooming. The POCSO e-Box enables confidential digital complaints.
  • Information Technology (IT) Act, 2000 and IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021:
    • Section 67B of the IT Act punishes the publishing or transmission of child sexual abuse material (CSAM) and mandates intermediaries to remove such content.
    • The IT Rules, 2021, impose content moderation and platform accountability, requiring social media platforms to take down content harmful to minors upon notification and recommend parental control mechanisms.
    • These rules also strengthen accountability by requiring grievance redressal mechanisms and traceability of offenders.
  • Digital Personal Data Protection Act, 2023 (DPDP Act, 2023) and Draft DPDP Rules, 2025: This legislation is a significant step, explicitly recognizing children as a sensitive category.
    • It mandates verifiable parental consent for processing data of individuals under 18.
    • It prohibits tracking, behavioral monitoring, or targeted advertising directed at children.
    • The draft rules released in 2025 further operationalize consent checks and data retention policies.
  • Juvenile Justice (Care and Protection of Children) Act, 2015: This Act provides for the care, protection, and rehabilitation of children in conflict with law and children in need of care and protection, including those affected by online crimes.
  • National Cyber Crime Reporting Portal (cybercrime.gov.in) and National Cyber Crime Helpline (1930): These platforms provide mechanisms for reporting cybercrimes, including those against children.
  • CERT-In Advisories: The Indian Computer Emergency Response Team issues advisories on cyber threats and promotes cybersecurity awareness.

Limitations in Existing Policies:

  • Age-Verification Gap: Current systems often rely on self-declaration or generic IDs, making age verification easily circumvented, allowing minors to access age-restricted content and platforms.
  • Inadequate Platform Cooperation: While some platforms cooperate, there is inconsistent enforcement and reporting of CSAM and other harmful content, and uneven safety practices across global platforms.
  • Reactive vs. Proactive Approach: Many laws are applied reactively after harm has occurred, lacking a strong proactive and preventive focus for the online world.
  • Digital Literacy Gaps: Enforcement efforts are often hampered by low digital literacy among children, parents, and even law enforcement in understanding complex online threats.
  • Pace of Technological Change: The rapid evolution of technology and new forms of online exploitation outpaces legislative updates.
  • Jurisdictional Challenges: Cross-border cybercrimes against children pose significant challenges for enforcement.

Measures the NCPCR Can Initiate to Tackle the Issue

The NCPCR, as the apex child rights body, can take a proactive and multi-pronged approach:

Policy Advocacy and Regulatory Enhancement:

  • Advocate for a "Digital Safety Code for Children": Push for a comprehensive code compelling social media, gaming companies, and EdTech platforms to incorporate safety-by-design features, default privacy settings, robust age verification mechanisms, and simpler reporting tools.
  • Strengthen Age-Assurance Standards: Work with MeitY and industry stakeholders to develop and implement privacy-preserving age-assurance technologies that are difficult for children to bypass.
  • Review and Amend Laws: Advocate for regular review and amendment of existing laws like the IT Act and POCSO Act to keep pace with technological advancements and emerging forms of cybercrime, such as "sharenting" and cyberbegging.
  • Platform Accountability: Demand greater accountability from digital platforms, requiring them to publish annual transparency reports on child safety measures, compliance, and efforts to remove harmful content.

Awareness and Capacity Building:

  • Launch "Surakshit Bachpan" Digital Safety Campaigns: Conduct nationwide, multi-lingual campaigns targeting children, parents, educators, and community leaders on digital literacy, safe online behavior, cyberbullying prevention, and reporting mechanisms.
  • Integrate Digital Literacy into Curriculum: Collaborate with NCERT and state education boards to develop and mandate a comprehensive digital literacy curriculum in schools, covering responsible internet use, privacy, and online risks for different age groups.
  • Capacity Building for Stakeholders: Organize regular training programs for law enforcement agencies (especially cybercrime cells), Child Welfare Committees, judiciary, and counselors to enhance their capacity in handling online child abuse cases with sensitivity and technical expertise.

Strengthening Reporting and Redressal Mechanisms:

  • Develop a Child-Friendly "Report-Remove" Interface: Create a dedicated, confidential, and easy-to-use online portal or app, potentially integrating existing helplines (like Childline 1098) and cyber portals for swift reporting and removal of harmful content.
  • Collaborate with Tech Companies for AI-Based Monitoring: Foster partnerships with technology companies to deploy AI-driven tools (e.g., similar to Microsoft's PhotoDNA) for proactive detection and removal of CSAM and other illicit content, while ensuring privacy safeguards.
  • Enhance International Cooperation: Strengthen collaboration with international organizations like UNICEF, Interpol, and the National Center for Missing and Exploited Children (NCMEC) to combat cross-border cybercrimes against children.

Research and Monitoring:

  • Regular Research and Threat Assessment: Conduct continuous research to understand emerging online threats, trends in child digital usage, and the effectiveness of existing interventions.
  • Monitor Platform Compliance: Regularly monitor digital platforms' compliance with child safety norms, data protection regulations, and content moderation guidelines, and issue directives for non-compliance.

Conclusion

The digital landscape presents both immense potential and significant perils for children. The NCPCR, as the custodian of child rights, must transcend traditional boundaries and proactively address the evolving challenges of the digital era. By advocating for a robust digital safety code, enhancing age verification, fostering widespread digital literacy, strengthening reporting mechanisms, and promoting greater accountability from digital platforms, the Commission can ensure that technology remains an enabler for children's growth and empowerment, rather than a source of exploitation. A collaborative approach involving government, industry, civil society, parents, and children themselves is crucial to building a truly safe and inclusive digital environment for India's future generations.

Answer Length

This is a comprehensive model answer for learning purposes and may exceed the word limit. In the exam, always adhere to the prescribed word count.

Additional Resources

Key Definitions

Cyberbullying
Cyberbullying involves repeated, intentional harm inflicted through digital means, such as derogatory comments on social media, public shaming, exclusion from online groups, or impersonation via fake profiles. Its relentless and often anonymous nature makes it particularly damaging for victims.
Child Sexual Abuse Material (CSAM)
CSAM refers to any material, including images or videos, depicting children in sexually explicit acts. Its production, distribution, and consumption are serious crimes punishable under laws like the POCSO Act and IT Act.

Key Statistics

According to a 2023 survey, 1 in 3 children aged 10–17 in India has encountered cyberbullying or inappropriate content online.

Source: The Times of India (Jan 2025)

Cybercrimes against minors in India rose by 400% between 2019 and 2020, as per an Economic Times report (2021), highlighting the growing digital risks.

Source: Economic Times (2021) / ResearchGate (2025)

Examples

Online Gaming Addiction

Cases of children developing severe addiction to online games like Free Fire or BGMI are increasingly reported. This can lead to aggressive behavior, academic decline, and even financial exploitation through in-app purchases or 'gaming debts', causing distress to families.

Sharenting Concerns

Parents inadvertently share excessive personal information and images of their children on social media (termed 'sharenting'). While seemingly innocuous, this can create a permanent digital footprint for the child without their consent, exposing them to privacy risks or future exploitation.

Frequently Asked Questions

What is the role of parental consent under the Digital Personal Data Protection Act, 2023?

Under the DPDP Act, 2023, verifiable parental consent is mandatory for processing the personal data of children under 18 years. This aims to safeguard children's privacy by ensuring that their data is not collected, stored, or processed by online platforms without explicit approval from a parent or guardian.

Topics Covered

Social JusticeChild RightsTechnologyNational Commission for Protection of Child Rights (NCPCR)Digital EraChild ChallengesExisting PoliciesSuggested Measures