Can We Really Verify Consent Online? India’s Age and Identity Problem, Explained

For years, the Indian internet has functioned on a convenient lie.

We have all seen the checkbox asking if we are 18 or older. Users click it without thinking, and platforms accept it without questioning. It enabled frictionless growth while maintaining a thin veneer of compliance.

The Digital Personal Data Protection Act changes that equation. What was once convenient is now a material liability.

This is not just a legal shift, it is a structural mismatch. Most global platforms were built for a world where 13 is the threshold for digital adulthood. In India, that bar now sits at 18. That five year gap, representing millions of users between 13 and 18, creates a significant compliance blind spot.

Under the DPDP framework, these users are minors. This means consent cannot be assumed, cannot be self declared, and must be verifiable. Most systems today are simply not built for that reality.

The real challenge here is not conceptual, it is operational. Every organisation now needs to answer a set of difficult questions. Is this user a minor, and if so, who is the guardian? Can that relationship be verified, and more importantly, can it be proven later if required?

At scale, none of this is straightforward. Self declared age is unreliable, while KYC style verification introduces friction and drop offs that product teams are hesitant to accept. Guardian verification is even more complex, largely because there is no standardised infrastructure to establish and validate that one individual is legally authorised to consent for another.

This is not just a product problem. It is fundamentally an identity infrastructure gap.

At the same time, AI driven personalisation is accelerating rapidly. Systems today continuously profile behaviour, predict preferences, and drive engagement decisions. But AI does not understand age, it understands patterns. If a 14 year old behaves like an adult user, most systems will treat them as one.

Under DPDP, that assumption creates real risk. The defence of not knowing that a user was a minor is no longer tenable. If systems are profiling a minor, serving targeted or sensitive content, or influencing behaviour without valid and verifiable consent, the issue moves beyond product design into serious regulatory exposure.

There is also a deeper structural issue that organisations often underestimate. Consent may be collected in one place, but risk exists everywhere else.

Data today is not static. It flows continuously across internal systems, cloud infrastructure, analytics layers, and third party processors. This is where compliance begins to break down. When consent is withdrawn, organisations must be able to trace that data across systems, revoke it across all processors, and demonstrate that this has been done within regulatory timelines. For many organisations, this remains unclear.

We are effectively building digital ecosystems without a reliable mechanism to stop the flow of data once it has started.

For years, privacy has been treated as a policy exercise. Organisations updated notices, added consent banners, and maintained documentation. That model does not hold anymore because policies do not verify identity, track data flows, or enforce accountability across systems.

What is needed now is a shift towards digital trust infrastructure. This includes identity linked consent rather than simple checkbox logs, real time visibility into data lineage across systems and vendors, purpose bound data flows, processor level accountability, and verifiable audit trails.

Privacy can no longer sit at the interface layer. It has to be embedded into the core system architecture.

As enforcement becomes more outcome driven, organisations need to move from intent to operational readiness. The real question to solve is how to address the identity infrastructure gap in order to reliably identify a minor and obtain verifiable parental consent.

From there, three critical questions follow. Do we know where minors’ data exists and how it flows? This visibility must be continuous, not point in time. Can we control how that data is used across systems and partners? Control must extend beyond organisational boundaries. And finally, can we prove all of this under scrutiny and within tight timelines? Because if it cannot be demonstrated, it effectively does not exist.

The DPDP Act is not just tightening compliance, it is forcing a reset. The age of assumed identity is over, and the age of unverifiable consent is ending.

What comes next is more demanding, but necessary. A system where identity is verifiable, consent is real, and data flows are accountable.

Because in the end, this is not about whether consent was taken. It is about whether that consent can be trusted.

And in India’s next phase of digital growth, trust will not be declared. It will have to be engineered.

Malcolm Gomes
Malcolm Gomes
Chief Operating Officer
Privy by IDfy
- Advertisement -

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles

error: Content is protected !!

Share your details to download the report 2026

Share your details to download the Cybersecurity Report 2025

Share your details to download the CISO Handbook 2025

Sign Up for CXO Digital Pulse Newsletters

Share your details to download the Research Report

Share your details to download the Coffee Table Book

Share your details to download the Vision 2023 Research Report

Download 8 Key Insights for Manufacturing for 2023 Report

Sign Up for CISO Handbook 2023

Download India’s Cybersecurity Outlook 2023 Report

Unlock Exclusive Insights: Access the article

Download CIO VISION 2024 Report

Share your details to download the report

Share your details to download the CISO Handbook 2024

Fill your details to Watch