The Year Ahead in Digital Policy: Proliferation of Data Protection Laws Continues, with Children’s Privacy Going Mainstream

In 2018, the EU Commission’s General Data Protection Regulation (GDPR) came into force with the intention to protect EU citizen’s fundamental privacy rights, define legal obligations for data controllers and data processors, require adequacy for cross-border transfers, and impose heavy fines for non-compliance, among others. Five years on, we have seen an abundance of privacy laws emerging globally, each country adding their own flavour on how to protect citizens.

In 2023, we expect significant markets, including India, Nigeria, Pakistan and Tanzania, to move ahead with their own data protection laws and new regulatory directives focused on data will emerge, such as the EU Data Act. In the US, five states will also likely launch their consumer privacy laws, following the recent modification of the California Consumer Privacy Act (CCPA). In other news, advertisers will need to rethink their strategies as third-party cookies come to an end. Among these developments, there will be new momentum for greater regulation of children’s privacy – a topic that will move from the sidelines into the mainstream.

What’s at stake?

With the emergence of online safety bills, such as those in Australia, New Zealand and the UK, the need for more harmonised children’s privacy regulation is also becoming evident. Online services covered by such laws are wide-ranging and include social media platforms, apps, games, connected devices and news services. If these services are likely to be accessed by children, even if not directly aimed at them, they will still be covered by many regulations. This means that companies must make significant changes to how they design their services and process children’s personal data to ensure compliance.

That is why it is so important to get this new type of global standard in privacy right from the beginning. The lessons from the GDPR have taught us that stringent frameworks can raise costs for firms of all sizes around the world, and compliance can be a challenge amid an uncertain enforcement climate. Another risk is multiplying compliance and cross-border data transfer burdens, as each country puts its own spin on children’s privacy and arrives at subtly different standards for implementing general principles.

What are the policy considerations?

There is a fair amount of dispute among regulators about the appropriate age for children to be able provide their own consent for their data to be processed. In legal terms, in most countries, those under 18 are considered a child. However, countries have different views on the appropriate cut-off point, which can vary between 13 to 18 years old. This means that, in theory, if strict compliance is followed, 16 and 17-year-olds could be excluded from many public spaces on the Internet. The potential for unintended consequences of age restrictions can be demonstrated by the following example: in the same year that (then) 15 year old Swedish climate activist Greta Thunberg harnessed the power of social media to spearhead a global youth movement, the GDPR-related changes to the minimum age for using could, in theory, have prevented her from starting a simple online petition.

Children’s privacy, protection and participation rights must all be held in balance. Overly stringent protection measures risk limiting children’s participation in the rich opportunities afforded by connectivity; conversely, access to the internet without appropriate safeguards could make children vulnerable to threats. The ongoing debate around managing encryption encapsulates these tensions perfectly: on the one hand, the privacy of children – as with that of adults – is protected by encryption, on the other hand those responsible for the most egregious child rights abuses facilitated by the internet – notably child sexual abuse and exploitation – are able to use encrypted services to avoid detection. Hence, policymakers need to balance these considerations in their decision-making processes. Appropriate age impact assessments for different online services can help, so that young people can navigate the internet safely according to their age and capacity.

What we expect to see in the year ahead

After the UK’s Age-Appropriate Design Code came into force in September 2021, we will likely see much stricter enforcement of the rules in 2023. On the other side of the Atlantic, the California Age-Appropriate Design Act was signed into law in September 2022 and will come into effect in July 2024. The Act places new legal obligations on companies concerning online products and services that are accessible by children (under the age of 18). The two bills are purposefully very similar, supporting the setting of a strong, consistent international standard for children’s privacy, that we can expect to see replicated elsewhere in 2023 and beyond. Meanwhile, the EU will continue to address children’s digital protection, privacy and participation needs through the 2022 European strategy for a better internet for kids (BIK+), the digital pillar of the wider EU Strategy on the Rights of the Child.

Regulators and stakeholders across the world will start paying closer attention to the implementation and enforcement of these codes. More scrutiny will be paid to any differences in implementation where there is formal regulation on age-restrictions versus self-regulation. In conclusion, in 2023, the direction of travel will be towards greater scrutiny and focus on children’s privacy around the world.

Policy Good Practice: ASEAN Regional Plan of Action for the Protection of Children from All Forms of Online Exploitation and Abuse  

In October 2021, ASEAN adopted the Regional Plan of Action (RPA) for the Protection of Children from All Forms of Online Exploitation and Abuse. Among other commitments, the RPA promotes children’s access to the Internet and respects their rights to freedom of expression, privacy, and access to information, while recognising their rights to protection from all forms and risks of violence and exploitation online.  

Notably, to ensure that children’s voices had a considerable impact and that child participation played a core role in the planning stages of the RPA, consultations with children and young people took place across ASEAN Member States. The practice of incorporating children’s views continues to be a central pillar in the implementation stages, and in 2022, at the ASEAN ICT Forum on Child Online Protection,  A Call to Action from Children and Young People to the Private Sector on Child Online Protection was published.  This insightful document illustrates four key areas that children and young people want the private sector to focus on to create a digital environment that centres on the best interests of every child.