Site icon Tech News in India, India Technology News

UK mandates compliance with children’s privacy design code

UK mandates compliance with children’s privacy design code
0
0

App makers in UK were given a 12-month additional time to comply with a design code that protects children online. The time ends today and app makers have to make sure that digital services accessed by children are on par with the set of rules. The rules prohibit children from being tracked and profiled. The design code originally came out on September 2 last year. But ICO, which maintains the UK’s data protection gave additional time to equip companies to make their services according to the design code.

The services that have to come under this code include connected toys, games, ed-tech, online retail, social media, and video sharing platforms. All these services attract a number of children. The code asks the services to maintain a high privacy level by default if the user is a child. It asks the services to turn off geolocations and profiling. The code also makes the app makers include parental controls in addition to adding age-appropriate information on such tools to the children. 

Another regulation targets dark pattern design, telling app developers not to use “nudge methods” to persuade minors to disclose personal data or “weaken or turn off their privacy protections.”

The whole code has 15 criteria, but it is not regulation in itself; rather, it is a set of design guidelines that the ICO wants app developers to follow.

The risk for apps that violate the rules is that they will attract the notice of the watchdog — either through a complaint or a proactive investigation — and that they will be subjected to a wider ICO audit that will look into their entire approach to privacy and data protection. 

It goes on to say that noncompliance with the kids’ privacy code could be viewed as a potential black mark against (enforceable) UK data protection laws, saying that if the code is not followed then it will be difficult to reason that the functioning of the apps is fair and is subjected to the rules of GDPR [General Data Protection Regulation] or the PECR [Privacy and Electronics Communications Regulation].

Stephen Bonner executive director of regulatory futures and innovation wrote in a blog post that they will be “proactive” in mandating the code to social media platforms, video, and music streaming sites, and the gaming industry. It will also provide possible support and in some cases will order investigations or audits, he added. Currently, ICO notes that high risks are associated with the above-mentioned platforms. In such platforms, children’s data are collected to add personalized features and content. Some of these may be inappropriate for kids to see. ICO is concerned with the harms that it can create which might be physical, emotional, psychological, and financial. 

Bonner added that the code provides information on how companies can use children’s data and that ICO wanted to see organizations taking initiative in protecting children online through designs and services that follow the code. The ICO’s enforcement powers are very broad — at least on paper — with GDPR, for example, allowing it to fine infringers up to £17.5 million or 4% of their annual global revenue, whichever is higher. The watchdog can also issue orders prohibiting data processing or forcing adjustments to non-compliant services. Therefore, apps that neglect the code must be willing to take up the penalty.

Major companies like Instagram, YouTube, and TikTok were all announcing several security updates especially concerning children in recent months. This is proof that they were preparing themselves to abide by the code. Apple adding child safety-focused features caught a lot of attention because of some controversies regarding the features. 

 While there has been increased attention in the United States to online child safety and the cruel ways in which some apps exploit children’s data — as well as a number of open investigations in Europe (such as this Commission investigation into TikTok, which is based on complaints) — the United Kingdom may be having an outsized impact here due to its concerted push to introduce age-focused design standards.

The code also aligns with upcoming U.K. legislation that will impose a “duty of care” on platforms in order for them to take a broad-brush safety-first stance toward users, with a particular focus on children (and it will apply to all children, rather than just those under the age of 13 as COPPA does in the United States).

In the blog post by Bonner, ICO has taken credit for all the major changes made in the social media platforms like Facebook, Google, Instagram, and TikTok. He added that the Data Protection Commission in Ireland is following in their footsteps. The data protection watchdog in France too has adopted similar rules. The code is also making an impact on the growth of the domestic compliance services industry. ICO announced a set of GDPR certification scheme criteria with importance placed on age-appropriate design code. 

The ICO will formally set out its position on age assurance this autumn, according to Bonner’s blog post, so it will be providing more guidance to organisations that are covered by the code on how to deal with that tricky piece. It’s still unclear how stringent of a requirement the ICO will support, with Bonner suggesting it could be “verifying ages or age estimation.” 

Children’s online safety has been a major concern for UK authorities in recent years, despite the fact that the larger and long-awaited Online Safety (neé Harms) Bill is still in draught form. Earlier, an attempt by UK lawmakers to bring in mandatory age checks to prevent children from accessing adult content websites dropped in 2019 after widespread criticism that it would pose a significant privacy risk for adult porn users. The government then continued to bring in ways for new child safety online regulatory rules. Now through this design code, the government has asked organizations to introduce strict age verification methods.

Simultaneously, the government’s increased push for online safety risks colliding with some of the ICO’s noble goals in its non-binding children’s privacy design code.

While the code includes the (welcome) recommendation that digital services collect as little information about minors as possible, the U.K. government announced earlier this summer. Before the anticipated Online Safety Law, lawmakers issued recommendations to social media companies and messaging services, advising them to prevent youngsters from using end-to-end encryption.

So, according to an official UK government message to app developers, the legislation will soon force commercial services to access more of children’s information, not less, in the name of child safety. This is in direct opposition to the design code’s data minimization push. 

The danger is that a growing focus on children’s privacy will be muddled and exacerbated by ill-thought-out laws that force platforms to watch children in order to exercise “protection” against a variety of online dangers, such as adult content, pro-suicide messages, cyberbullying, and CSAM. The regulation appears to push platforms to display their functioning in order to demonstrate compliance, which might lead to ever-closer tracking of children’s activities, data retention, and possibly profiling and age verification checks (that could even end up being applied to all users; think sledgehammer to crack a nut). In a nutshell, it’s a privacy dystopia. 

Such mixed responses and disjointed policymaking appear set to increase confusion and conflict in requirements on digital services operating in the United Kingdom, making tech companies legally responsible for divining clarity amid the policy mess — with the risk of huge fines if they get it wrong.

As a result, adhering to the ICO’s design criteria may be the easiest part. 

Exit mobile version