This Saturday, 25 May, will be the one year anniversary of Europe’s General Data Protection Regulation (GDPR) coming into force. Alex Cooney, CEO of CyberSafeIreland, a non-profit working to empower children, parents and teachers to navigate the online world in a safe and responsible manner, discusses the impact of the regulation on children, particularly the GDPR’s requirement for a digital age of consent.
Children’s right to data privacy has come into sharper focus since the introduction of the General Data Protection Regulation across Europe in 2018, which for the first time specifically addressed data protection from a child’s perspective. Article 8 determined that each member state would set its own ‘Digital Age of Consent’, referring to the age at which young people may sign up for online services such as social media without needing the explicit consent of their parent or guardian. This would have to be somewhere in the range of 13-16 years. Following much debate on the issue in Ireland – some of it constructive, some of it less so and some of it frankly, misguided – 16 years old was eventually determined to be the “right” age for Irish children.
One year on, we felt it was worth examining what has changed in practice since this decision on 25 May 2018. As it turns out, most online service providers, apart from WhatsApp, stuck with 13 as the stated minimum age for using their services. This was effectively the de facto age of consent before GDPR came into force as result of the earlier US Children’s Online Privacy Protection Act (COPPA).
As we regularly chat to children aged 8 – 13 and survey them as part of our schools education programme, we were able to easily identify the 10 most popular social media and messaging apps amongst this age cohort. These are listed below in order of popularity and with the relevant minimum age restriction for ease of reference:
|In order of popularity||App||Minimum age restriction according to T&Cs|
A significant majority of the preteen children that we speak to are already using social media and messaging apps. The data we published at the end of the last school year (2017/18), based on responses from more than 5000 children, indicated that the figure was high: 70% of 8 – 13 year olds were on social media and this usage increases with age, starting with 47% of 8 year olds, rising to 91% of 13 year olds. We haven’t yet crunched the data for this year but early indications are that the number will remain high.
This begs the question: how easy is it for underage children to sign up to these services? Pretty easy, as it turns out, with or without parental consent. The new regulation states that the online service providers must make a “reasonable effort” to verify that parental consent has been given (whatever that means!)
With the support of Liliana Pasquale, a lecturer and Assistant Professor at the School of Computer Science at University College Dublin, and Paola Zippo, a Masters Student visiting UCD from Politecnico di Bari in Italy, we took a closer look at the terms and conditions (T&Cs) and sign-up processes of these online services to determine what measures they had put in place to make clear their minimum age restriction, to verify parental consent and to deter underage usage. As it turns out, there weren’t many. There is certainly no age-verification and if an underage child is prepared to lie and put 16 as their age, then they can access any of these services without restrictions, user limitations or warning boxes appearing.
Whilst the minimum age is clear at the point of sign-up on some of these apps, including on Snapchat, Instagram and TikTok, on others such as WhatsApp, Messenger and Discord, it is not. This latter group do not even ask for an age at sign-up at all, so you’d need your magnifying glass to scroll through the largely (and particularly from the point of view of a child) unintelligible T&Cs to work it out.
In fairness, those services that do ask for a birth date will refuse to let you to sign up if you put an age less than 13. (A little warning box will appear.) But you can simply try again with an older age and then you’re in.
In a nod to GDPR, some services have additional conditions for children aged 13 – 16:
- Snapchat notes that it may limit how they store and use data for this age group.
- Instagram suggests asking for parental authorisation. A child can skip this last step, however, and can also enter false dates because there is no age verification. The age that they enter appears to impact what ads will be shown for 13 – 16 year olds.
- Viber allows the children this age to use its services, but states that it is with greater control and protection.
- Skype asks for an email address of a parent if a child enters an age less than 16.
- Any teenager over the age of 13 can register and use Facebook but to view any content, it is necessary to get parental authorisation, either via their Facebook page or by email.
Of course there is nothing to stop a child setting up an email account for the purpose of bypassing the requirements for parental consent for either of the latter two apps. Or just change your age to 16, and you bypass the need for parental consent completely.
So going back to the point at which we started, has anything changed since the introduction of GDPR and specifically Article 8? Not really. Underage children can still easily sign-up to these services, with or without parental consent. Attempts to gain parental consent can be easily bypassed even where they exist. Where this results in a child lying about their age, there will be no protection of their data.
All that hot debate about 13 or 16 has proved largely meaningless: whilst the regulation was clearly well intentioned, it hasn’t yet had the intended impact. Online service providers continue to look the other way whilst underage kids use their services (and share their data) in large numbers.
This post originally appeared on the LSE Media Policy Project Blog and is reproduced with permission and thanks.