Data protection and child protection

The Secretary of State for Health, Jeremy Hunt, recently wrote to the largest social media companies following up on a meeting he had with them before Christmas 2017. He indicated his discontent with the slow progress they were making in areas such as age verification, screen time limits and cyber bullying.  He gave them until 30 April to respond with details of the steps they have taken in relation to those points and their intentions in relation to ‘healthy screen time’ for young users.  Possible legislation was threatened if the voluntary joint approach continued to prove unsatisfactory.  The details of their responses have yet to be made public.

Just what possible future legislation could look like is also not clear at the moment. On 25 May 2018 the General Data Protection Regulation will take direct effect across EU Member States and domestically a new Data Protection Act will also come into force to compliment it and repeal the existing 1998 act.  The regulation and act specifically recognise the particular vulnerabilities of children and that they can be less able to appreciate the risks, consequences and safeguards, when it comes to collecting and processing their personal data.  The approach of the legislation is to provide enhanced protection for them through increased obligations on the part of data controllers and processors, particularly when their personal data is going to be applied for the purposes of marketing, or creating personality or user profiles.

The lawful bases for processing personal data will be the same as those for adults contained in Article 6 of the GDPR. However, particular care has to be taken if using the justification of consent for processing personal data. In the UK those offering information society services  ie. online shopping, banking, search engines and social media directly to children can only rely on consent if the child is aged 13 or over.  The age of consent will be 16 in some other Member countries. The GDPR stipulates age 16 but allows for derogations.  Seven other member countries, including Ireland, Spain and Demark, have adopted age 13 too.  If the child is under 13 or 16 consent will have to be given by the person with parental authority for the child.

The data controller will also be required to make reasonable efforts to verify in such cases that consent is given or authorised by the holder of parental responsibility over the child, taking into consideration available technology.

However, the consent of the holder of parental responsibility will not be necessary in the context of preventative or counselling services offered directly to a child. Competence is as  important a concept under the GDPR as it was with the Data Protection Act 1998.

Children will have the same rights as adults over their personal data.   An individual’s right to erasure is particularly relevant if they gave their consent to processing when they were a child,  were not fully aware of the risk involved by the processing and later want to remove the personal data, particularly from the internet.

Clause 124(1) of the draft Data Protection Bill sets out the requirement of the Information Commissioner to prepare a code of practice which contains such guidance as she considers appropriate on standards of age-appropriate design of relevant information society services which are likely to be accessed by children. It will have the force of statutory guidance.  This has not yet been published but is to be prepared within 18 months of Royal Assent.

It is clear that for controllers and processors the design and default approach to data protection is going to be central to anything relevant to children.

An earlier attempt to regulate an aspect of the internet for child protection purposes has just become live. The Digital Economy Act 2017 received Royal Assent last year.  However, its requirements that providers of online pornography ensure age verification of their customers at 18 or over is just beginning to be effective.  On 21 February 2018 the British Board of Film Classification was confirmed as the age verification regulator.  It will be responsible for identifying and notifying non-complaint providers of online commercial pornography. It will also notify ancillary service providers, payment-services providers and direct internet service providers to block access to non-compliant pornography services.  At present there are no figures available in order to bench mark this work.  It will take several years yet.

Technology has to a great extent been a source of freedom and awareness for many children but it and the relevant controlling legislation have not kept ahead of the misuse of the internet as a means to groom, exploit and abuse. Some major challenges no doubt await but they need to be considered with haste otherwise by the time they become effective technology will be even further ahead.


Firth_S-24_printPosted by Sarah Firth, an associate with BLM

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s