Skip to main content
07.08.2024

GDPR and facial recognition in schools and colleges: six steps to avoid being caught out

A school in Essex, Chelmer Valley High School, has been reprimanded by the Information Commissions Office after it introduced Facial Recognition Technology (FRT) to manage its cashless catering system for students.

FRT means software that identifies or verifies a person by analysing their facial features. Uses include face detection and faceprint matching. It is deemed to be biometric personal data under UK GDPR and is special category data. This means that it is subject to enhanced protection under the data protection legislation.

Background

In March 2023, the school switched from fingerprint recognition which had been in place since 2016 to facial recognition. Because the personal data used was biometric personal data, and the individuals were children (who are regarded as vulnerable data subjects), a Data Protection Impact Assessment (DPIA) was needed to understand the risk and the data protection issues involved. This wasn’t done until nine months after the school had implemented the new technology. 

The school didn’t consult with its data protection officer or parents before implementation. The parents simply received a letter with a slip to return if they wanted to opt their child out of the FRT arrangements, but no opt-in consent was obtained.

Unfortunately for the school, once the DPIA was undertaken, it identified the technology as being high risk and it couldn't identify any ways to reduce that risk. Organisations must undertake a DPIA where there is “likely to be a high risk” to the individual. If the DPIA identifies that the use of personal data is high risk and the risks cannot be reduced or mitigated, organisations can't proceed until they have consulted the ICO.

In January 2024, the school submitted the DPIA to the Information Commission's Office for further consultation.

What did the ICO say?

the ICO formally reprimanded the school. It found that:

  • The school should have undertaken a DPIA before rolling out the technology. The personal data concerned was special category data and the individuals were vulnerable data subjects.
  • Opt-out consent was not enough. As the personal data being processed was special category data any consent given had to be explicit opt-in consent. This wasn’t obtained until after the ICO got involved.
  • The consent of the students should have been obtained as it was likely that the students, aged 11 to 18, were competent to provide their own consent. This was later obtained from the students.
  • The school should have consulted their data protection officer. The ICO took the view that if the school had raised this with their DPO before implementation then many of the compliance issues would have been caught and addressed.

The ICO did acknowledge the school’s eventual compliance by completing a DPIA and later obtaining explicit opt-in consent from students.

What lessons can other schools and colleges take from this?

1. Understand when a DPIA is needed

Often schools and colleges will avoid doing a DPIA but they should be seen as a positive step in assessing and understanding data protection risks. They should be done before the new use of personal data is being rolled out not several months later as in this case. The ICO has prepared guidance that you can use to assess whether a DPIA is needed. 

Schools and colleges should regularly carry out DPIAs as students are regarded as vulnerable data subjects.

2. Use your Data Protection Officers (DPOs)

Senior leaders should engage with their DPO regularly particularly in the early stages of the process if new technology is to be deployed, or a new use of personal data is going to be rolled out.  A DPO can help catch compliance issues before they become problems. Their expertise is invaluable in navigating complex data protection requirements. 

In the DfE guidance on data protection in schools senior leaders are accountable for deciding on uses of technology, understanding data protection laws and getting advice from their DPO.

3. Understand what the legal basis is for the school’s use of personal data

All uses of personal data need to be lawful and there are defined lawful bases in UK GDPR.  If you are using explicit consent for the use of the personal data (as was the case here as facial recognition technology is special category data) you should understand what that requires. UK GDPR level consent is not easy to get. Opt-out consent is not explicit consent.

4. Who do you communicate with - the pupil or their parent?

Once a child is able to understand what is happening with their personal data and what the nature of a request such as a DSAR means then you should communicate with them about the use of their personal data. There is no hard and fast rule under UK GDPR about when you need to communicate with a student and it comes down to their understanding, but you need to think about who to communicate with in any given situation.

5. Provide clear communication

Provide a clear privacy notice explaining how you are using personal data of your pupils and students.

6. Take proactive Measures

Taking proactive steps, such as refreshing consent and completing necessary assessments, can prevent regulatory actions and protect the reputation or your school or college.

We can help

Our data protection team, led by Joanne Bone can help your school or college adopt AI technologies in a legally compliant way. Joanne is down to earth, provides practical advice and takes a particular interest in the use of AI in the education sector. 

Our newsletters

We publish monthly employment and education newsletters. If you'd like to be added to the mailing list, please let me know.

 

“Handling people’s information correctly in a school canteen environment is as important as the handling of the food itself. We expect all organisations to carry out the necessary assessments when deploying a new technology to mitigate any data protection risks and ensure their compliance with data protection laws.

“We’ve taken action against this school to show introducing measures such as FRT should not be taken lightly, particularly when it involves children.

“We don’t want this to deter other schools from embracing new technologies. But this must be done correctly with data protection at the forefront, championing trust, protecting children’s privacy and safeguarding their rights.” - Lynn Currie, ICO head of privacy”