We use cookies to make this site as useful as possible. Read our cookie policy or ignore.

Don't press too hard: Use of behavioural biometrics brings privacy issues under GDPR

24 October 2018

The way a user interacts with their phone screen, keyboard or mouse contains identifying features that can be as accurate as fingerprint mapping, iris scans or facial reading. By using screen sensors in devices and identification algorithms on websites, companies can track thousands of data points from analysing the angle at which people hold their devices, which fingers people use to scroll, the pressure applied, the rhythm of keystrokes and curser movement to build a unique digital profile assigned to a particular user in a method known as behavioural biometrics. This profile is then measured against every new interaction, analysing thousands of elements to calculate a probability-based determination of whether the user is who they say they are when they log in to an online account, device or application.

With cyber-theft and data breaches becoming an increasingly common occurrence, the need to rapidly and accurately identify fraud has driven the development of behavioural biometrics as a means to target automated attacks and suspicious transactions. However, for the technology to function, companies must amass libraries of biometric personal data to construct profiles based on how users touch, hold and tap their devices. Given the scale and quantity of personal data potentially being collected, as well as the multitude of potential purposes outside of fraud detection, concerns are emerging that behavioural biometrics may not be compatible with European Union privacy rules, in particular the legal bases under Article 9 of the General Data Protection Regulation (GDPR).

Many companies, from start-ups to large tech multinationals, have begun to build behavioural biometrics into their security software as the technology is increasingly recognised in the cybersecurity market as a powerful safeguard. Given the use in combating fraud, most companies are reticent about revealing the details of their biometric processing outside of the mandatory information required in privacy notices.

When the GDPR came into force in May 2018 it introduced new rules around biometric data, recognising it as a ‘special category of personal data’ that requires both a special legal basis under Article 9 and an accompanying data privacy impact assessment to be carried out (which requires an identification of the privacy concerns and the measures that must be implemented to mitigate risk).

Furthermore, the GDPR contains a very broad definition of biometric data and allows Member States to impose additional conditions and limitations on a national basis.

These factors, combined with the technology being relatively nascent, means businesses who are contemplating deploying behavioural biometrics will need to ensure their processing is in line with the developing rules for the technology under the GDPR. Finally given the scale of the personal data being collected, the potential for misuse and the ongoing questions as to whether behavioural biometrics' effectiveness can be replicated in less-invasive methods, companies can expect to be scrutinized by regulators across Europe that have been tasked with examining whether this technology's benefits can outweigh the impacts on the privacy of individual citizens.