“Underlying confidential information” and its role in an embarrassing data breach for the City’s watchdog
The City’s watchdog, the Financial Conduct Authority (FCA), has admitted to accidentally publishing the personal data of some 1,600 individuals on its website. These data were contained within its online response, in November last year, to a freedom of information (FOI) request.
The individuals affected were all complainants to the regulator. More than half had only their names published, while the remainder had details such as their address and/or phone number exposed.
The FCA has stated that no financial, payment card, passport or other identity information were included in its inadvertent data release. However, the personal data disclosed is nonetheless covered by the General Data Protection Regulation (GDPR). Notably, the ‘integrity and confidentiality’ principle, per GDPR Article 5(1)(f). This requires organisations to use appropriate technical and/or organisational measures to protect against the accidental loss or disclosure of personal data.
While details of the failure that led to the breach have not been published by the FCA, the regulator states the issue concerned “underlying confidential information”. At its simplest, this suggests the FCA published a document in response to the FOI request that visibly contained the personal data concerned and this simply went unnoticed by the report’s creator, prior to its publication.
However, the personal data could also have been buried more deeply, for example in “hidden” worksheets used to calculate any statistics the FCA included in its FOI response. The copying and pasting of automatically calculated data from applications such as Microsoft Excel into other reports is commonplace in business. However, while the data underlying the calculations may not be visible, that doesn’t necessarily mean it isn’t still there and accessible to those with enquiring minds.
Where information is visible, manual review procedures could assist in the identification of potential data breaches. However, such procedures need to be properly actionable and auditable, and informed by a real understanding of what ‘personal data’ is. The GDPR defines it as ‘any information relating to an identified or identifiable natural person’. This wide-reaching definition means that seemingly innocuous data might come within its ambit and therefore be subject to regulation. For example, an identification number that relates to a customer. Reviewers therefore need to know what to look for.
Even when armed with such knowledge, an understanding of the overlaps between freedom of information and data protection laws is also needed for the procedures to be effective. Otherwise exemptions that allow personal data to be excluded or redacted from FOI responses may not be appropriately employed.
To address the issue of “hidden” personal data, technological checks may also be necessary. Automated tools can check documents for hidden data and remove it, prior to publication or dissemination.
The FCA’s breach this week is undoubtedly embarrassing for the regulator, not least because of its role in policing data breaches. It imposed a £16.4m fine on Tesco Bank in 2018 for its failure to protect customer information, for example. However, the FCA’s breach also underscores the need for companies to be more vigilant in their efforts to ensure the ‘integrity and confidentiality of stored or transmitted personal data’ under the GDPR, and to develop a greater awareness of their obligations and responsibilities as data controllers and processors.