Technology

Data Breaches: Human Error or a Series of Avoidable Mistakes?

A recent surge in data breaches, including one with the PSNI, raises a question: who is at fault? It’s easy to blame human error, but is it fair? Mistakes happen, but they’re one part of the equation. Poor training, flawed systems, and weak procedures often share the blame. In the PSNI case, any system that lets a FOI request include unredacted, safety-critical data is broken. Systems must be designed to prevent failure at every step. This applies to both data security and live bet management.

The Information Commissioner’s Office says 80% of data breaches are due to human error. Common mistakes include:

  • The email containing personal data was sent to the wrong recipients.
  • Forwarding attachments containing personal data in error.
  • Noticing hidden tabs or lines in spreadsheets with personal data is key. It’s one cause cited in the PSNI case.
  • Sensitive mail was sent to the wrong address. (Yes, a classic data breach!))

Yet, I sometimes hear of breaches due to human error. I don’t think, “How did someone accidentally do that?” Instead, I wonder…

  • Why didn’t anyone spot the risk of having ALL those records in an unprotected spreadsheet?
  • Why wasn’t there a way to prevent forgetting to blind-copy email recipients?
  • Are responses to Data Subject Access Requests or FOI requests being properly reviewed? What level of supervision/QA exists in that organization?
  • Why is it still considered acceptable for confidential documents to leave the office?

I could go on.

Technical and Organisational Measures (TOMs)

Should we blame a lack of proper measures to protect personal data? Not human error. A fundamental data protection need.

We all know robust procedures and security measures can mitigate the risk of human error. A simple example. I know employees who get an alert if they try to send an attachment with personal data, without a password.

Effective data protection training is essential and shouldn’t be a ‘check-the-box’ task. It shouldn’t be a case of an annual online training module completed; no further action is required! We must ensure the training is relevant and effective. It must deliver key lessons. Awareness campaigns are vital. They reinforce training and keep key messages top of mind. Using mistakes (big or small) as case studies is a good way to keep people alert to the risks. Post-incident investigations are vital—they provide valuable lessons to prevent future mistakes.

Instead of a liability, people can be our greatest asset. If we give them enough knowledge, they can help prevent data breaches.

Lessons from High-Profile Data Breaches: Heathrow Airport and the Cabinet Office

There was a case where Heathrow Airport lost a memory stick. It had unencrypted personal data on it. The ICO found the airport had failed to secure its data and train its staff. It seemed like a human error. The employee didn’t even know the stick needed encryption.

In another case, the Cabinet Office published people’s home addresses in the New Year’s Honours list. The mistake happened because a rushed IT system was set up poorly. Flawed procedures and tight deadlines led to shortcuts. Both cases highlight systemic failures, not individual errors.

The lesson here? Data breaches aren’t always solely the fault of the person pressing the ‘send’ button. Too often,  systems and procedures have already failed. Data protection is a mindset. A culture. Not an add-on. The PSNI case serves as a tragic reminder of the severe consequences of data breaches.

Breaches can have a profound impact on employees, customers, crime victims, and patients alike. Exposure to their information often leads to great anxiety and distress.

Data protection laws don’t provide a detailed roadmap for businesses to follow. To find data protection risks, we must know what personal data we have and its uses. We need to assess and manage risks. Any protective measures implemented must align with the level of risk involved.