Did I Do That? On Security Errors, Fear, and Shame
Security errors are common. No, really common.
In 2015, human error ranked as the single largest factor behind security breaches. A 2021 report by Verizon found that 82% of data breaches involved human error, while a 2014 study by IBM revealed that over 95% of the security incidents they investigated had “human error” as a contributing factor.
Security errors are also costly. A 2018 study by the International Monetary Fund shows that cyber threats cost financial institutions between 10% and 30% of their net income — and most of those threats result from employee actions and mistakes.
Yet, despite being both common and costly, mistakes at work are widely underreported. A recent study from ISACA, for instance, showed that employees are underreporting security incidents even when it’s a requirement.
So, why don’t employees report security problems and mistakes when they happen? And what can companies do to create a better culture — from better transparency to more resilient systems — around those mistakes? We’ll investigate below.
The psychology of making mistakes
We all know the feeling of making a mistake — but we may underestimate just how much that feeling can affect our responses to mistakes at work. Unpleasant emotions like fear and shame can be major motivators, in some cases leading employees to conceal evidence of serious errors.
Indeed, the fear of failure can become so strong that it leads to a broad range of emotional and psychological problems, including shame, depression, anxiety, and low self-esteem. In extreme cases, it can actually be diagnosed as a medical condition known as atychiphobia — a testament to how truly unpleasant it can be to make a mistake.
As a 2021 study in Frontiers in Psychology notes, shame and fear of failure in the workplace can “inhibit employees’ motivation to correct errors,” “reduce their confidence in making subsequent improvements,” and cause them “to be evasive about failure events and avoid public attention, which prevents people from communicating about” and learning from failure.
Similarly, fear of mistakes can paralyze people — and can paradoxically make mistakes more likely. As the Harvard Business Review notes: “When we’re scared of making a mistake, our thinking can narrow around that particular scenario. Imagine you’re out walking at night. You’re worried about tripping, so you keep looking down at your feet. Next thing you know, you’ve walked into a lamp post.”
At least part of it may be generational. According to research by Tessian, younger employees were five times more likely to admit to cybersecurity errors, with 50% of 18 to 30 year olds owning up to mistakes versus just 10% of workers over 51. Some of that may be due to experience — less-experienced workers are more likely to make mistakes in the first place — but the researchers suggested that younger workers are actually more aware that they have made a mistake and are more willing to admit their errors and “lose face,” opening up more potential for improvement.
Adopting a mistake-tolerant culture
So what can be done? How can organizations better learn from past mistakes and anticipate future ones?
Turns out, it’s all about the culture. Analyzing mistakes and failures requires openness, patience, and even tolerance for ambiguity — but companies often value decisiveness, efficiency, and action over thoughtfulness and reflection. Without the right company culture, employees won’t be willing to own up to mistakes, and supervisors won’t be able to help the organization improve on those mistakes.
Similarly, CISA notes in their 2020 Insider Threat Mitigation Guide that a good response “should not be focused on catching people doing things wrong. Rather, it should be grounded in the notions of helping people avoid mistakes, providing a safe environment, and preventing insider incidents from occurring while mitigating the potential risks.”
To encourage this kind of open, nonjudgmental work culture, one Harvard Business Review article suggests a top-down approach to learning from security mistakes. They note that leaders should avoid blaming and instead make people feel comfortable to point out their own mistakes.
Meanwhile, SHRM suggests curiosity as a tool to avoid shame and learn from mistakes. Remaining curious and open to explanations during conversations about security errors can lead to significantly better communication, improved learning, and growth — as opposed to criticism, which “triggers a combination of defensiveness, dejection, resistance and antipathy.”
Let microsharding cover your mistakes
Company culture aside, you may be wondering what you can do to protect against human error in your security systems. We’ve got you covered.
ShardSecure’s technology helps companies minimize the fallout of human error and maintain business continuity. Its three-step microsharding process desensitizes sensitive data for the cloud, rendering it unintelligible and of no value to unauthorized users. This supports data privacy and confidentiality even when buckets are misconfigured or storage locations are accidentally left exposed.
Meanwhile, if a team member accidentally clicks a link in a phishing email and introduces ransomware to a system, our self-healing data can help. By automatically reconstructing in real-time any microsharded data that fails our data integrity checks, we can keep operations running in the face of mistakes.
To learn more, take a look at our solution briefs, FAQs, and white papers. We’ve got a wealth of information on how ShardSecure helps your data remain confidential, available, and protected in the cloud — so your future mistakes can involve less fear and shame and more opportunities for growth.