Cybercrime cost the world economy about $445 billion in 2014 and the 2015 numbers will be even higher. The cost of data breaches will reach $2.1 trillion globally by 2019. Worldwide spending on information security is estimated to reach $77 billion in 2015. In the midst of these astounding numbers, the role of the “human factor” has gotten lost. This is a frightening fact. Why? Because “they will click.” A breach is just one click away – a single person can and will overcome any technological safeguard. This is an unassailable reality, but one that gets mostly lip service by companies.
The human factor is the most important element of a company’s cyber defense profile. Every feature of a company’s cyber defense is driven by human thinking and behavior – from the baseline understanding of a company’s vulnerabilities, design and implementation of defenses, interaction with cybersecurity vendors and responding to threats and breaches.
Major hacks have a common element – the “human factor.” Yet we rarely focus on the defining role human beings played at every step in the breach lifecycle. Viewing humans as a crucial element of cybersecurity conflicts with the common perception that cybersecurity is a “tech” problem that has a “tech” solution and the equally common and utterly false belief that “regular” people just can’t be taught this stuff.
Hackers are so sophisticated that it is not remotely enough to rely on even the best firewalls and anti-virus protection. No technology will be enough if employees do not practice good cyber hygiene – this includes line employees who use computers and mobile devices and corporate leaders who are responsible for information security policies, including corporate directors. Any individual with access to a company’s system represents a threat and must be part of that company’s cyber defense strategy.
IBM’s “2014 Cyber Security Intelligence Index” reported that 95% of all security incidents involve human error. Many of these are attacks by external actors who prey on human weakness to lure insiders to unwittingly give them access to sensitive information. Insiders – current and former employees, in particular – have become the most-cited culprits of cybercrime.
Many attacks involve social engineering techniques, like “malvertizing,” to lure targeted individuals into making mistakes. Ninety-five percent of advanced and targeted attacks involved spear-phishing scams with emails containing malicious attachments that can cause malware to be downloaded onto the user’s computer or device.
Failure to account for the human factor undermines cybersecurity technology. Richard Henderson, a security specialist for Fortinet’s FortiGuard Security Lab, said in a Bloomberg report that “It doesn’t matter how much money a company spends on infrastructure or technology, until you close the human gap in the equation, you are always vulnerable to attacks.” Tim Bolt, director of performance management at Lloyd’s of London notes that “Cyber attacks are often treated as a problem of technology, but they originate with human actors who employ imagination and surprise to defeat the security in place…The evidence of major attacks during 2014 suggests that attackers were often able to exploit vulnerabilities faster than defenders could remedy them.”
Organizations are not oblivious to the “human factor.” Cyber Edge Group’s “2015 Cyberthreat Defense Report (North America and Europe)” states that for the second year in a row “low security awareness among employees” was cited as the top inhibitor to an organization’s ability to defend itself against cyberthreats. With statistics like these, employee awareness and cyber hygiene should be at the forefront of corporate cybersecurity priorities. Not so.
Virtually all cyber defense systems and technologies have been focused on the concept of “fortification” — the creation of technological barriers to attack. But fortification is an archaic, “physical world” concept that has no place in the intangible and virtual world of cyberspace.
Technology has not reached the point where it can stop any and all “zero day” viruses and other unique threats before they wreak havoc. The utility of humans in the overall cybersecurity effort has been undervalued and overlooked. If intruders are banking on human error, a logical next step would be to equip humans to practice better cyber hygiene. Doing so requires an effort to which most organizations have only given lip service.
Few organizations have undertaken effective programs to make employees part of the overall cybersecurity effort. Surveys show that many corporate directors are still in the dark about cybersecurity. Major breaches show that even high-level officers and corporate boards could profit by cybersecurity training, even if only to confirm that big spending on cybersecurity technology does not create effective cybersecurity. The recent Sony breach had its roots in a spear phishing attack aimed at a corporate officer. The federal Office of Personnel Management cyber debacle shows what happens when an organization’s leadership does not grasp the multivariate nature of cybersecurity.
Old and flawed perceptions must give way to the reality that humans must be part of a company’s cyber security strategy. The objective of every organization should be the creation of a “human firewall.” A recent article on the warontherocks.com website declares that when it comes to cybersecurity, there is “no patch for incompetence.” Technology alone is a losing defense strategy. Truly competent cybersecurity must account for the “human factor.”