Professional attackers don’t target company firewalls, but company employees. That’s why the human factor needs to be a firm part of a cybersecurity concept—but generally it isn’t. Changing that is a task for management, because it costs money. And because, in most cases, the corporate culture needs to change accordingly.
For managers, what is at stake is not just their own personal liability in the event of data breaches—the EU General Data Protection Regulation (GDPR) is looming on the horizon too. Data breaches also play a vital role in relation to corporate culture. It starts right from the relevance attaching to the subject of cybersecurity: if the management tier regards cybersecurity as a brake on operations and a cost driver, then security loopholes are bound to come about. Effective protection measures are only adopted if the subject is recognised by the decision-makers as a key pillar in commercial success and that is then communicated across the company.
It does not help overly in achieving objectives if the management designates an employee with insufficient qualifications in that area as Chief Information Security Officer (CISO) or Data Protection Officer. On the one hand, it fails to send a message of confidence to the staff, and on the other hand it weakens cybersecurity at the heart of the company.
A suitable candidate will be a good communicator with an understanding of business processes, and does not have to be a hard-core tech enthusiast. For many cybersecurity experts, it goes without saying that the CISO should definitely not be part of the IT organisation and should not have the CIO (Chief Information Officer) as line manager. That often leads to the CISO feeling inhibited in speaking out when it comes to tracking down vulnerabilities in the IT infrastructure. After all, no-one likes to be the one doing his own side down. The smarter solution is to have the person responsible for cybersecurity reporting directly to the head of the company.
In order for the CISO—and, with him, the entire organisation—to be successful, decision-makers need to acknowledge their own gaps in knowledge and understanding. Otherwise they run the risk of taking wrong, or ineffective, decisions. But if they set a good example, the threshold is lowered for staff members to be similarly able to admit their own lack of knowledge.
One of the most important aspects therefore lies in creating a positive error culture: only if employees need not fear blame, warnings or even dismissal following an inadvertent click on a poisoned e-mail attachment or disclosing their log-in details on a phishing page will they report such incidents to in-house cybersecurity colleagues. And it is only if those colleagues find out about that click that they can protect the user account in question—and thus the entire corporate network. Conversely, if a climate of fear instigated from the top prevails, the CISO and the senior management learn about that disastrous click only after the digital crown jewels have long been copied.
For employees to be able to fulfil their role as multiplicator, they have to be put in a position to do so: through training and awareness programmes. Both are measures that should, and must, cost money. Which brings us full circle: if cybersecurity is not ranked in the category of “unpleasant duty” but is considered a basis for corporate success like, say, marketing or product development, then decision-makers will also find a budget for training.
The impact of the corporate culture on cyber security was a subject of the theme world Human Factor at the Command Control 2018.