In this chapter, we will explore cybersecurity awareness-raising in more detail, particularly focusing on techniques to raise awareness in a positive and effective way. This chapter will therefore introduce you to some ideas to implement in your organisation. We will also address the issue of fear, uncertainty and doubt (FUD). Unfortunately, the cybersecurity industry has relied on FUD to try to communicate cybersecurity messages for too long, often in an ineffective way. The way many use FUD can even backfire, leading to greater resistance to awareness messages, rather than greater engagement. There are many other ways we can talk about cybersecurity, and more positive ways. However, we must also recognise that raising awareness of cybersecurity does often inevitably mean discussing the threat, which can mean that the people we are communicating with feel fear. With this in mind, this chapter explores how we can talk about something scary in the most constructive, positive way.
Before we move on, however, we should tackle a fundamental question underpinning all of this: Is awareness-raising, in and of itself, always a good thing? Over the last few years, I have watched the cybersecurity industry change. In 2011, when I was beginning my career, I would often have to explain to people within the industry what I actually meant by ‘the human side’ of cybersecurity, and what it was that I actually did. That is no longer the case, as the human side of cybersecurity has massively grown in prominence.
With the growth in prominence of the human side of cybersecurity, there has been an associated sweeping narrative that ‘people are the weakest link’. This is an unhelpful and, frankly, lazy narrative that some of us within the industry have fought hard to challenge, attempting to demonstrate that people can be the strongest link if given the correct support, awareness-raising and tools. The UK’s National Cyber Security Centre (NCSC) has joined this call, with an ethos from its inception that promotes a positive and people-centric approach to security.46
With the rise in acceptance of people as an important factor in cybersecurity, there has been a prevalent assumption that, to solve the ‘people problem’, an organisation should carry out awareness-raising activities. However, awareness-raising comes in many forms and all of those forms are not equal. When you raise awareness of an issue, you may elicit unintended consequences. It is important to understand what some of those unintended consequences may be, how to limit their chances of emerging, and how to effectively deal with them if they arise. We will explore this issue more, as this chapter progresses.
Awareness-raising that is engaging, effective and empowering – and is conducted as part of a wider strategy focused on behaviour and culture – is immensely valuable to an organisation. So, how can we raise awareness of cybersecurity in a way that is engaging, effective and empowering?
People are more likely to engage in recommended behaviour if they understand why. I have seen this first-hand during my career delivering cybersecurity awareness-raising sessions in organisations of all sectors around the world. Cybersecurity can seem intangible: without deep technical knowledge and understanding, people often simply do not understand, for example, the importance of having unique, long and strong passwords or how clicking a malicious link in an email can allow a cybercriminal to access their machine, including all of the data on it, their microphone and webcam.
In the words of Simon Sinek, we need to start an awareness programme with ‘why?’ (Sinek, 2011). Telling people to have good passwords, to be careful of what they click on and to set up two-factor authentication is a list of commands. Why should they simply follow what you tell them, especially if it gets in the way of their primary purpose online, whether that’s getting on with their work or catching up on social media?
Telling people why something is important is one thing but showing them takes your awareness-raising to a whole other level. You can tell people to have unique and strong passwords for all of their online accounts, but that just sounds like an unnecessary burden unless you help people understand why it is so important. When people see a password-cracking demonstration, they understand that millions of dictionary words can be cracked in less than a second with pre-built tools that are freely available for anyone to use. In this way, individuals experience a cyberattack in a safe environment, optimising their learning without the pain of a real incident. As Meier (2000, p. 28) said:
If you seek information, read words
If you seek understanding, have experiences.
Cybersecurity is a complicated subject. Richard Feynman, the Nobel prize-winning physicist, is renowned for being able to make the most complicated of subjects more accessible. We can adapt what has become known as the Feynman technique for our cybersecurity awareness-raising initiatives. The Feynman technique47 can be broken down as follows:
Step 1: In a notebook or on a blank piece of paper, write the name of the concept or subject you are studying.
Step 2: Imagine you are teaching the subject to a child. Underneath the title, write an explanation of the concept or subject, using only plain language (avoiding jargon where possible, and defining it simply where it cannot be avoided). This will highlight what you know and, most importantly, what you don’t fully understand.
Step 3: Focus on the areas that you have identified you do not understand. Go back to the source material, reread and relearn. Repeat Step 2 until there are no gaps in your knowledge.
Step 4: Look over your notes and simplify any complicated language. If you are paraphrasing the source material anywhere, then reword it. Use straightforward analogies and stories where they are helpful in bringing the subject to life.
I find the Feynman technique to be the best way of learning about new concepts myself. As someone who focuses on the human side of cybersecurity, I pride myself on having strong technical understanding of the subject and on being able to translate technical messages in the awareness-raising activities and communications I deliver to clients. As cybersecurity is a constantly evolving field, I therefore have to learn about new technologies, vulnerabilities and attacks on an extremely regular basis, and I have to fully understand them because I may be asked to explain any aspect of them during awareness-raising for a client or indeed in a media appearance.
I have adapted the Feynman technique for awareness-raising workshops with clients. Here’s how I use it for learning in a group:
Step 1: Give everyone in the room a new notebook or some blank paper. Split the room into two and assign a different cybersecurity topic to both halves. As I explain in Chapter 2, topics you cover in awareness-raising should be defined by behaviours that you want to influence. Choose two of the topics that you have identified as things that you want to raise awareness of within the organisation or group; for example, I will often assign CEO fraud to one group and ransomware to the other, or it could be password managers to one group and two-factor authentication to the other.
Step 2: Allow the individuals in each group a set amount of time (this will vary depending on how long you have with the group, but 20–30 minutes is good) to learn about their subject, using internet access or resources you have provided. Advise them to write notes using only plain language (avoiding jargon where possible and defining it simply where it cannot be avoided). Tell them to pretend they are teaching it to a child. Go around the room and check how people are doing, giving them support and assistance, without doing it for them.
Step 3: Have people pair up with someone from the other group and take turns explaining their subject to one another. Again, they should use only simple language, defining any jargon and using stories, examples and analogies. In explaining their subject to their partner and trying to answer any questions their partner may have, individuals will identify what they understood and where they had gaps in knowledge or questions; ask them to note these down.
Step 4: Ideally, Steps 2 and 3 should be repeated until there are no gaps in anybody’s knowledge, but in a workshop scenario that often is not practical so Step 4 will probably need to be modified depending on the group and the time you have with them. If time allows, repeat Step 2 to enable people to plug the identified holes in their knowledge. If there is not time for that, work out what is best for the situation; for example, you can have people read out their questions and answer them yourself or collate all of the questions from the group and circulate answers later.
The most important thing to remember about the Feynman technique is that people don’t remember what they are taught as well as what they have learned and explained to somebody else. So, the subject that they revised and communicated to their workshop partner will be the subject that they understand in the most depth. Of course, at the same time, you are raising awareness of both of the subjects that you focus on in a meaningful and interactive way. There are lots of ways you can follow up on the workshop, too. You could capture key points from the individuals as they are sharing their learning and put these together in a handout that you send around the group after the session, to prompt their memory of the learning and reinforce key messages. You may find that a few people in the workshop are especially keen to build on their learning, so they may even want to help develop the handout or you may want to engage these individuals as security champions (as discussed in Chapter 2).
Part of the beauty of the modified Feynman technique that I outline above is that it is interactive. You are not standing in front of people as an expert and expecting them to listen to you communicate about cybersecurity. It is not ‘death by PowerPoint’, but rather it is empowering and engaging. This is always the aim for cybersecurity awareness-raising: to be as empowering and engaging as possible. You want as many people as possible to be truly engaged in learning about the subject, feeling confident with a subject that is too often presented in an intimidating and elite manner.
The modified Feynman technique also sits nicely with another great approach to awareness-raising, and that is Accelerated Learning.
I first heard about Accelerated Learning at the SANS European Security Awareness Summit in London in 2016. I go to hundreds of conferences and the SANS summit is one of my absolute highlights of the year, because it is focused on my passion (the human side of cybersecurity), and I’m in a room full of people who share that passion.
In 2016 Martine van de Merwe gave a talk showing methods to appeal to different kinds of learners, to keep participation high and to improve training results.48 Although the term ‘accelerated learning’ was new to me, I discovered that I had in fact been using the approach in awareness-raising for a long time, without knowing the technical term for it.
Accelerated Learning is rooted in what we know about how the brain works, about getting and keeping people’s attention, about motivating people and about the ways in which different people learn. An important part of Accelerated Learning is creating an atmosphere in which people feel confident and comfortable. The language you use to speak to those participating in training is really important, as we know from decades of psychological research. When you expect people to perform poorly and to be the ‘weakest link in cybersecurity’, they will be less likely to engage in the positive behaviours you are asking of them (in psychology, this is known as the Golem effect; Babad et al., 1982). When you empower and enable people, when you develop their confidence and believe in them, they will be more likely to engage in the positive cybersecurity behaviours that you are seeking to engender (known as the Pygmalion effect; Babad et al., 1982).
Accelerated Learning also promotes the use of different methods to engage people in training, because different people learn in different ways. These include:
It can be argued that learning is optimised when we combine all four methods because it is not simply the case that different people learn in different ways, but rather that we can all benefit from multiple styles of learning (Meier, 2000, p. 42). If you can integrate the different learning styles in your awareness campaigns, you will engage more people, more deeply: ‘Effective awareness requires consistent communication through several channels’ (Beyer et al., 2015, p. 6).
Many clients have approached me in the past and asked me to design or deliver awareness-raising in their organisation with the goal of ‘scaring people into behaving’.
I can understand the frustration behind this request. To an individual with a great deal of cybersecurity knowledge (and, perhaps, little knowledge of fields such as psychology, sociology, neuroscience or behavioural economics), it is frustrating and bewildering to see people be careless with passwords, hold the door open for anyone who wants to walk into their office, leave their workstations unlocked while they go out for lunch or click on links just because an email prompts them to do so. From the perspective of the average chief information security officer or security manager, such behaviour must surely be because those people do not understand the threat. If they understood the threat was real, then surely they would comply with all of the security rules?
When you communicate information to people with the intent of scaring them into a certain behaviour, you are invoking what is known as a ‘fear appeal’. We see fear appeals all of the time, from television adverts warning of the importance of not driving after you have been drinking alcohol, to graphic images of smoking-related diseases on cigarette packets. Fear appeals have been used for at least the last six decades in various ways in society and we can look to psychological research to understand why some work and some do not, and therefore how best to talk about a scary subject if you want to prompt behavioural change.
The Extended Parallel Process Model is an extremely helpful framework that was developed by Kim Witte, a Speech Communications Professor, and that outlines how people react when confronted with a fear appeal (Witte, 1992). People process fear appeals in a certain way, without really being aware of it. First, they will assess whether they believe that the threat is real. I believe awareness-raising has succeeded in convincing most people that, yes, the cybersecurity threat is real. So, next they will assess whether they are susceptible to it, as an individual. If we do manage to convince people that they are susceptible to the threat, they will then go on to assess the responses we are recommending: do they understand the responses? Do they believe that the responses will be effective in mitigating the threat? Only if they answer yes to those questions, will they then consider whether they are personally capable of engaging in the responses.
The Extended Parallel Process Model is based on Protection Motivation Theory (Rogers, 1975) and the work of Howard Leventhal, a psychology professor (Rogers, 1975). In 1965, Leventhal conducted a very influential study looking at the relationship between fear arousal and actions. In the study, participants were provided with a high or low fear-arousing message about tetanus and were advised to have a vaccination. Half of the participants were also provided with a map of the local area, highlighting the location of the hospital, and were prompted to plan their day in order to visit the hospital and have the injection. Results showed that those provided with the high fear messaging had more positive intentions to have the vaccination (compared to those who received the low fear messaging), but that these intentions did not translate into action (they intended to have the injection, but they did not follow through with actually having it). However, the study also found that those who were provided with the map and encouraged to plan when they would have the injection were far more likely to follow through with the behaviour: 30 per cent of those who received the action instructions got the vaccination, whereas in the group that did not receive the action instructions, only 3 per cent got the vaccination (Leventhal et al., 1965).
In terms of cybersecurity, fear appeals often take the form of pictures of ‘hackers’ in hoodies and balaclavas in front of a green screen of 1s and 0s accompanied by statistics of how many companies and individuals get hacked every year and the associated cost. People respond to such fear appeals with doubts such as:
If at any point in this awareness-raising process, we fail to convince our audience that cybercriminals exist, that they personally are susceptible to a data loss or compromise, that the mitigations we recommend do work and that they are able to engage in those mitigations, then we are likely to provoke a defensive response from them: they will revert to acceptance or avoidance. This can take the shape of ‘the problem is so big and I am not capable of all of those defences, so why bother? If it’s that bad then it’s inevitable’ and ‘this threat is overblown, IT are just trying to get us to do what they want, why bother?’ and ‘I’m too busy for that, I can’t remember all of those passwords, I’ll worry about it another day.’
These defensive reactions of reluctance, avoidance and denial are completely normal and are to be expected when we clumsily wield fear as a tool for awareness-raising, with no awareness ourselves of the psychological implications of how we communicate such scary messages.
Fear appeals should be used cautiously, since they may backfire if audiences do not believe they are able to effectively avert a threat.
(Witte and Alen, 2000)
When people are scared about something but do not understand how they are susceptible to the threat or, most importantly, how they can engage in better protecting themselves, they will engage not with the true danger (cybercrime) but rather with the emotional response: the fear. This is why they avoid the reality of the situation or choose to believe that you are exaggerating the threat. So, we need to be very careful when we use fear appeals. As an industry, we must move away from simply trying to scare people into behaving more securely and we must communicate in a responsible way when talking about cyber threats to ensure that people engage with the actual danger rather than just the emotional response that is aroused from hearing about the threat.
Psychological research suggests that the weaker the efficacy message (the communication that the individual can engage in behaviours to minimise the threat), the greater the fear control response (the defensive mechanisms of avoidance or denial):
If fear appeals are disseminated without efficacy messages, or with a one-line recommendation, they run the risk of backfiring, since they may produce defensive responses in people with low-efficacy perceptions.
(Witte and Alen, 2000)
When discussing the cybersecurity threat, you are inevitably discussing something scary. In fact, it can be argued that just saying the word ‘cyberspace’ will cause fear in some listeners as it is a vague and unknown word (Bada et al., 2015). I believe that tackling fear, uncertainty and doubt is central to awareness-raising, which is why I delivered a keynote on the topic at the RSA conference in San Francisco in 2020. So, how can we talk about this scary subject in a way that prompts positive, rather than negative, behavioural change?
Let’s refer back to the Extended Parallel Process Model and consider how we should focus our awareness-raising messages if we want to be effective.
Step 1: Show people that the threat is real and that they are susceptible to it, for example by using references to real-world attacks and anecdotes that relate to their industry sector and job role as closely as possible. For example, rather than using general statistics on cybercrime, use statistics from your own company and explain that many cyberattacks are not targeted but you can become a victim nevertheless (the NHS WannaCry case study I discussed in Chapter 2 is an example of this). Showing hacking demonstrations is extremely effective here, as it opens people’s eyes to the reality of how cybercrime is actually carried out. For example, with a password cracking demonstration, you can highlight that attackers do not target individuals and try to manually guess their password, but rather use a script and tools to compromise passwords from breached lists.
When people understand that the threat is real and that they are susceptible to it, they will be scared, and this must be handled responsibly. Do not leave them in this state. Instead, it is very important that you implement Step 2.
Step 2: This is where we focus on self-efficacy. Let’s say you are running an awareness-raising session on passwords and you have delivered a password-cracking demonstration. Now, you need to explain to people what they can do to mitigate the risks and ensure that they have the tools and techniques available to them and the understanding and confidence to use those tools and techniques. It is not good enough to simply recommend that people have a unique and complicated password for each of their accounts (which possibly still expire on a regular basis despite the latest NCSC and NIST guidance50) and that people cannot write them down but rather must simply remember them. This is not realistic. You can train people in using passphrases, but it is still asking a great deal of their cognitive load that they should have a unique passphrase for each account.
So, let’s unpick step 2 a bit: what are you going to ask them to do? The security burden of passwords cannot continue to fall so heavily on people; we need to provide people with the tools to manage unique, complicated passwords or they will continue to reuse weak ones. If there is a corporate password manager, then show them it and give them a demonstration, too. After giving the password-cracking demonstration, run password management workshops where you work with people step by step to get them set up and running with the password manager. Provide well-designed, visually appealing handouts that explain, in concise and simple terms, how to set up and use a password manager, ideally with contact details included so people can get in touch if they need more support.
If you want positive behavioural change to result from your awareness-raising, then it is imperative that people understand the action and why it will better protect them, and that they have tools in place that they feel confident using. As Witte and Alen commented:
... strong fear appeals and high-efficacy messages produce the greatest behavior change, whereas strong fear appeals with low-efficacy messages produce the greatest levels of defensive responses.
(Witte and Alen, 2000)
At Cygenta, we regularly run hacking demonstrations as part of the awareness-raising programmes that we deliver. When people see a live demonstration of a spear-phishing attack, showing the criminal and the victim sides and just what the criminal is able to do from the click of a malicious link, this is a strong fear appeal and it must be handled carefully. It is vital, at this point, to communicate the relevance of this to the people in the room and to provide a strong self-efficacy message.
As another example, it is not enough to say to the participants ‘be wary of the links you click on in emails’. This advice does not empower or enable the participant in any way. They will leave either terrified of clicking on any links or believing that the situation is so hopeless, or so exaggerated, that they might as well just carry on regardless. It is so much more empowering and enabling to show them what to do if they receive an email that they suspect may be phishing. Hopefully, your organisation has a ‘report a phish’ button in emails or an email address that people can send suspected phishes on to. If so, this is what you want to show people after you have raised their awareness about phishing, because this is the crucial high-efficacy message that you need to produce the greatest behavioural change. This fits perfectly with the NIST definition of awareness that we have referred to in Chapters 1 and 2: focusing individuals’ attention on IT security concerns so that they can respond accordingly.
This approach is backed up by research by Ruiter, which analysed six decades of research into fear appeals:
… the elements of fear appeals most likely to motivate risk reduction behaviors are: (a) strengthening self-efficacy (i.e., suggesting that the person can successfully perform the recommended protective actions); (b) promotion of response efficacy (i.e. suggesting that the recommended action will avoid the danger); (c) awareness of susceptibility (i.e., suggesting that the threat is personally relevant); and not, (d) messages suggesting in an emotional way that the threat is severe.
(Ruiter, 2014)
Self-efficacy is an important concept in cybersecurity. It is a person’s belief in their ability to succeed in a specific situation or accomplish a specific task. Raising self-efficacy in awareness-raising activities is important because it fights against another common pitfall of awareness-raising: security fatigue.
It is both a blessing and a curse that cybersecurity now has such a high profile. The very large number of cyberattacks, data breaches and vulnerabilities makes our profession such a challenging industry to work in. The fact that so many of these cyberattacks, data breaches and vulnerabilities are now public can be a good driver for engaging board members and colleagues in the subject: awareness of cybersecurity, in a very general sense, has never been higher. However, the other side of this is that people can feel overwhelmed by security and online threats.
In a study exploring online activities, researchers at the NIST found that more than half of the research participants referred to feelings of security fatigue. The researchers were not looking for fatigue, but they found it, with people expressing a sense of resignation, lack of control, fatalism, risk minimisation and decision avoidance. What is unsurprising, but very important, is that the research also found that this sense of security fatigue led to individuals engaging in less secure behaviours (Stanton et al., 2016). This clearly shows that awareness of cybersecurity can actually undermine security behaviours, which in turn will of course reflect in a less mature cybersecurity culture.
Awareness, in and of itself, should not be accepted as automatically good: when it comes to awareness-raising, it is not so much what you do that matters, but rather how you do it. The study identified three evidenced ways in which we can ease security fatigue and help people have more secure behaviours. They are:
These recommendations highlight that, when considering the human side of cybersecurity, we need to look at system design and opportunities for using technology to support individuals, as much as we consider how to raise awareness. Security fatigue can be in response to poor security system design and cognitive overload. Minimising security fatigue relies on us making security easier, and on better communicating security messages.
When discussing cybersecurity awareness, a common question that often arises is how frequently an organisation should deliver cybersecurity awareness-raising. If awareness-raising is too often, you risk adding to security fatigue and turning your messages into noise, but if you make it too infrequent then you risk being ineffective. Like everything in cybersecurity, there is no silver bullet for this; tailoring your security awareness, behaviour and culture strategy and activities to your organisational culture is going to be more successful than trying to ‘lift and shift’ a one-size-fits-all approach.
However, when mapping out a security awareness programme, there are some fundamental elements to consider. Annual training is common in lots of organisations, with many capitalising on Cybersecurity Awareness Month in October to have a big push on awareness and make an impact on the organisation as a whole with engaging, innovative and informative activities. Beyond this, there are some other core awareness-raising opportunities, particularly at induction training for new employees and targeted training for high-risk groups and those who need cybersecurity training as a core requirement of their position. Having on-demand, bite-sized content is a fantastic way of supporting your colleagues, so they can get answers to security questions as they arise; cybersecurity champions, discussed in the previous chapter, can also really help with this.
When planning your awareness programme, think about it from the perspective of the colleagues you are trying to support with this programme. Consult your colleagues throughout the organisation, for example via your champions or via focus groups with a varied cross-section of areas and levels of the organisation. What content would help them and when would it be most effective to deliver that content? For example, you may want to deliver awareness-raising that coincides with holidays and celebrations, relevant to people’s personal and family lives as well as their role in the workplace.
The importance of understanding your audience and making awareness-raising activities relevant to them has already been covered in the discussion in the ‘Enabling Self-Efficacy’ section. Knowing your audience means making the content relevant to the people in the room, using examples that will resonate with them, and it also means speaking their language. Cybersecurity is rife with technical jargon. When we communicate with people using terms that they do not understand, we are failing to make the messages relevant to them and we are likely to alienate them. Define the terms that you are using, such as ‘password manager’, ‘two-factor authentication’, ‘spear-phishing’ and ‘ransomware’; perhaps provide a glossary of terms.
Also consider how you can speak the language of your audience. For example, if you are delivering awareness-raising to the board, focus on the business implications of cybersecurity risks, rather than the technical issues. Board members are generally more attuned to business and financial risk than cybersecurity risks, so by discussing cybersecurity in terms of potential financial and reputation damage, you are speaking to board members on a level that is more comfortable and relevant to them. Highlight the business aspects of cybersecurity as opposed to the technical ones; for example, when discussing incident response, you can focus on the impact an incident would have on customers, on reputation and on different parts of the organisation, such as PR and communications, human relations (HR) and the legal teams. Rather than considering what you want to say, consider your messages in light of what your audience needs to hear and how you can most effectively shape your communications to achieve that.
How we communicate cybersecurity messages is a fundamental factor in the extent to which we successfully raise awareness of cybersecurity. By success, I mean the extent to which people listen to our messaging, understand what we have to say and, as a result, pursue more secure behaviours, in turn contributing to the development of a more positive cybersecurity culture.
Awareness-raising does not exist in a vacuum and the end-goal of awareness-raising activities should not simply be that people are more aware of cybersecurity. The end-goal of raising awareness should be that people have a better understanding of cybersecurity and that they are therefore engaging in behaviours that will better protect themselves and the organisation, supporting a positive and mature cybersecurity culture:
Knowledge and awareness is a prerequisite to change behaviour but not necessarily sufficient … It is very important to embed positive cybersecurity behaviours, which can result to thinking becoming a habit, and a part of an organisation’s cybersecurity culture.
(Bada et al., 2015)
We will move on to explore cybersecurity behaviour and culture in the rest of this book.
Let’s take a look at some next steps that you can consider in terms of cybersecurity awareness in your organisation: