While the public sector’s tech talent shortage woes are hardly new, solutions to patch this demand gap have been few and far between. For the more niche cybersecurity discipline, this shortfall is even more pronounced, increasing the potential firepower gap between cyber adversaries and defenders. So, to meet its personnel demands, do governments need to recalibrate their recruitment efforts or look to more out of the box solutions?
We’ve taken a snapshot of the featured Cyber Panel discussion featured at the FST Government WA 2022 event, with cyber and information security experts from WA’s public sector weighing in on the skills and talent gap, including a re-think on hiring practices and the potential of recruiting healthcare specialists into the cyber discipline, why ‘off-the-shelf’ cybersecurity awareness training will never cut the mustard in government, and overcoming social engineering hacks that are still tripping up public sector staff.
- Richard Asch, Head of Cyber Security, Western Power
- Julia Burns, Executive Director, South Metropolitan TAFE
- Vito Forte, Chief Information Officer, Edith Cowan University
- Brett Winterford, Senior Director Cybersecurity, Okta
Moderated by Christian Rasmussen, Executive Director, IT, St John Ambulance WA.
Moderator (Rasmussen): This financial year, the ACSC received more than 67,500 cybercrime reports, an increase of nearly 13 per cent from last year. This translates to one cyber-attack report every eight minutes, up from one every 10 minutes the previous year. In the face of rising threats, it’s clear agencies must remain vigilant and take steps to strengthen their cybersecurity posture. This, crucially, includes having the right talent to support their cybersecurity needs.
So, in the face of persistent talent shortages, how can we bridge the skills gap and attract, retain and develop cyber talent?
Burns (S-Metro TAFE): This is one of the really big questions for government. I can’t think of anyone I’ve dealt with in the private or public sector who hasn’t faced the problem of recruiting and retaining the right people. It keeps you awake at night!
Even before Covid, AustCyber’s yearly jobs report has consistently shown the shortfall we have in the workforce; we’ve got about 17,000 jobs they think will be required by 2026. That number is not going down. The universities and TAFEs are providing that workforce, but they’re only producing around 500 graduates a year. And with Covid restricting our skilled migration, we’re facing an even bigger problem.
One of the other major problems for us in government is that we don’t have the ability to financially compete with the salaries offered by the private sector.
We really need to look at this problem from a different angle. In the past, we’d simply advertise, there’d be a lot of people applying, we’d pick off the best, and then we’d look at how we can retain and keep them excited about being in the job. We just don’t have those applicants anymore. We’ve found, as a result, that we really need to start investing in our own people.
We also need to look at providing opportunities for students to work in your organisation. Universities and TAFEs are all looking for placements for their students; the best way that we can train students is for them to learn some of the theory and practical stuff.
We’ve all got SOCs, but that’s not enough. They need to be working in a real-life situation.
Asch (Western Power): For me, attracting talent is also about rethinking hiring practices. If you look at a sample of ten job advertisements for cyber capabilities, a good chunk of them will have huge laundry lists of very [specific] experiences requiring every single cyber certification under the sun. In the constrained market we have today, you’re not going to find those people – or, if you do, you’re really going to be competing at that top end of the talent pool.
It’s really about looking for those skill sets that can transfer over from other disciplines; ones that you might not ordinarily think are good fits for cybersecurity personnel. In a technical and an operational sense, there’s a whole bunch of IT operations folks that have an interest in cybersecurity. They deal with triaging incidents and high-pressure situations, so they could be trained up and coached in those cyber areas.
I heard recently of recruitment campaigns to get nurses involved in cybersecurity.
If you strip out all the jargon, what does a nurse do? They triage, they do shift work, and they deal in high-pressure situations; much is the same in a big SOC (security operations centre) environment. We really need to think differently about hiring practices and stop putting a huge laundry list of experience on there; it’s not practical, nor will you attract the right talent, in my opinion.
Forte (Edith Cowan Uni): You’re right, we have vendors that come to the University and they’ll take – I think the last one took 50 – students through an internship program before they graduate. The biggest issue we have is they actually leave before they graduate; they end up just going through the certification process internally.
On a slightly different point, in the higher education sector, we already collaborate at a cyber level. We’ve got Communities of Practice; we’ve got a SOC across the whole sector that AARNet – itself owned by the universities and the CSIRO – runs. And we’re in the process of onboarding at the moment.
The question did arise internally of whether we, as a university, run our own infrastructure, and then run into resource cost and other issues, or do we collaborate at a sector level? We recognised that you gain strength from that collaboration and that consolidation. And, yes, there’s a cost associated with it, but I can’t build or run it alone for the money that I’d be charged. And, what’s more, everyone learns from everyone else.
Who in this room has got enough capacity to deal with a state actor? No one. So why even go there?
Coming together is the most important thing, and that’s where you’re going to get your economies. I don’t know about you, but if you try to buy a SIEM (security information and event management) nowadays, you need a small state [budget’s worth] of money just to buy it in order to capture events. You can’t just do that.
Winterford (Okta): Given the constraints that we’ve discussed, it’s really about creating a very supportive and engaging work environment once you do have the talent on board. That requires giving a security team a really clear articulation of their purpose and making sure that they’re heard and seen in the organisation. It’s also about providing learning opportunities around their chosen domains of interest.
Often the reason why really strong talent chooses one employer over another, in my experience, is because there’s a specific individual or a specific team of individuals they want to work with. If you don’t have that environment today, if you don’t have the scale, what kind of training or mentorship opportunities can you provide that are going to help compensate for that?
Finally, they need to see some evidence of career progression, particularly inside something like a government agency.
My advice, based on similar organisations I’ve worked for in the past, is to try not to shoehorn technically gifted people into management roles once they’ve progressed.
If they show no interest in those roles, it might mean having a conversation with HR around creating technical lead roles where individual contributors can still progress without necessarily having to take on managerial responsibilities; that’s often the point where very skilled people leave organisations for financial services or technology companies because they feel that, as a technical professional, there’s no further they can progress.
Moderator (Rasmussen): That’s actually a very good point, Brett [Winterford]. I know in government they have special classifications for lawyers and such because they can’t retain them at the levels that they’d normally be employed.
Moving on now to cyber awareness, what are some ways of building effective, ongoing cybersecurity awareness and training programs for staff?
Forte (Edith Cowan Uni): People’s inability to either understand the risks or understand that their behaviours – and I know it’s difficult at times – are generally the biggest issue with cyber.
Education is so important, and I know we’ve got active mechanisms currently to provide education, but I’m continually amazed at how many people say, ‘I don’t need to know anything about this because I already know everything!’. To me, that’s the biggest issue that we have. And if you go out into the general public, there’s even a lower level of awareness, generally – which is scary, to say the least.
Moderator (Rasmussen): That’s a very good point. I saw a stat recently that showed around 90 per cent of cybersecurity breaches are due to users.
Burns (S-Metro TAFE): The secret for me is helping laypeople, that is non-IT people, in the organisation get over this idea that there’s some cool IT dude back who’ll take care of it all and that they’re in some safe little bubble.
We need to get everybody to understand that this is not just about cyber – the key word here is ‘security’.
When people think about what that means in terms of their own personal security, their ears prick up. It’s the security of your information, your identity; people get really fired up when they understand that someone can take their whole identity.
That’s why it’s so critical to have those translators or communicators in your organisation who can take that technical information and put it into real-life terms that someone who’s not in that world can understand. That’s when you get buy-in.
Looking also at the social engineering aspects of it, most breaches are the result of someone doing something really silly that – no matter how many technical barriers you put in or how brilliant your team is – it won’t help if you’ve got people on the inside who are stupidly making mistakes because they don’t understand the [risks].
Forte (Edith Cowan Uni): I’m happy to talk about a particular incident we had about two weeks ago where someone clicked on an email, which then led to a phone call that our staff member made to a scammer who then proceeded to elicit $500 worth of Apple iTunes cards out of their own pocket – and our staff member did it willingly! It’s not even a technical thing; it wasn’t like they were doing something nefarious within the network per se, apart from the initial email. It was just the social aspects around that that were scary.
Winterford (Okta): I’ve done security awareness at three organisations now and feel quite passionately about it, particularly so about not putting too much on the user and also coming up with different ways to protect them.
For example, in my current organisation, we’ve created a Slack bot where we’ve written detections that when a user engages in a certain way that is a known pattern of activity that social engineers exploit, the bot sends them a message on Slack that says, ‘Did you mean to do X?’. Quite often staff will say, ‘Oh, no, I didn’t mean to do that!’.
We’re always trying to get ahead of what social engineers do and understand that humans are frail. We make errors. It doesn’t mean that they’re stupid.
Your learning experiences really need to empower your users. Your advice should be positively framed. Tell them what they should do and what is the correct process, instead of just telling them what they shouldn’t do and what they shouldn’t fall for.
Your learning experiences should be actionable. Be wary of statements. Tell them exactly how your processes work and how your tools can protect them. Learning experiences should be engaging and experiential.
I’ve just built a new experience for push for MFA (multi-factor authentication) fatigue. This involves sending random MFA pushes to our staff to teach them what attackers will do if they get legitimate credentials and they’re using a push MFA – it’s about making sure that staff experience something in the moment rather than just once a year through training.
Also, for learning experiences, you need a platform that’s dynamic and responsive.
Buying security awareness materials off the shelf is rubbish. They are always outdated.
It seems like the easy path, but at the end of the day, you’ve got to contextualise the risks to your organisation. Vito [Forte] mentioned a recent incident at his organisation. Every time I’ve run security awareness training, my first 30 minutes are always about recent events that we can now ‘declassify’ for staff to show them that this is how security incidents happen. And that’s how you get buy-in from people.
Asch (Western Power): I came from the resources sector previously, and there’s no question about sharing the details of a safety incident in order to improve overall safety and safety culture. If someone hurts themselves, obviously all the sensitivities are stripped away from it, but we talk about what happened, what went wrong, and what we’re going to do about improving the conditions and the controls to prevent that from happening again.
Getting into that same mindset about security: people make mistakes, stuff goes wrong, things go bang, when it’s declassified and when you can talk about it, you should – that makes it real for people.
It’s not this ethereal risk that happens to companies in the news; you can show that it’s happened in your backyard, this was the outcome, and this is what we’re going to do to prevent that going forward. ■
This is an edited extract from the Cyber Panel featured at the FST Government Western Australia 2022 conference.