Eight isn’t enough: why cybersec teams should rethink their approach to the ACSC’s ‘Essential Eight’ – Highlights from the FST Gov Australia panel

cyber_panel_gov

This is an extract from the Cybersecurity Panel at the FST Government Australia 2019.
 

Featured speakers on the Cybersecurity panel:

Jamie Norton, Chief Information Security Officer, Australian Taxation Office
Mike Webb, Chief Information Officer, The Treasury
Scott MacLeod, First Assistant Director-General, Protect, Assure and Enable, Australian Signals Directorate

Moderated by Paul Cooper.

—————————————————–

Cooper (Moderator): Firstly, if you could introduce yourself very quickly and perhaps offer a little something that you think the audience should know about cyber.

Webb (Treasury): I’m Mike Webb, the CIO at Federal Treasury. From my perspective, cyber is a really difficult challenge for any organisation. The goalposts are constantly moving, the threat actors are getting smarter and cleverer. You’re getting hit from almost every direction. And as technology iterates, it’s just opening up more opportunity to transform and to provide value to the business. But it’s also creating a bunch of additional vectors that could lead to compromise. That’s naysayers view on cyber.

MacLeod (ASD): Scott MacLeod. I work in the Australian Cyber Security Centre (ACSC). I’m one of the two ‘Band 2s’ in there. I look after protection, assurance and enablement, while my other peer looks after operations, intelligence, and incident response; he’s very busy, which tells you how well I do. If there’s anything that people need to get their heads around with cyber, it’s that it’s everyone’s responsibility and not one person or one organisation is going to be able to deal with it. One of the things we’ve done very poorly is involve industry in cyber. We’ve got to open our eyes and get everyone to the table, to start having more honest conversations about how we do cyber better. Right now, our adversaries are lean and efficient and we’re not.

Norton (ATO): Good afternoon everyone. Jamie Norton from the Australian Taxation Office (ATO). My observation would be – and I’ll hit this one pretty much front on – the vast expanse of threats that we see in the environment these days don’t lend themselves very well to a regulatory framework. And the regulations that are currently imposed upon us are distracting in the extreme and aren’t helping to address the real risks that we face both at the ATO and I’m sure more broadly. We’re missing the mark. And, in fact, if we dealt with those regulations exclusively, we’d be very insecure.

Cooper (Moderator): Fantastic. We’ve had some questions come through fast and furious from the audience. We’ll start with patching; it’s two of the Essential Eight and covers the entire technology stack. How do your teams keep up and manage patching against exploits from unknown vulnerabilities? And, what is best practice and how are you managing it?

Norton (ATO): It’s quite a complex issue. On the surface, at some levels, people just look quizzically and go, ‘Why aren’t you just doing this? It should be something that’s just done.’ But at the scale of the ATO, it was quite a complex issue. We’ll obviously assess a patch based on risk and determine a timeframe that it has to be delivered by. And we’ll incorporate feedback from the ACSC and others to determine timelines. But the actual act of patching can be quite tricky, because we’ve got applications and the resilience of those services we provide to the Australian public, and we can’t be offline for a week or even 24 hours even while we patch. In terms of the mechanisms there, that’s on a case-by-case basis, because there is such a vast array of systems that we don’t have one consolidated patch engine. It’d be nice to click a button and say ‘Roll that patch out to everything’ but, unfortunately, that’s not the way it is in reality. So, we do it based on risk and based on the criticality of the patch and how we apply that to our environment.

Cooper (Moderator): There are plenty of examples I know of vendors that have introduced more vulnerabilities through patches that they’ve released, so there’s always an aspect around that, isn’t there?

Norton (ATO): Yeah, definitely. It’s always that balance of resilience versus security. If it’s something that’s absolutely going to take us apart in 24 hours, then we will patch within 24 hours. But if the exploit has not yet been released in the wild, then we’ve got a little bit more time and it’s a case of how we test that appropriately so that you don’t lose your ability to lodge a tax return whilst we’re patching something at the same time.

Paul Cooper (Moderator): A question for Scott. What is the role of traditional, often slow accreditation, in a world of constantly changing services and technologies?

Scott MacLeod (ASD): That’s a good question. It’s fair to say, the idea of an accreditation or a point-in-time value – and this goes a lot to patching if you like – is not useful. One of the areas that we want to try to focus on – and I know from an ACSC point of view, we’re really starting to drive this message now – is getting people out of this idea that on 1 July we have accreditation done and everything is hunky dory. It doesn’t work like that.

Let’s swap the message around a bit and say, ‘This is really about cyber hygiene and it’s something we do continually, and it’s something we can’t take our eye off at any stage’. And we see this when people or organisations would report compliance with the Top Four or Essential Eight. We’ve never asked you to be compliant with them. What we do is we say it’s a risk management conversation. The idea of compliance really gives you a false sense of security… Even when you’re patching, there are other things that you can do.

And I like the way that humans have chosen the ‘Top Four’, the ‘Essential Eight’ – we’re really good at that, but we actually give 37 mitigation strategies out, not four and not eight. For me, any CIO that is focused on the Four or Eight is just taking an easy journey. There are 37 of them there; you’ve got to read them and work out from a business point of view what works for you. The other part… is we need to work with industry so that we can automate a lot of this because, in a hygiene sense, it’s something that’s got to happen behind the scenes. If we’re relying on people to do this work, we will fail. Because none of us has a great workforce of that size. The Cyber Security Centre is about 350 people and we look after the whole of Australia. If we thought that we could do that with 300 people without some form of automation, then we’re going to fail. For me, that’s one of the great challenges: we’ve got to change the message from accreditation and start moving towards things like hygiene. And it’s about understanding your business and managing risk. 

Cooper (Moderator): On the point of hygiene, how can we make this cultural shift required for ongoing cyber hygiene?

Scott MacLeod (ASD): Putting the human front – and I used to say cybersecurity would be great if we got rid of humans, but we can’t do that. So, sensibly, we have yet to communicate cybersecurity in a way that end users understand it. It is still too technical; it’s still too complex for people to understand – they don’t get it. And that’s because most of our industry has been built around techs.

Cybersecurity is many different things. Multi-discipline teams are a really good idea when you’re working in the cybersecurity side of the house. We do have psychologists and people like that who understand how we work with humans and how we get that end-to-end model operating for us. There is a really big challenge on ‘technology is never going to be the answer to solve this’; there is always going to be a human aspect and we have to ask, how do we educate them?

We’ve seen this a lot in the university sector with the guidelines that have just been produced. If you read them, you see it relies a lot on culture. The foreign interference problem, it’s going to be a cultural problem for the universities that they’re going to have to work with. Actually, the technology side of it – and everyone’s going to get really angry here – it’s pretty easy in this day and age. Cybersecurity from a technology point of view is a lot easier now than it was 15 to 20 years ago: patching is very much automated; you can buy cloud services and there are things that you can do, so it’s not as hard on the technology side. The human side, that’s the complexity problem. And we are yet to come up with a good message that our end users understand of how we would make them part of our cybersecurity hygiene process.

Webb (Treasury): It depends on the organisation. If we’re talking about compliance and that binary ‘yes/no’ compliance, that needs to shift to an active risk management type approach. That’s the way IT manages cyber risk. But, as an organisation, we escalate that to the executive so that we’re articulating risk and cyber risk around our important assets, the threat actors that are likely to be attacking our networks or our systems and our people, and then spending a lot of time on security awareness with end users.

I’m quite lucky to Treasury in that we have a reasonably strong security culture. There are a few sensitive bits of data we deal with very regularly, so the baseline is relatively high compared to other parts of APS I’ve been. But we have mandatory security awareness training annually, where the cyber guys dumb it down from a technical perspective and say, ‘Look, this is what phishing looks like. This is what you need to do with your own personal cyber hygiene at home’. It’s a bit of guidance just to put it in end-user terms for what we to be wary of and what to look out for. And we’ve seen a shift in the last 18 months to two years at Treasury where we see a lot of people coming to us saying, ‘Hey, I think I’ve got a phishing email, I think I’ve got these or that’. So that helps. But it’s a smallish organisation as well compared to where Jamie’s at, and there’s probably a lot less turnover.

Norton (ATO): Just reinforcing what Scott and Mike said, cybersecurity is still not an exact science, and it’s still got a while to go before we reach a point where we have a very defined way of doing things. That comes down to the skills and expertise of the individuals; we need to train more people into the industry and government. But, from that, we then need to trust those individuals to identify what needs to be done for an organisation, what the business imperatives are, and what the risks look like for that organisation, and try not to create a one-size-fits-all model that everyone has to adopt, because we certainly face risks in the ATO that aren’t in any regulation. And I’m sure every other agency does as well, so we need to make sure that we trust these individuals and use regulation as guidance, and not try to do it the other way around.

Cooper (Moderator): The other thing, if I may share from personal experience, I am working on a sessional basis with Deakin University in Victoria, but they do continual cyber awareness training and anti-fishing things and so on. And, although some of that is effective, there’s also a little bit of burnout fatigue from the continuous messaging. And I get why it’s required, but that’s probably a human factor. We may be getting a little overexposed to it despite recognising the importance of it.

Moving on to the next question from the audience. We recently saw senior officials state that ISM (Information Security Manual) changes happen so frequently, it’s hard to keep up. What does the future of security policy look like in government?

MacLeod (ASD): In the context of the ISM, we used to have an annual release of it. And many agencies and smaller agencies find that quite difficult. We’ve since gone to monthly releases. We’re looking at whether there is a point in time every year that we say that this is the penultimate version of it. The disadvantage of that for us is, right now, the world is changing for us so quickly that coming out once a year – and people having such a reliance on it – is really challenging for us.

You will have seen in the last two years, there has definitely been a move away from that compliance model, where the ISM did get into a bit of compliance over the last few years. We’re now getting back into that risk framework. We have to balance how much information we can get in there and how fast we can get it out to you and making it digestible. And that’s been one of our greatest challenges. If you’re a large organisation in ATO we would do the ISM once a year and that’s probably not too bad if you’ve got a big IT security side of the house. But if you’re a small agency where you’re relying on two or three people, that can be really difficult. So, there is that need for a smaller consumption time. I don’t want to mandate when you should be doing things if you want to do your annual check and run it annually, that’s up to you.

The other part is, from my side, I’ve only got a limited workforce as well. The ISM is produced by one person – they have a lot of support from my other workforce, but there’s basically one person, and it helps us to keep that focus if we can do those monthly releases. Will we change the policy? Are there other versions out there? Sure, there are lots of different versions out there right now. But, for me, the ISM hits the mark. It’s a very very good policy. I’ve looked at others around the world, and I think we’re in a good space. Interestingly, what’s happened in Australia is we’ve seen many companies outside of government using it, against many of the international standards, because it’s a bit more digestible than many of the big ones. If you want to take the ISM and put it up against the NIST (Cybersecurity Framework), trust me, the ISM is a bit more digestible than what NIST would normally produce.

I like it, but if people think we should do it annually, I’m happy to hear from you. If you’d prefer monthly, I’m also happy to hear from you… So feel free, we’ll take any feedback we can, but right now, monthly suits me just because of the size of the workforce I have and the number of changes I have to do within it.

Cooper (Moderator): And Scott while you’ve got the talking stick, you mentioned a couple of times about your direct workforce of 350, and you can’t do everything with that. But isn’t it also the case that you’ve got a very large indirect workforce with all the other security organisations in banks, large government departments, and there’s a great deal of effort I know to link all these up in ways that we can get some amplification. Would you like to elaborate on how that’s going and if there are any ways that we can get more bang for your buck, so to speak?

MacLeod (ASD): For those that don’t know much about the Australian Cyber Security Centre (ACSC), it’s actually a multiagency centre. It’s made up of members of ASD, the Australian Federal Police sit inside there, Defence Intelligence Organisation as well. We also have the ACIC (Australian Criminal Intelligence Commission). And, interestingly, we have Home Affairs in there, and Home Affairs are our strategy and policy agency, so with them sitting inside, our operational people don’t really like that because… every time we say something, say we can’t really do that. But then the policy people sit next to the techs, so when they say something, we say ‘actually, we can’t do that either’, so it’s a really good balance.

We also have a number of private sector companies inside the ACSC. We don’t advertise it, but they are in there. And we’re bringing on a few more. We are members of a number of different communities within the financial sector. Critical national infrastructure – so, you would have seen last week we ran GridEx, which is a big Australian energy sector exercise. We reach into many organisations. To make that a bit easier, we have joint Cyber Security Centres in every state in Australia except for Northern Territory and Tasmania, and they’re on the cards at some point in time. And we bring industry together in there as well. So we have quite a big reach, and we listen to all those different groups, and we try to manage that. As I said at the start, we can’t do this alone. Industry are very important to us. And the different levels of maturity in those industry plays are big there as well. But we have 600 to 1,000 different relationships within Australia that we work with as we do cybersecurity.

Cooper (Moderator): The next question that is come through is what E8 (Essential Eight) maturity levels government organisations should be aiming for?

Norton (ATO): My answer to that would be, whatever’s most appropriate for the agency. Without having an onerous compliance mindset, it comes down to where the risks are for that agency, what we’re trying to address. I mean, if you take multi-factor as an example, there are lots of areas that multi-factor may be a risk for the agency, whether it’s administrative access, remote access etc. But it’s up to the agency to determine what’s important from a risk perspective and then apply those controls. We shouldn’t just go: ‘We should all be mature in Level 3 or 4 across the board’. That going to detract resourcing and dollars when there are bigger risks that the agency could be addressing.

Webb (Treasury): We did that, too. We had a Treasury and ANAO audit on the Top Four early 2018. We were found compliant with the Top Four and then a month later one of the SAS services that we were using for HR and recruitment got hacked. So, to Jamie’s point, compliance is quite possibly a necessary evil and a benchmark with which to measure yourself. But it is definitely a ‘point in time’ thing and it doesn’t put into context the agency, the organisation, the data, the risk profile and what matters to that organisation. I would wholeheartedly agree with Jamie on that point, you should aim to be as high as practically possible, but managed well beyond that, because there’s a lot that we’ve been focusing on in those other 29 and even beyond that.

Cooper (Moderator): In our pre-panel discussion, we discussed AI as a cyber threat and discussed using fire to fight fire, so to speak. Scott, while you mentioned that it’s not really a technology issue but a human or cultural concern, in terms of artificial intelligence, how can we get better software development practices that better protect us against these sort of very sophisticated AI intrusions?

MacLeod (ASD): One, we don’t know enough about AI right now to understand it from a cybersecurity point of view and from a defensive or an offensive point of view. We’re still developing new technologies and these are very early days in terms of how we work in the AI space. The uses of AI and the ethics around it are still very grey. We’re starting to see a lot of Western countries put into place some ethical boundaries around AI and I think there will be some benefits as we do that. But for me, it’s still very very early days in determining how it’s going to affect us. I’m not sure that we quite understand what it means to us.

It does worry me, if I think of using AI from an offensive point of view. But on the defensive side, I don’t think we’ve really got our heads around it yet. For me, this is still very early days and the really smart people… are yet to nail this one very well for us.

Norton (ATO): On the last point you made on software development, this is the emerging issue. It’s one of the largest emerging issues we face and it’s often overlooked, but the providence of our code and making sure that that it’s clean and secure, that’s one of the biggest challenges we face. And it’s not in any of the regulations or any of the textbooks for cyber. But it’s going to become a massive issue. It’s already a bigger issue than we think.

—————————————————–