Technology lifecycle management: The silver bullet to endpoint security?

Tanium Digital Security Panel
Top L-R: M. Pancino; G. Blair; A. Dacal; B. Turner (Mod)
Negligence within IT teams to patch, upgrade, or switch off legacy systems is leaving financial firms exposed to “unnecessarily large attack vectors”, warned former CBA technology chief Matt Pancino, alongside other ex-Big Four banking stalwarts.

Pancino, speaking on FST Media’s recent Security Digital Discussions, encouraged CISOs and IT teams to renew their focus on managing hardware and software lifecycles to increase oversight of digital perimeters – a critical enabler of a proactive security posture, he argued, particularly as post-Covid digitisation ramps up across industry.

Indeed, while technology lifecycle management should be highly valued by technology professionals, the practice has apparently been overlooked, with many financial institutions undervaluing it as an unnecessary expense.

Yet, for Pancino, the challenges are inevitable, because “the day a system goes into production, it’s legacy”.

“For too long we’ve built new things, thrown them into production and [considered it] someone else’s problem,” he said.

As a result, today, financial firms’ IT shops harbour “long tails of legacy” applications that must be ‘cocooned’, or shielded, while the rest of the enterprise modernises. Sometimes, Pancino said, they are “not even known about”.

Lifecycle management – that is, monitoring when a system goes from being vendor-supported or maintained, to legacy – involves tracking assets so that organisations can securely carry out system hardening, patching, and decommissioning.

Echoing Pancino’s sentiments, fellow panel speaker, Tanium vice president and general manager for Asia Pacific, Armando Dacal, recounted how his enterprise clients largely manage to patch just 80 per cent of their systems, rather than achieving a full network upgrade.

“You ask why 80 per cent is just a magic number – and that’s to Matt’s point, that last 10 to 20 per cent is just hard to find,” Dacal said.

However, in Dacal’s observation, legacy applications that are poorly managed raise difficult questions for technology practitioners.

“[Clients ask], do we continue to work towards 100 per cent [of patching], or do we move on to another project, because that last mile is likely going to consume probably 10 times more time than the initial 80 per cent?” Armando explained.

To him, network visibility is key to securing endpoints.

“Ideally, you want to be a position where, if an issue happens today, you’re concerned about it. You can look across your network, identify very quickly and jump into action immediately. When a crisis happens, that’s not the time to debate the data. That’s the time to jump into it,” Armando said.

Panellist Gary Blair, former CISO of NAB, Westpac and CBA, on the other hand regards securing endpoints on legacy systems as a business, rather than a technology, problem.

“The fundamental thing is, you can choose how to pay for [the problem]. You can pay for it upfront and proactively, and it’s going to cost you less, or you can ignore it and put it to one side,” Blair said.

 

Keeping on the front foot with cyber risks and controls

Blair also encouraged CISOs and IT teams to take a ‘risk-based’, instead of a ‘compliance-based’ approach to managing budgets and enterprise initiatives; in his experience, projects initiated to address compliance issues raised by regulators or auditors are typically of “lower quality from a security perspective”.

“Their objective [from a compliance angle] is to get the issue off the table, not necessarily address the risk,” Blair said.

Drawing on decades of industry experience, he urged CISOs to instead aspire to be in control of projects that address risks as they emerge, rather than “responding to somebody with a big stick”.

On the subject of regulators, all three industry experts recommended greater transparency and open communication when engaging with Australia’s prudential regulator, APRA, particularly as technology environments grow in complexity.

For Pancino, a constructive and open approach is preferred, noting not only that there is much to learn from regulators, but also that a strong relationship helps ensure that financial firms’ technology projects will not get bogged down and can progress at a reasonable pace.

However, Pancino stressed that the discussion around enterprise technology must be “controls-based”, rather than “IT-based” – that is, avoiding an overly technical focus that obscures bad processes.

“When you start thinking about your standardised controls around availability, security, disaster recovery, and when you demonstrate those controls, and who can operate those controls, if you can demonstrate that you can manage those environments, then you can pretty much use the technologies that will deliver most value to your business,” he said.

Pancino also praised APRA’s Banking Executive Accountability Regime (BEAR) framework and its mandate to increase accountability within financial firms – particularly helpful, in his opinion, to provide clarity on who is responsible for protecting data and critical assets.

The BEAR framework, which commenced in July 2018 for large Authorised Deposit-taking Institutions (ADIs), and the following year for small and medium ADIs – is a legislative reform established by APRA to transform governance, risk culture, remuneration and accountability within the financial services sector.

Blair similarly commended the BEAR regime providing greater support to risk-based thinking and, by extension, promoting a security-focused culture and mindset within financial institutions from the top-down.

To Blair, it is vital that there is a clear enterprise-wide perception and understanding around an organisation’s risk appetite.

“Fortunately, in financial services, that’s now supported by APRA through the BEAR framework, where accountability is actually set,” he said.

“I think that’s an excellent way of actually getting real clarity around who’s responsible for what element of risk.”