As the public sector shifts to a new operating model it has to take a new posture on cyber security, writes Gregor Davidson, cyber specialist at Nutanix
Hybrid cloud is becoming the operating model of choice for many public sector organisations, and while it provides a range of desirable features and business benefits it also raises new questions around their cyber security postures.
They often hit through an unwitting employee clicking on an email attachment, download link or HTTP payload, and organisations need to take steps to prevent, alert and remediate against them.
A key element of this is in microsegmentation, dividing data into distinct security segments down to individual workload levels, which requires a new perspective in the move to hybrid cloud.
Traditionally organisations have taken a network-centric approach, making provision to isolate specific networks and controlling access to each to prevent the spread of malware. But it is less appropriate to a hybrid cloud model, especially when it provides the capacity to shift data and applications between a number of cloud platforms.
The focus has to shift towards identifying and categorising the applications in use, then arranging for specific ones to have access to each other as the business processes require. This is accompanied by giving people who need it the appropriate access to individual applications and enabling them to go from one into another if necessary.
This means that any employee falling foul of a malware attack will infect fewer systems, and it provides stronger protection than opening up the whole network to employees and creating larger attack surface.
It is possible to create a microsegmentation plan and security policy through software defined networking, with the capacity to monitor activity through dashboards, visualise network traffic, run automated audits on any signs of suspicious activity and automate checks on compliance. This makes it possible to deal with potential vulnerabilities before they are targeted.
There are a number of factors that come into play. One is being able to see where data resides and where applications run across multiple clouds, and how those clouds are operating at any time. This can be more difficult with legacy systems and it needs work to identify what they hold to feed into a plan.
This relates to the different approaches to storage. File storage is in wide use and creates a need to identify different types of files and the associated risks, categorise them and install alert functions to block traffic into them if necessary. This can be supported by a dashboard capability that makes it possible to see any unusual activity among the retained files.
Object storage – in which the object is the data itself, metadata and a unique identifier – is now the ‘go to’ approach for data archiving, large datasets and data lakes. This is often targeted by ransomware, which looks for objects to remove with threat of deleting them permanently unless the ransom is paid.
The risk of this can be reduced by, where possible, using immutable storage, in which it is possible to designate data that cannot be modified or removed with a ‘write once read many’ (WORM) approach. This is becoming an important option within managed cloud services, and provides an extra level of security, with the service provider taking responsibility for protecting the data.
Policies can be designed for immutable storage, enabling WORM at bucket level and ensuring it cannot be disabled, and extending (but not reducing) how long it can be applied to data.
There is also a need for a robust disaster recovery (DR) plan to ensure that if there is an incident the applications and data will transfer to a secure cloud, and back-up data is stored in an isolated cloud, segmented from the rest of the network. This is something Nutanix makes possible through natively integrated cloud DR, in which a primary site is backed up to the Nutanix Xi Leap Service.
Some of these factors were highlighted by Kevin Mortimer, head of operations for the University of Reading, at the recent UKAuthority Resilience and Cyber4Good conference. He referred to how it had built a segregated network using Nutanix Flow, a software defined network service that can apply policy management to virtual environments. This enables it to manage the ‘east to west’ traffic to minimise the damage from any ransomware attack.
He emphasised the importance of developing and testing a good DR plan, saying that when the university did so it identified a number of issues in the flow of processes between applications that needed fixing.
He also spoke about the need for a measured approach in moving data and applications to cloud soloutions, rather than “shoveling it in”, as there is a need to understand what data is being held, how long it is retained and the regulatory demands.
The most important step, he added, is to adopt the concept of ‘least privilege’, with a default approach of restricting people’s access rights to systems and data, and reducing the numbers with admin rights. This comes with looking how people are authenticated to access network systems and which ports are connected.
It all demonstrated the importance of defence in depth, comprising a series of measures that recognise the need to fully understand, segment and closely monitor all the elements of a hybrid cloud estate.
Nutanix can support the approach through its secure private cloud platform along with multi-cloud services delivering key capabilities, including:
- securing the platform with automation;
- automatically detecting and remediating any security configuration errors;
- protecting data with native data-at-rest encryption.;
- network segmentation and application policy controls;
- auditing and reporting on regulatory compliance.
You can find more detail through the following links to Nutanix services on:
UKAuthority has worked on a number of reports and briefing notes from our virtual round tables with Nutanix on the theme of transformation. Catch up with the research here
Image from iStock, ipopba