IT Brief Asia - Technology news for CIOs & IT decision-makers
Story image
Interview: Cohesity ANZ director on data management in a COVID-19 world
Tue, 14th Apr 2020
FYI, this story is more than a year old

The proliferation of data has changed the landscape of almost every enterprise, affecting IT teams on almost every level.

Now that most IT teams have cloud strategies in place, issues and questions have cropped up regarding the swathes of data being circulated - where does it get stored? What are the backup solutions? What are the security implications?

TechDay caught up with Cohesity managing director of ANZ Steve Coad, who explains the importance of data backups, management, and security, and how the onset of a global pandemic has changed the industry.

What are the misconceptions around data storage in the cloud and what can you do about it?

The big misconception about data storage in the cloud is that it is cheap but this all depends on your needs and what you're putting in the cloud.

The chief misconception that came from some recent Vanson Bourne research was that by simply moving an on-premises workload to the public cloud it would simplify operations, increase agility, reduce costs and provide greater insight into their data. That's not the case, as over 90% of respondents admitted.

The survey showed that 42% are using three to four point products to manage their data across public clouds today, while nearly a fifth are using as many as 5-6 separate solutions.

Respondents said this was an area of grave concern, especially if those products do not integrate. 

How can companies move data back and forth in the cloud and how can you simplify this?

While providing many needed benefits, the public cloud also greatly proliferates mass data fragmentation - the spread of data copies into siloes across a business.

We believe this is a key reason why the Vanson Bourne survey found that 38% of respondents say their IT teams are spending between 30-70 % of their time managing data and apps in public cloud environments today.

With Cohesity, customers can tier file data to the cloud, leaving a reference access stub behind, and so save space on primary files.

They can use cloud storage scalability to handle spikes in storage demand without the need for cloud gateways and disparate point solutions to connect to the cloud.

This all helps the transition from large capex investments to a pay-as-you-go operational budget model with the ability to leverage other services and applications available in the public cloud.

People are using backup as an insurance policy to deal with ransomware. Why is this no longer optimal and what can you do about it?
 

Put simply, you can't rely on a conventional backup to counter ransomware because cybercriminals are evolving their threats faster than many backup vendors' technology can develop a counter technology.

According to Forrester Research, only 21% of surveyed organisations confirmed they have contingency plans to recover from ransomware attacks, and only 11% of respondents said that they were confident to recover their data within three days of an attack.

What's ultimately needed is a modern backup solution with a multi-layered defence approach.

This enables you to defend against sophisticated ransomware attacks.  At the same time, such an approach should include technologies such as an Immutable File System, which keeps the backup jobs in time-base immutable snapshots.

The original backup job is kept in an immutable state and is never made accessible, which prevents it from being mounted by an external system.

The only way to mount the backup in read-write mode is to clone that original backup, which is done automatically by the system.

Although ransomware may be able to delete files in the mounted (read-write) backup, it cannot affect the immutable snapshot.

The most obvious thing you can do to counter ransomware is to replicate your mission-critical data to another immutable cluster or site, which adds another layer of protection against ransomware attacks and ‘air gaps' them.

How will data management be impacted by 5G?
 

5G stands at the iciest of pinnacles right now. It is not yet mainstream but it has huge expectancy with analyst houses predicting stratospheric revenue opportunities.

While 5G might be the next big thing for wireless networks and telecommunications firms, for enterprise IT leaders it's more about providing a complete data management solution from the core to the cloud and edge.

According to Vanson Bourne research, over 90% of businesses believe that the promise of public cloud hadn't been realised because their organisations are weighed down by mass data fragmentation issues.

Coming to grips with managing distributed data before you unleash 5G and roll out edge computing on your business will be key.

Not addressing these fragmentation issues first could lead to skyrocketing storage costs, security challenges, reduced agility, risk and compliance, and data recovery.

How has the COVID-19 pandemic changed the significance of backups? Has increased vulnerability as people work remotely affected this?
 

The current situation has expanded the IT landscape and as a result the surface of backups. 
Some organisations have staff using devices not issued by the corporate IT team, some are using their own personal devices, and more broadly there's an influx of office-based workers now off the corporate firewall and some without a VPN.

Each case and need will differ and be specific to the setup of the organisation, but it isn't unrealistic to say organisations are having to have employees take a greater level of responsibility for their own backups and file recovery too.

If you're going to give your staff the ability to restore their computer in the event of issues, it's critical they understand the importance of backups so their data gets backed up properly and what to do in the event of a problem. 

Likewise for data management – are there any shifts in what people need to be concerned about in a COVID-19 world?

Keeping productivity as high as possible and eliminating IT-led outages or issues is going to be your modus-operandi for the next few months.

It's critical that you not only keep the lights on and keep data accessible, but that you proactively keep corporate data as secure as if all employees were on-site within the corporate firewall.

Undoubtedly the biggest threats to this will be cybercrime, which has adapted already with new targeted campaigns to obtain information.

The fast-moving landscape we've endured because of the virus has also created a state where hackers, scammers and spammers are thriving.

Without doubt having an influx of home workers raises the threat profile for your organisation as attack surfaces increase.

Re-evaluate your IT policies, and update to support a remote workforce. Since February security researchers have seen a spike in attacks with their peak happening over the past fortnight.

To counter these attacks use tools at your disposal already to set up alarms and alerts that observe unusual activities such as permission changes, volume increases on storage, and high volumes of data being moved.

Finally, utilise any mobile apps and automated alert reporting from your vendors to make it easy to spot issues before they arise.