It is the day before April Fool’s Day so it is probably appropriate that is also World Data Backup day. Despite is name, it is, in fact, a thing. Originally created to remind consumers to back up their personal files like photos, music or movies, it has also taken on a resonance in the enterprise that cannot be avoided. There are two recent pieces of research that show just how important data and backing up enterprise data is. The first is from Veeam, whose Data Protection Report 2021, shows that keeping and securing data is still a major problem for enterprises.
Veeam is a Swiss-based developer of backup, disaster recovery and intelligent data management software. The research, which was based on more than 3,000 IT decision makers at global enterprises indicates that data protection challenges are undermining organizations’ abilities to execute digital transformation (DX) initiatives globally.
It found that 58% of backups fail, leaving data unprotected. Against the backdrop of COVID-19 and ensuing economic uncertainty, which 40% of CXOs cite as the biggest threat to their organization’s DX in the next 12 months, inadequate data protection and the challenges to business continuity are creating the biggest headaches for enterprises.
Specifically, the impediments to transformation are multi-faceted, including IT teams being too focused on maintaining operations during the pandemic (53%), a dependency on legacy IT systems (51%) and a lack of IT staff skills to implement new technology (49%).
And if you think that is a problem, then further research by Ottawa-based Rewind confirms that back up and data loss are inextricably linked. Overall, the Rewind research found that while more than half (53 percent) of respondents cited using SaaS tools on the job, and some (43 percent) even used four or more, many users (45 percent) still were not aware of the Shared Responsibility Model (SRM).
The SRM means that the organization’s security team keep some responsibilities for security as you move applications, data, containers, and workloads to the cloud, while the provider takes some responsibility, but not all. This means that while SaaS providers actively back up their own cloud infrastructure, they do not make the account-level, business-critical information stored in their apps available to users in the event of data loss.
This is even more problematic when you consider that most users noted that the data in their SaaS applications was either “somewhat” (47 percent) or “very” (42 percent) critical to do their work. In fact, depending on cloud providers for back up is increasingly common. Often the first foray for enterprise companies to move to the cloud is to consider the use of public cloud for back-up and disaster recovery. The need to have a scalable, agile and cost-efficient approach for back-up and high availability for sensitive enterprise data is important, Vinay Mathur, chief strategy officer at Toronto-based Next Pathway, told us. The legacy approach of relying on-premises servers for back-up and redundancy is not cutting it, as it does not scale and is highly cost prohibitive.
A good example of this we see in the market today is around disaster recovery strategies for large on-premises Hadoop clusters. Especially for large companies, legacy Hadoop clusters store massive amounts of data (often at petabyte-scale), and an on-premises back-up and disaster recovery method would come with an enormous price tag. “Instead, companies are looking to the hyperscale cloud providers to solve this challenge,” he said. “And while doing so, enterprise companies, some of whom have been reticent to consider a move to the public cloud, start to dabble into unfamiliar territory with massive upside for their IT operations.”
Related Article: Making Sense of the Growing Legislation to Protect Customer Data
Next to employees, data is the most valuable asset of any commercial business, nonprofit or government agency. Protecting both has therefore become a top priority for organizations large and small, around the world. Horrifically however, this past year saw both under attack, as a direct result of the COVID-19 pandemic. From a data protection standpoint, the rush to accommodate new and necessary ways to work, shop and live opened the door to cybercriminals, according to Surya Varanasi, CTO of Nexsan, a subsidiary of StorCentric. Consequently, we saw a dramatic increase in ransomware attacks and high-profile data breaches that further cemented the importance of backup, he shared.
“However, the past year also taught us the criticality of ‘unbreakable backup.’ Certainly, the overall objective of backup is the ability to recover from any failure or data loss within a specified period,” he said. “The process of backing up, especially to disk, has become highly automated after initial setup across applications, platforms and virtual environments.’
Now, however, he added that as ransomware and other malware attacks continue to increase in severity and sophistication, we understand the need to protect backed up data by making it immutable and by eliminating any way that data can be deleted or corrupted.
Unbreakable backup does just that by creating an immutable, secure format that stores the admin keys in another location for added protection. For those seeking the ideal unbreakable backup solution for their environment, they should seek one that delivers data integrity with policy-driven and scheduled data integrity checks that can scrub the data for faults, and auto-heals without any user-intervention.
Next, they should seek a solution that provides high availability with dual controllers and RAID-based protection that can guarantee data access in the event of component failure. Recovery of your data will also be faster because all RAID-protected disk arrays can read faster than they can write. With an unbreakable backup solution with these capabilities in-hand, users can alleviate their worry about their ability to recover — and redirect their time and attention to activities that can more directly impact the organization’s bottom-line objectives.”
The bottom line is that there is little room any more for a legacy approach to data and organizations will have to find a way that works for them. “I think the impact of COVID on the nature and distribution of work has clearly underscored the challenge with legacy approaches to data protection,” said Mike Plante, chief marketing officer at eFileCabinet and a long-time veteran of data backup.
“Distributed workforces, rapidly evolving workflows, and strained IT departments mandate a new approach,” he said. “We believe that for most enterprises, natively embedding cloud-based content management systems into their employees’ everyday workflows is a better, more natural way to consistently protect data that delivers a materially higher likelihood of protecting critical assets versus legacy data protection approaches.
He added that DX initiatives will fail if data protection does not evolve from file-based to content-based, and from point-in-time to full lifecycle protection.
Remote Work And Security
The recent move to remote working as the pandemic took hold has also seen increased risk to data. Across private and public sectors, there was an onslaught of phishing, malware distribution, false domain names, and other attacks on IT infrastructure as teams quickly pivoted to remote working, JG Heithcock, general manager of Retrospect told us.
“While we continue to navigate the uncertainties of the pandemic this year and beyond, it is also important to reiterate simple steps to avoid or minimize attacks on businesses, such as identify suspicious senders, exercise caution before clicking on links or opening attachments, and instill a backup strategy that utilizes the 3-2-1 rule.”
A strong 3-2-1 backup plan includes having at least three copies of data across multiple locations: the original, a first backup stored onsite, and a second backup located offsite. In the current environment, where ransomware attacks are commonplace, if all organizational backups are on a single disk that is connected to a main computer, those backups can be encrypted at the same time as source data, rendering them useless. “With three copies of data — on the computer, on local storage and on offsite storage — rapid recovery from threats such as ransomware becomes much more practical,” he added.
Backing up your data is vital for every organization big or small. There are many variables that can impact an organization and cause data loss including loss of connectivity, natural disaster, cyber attack etc. just to name a few, Joe Shields, president of Urbandale, Iowa-based IP Pathways, said.
The most expensive part of a data loss is getting back online. In the US a data breach costs a company on average $8.19 million, an increase from $7.91 million in 2018, and more than twice the global average, It is imperative, then for organization to have a plan in place to back up their data. Why is this so hard for organizations?
IT is complex and data backup is hard for organizations because they do not know where to start and have a lot of questions. As IT solution providers, the common questions asked around data backup include:
- How often should I backup my data?
- How long do I need to keep my backups?
- Do I need to pay for real-time replication?
- Should I be mirroring my data in different data centers?
- What is the risk if I only backup daily?
- What solution should I implement?
“Often times data backup isn’t enough; it doesn’t provide a complete solutions. To ensure the protection of your data you need to find a solution that includes Anti-Virus and Malware, he said. “Your IT team has to perform regular patch management. You need a firewall solution to prevent corrupt traffic from entering. IT also needs the capability to prefer Image and File backup at a system level to recover data on an individual machine level.”