5 Numbers You Should Know About Data Protection

Data Protection and disaster recovery

You may have a great idea of where your own IT team stands on data protection, but what about the business beside you, or your main competitor’s office? What are the trends out there in the industry at large?

Recently, an independent research team undertook the largest known industry survey ever conducted on the topic of data protection. Our friends at Veeam published the results, which offer a lot to take in. What nuggets of knowledge can we glean, and what is their significance? Keep reading to dig into the research to unearth five important figures for keeping your fingers on the pulse of data protection across the board.

50/50 – On Premises and Cloud Hosted Servers

This figure represents the industry standard for splitting IT between on-premises and cloud-hosted servers. For three consecutive years now, the most common arrangement between physical servers and virtual machines within an organization’s data center and virtual machines hosted elsewhere or through a managed service provider has been, broadly speaking, an even split. The hybrid model appears here for good, so those in charge of data protection tactics should ensure they include physical, virtual, and multi-cloud considerations.

4 Years & Disaster Recovery 

Within a given organization’s business continuity strategy, the number of companies choosing to store secondary data with an outside cloud provider is expected to increase by 30% in just four years (from 2020 to 2024). Cloud-powered disaster recovery is on the rise for two primary reasons. First, utilizing secondary infrastructure gives much-needed flexibility when working with self-managed data centers. Secondly, outsourcing some disaster recovery services gives access to technical know-how without needing to hire in-house experts.

The Cost of Downtime – $1,467 

That number states the average cost of downtime per minute — $88,000 per hour. Considering the average outage lasts 78 minutes and 40% of servers suffer a minimum of one outage per year, it adds up fast. Through all of this, the magic number seems to be 60 minutes. That’s where data loss tolerance hits its breaking point. For 55% of high-priority data and 49% of standard-priority data, the data loss tolerance is up to one hour. Notice that significant workloads and general IT stuff doesn’t differentiate much. It’s all important.

Ransomware Attack – 3 out of 4

Three in every four businesses have experienced a ransomware attack in the last year. Most encountered two or more. Whether it be an insider threat, spam email, infected software, malicious link, or a compromised credential, ransomware is a potent threat to organizations even if they consider their cybersecurity plans to be totally friendly to business continuity. Considering that about a third of data is never recovered after a ransomware attack, the stakes are high for this very potent type of cyberthreat.

90%

This is the percentage of organizations that self-report a mismatch between an SLA expectation and the amount of time IT systems can be expected to return to normal. Called an availability gap, it is a growing concern for leaders trying to reach productivity goals despite a lag in how fast applications can be recovered. The prevalence of availability gaps in the industry shows that almost every IT team is losing the race to improve recovery time in the face of rising criticality among workloads.

When it comes to data protection, there is a lot to know. If you are looking to learn more about how to approach your business continuity plan, check out our helpful guides to what to include in your DR plan and how to go about choosing a DR provider.