FINAL WORD
Dave Russell, Senior Vice President, Head of
Strategy, Veeam
full data resilience, many for the first time, revealing a number of previously unknown blind spots.
But no matter how they realised their gaps, organisations did not fall behind overnight. For many, it
Take your standard tabletop exercise, sure, it is better than nothing, but data resilience cannot be measured on paper. In theory, their processes might work, but in reality, it is a whole other story.
So, what is next? Rather than waiting for an incident to come along and put them to the test, organisations need to get comfortable with being uncomfortable. That means proactively uncovering and addressing gaps, however uneasy it might make them feel.
The first step for any organisation with below-par data resilience should be to gather a clear picture of your data profile. What you have, where it is stored, and why you do or do not need it.
With this, you can reduce at least some of your data sprawl by filtering out any obsolete, redundant, or trivial data to focus on securing the data you actually need. Then, get to work securing it.
But the work does not stop there. Once you have got your shiny, new data resilience measures in place, it is time to stress test them. And not just once. Data resilience measures need to be consistently and comprehensively tested to push them to their very limits, much like in the real thing – cyber-attackers will not stop when your systems start to creak a little. And they will not wait until the perfect time.
Many organisations have been benchmarking themselves against the wrong yardsticks.
has happened incrementally with their data resilience standards not keeping up as new technologies and applications have been adopted. With most organisations implementing AI at will to stay ahead of the competition and optimise business processes, the impact on their data profiles has gone largely unnoticed.
The sheer amount of data needed and generated by these applications has resulted in sprawling data profiles that fall far outside existing data resilience measures.
Pair this with an underdeveloped understanding of modern data resilience, and you have got a recipe for disaster. It is often a case of you do not know what you do not know. As a result, many organisations have been benchmarking themselves against the wrong yardsticks.
Go through scenarios where key stakeholders are on annual leave, or where security teams are occupied with something else entirely, to expose all of the potential gaps in your measures. It might seem excessive, but otherwise, the first you will hear about these vulnerabilities will be during or following a real attack.
It is a significant piece of work to undertake, but data resilience is worth every penny. According to the Veeam report, in collaboration with McKinsey, companies with advanced data resilience capabilities have 10 % higher annual revenue growth than those lagging.
That is not to say that improved data resilience will magically boost these figures for you, but bringing up your data resilience standards will inevitably have a knock-on effect on processes across the board.
At the very least, you can be sure that cyberthreats will only grow more complex, and that data footprints will not be getting smaller any time soon.
This is an issue every organisation will have to face, so jump in the deep end now before you get pushed beyond your limits by a cyber-attack. p
90 INTELLIGENTCIO MIDDLE EAST www. intelligentcio. com