By Nikhil Korgaonkar, Regional Director, Arcserve India & SAARC
Major carriers around the world are starting to roll out 5G capabilities. Most organizations are looking forward to boosting productivity that will come with 5G’s increased speed and bandwidth. But they’re also wondering how they’ll handle the flood of data that 5G will bring.
Indeed, 5G will usher in an era of explosive data growth. That’s because the increased speed and lower latency of 5G will power many next-generation applications, which will mean much higher data use. According to an Ericsson report, the average traffic per smartphone in the India region stands second-highest globally and is projected to grow to around 40GB per month in 2026.
And here’s the trillion-dollar question. Where will all that data be stored-and how will it be protected? Just as important, what architectures can organizations create to successfully capture and process the new data and use it to their advantage?
New policies are needed
With so much more data generated in the 5G era, organizations will need governance policies to determine what data is mission-critical and needs to be saved and protected, versus what data is less important and temporary and can be quickly discarded. Not all data is equal, and companies need a system to decide what data is valuable and what is not.
Many companies have no such system. The truth is that traditional data backup and disaster recovery teams tend to err on the side of caution. They operate on the principle that you can’t go wrong by backing up everything. So, they take a one-size-fits-all approach to their data and back up all of it. While this works, sort of, in an environment where organizations generate a linear amount of data because they can manage their storage expenses, it won’t work in the 5G era.
A one-size-fits-all policy is not feasible in an environment where the amount of data you need to store and protect increases exponentially. If organizations treat all their data the same, the sheer cost of storing it all will drive some businesses into bankruptcy.
Think about it. What business can afford to allocate an ever-expanding chunk of its budget to data storage alone? It’s just not a sustainable approach from a cost perspective.
New data management solutions will help
So, what’s the answer? As noted, organizations need to create policies that enable them to store their data in the most price-optimal way. For instance, they can use more expensive tier-one storage for mission-critical data that needs to be accessed frequently and recovered quickly. While for less important data, they can rely on more cost-effective alternatives. The trick is to quickly determine which pieces of data are more important than others.
Fortunately, there is a new generation of data management solutions that come packed with intelligence to help you understand which data is worth saving and which is expendable. Based on how frequently you’re going to use the data and how critical it is; you can then decide whether to put it in primary storage or lower-cost secondary storage.
By moving to such an intelligent and dynamic data management and protection model, organizations can keep their storage costs under control in the 5G era. What’s more, as companies roll out various use cases for 5G-whether immersive technologies like virtual reality or the real-time monitoring and remote control of connected equipment-they will have the comfort of knowing that their data storage capabilities are up to the task.
Companies that act now will win
With 5G arriving soon, it’s time for every organization to ask itself whether it is truly ready for the explosion of data that will come with 5G. Those that now use a one-size-fits-all approach to data storage and management are probably not ready. While those companies that use an intelligent, multitier strategy for categorizing and prioritizing their data will be in a good position to unleash the tremendous potential of 5G and gain a critical edge over their competitors in the years ahead.