Why Are You Spending So Much on Storage?

Steven Zolman
Oct. 15,2011 |

Having a methodical and deliberate understanding of data will determine how it should be stored and protected, which translates into strategies for tiered storage, backup and recovery, archiving and recovery point (and time) objectives for disaster recovery.

Should we be surprised we’ve reached this point? After all, data processing has been around at least since the advent of removable type and we’ve been challenged with the proper and cost effective storage of data ever since. With most IT organizations facing unprecedented pressure to serve up rapidly and continuously growing information sets of logically collected data faster and more reliably than ever before while at the same time tightening their IT budgets, we are challenged by the cold reality of how to pay for this ever-growing demand for more storage, because there is a high probability you’re not going to constrain the need for information in your organization. In addition, the demands for functional storage are increasing at the same time. Business users want their information now, in real time, and as a result, the media and technology is increasingly driven towards higher levels of availability, reliability and performance.

While you may be ahead of the curve in planning for your next storage acquisition to address anticipated growth in storage requirements, there is a good chance your storage projections will be obsolete before you hit send on your email request to procurement. But that’s likely not the real problem here, as there is probably a more important imperative to take a step back and ask: do I really know what I’m trying to store and for what purpose? Do I really know what information the business users want and how best to get it to them? Further, how long should I keep this data? Where and how should the data be stored? Who owns this data? How should I secure and protect this data? What level of business criticality does this information have, and as such how should it be backed up and potentially restored? Without clear answers to questions like these (and many more), backed by a comprehensive and clear data management strategy, you are likely spending way too much on storage. Most clients that engage in holistic review of storage policies, technologies, data retention plans, back-up and recovery objectives, et al – and a subsequent optimization effort, achieve savings in excess of 30%.

Data management, along with a data protection strategy and a plan for the lifecycle management of data, provides the foundation for introducing some coherence into what has become an ever increasing complex array of channels through which data passes from the point of inception to the archival. Data creates information, information creates more data required for understanding and context, and the lifecycle continues to evolve. Having a methodical and deliberate understanding of this data will determine how it should be stored and protected, which translates into strategies for tiered storage, backup and recovery, archiving and recovery point (and time) objectives for disaster recovery.

All too often, storage technology is piled on over time; use a SAN for this, NAS for that and Windows file services for this, with no clear tiered storage strategy that puts data where it belongs due to a need for read/write performance, longevity or security. Instead, as is often the case, everything becomes quasi enterprise-class, including data that is infrequently touched or application files that are rarely refreshed. A tiered storage strategy would potentially place less volatile and less mission critical data on lower cost technology alternatives, or perhaps cloud storage for active archive requirements. Without this consideration, you are likely paying way too much for storage.

Is data deduplication the fix-all for your data storage restraints? It is if you believe what your backup and storage vendors are telling you. But data deduplication, typically combined with your backup and recovery technology, is typically only a band-aid to fix a larger problem: a poor data management strategy. Data deduplication is great technology and a long overdue and pragmatic solution to reducing storage requirements for data backup and archives, but no sense in spending 100s of thousands of dollars on a de-deduplication product to store data that doesn’t need to be there to begin with.

The storage equipment, software and services suppliers have long argued that the cost of managing data is significantly larger than the incremental cost to just buy more. In our client’s experiences, however, that is absolutely not the case. Clients that manage these environments carefully spend only $1 to every $3 they save, and the incremental cost of adding more and more storage is clearly a net increase in costs – so the worst case scenario is that a carefully managed strategy for storage will help virtually any company save considerably – and is far better bet than just buying more.

This has implications as well for your recovery time and recovery point objectives for disaster recovery planning. Getting your most mission-critical data from point A (primary production environment) to point B (backup DR environment) as quickly as possible to meet your recovery point objective is probably one of your greatest disaster recovery planning challenges. Approaching this with your data management strategy in mind first will lead to a coherent approach and priorities for recovering data in the event of a disaster in a pragmatic and economical manner. How should I move my data to the DR data center? Tape to disk? Block level synchronization? Snapshots? There are implications on the costs of moving and storing data in of these scenarios, that if approached in a narrow, disaster recovery focused manner only, will likely result in spending way too much on storage and storage related technologies.

And speaking of data protection or, more specifically backup and recovery, this is also an area that is likely putting a tremendous amount of pressure and unnecessary cost on the organization related to storage. Whether you’re struggling to complete backups within a reasonable overnight backup window or lugging vast volumes of tape media to offsite storage facilities, backup and recovery products have evolved significantly in the past several years, and they now integrate archiving, data de-duplication and storage virtualization in ways that can have significant impacts on the efficiency and cost on your data protection and storage strategy. If you haven’t started revamping your backup and recovery strategy and how this integrates with your disaster recovery plans, you should start today.

Solid storage management and monitoring storage utilization is also critical to keeping costs in check, because what you can’t see you can’t fix. This includes visibility into storage allocation and reclaiming unused storage as part of a proactive capacity planning approach that anticipates storage and performance requirements before they become issues that might require hasty and expensive storage acquisitions.

You will not be fighting a losing battle in containing costs and maximizing the value in your storage investments if you take a deliberate, on-going and proactive approach to our storage and data management requirements. In the end, satisfaction comes from knowing you’re leading rather than following this unending trend of information growth.

NET(net)’s Website/Blogs/Articles and other content is subject to NET(net)’s legal terms offered for general information purposes only, and while NET(net) may offer views and opinions regarding the subject matter, such views and opinions are not intended to malign or disparage any other company or other individual or group.

Read similar posts below