Storing all your data in a public cloud is a mistake
The monthly cost of storage can far from affordable
One of the tenets of public cloud storage has been how cheap it is to utilize. After all, a few pennies per gigabyte per month sounds affordable, so why not make use of it?
True, the initial cost of storing data with public cloud providers like AWS, Google or Microsoft is low, and it is very easy to get started. That said, few organizations consider the long-term implications of keeping significant amounts of their data in a public cloud. So what is there to worry about?
Using the public cloud for data storage is like having a basement you never have to clean out. You can just keeping putting more data in the public cloud but it never fills up. What started as a few terabytes stored in the public cloud can grow to tens and then hundreds of terabytes over a few years. Over time, that pennies-per-gigabyte monthly charge for storing your data in the public cloud becomes a significant monthly expense.
The reality is you will have this monthly expense for as long as you park your data in a public cloud. How can you prevent this from happening?
Knowing what data to keep in a public cloud and what data to store in your own private cloud can help you avoid accumulating the wrong types of data in the public cloud.
What type of data is best stored in a public cloud?
Hot or transactional data is the type of data you want to keep in the public cloud, assuming your applications are also running there. Data that is warm, cold or archival is the type of data you want to keep on premises in a private storage cloud.
Why? Your hot or transactional data is what enables your organization or business to function, and it needs to be stored where your applications are running. Your warm, cold or archive data can be stored in a more versatile and cost-efficient manner using a private storage cloud.
Doesn’t this run counter to what everyone thinks about storing their warm, cold and archive data in the public cloud? Yes it does. If you adopt the conventional wisdom and use the public cloud for all your data storage, by the time you have several hundred terabytes of data in the public cloud, you will understand the flaw in this approach. You will have placed a large amount of static data in a public cloud built for running applications that use your hot or transactional data.
Public cloud storage providers know that data storage is “sticky” and you are not going to touch your warm, cold or archive data very often. So they are happy to keep charging you for every month you keep it in their storage cloud. And when you need to touch your warm, cold or archive data, you will incur additional charges depending on how much data you touch during the month.
The solution is to build your own private storage cloud on premises to avoid having your warm, cold and archive data held hostage in the public cloud. If you haven’t already gone down the road to using public cloud storage for all of your data, this should give you something to think about. However, if you have a non-trivial amount of data stored in a public cloud, you should make plans to get it back and keep it in your own private storage cloud. You will have to pay a price to do it, but you will avoid a monthly expense that never ends.
Let’s ballpark the cost of storing 300 terabytes in the public cloud compared to storing 300 terabytes in a private cloud over five years.
Using AWS S3 (Simple Storage Service) standard storage with a reasonable amount of activity, you could expect to be charged $600,000 in operating expenses over the course of five years, or $10,000 per month. Using a private AWS S3-compliant storage cloud, you could expect capital and operating expenses of $270,000 over the course of five years, or $4,500 per month. Which check would you prefer to write each month?
OK, a private storage cloud looks good financially, but won’t building a private storage cloud be too complicated for most organizations? Not necessarily, because you can start with a handful of commodity storage servers plus software, and scale your storage as you need the capacity. You can also make use of storage appliances, which can be deployed to quickly build your private storage cloud.
The name for this storage architecture is Software-Defined Storage (SDS). The benefits of SDS include:
• No need to buy storage capacity that you won’t be using for years
• No vendor “end-of-life” obsolescence forcing you to engage in time-consuming data migrations every 3-to-5 years
• No proprietary firmware or hardware needed in the storage servers
• No need to buy all your storage servers from the same vendor
• No need for each storage server to have the same amount of storage capacity.
• Software-Defined Storage is hardware-agnostic.
The benefits of having your own private storage cloud are real, and it can mean substantial savings on data storage costs in your organization or business.
Tim Wessels, founder of West Rindge-based MonadCloud, can be reached at 603-899-5530 or firstname.lastname@example.org.