Vendors need to be cognizant of the economics around using their tools. This is not the typical total cost of ownership discussion. Instead, it is a discussion around the external costs of using products designed to interact with any of a number of clouds. Using a cloud such as Amazon or Azure can be cost prohibitive, but at the same time, it can be quite cost effective. Keeping a handle on costs is currently a time-consuming job that is prone to many inaccuracies as new vendor products are added. So, if you have a product that interacts with a cloud, please add cost analytics.
There is a question I ask nearly everyone who places data into the cloud, whether they are in data protection, desktop as a service, or software-defined storage. If data ends up somewhere in a cloud, I need to ask if cost analytics is performed. I am not looking just for today’s costs but also for future potential costs. Let us look at data protection. Many packages will place data within Amazon S3. Placing data in Amazon S3 is fairly inexpensive; however, removing data from Amazon S3 is nearly triple or quadruple the cost of putting it in. If there is a lot of data, that can add up over time.
In essence, I need to know at any moment not just the cost of using the cloud to house my data, but also the ever-changing value of the cost of pulling my data back from the cloud or even copying that data to a different zone of the cloud. The potential costs of gaining access to the data are very important. They become heartbreaking during a disaster, when the last thing you want to worry about is costs. Yet, with the cloud, you do. Knowing ahead of time that during a disaster you will need a cool million dollars extra means you have to budget for that disaster—and to keep money around as a form of insurance!
This is the problem with the cloud: if you cannot pay, you cannot play, which means your budgets need to account for disasters in the case of data protection, and for movement of data into and out of the cloud in the case of other tools. These costs add up. However, you may say, there are a number of tools that do costing for your cloud. In fact, there are any number of them. Yet, they do not know how you use the data, so they cannot predict what will happen when you have an emergency or a need to add even more data.
You can use those cloud cost analytics engines, such as Cloudability, CloudHealth, and others. However, they are not specific to any one application. They are generalized. You can do what-if games, but they are not definitive. Each individual tool is definitive. It knows the answers to certain questions. It can even tell you exactly what you need to pull from the cloud to restore your entire environment. For data protection, that will not be everything—just the last bit added. If you had a million objects in S3, you might need to get only a thousand objects out. That would be a huge cost savings; however, the generalized tools would not know those details.
We need cost analytics baked into many tools. Those analytics could be achieved using APIs to the most popular tools already in use or part of the tool in question. It is the unexpected costs that we need to plan for, budget for, and otherwise understand.
In the data center, we have a very good understanding of costs. In the cloud, those costs are sort of understood. We know how much we are spending, but not necessarily how much we could spend. We need better accounting within the cloud, so our tools also need to give us better accounting within the cloud. Those cost analytics help us to understand the economics of using the cloud and moving to the cloud, as well as when not to put something in the cloud. It becomes part of our planning.
Cloud expenses are under OpEx these days. Perhaps it is time to move them to CapEx instead? Do your tools give you some form of cloud or multicloud cost analytics? What would you like to see?