During a briefing of Quest’s new data protection announcements I started to think about the future of data protection. Quest recently announced that NetVault will now work with Exagrid devices and that there is now a Capacity Edition targeting SMBs and SMEs. These changes add some more capabilities to an existing product suite. While, these announcements do not necessarily merge with virtualization backup, the combination of Quest’s tools and partnerships do form an impressive view of the future with respect to Data Protection. But is this future here now or even achievable?
To get more information on the quest announcements please visit Quest Software Unveils NetVault Backup Capacity Edition for SMBs and Quest and ExaGrid Team to Simplify Virtual Data Protection. These announcements plus discussions with Quest brought to life a new vision for the future of data protection with respect to the cloud.
Emerging Data Protection Strategies
- Application Aware Backup and Restoration which was discussed in our article entitled Application Aware Virtualization Backup
- Cloud Aware Backup and Restoration is where we treat a Cloud Tenant as if they were the Application, this implies we need to be able to separate one tenant’s configuration and application data within the cloud management stack from another tenant’s configuration and application data within in the cloud management stack. We want the glue that binds the tenant together as well as the true tenant data.
- Multi-Tenant Backup and Restore is important as we move to the cloud. Backup and restoration tools must be driven by the tenant and not require a phone call to the cloud provider or even local administrator to force backup, but be part of any cloud portal.
- Backup to Anywhere becomes more important within the cloud, as I may need to backup from one cloud to another, to a storage cloud provider, back to my existing enterprise, or any combination of these.
- Restore to and From Anywhere becomes important within the cloud, as I may need to restore across clouds of like type but also across types, from cloud based storage, from tape (or virtual tape devices), from the cloud back to my enterprise, or a combination of these.
- Automatic testing of Protected Data becomes more important as we move towards the hybrid cloud, as we need to know that in a disaster our data can be restored quickly and easily.
How to Future Proof Data Protection?
How does one future proof data protection today? By applying good architecture to your data protection while living within the bounds of existing software. At the moment, there are no true multi-tenant backup and restore tools or even ones that are truly application aware, but we are heading in that direction. So, this implies we need a good architecture. What does that architecture entail?
Simply, put the use of in-tenant backup solutions with proxies to bridge the data between multiple cloud entities.
However great this sounds, this is an unfeasible as the performance and backup and restoration times required by cloud tenants require backups to be performed at a much lower level within a cloud, where the backup and restoration tools can take advantage of the underlying hypervisor and storage environments. To backup cloud tenants we should NOT revert back to agent-full backup tools but insist on agent-less backups down at the hypevisor and storage layers BUT also insist that any new tools work in a multi-tenant environment.
So ‘in-tenant’ backup solutions could be the following:
- Cloud Portal Controls and Reporting on backup that is multi-tenant
- Use of Virtualization Backup tools such as Quest’s vRangerPro, Veeam Backup, PhD Virtual Backup, etc.
- A way to backup the ‘connections’ between applications as well as tenant configurations.
- Write data local, to tape, or bridge to the cloud:
- Use of a storage cloud gateway such as Riverbed Whitewater, StorSimple, Natsuni, or TwinStrata.
- Use of Cloud Gateways to bridge clouds such as tools from AFORE and VMware.
- A tool that automatically (with a little scripting help from the tenant) tests my backups, in all locations it is stored, to ensure it can be read and the applications actually work prior to need
So ‘in-tenant’ restore solutions could include the following:
- Local Restore controls and reporting (as well as Cloud based Multi-tenant controls). Why local? For Disaster Recovery where all I have is a laptop and the software to do the restore.
- A way to pull data from any cloud to another or to bridge back to your local datacenter
- A simple way to restore the data
It all sounds simple, but we are missing a few things:
- Integration into the cloud portals that exist, at the moment unless the cloud provider has done a lot of work making their own, the portals such as VMware vCloud Director have no backup or restore capabilities.
- Integration with the clouds that exist to pull not only workload backups, but backups of connection and tenant configuration data
- Integration with the existing clouds to provide per tenant test sandboxes that can be used to test restorations and functionality of a backup.
- Integration with the existing clouds to provide ways to provide backup targets.
- Creation of tools that can run locally to pull data down from any cloud to a local data center, which could be one way of getting your data backup out of a cloud on a regular basis.
- Move away from the Central Backup Server architecture, to a distributed architecture that spans clouds.
So why did Quest’s tools make me think of future proofing Data Protection, because their suite has nearly everything they need, but it needs some more integration.
- Quest vRangerPro can backup the virtualized data to not only storage cloud providers but to
- Quest NetVault which can run the data to tape (for storage at Iron Mountain), disk, and clouds as necessary
- Quest Lightspeed could be used to backup the per tenant connection and configuration data with in the cloud
A combination of these tools or something like it would be a clear win for the future of cloud based data protection. All we need now is the glue to pull them all together and future proof data protection. Even with this glue, the key component will be the capability to restore from anywhere to anywhere even from my local device whatever that may be.
Are we on our way? of course.
-NetApp’er-
Great read! I wrote something similar (but different) back in September.
Reading this, I’m curious if your take on the future of Cloud, and the success/failure of Cloud in general is going to be based on the success/failure of the ISV’s to create viable backup solutions?
Wouldn’t it be smarter to, as I stated in my post, to innovate ways to avoid even having to back up at all?
-Nick
Hello Nick,
I think the ultimate success of the Cloud will depend on what we can get out of it regarding data protection. If Clouds hold my data hostage for any reason, then that is a problem and one that is going through various courts now (at RSA I heard of a EU court case specifically about this, unfortunately I did not hear the result yet). So some form of Data Protection will always be needed.
In addition, due to compliance and current corporate data protection policies, I think there will always need to be a backup and disaster recovery mechanism available, at least for the foreseeable future. If you have only a local private ‘cloud’ what happens when that cloud has a massive disaster? Not everyone will put their clouds into the public spaces. Given this, is there a way to architect a cloud so that you do not need Backup, Restore, Replication? Perhaps, but whatever you do will end up being a form of Data Protection.
The vision I see is the ability to move or place my data anywhere and still use it regardless if its a backup or replication. Eventually, we will be able to cross cloud boundaries… A lofty goal now, but one that could be crucial in the future.
Best regards,
Edward L. Haletky