Workload Portability and Data Archiving – New World Meets Old World
By Andrew Martin, Director of Asia Pacific & Japan
When I joined Zerto, I knew that first and foremost I was joining a Business Continuity and Disaster Recovery (BC/DR) company. Albeit with an incredibly disruptive approach the focus was and still is BC/DR.
However the world around the data center is changing rapidly and even though BC/DR is the major driver for the Zerto technology, there is another trend that is becoming very real. IDC refer to this as workload migration, I prefer to refer to it as enterprise workload portability.
When you think about what a BC/DR product does, it moves an application or workload along with all its associated data from point A to point B. Then when the need arises, such as for an unplanned outage, the application can be “failed over” essentially meaning the workload will start to run from the “point B” location.
The trend in data center is towards software defined and a hybrid cloud approach. This can mean many things, but one promise of the software defined era is portability of enterprise workloads. As the physical infrastructure becomes abstracted and eventually irrelevant to the IT department, what will become absolutely critical is the “workload”. The ability to move the workload at will is also going to become a must.
IT departments are already demanding the capability to move workloads across data centers, private clouds and public clouds. I can even imagine a future where IT departments may want the ability to move workloads between public cloud providers in real time based on minute by minute pricing analysis. We are not there yet but it’s moving that way.
Zerto specialise in replicating Virtualised workloads. The “spin-off” of this technology and functionality is that we are already down the road to granting IT departments this enterprise workload portability. We see many people that take on our technology do so in the knowledge that in addition to DR we provide flexibility and portability.
In essence the changing nature of the data center has meant that our technology can be used to enhance and complement the shift to software defined and cloud based IT.
So where is the link with data archive software?
The concept of archive software is old, it goes back to mainframe times and over the years has it has always been in a struggle with the cost of primary storage to justify itself. The biggest block to investing in archiving software has always been that as the price of disk comes down, IT managers tend to throw more disk at expanding data rather than use archiving software to remove old or stale data from primary storage. Archive can be difficult to cost justify when being sold for storage efficiency.
So why is archive suddenly more relevant today and why would an old world technology like archive matter to a new world company like Zerto?
The answer comes in the word “portability”. When I refer to enterprise workload portability, the biggest problem in moving a workload around is the amount of data that may be attached to that workload. We get into bandwidth and migration time issues that mean the smooth movement of a workload from A to B and on to C can be slowed down by the amount of data that might need to shift with that workload.
Archive serves a whole new purpose in a world where people want the flexibility of enterprise workload mobility. Moving stale and static data that is rarely if ever accessed out of an application’s data store and replacing it with pointers back to an archive, means the size of the live workload itself is significantly reduced in turn this will mean that portability especially into new platforms becomes quicker and smoother.
We are not there yet, but we are seeing it signs that Archive is going to become part of the “new world” in the last few months I have seen companies like EMC and Barracuda invest in acquisitions of archive companies, I believe for this very reason.