Proprietary Cloud Archives Are Like Roach Motels
You Can Check-in, But Can’t Check-out
Your company owns a great deal of data. Not just “records” but huge amounts of unstructured content, proprietary data such as intellectual property, and sensitive customer and employee personal information (PI).
Over the last several years there has been a sizable movement to cloud-based computing, storage, and information management. Yet some corporate CSOs/CTOs/CIOs have been leery of storing proprietary or sensitive data in a 3rd-party, proprietary cloud due to fear of vendor lock-in, perceptions of low security (i.e., hackers and cloud vendor access to your data). With the advent of large public cloud platforms with state of the art security, C-level management has another chance to consider cloud storage for their sensitive data.
Cloud Archiving - Full speed ahead
Many are now changing their tune about uploading and storing proprietary and sensitive data into a cloud based on the reputation and success of big public cloud vendors like Microsoft (Azure), Amazon (AWS), and Google (Cloud). These public cloud vendors have put sophisticated, worldwide infrastructures in place which enable companies to designate what geographies to store their data, whether to include geo-replication capabilities for fail-over, and the inclusion of next-gen security – including using your own encryption keys. In fact, one of the most conservative industries that, in the past, pushed back on cloud storage of sensitive patient data for many years, healthcare, have accepted the cloud as a cost-effective and highly secure archiving repository.
Other 3rd-party cloud offerings offer additional capabilities beyond that of just storage, such as information management, regulatory compliance, eDiscovery functionality, and a level of data analytics that can add value to stored data. However, most 3rd-party cloud archiving vendors use a SaaS model (Software as a Service) which forces companies to adopt a standardized application that cannot be customized to the companies specific needs. In reality, the SaaS model is a “One size fits all” solution which means its rarely a perfect fit for any company.
Another issue with proprietary cloud platforms is that of infrastructure. Third-party cloud archiving platforms usually have a limited geographic footprint due to only having one cloud repository in a single location with a potential second repository to act as a disaster recovery or fail-over solution. For many multinationals that have offices in countries that have regulatory retention requirements to keep all data stored within the country, most 3rd-party SaaS cloud archiving solutions will not meet their requirements.
Not all 3rd-party cloud archives are the same
The bigger issue for companies adopting SaaS cloud archiving is that of vendor lock-in. These platforms make it easy to move your data into their platform but make it extremely difficult to move it back out. It’s like the old Black Flag Roach Motel slogan: "They check in, but they can't check out." In many cases, 3rd-party cloud archiving solutions will throw up roadblocks to customers wishing to move data out of their proprietary cloud archive.
Many of the 3rd-party archives will convert your data into a proprietary format in the guise of saving storage space, but then will charge huge amounts to reconvert it back into its original format so you can actually use it after the move. In certain cases, I have been told by customers that the “re-conversion” charge so high that it can equal the annual contract fee. In reality, vendors do this so you will decide not to move away from their platform due to cost. In other cases, cloud archiving vendors will throttle your data export speeds so that it will take months or years to completely move your data out – again to make it impossible to move away.
Imagine your company has 1 PB of archived data in a proprietary cloud archive and you’ve become dissatisfied with it and want to move to another cloud provider. Further, imagine the current cloud vendor limits you to 1 TB per day. In this example, the time to completely move away from the current cloud archive would be 1,000 days – all the while you are still paying for the cloud archive you no longer want.
Figure 1: The time required to move archived data out of a 3rd-party archive with various data throttling speeds
Proprietary Cloud Archives Are Like Roach Motels – You Can Check-in, But Can’t Check-out
A big advantage of the big public clouds is that the customer completely controls and manages their data in their cloud tenancy. For example, Microsoft’s Azure Cloud platform is designed to allow customized applications to work within Azure, so the customer is not stuck having to use a single application offering that does not fully meet their needs – the one size fits all scenario.
This capability is offered in the Azure PaaS (Platform as a Service) model. With it, customers provide and manage their own native Azure applications, such as information management or archiving application.
For the security conscious CTO/CIO/CSO’s, a public cloud PaaS solution allows them to assign and manage their own encryption keys while also relying on the built-in, next-gen security infrastructure of the public cloud.
Other blogs discussing these subjects:
About Bill Tolson
Bill is the Vice President of Global Compliance for Archive360. Bill brings more than 29 years of experience with multinational corporations and technology start-ups, including 19-plus years in the archiving, information governance, and eDiscovery markets. Bill is a frequent speaker at legal and information governance industry events and has authored numerous eBooks, articles and blogs.