Local appliance addresses the biggest objection to cloud storage: loss of control
i365 has added an on-premises backup and recovery appliance to its EVault online backup and recovery services.
One of the biggest inhibitors to Cloud Storage for backups has been that businesses don’t want to lose control of their data. Drunken Data mounted the soapbox on this topic Monday. No matter what assurances a Cloud service makes, it’s hard not to feel safer with data on-site.
The poster child for this reluctance is Amazon S3, which guarantees 99.95% uptime, yet has a history that falls short of this level.
The EVault Express Recovery Appliance stages backups locally, allowing transfers to the Cloud over time. The incremental costs for the appliance are small compared to conventional 100% on-site backup. Near-term recovery time is quicker, but maybe more important is the emotional benefit of having recent backups within the company walls (locally or at a remote facility).
This pragmatic tweak to the Cloud Storage model could open up the business market for SaaS in a big way. What do you think?
Posted in Backup, Business Solutions, Cloud computing, Datacenter
Tagged appliance, Backup, cloud storage, EVault, i365, online, recovery, SaaS
99% of data recovered from a drive in the tragic crash
It’s amazing and somehow very sad that engineers were able to recover the data from a Seagate disk drive found in the wreckage of the shuttle Columbia.
When things get tough at work, a colleague of mine is fond of saying “they’re just disk drives.” This kind of brings that home.
More on the story from Engadget.
Update: more on this story from Blocks and Files. Dave Reinsel at IDC has written a detailed report as well, accessible to subscribers for a fee. Intriguing perspectives about the persistence of data in disk drives for good and ill.
Persistent data can overwhelm archives if not treated differently
Heidi Biggar hit a nerve with her stories on a growing problem in backup and recovery for businesses: backing up the same information over and over. Backup technology and practices have improved greatly over the past few years, while simultaneously data volume has continued to expand every year. So more companies are backing up, and have more to backup.
Problem: too much of the same data on all of those backup drives:
ESG estimates that 60% to 80% or more of the data on primary storage systems today is static (or persistent). In other words, this data has not been accessed at all 90 days or more after its creation.
Is persistent data a large-scale problem? Heidi’s solution is basically tiered storage, but I’m not convinced that’s being done widely today. I’d like to hear from people successfully (or unsuccessfully) addressing this issue.