Persistent data can overwhelm archives if not treated differently
Heidi Biggar hit a nerve with her stories on a growing problem in backup and recovery for businesses: backing up the same information over and over. Backup technology and practices have improved greatly over the past few years, while simultaneously data volume has continued to expand every year. So more companies are backing up, and have more to backup.
Problem: too much of the same data on all of those backup drives:
ESG estimates that 60% to 80% or more of the data on primary storage systems today is static (or persistent). In other words, this data has not been accessed at all 90 days or more after its creation.
Is persistent data a large-scale problem? Heidi’s solution is basically tiered storage, but I’m not convinced that’s being done widely today. I’d like to hear from people successfully (or unsuccessfully) addressing this issue.