More about NetApp Cloud Backup
- 8 Ways to Reduce Your Backup Costs with BlueXP Backup and Recovery
- Cloud Backup and FlexGroup Integration: Petabyte Backup Solved
- Tape Backup Alternatives: Why It’s Time for Tape Modernization and How to Do It
- Backup Modernization: 5 Ways to Modernize Your Backups
- 10 End-to-End Security Features in NetApp Cloud Backup
- Object Storage Backup: Benefits and Key Considerations
- Search and Restore with Cloud Backup’s Indexed Catalog
- The Storage Admin Backup Struggle: PBs to Back Up and No Time to Do It
- Cloud Archive Storage: How to Lower Archive Backup Costs with NetApp Cloud Backup
- Understanding Cloud Backup Costs and 4 Ways to Cut Your Costs
- Dark Site Backup Support with Cloud Backup: Store Backups Offline with the Software-Only Option
- Why Backup Should Be Your First Step to the Cloud
- Cloud Backup: NetApp’s Fully Managed Backup-as-a-Service for Cloud Volumes ONTAP
March 27, 2022
Topics: Cloud Backup Data ProtectionBackup and ArchiveAdvanced7 minute read
If you’re working as a storage admin, the last few years have seen an explosion of the data estate that have made the job of backing up data much more challenging. If you’re not a storage admin, you may be completely unaware that these are the average pressures that are being put on the system, some of which can have serious business consequences. Managing the on-prem or cloud backup system for this massive data estate can be a nightmare.
In this blog we’re going to introduce you to Anne, a storage admin at an enterprise-scale financial company and show you what some of her day-to-day backup responsibilities look like. We’ll also see what Anne can do about these challenges and how it can help backup operations in your IT department.
Use the links below to jump down to the sections on:
- Storage Admins Are Getting Overburdened with Backup
- What Bothers Anne About Backup
- The Solution Anne Is Looking For
Storage Admins Are Getting Overburdened with Backup
Consider what the average storage admin’s backup workload looks like. Let’s introduce you to Anne. Anne is a storage admin at a large American financial services company, where she is responsible for the backup (and recovery) of 10 PBs of file shares of business-critical data. The organization has been using network-attached storage (NAS) as its storage architecture since NAS was first introduced almost 25 years ago. The NAS backups are streamed to tape media over the company’s IP network using the Network Data Management Protocol (NDMP). Anne can also take advantage of secondary ONTAP systems to store her backups by transferring the data via SnapMirror®.
In order to meet the organization’s RPO needs the organization’s policy, Anne has to carry out one full backup per week. Incremental backups (only files that have changed) are performed twice each work day. Anne spends almost the entirety of her workday overseeing these backup operations, testing the backups, and ensuring that recovery procedures meet the company’s RTO requirements. Because so much of her time is invested in these backups, other storage-related tasks that need her attention might wind up being neglected.
What Bothers Anne About Backup
Backup creation is one of Anne’s most crucial tasks. With the growth of the data estate, it’s now making up more of her workload than ever, and it’s becoming harder to accomplish. Here are some of the questions that concern Anne on a daily basis about her backups:
How long is this back up going to take?
The most serious challenge that Anne faces using her NDMP-based backup solution to back up a data set that is now in the petabytes is that creating the backup takes an extremely long time. This process can take anywhere from a few days to sometimes even weeks.
The reason for longer backup windows can be traced back to a few issues. Network congestion and volumes that continue to grow as backups are taking place can increase the backup time. Also, since NDMP can’t preserve ONTAP storage efficiencies, Anne’s data has to move in its full size. In the end, Anne may have a viable backup, but it’s a process that takes far too long.
Did we miss the backup window?
Anne’s backup creation process takes so long that sometimes she isn’t able to finish creating one backup before the backup window closes. This means that there will be significant differences between her primary data set.
What missing a backup window means is that Anne can’t guarantee the organization’s business-critical file share can be restored if something goes wrong with the primary. The system is not properly protected.
Was my backup successful?
Since Anne is using an NDMP-based file-level backup solution, it can take days to create the up-to-date copy. Once that backup is created, Anne needs to verify that it actually worked. How can she know that the backup is viable for recovery?
Anne may next have to test the backup in a dummy environment to see if the backup is actually dependable. This is a step she can’t skip because if she risks using an untested backup, it may mean the company’s data is unprotected. And if the test proves the backup isn’t good, Anne can look forward to a few more days dedicated to backup creation.
When is the next maintenance window?
It’s important for Anne to be aware of when her next maintenance window is scheduled. If there’s any way that she can leverage that scheduled maintenance to help close the backup and recovery gaps, she should make use of it.
Of course, in the long run, this isn’t a cost-effective way to make sure backups work properly since it doesn’t scale effectively over time. Continuing this way will require new purchases for storage dedicated solely for backup on a regular basis.
Did the backup cause problems with other workloads?
Anne isn’t just responsible for the backups, but for fixing the problems that the backups themselves create. For instance, two weeks ago Anne created a backup that for some reason affected the production environment. What went wrong? Anne needs to figure this out before the next backup is created to avoid creating even more problems.
How do I integrate a new workload?
Anne’s company is constantly evolving its services, and that means dealing with changing system requirements and different demands for backup. Will any of these new workloads present issues for Anne, both in integrating with the existing environment and when it comes to the backup regimen? Anne also needs to determine if adding anything to the existing system is going to cause issues with future backups.
How long will it take me to restore a single file?
When it comes to restoring single files, Anne’s job can be extremely inefficient. To replace a single file, Anne will need to replace the entire volume from the backup, since her NDMP-based solution doesn’t have any single-file restore capabilities. Full volume restorations take a lot of time, time that Anne doesn’t exactly have to spare.
Luckily, there are other tools that Anne can take advantage of to make sure that she has the right answers to all of these questions.
The Solution Anne Is Looking For
Anne has more options today thanks to new technologies that are making it easier, faster, and less expensive to back up her massive data set.
- Cloud-based object storage offers unlimited scalability at the lowest rates
- Incremental forever means you spend less time creating full backups
- Block-level backup makes updating faster and restores more granular
- Retained storage efficiencies keeps backup storage footprint optimized
- Fully integrated with the NetApp ecosystem
- Direct backups without an intermediary save time and costs
- Set-and-forgets cheduling helps meet the most demanding recovery objectives
- SaaS or software-only options means sensitive backups can be stored air gapped with no network connectivity.
There is one solution that bundles all of these beneficial technologies into one package: NetApp Cloud Backup.
Cloud Backup is a backup-as-a-service from NetApp delivered through the convenient Cloud Manager management console. It automatically creates block-level backups that are stored in cost-effective object storage either on AWS, Azure, Google Cloud or on-prem in StorageGRID appliance.
Since Cloud Backup is designed specifically for ONTAP and Cloud Volumes ONTAP, it works seamlessly with existing ONTAP systems like Anne’s. And as a NetApp-native technology, it ensures that all of Anne’s primary system’s storage efficiencies are preserved in the backup.
The benefits Anne gets with Cloud Backup are clear:
- Incremental, block-level backup saves time and makes sure her backup windows are never missed, unlike with NDMP solutions
- Preserve storage efficiencies to make sure her cloud costs are optimized
- Full integration with SnapCenter technology means she can create application-aware backups
- Support for Kuberenetes lets Anne back up persistent volumes effectively
- Archival tiering provides cost-effective object storage tiering for infrequently used archive data
- Restore individual files with the indexed catalog feature
- Ease of operation with set-and-forget controls makes operation a breeze. See for yourself with the product simulator.
Thanks to Cloud Backup, Anne has a new way to make backup possible at her company—one that’s fast, cost-effective, and scales with the growth of data no matter how large it gets.
To read more about Anne and her journey to find a better way to back up the data set she’s in charge of, download our new guidebook on PB-scale data backup in just hours.
If you’re ready to get started now, sign up for a free trial of Cloud Backup.