The volume of data that must be backed up has always been a problem, even before mobile, the cloud and the Internet of Things. However, the explosive growth of data and the need to constantly back up data have led to major challenges with backup performance and capacity. The nightly backup is no longer adequate, and most organizations are struggling to back up large volumes of data in near real time. Organizations have to do more than make sure data, applications and systems are protected. They have to make sure recovery is fast enough to meet business requirements and minimize disruption.
Backing up virtual machines (VMs) has also been difficult. Typically, organizations would deploy a backup agent for each VM, but this approach is inefficient and can create a management nightmare. Not all VMs are the same, yet host-level backups require VMs to run a supported operating system, as well as maintain compatibility between the backup application and other applications on the VM. Although new backup solutions are virtualization aware, many organizations are still struggling to adapt from physical to virtual server backups.
Data backup has become even more complex with the rise of the bring-your-own-device model and the geographically distributed enterprise. Increasingly, valuable organizational data is held on devices outside of IT’s control. That data needs to be protected, but organizations have to worry about bandwidth limitations, data charges, the lack of consistency across devices and the impact on user productivity.
On top of all this, designing and implementing a backup solution using best-of-breed components is a long, complicated process. “Backup: it’s something you have to do, but not something you love to do. Anyone who has dealt with legacy backup applications can tell you, they are not easy to manage,” says Ray Hall, Senior Systems Engineer at Cerium.
Organizations must choose the right technology, install hardware and software, and configure the solution to meet workload requirements. Configuration errors and improper tuning of the backup software can cause backups to fail. Backups must be scheduled properly, and the environment monitored and managed. In many organizations, siloed data protection platforms create management headaches.
Integrated backup appliances were developed specifically to overcome these challenges. Deployment of an integrated, preconfigured appliance is far faster and simpler than traditional solutions. Integrated backup appliances can also be pre-tuned and optimized to perform data protection tasks, and support is simpler because there are fewer moving parts.
Dell EMC introduced its Integrated Data Protection Appliance (IDPA) in 2017, combining its data protection software with Dell EMC PowerEdge servers in a turnkey solution. The Dell EMC IDPA performs up to 10 times faster than a traditional solution built from scratch and delivers 20 percent faster performance than competing solutions. “Dell EMC took its well-known converged approach of engineering the hardware AND the software into an easy to upgrade, manage and deploy appliance. The IDPA pairs together the leading purpose-built backup hardware of Data Domain and backup software Avamar into a small as 2U form factor,” says Ray.
The Dell EMC IDPA overcomes complexity challenges with a single management interface and support for a wide range of physical and virtual workloads, applications and hypervisors. It also includes integrated search and analytics features. A 55:1 de-duplication rate conserves storage capacity, while native cloud-tiering enables you to extend coverage to the cloud for long-term data retention.
Integrated backup appliances are viable alternatives to legacy backup processes that are unable to keep pace with the high data volumes and the complexity of a modern backup environment. The Dell EMC IDPA can ensure that data from all sources is efficiently backed up and protected without affecting network performance or user productivity.