Newsletters




Data Protection - An Old Problem Just Gets Bigger

<< back Page 2 of 2

Second, inline deduplication cannot process multiplexed, multistreamed database data efficiently. As a result, it forces data center managers to make difficult trade-offs between backup performance and capacity reduction. In other words to back up your Oracle, SQL, or DB2 database fast enough to meet your backup windows you have to turn off deduplication and use multistreaming/multiplexing.

Smart hybrid deduplication is a more efficient solution. It works by analyzing the content of data as it is backed up and applies highly efficient deduplication that does not jeopardize backup windows. It deduplicates all data types, including multiplexed, multistreamed databases and it can scale across multiple processing nodes. It will also analyze data again, after it has been backed up, at the byte level for further capacity reduction. Smart hybrid deduplication provides an effective way to control data growth and to backup massive data volumes without adding complexity or missing backup windows.

Consolidate for Cost Savings

Grid scalability is a critical part of implementing an orchestrated backup environment. Since many backup technologies cannot scale partly because of their inline deduplication functionality, they force you to divide backup volumes across multiple systems – a recipe for data center sprawl and inefficiency. A more efficient approach is to use a grid scalable backup system that allows you to backup, deduplicate, replicate, and manage data in a single system.

Grid scalability technology allows you to start with a system sized for your current needs where you can add capacity and or processing nodes independently over time. You can protect tens of petabytes of data in a single system, improve system utilization – get the most value from your investment.

To Manage Enterprise Data Volumes, Move It Efficiently

To manage enterprise data volumes, you need the ability to move it quickly and efficiently. For example, you need to move terabytes of data to and from backup systems and disaster recovery sites and to implement an effective data tiering strategy. Therefore, it is important that you use a backup solution that can access data using a variety of methods, including NFS and CIFs and one that is optimized for large sequential workloads. It should also integrate smart deduplication with replication to minimize bandwidth requirements and speed transmission times.

Advanced Management Capabilities

To succeed in orchestrating your backup environment, you need a management console and reporting capability that can give you the complete view. This level of reporting is far less detailed (in some cases even impossible) with multiple disparate systems, resulting in unpredictable, inaccurate or last-minute requests for more system/capacity. An enterprise level system should provide simple, immediate reporting on backup performance, deduplication and replication efficiency, and capacity usage. Systems that can provide reporting details down to individual clients and their backup job provide more efficient capacity planning capabilities.

Don’t under-estimate the importance of being able to manage your backup systems from remote devices, PC, laptops, or tablets. IT should have the ability to monitor and control all protected data enterprise-wide through backup, deduplication, replication, and restore processes.

Encryption of Backup Data (Data at Rest)

Many data centers are not encrypting backup data (data at rest) because they are concerned that enterprise key management (EKM) will add another layer of complexity in an already over-burdened environment.

With more and more highly confidential data in the backup stream, enterprise data centers need to be able to encrypt data at rest efficiently. A better solution is to use a backup technology that integrates with enterprise key managers (EKM) that are compliant with industry-standard OASIS Key Management Interoperability Protocol (KMIP). These systems eliminate any added complexity by enabling IT staff to use the same EKM in their backup environment that they may be using in other parts of the IT infrastructure.

Flexibility to Integrate New Technology

Data center managers need to ensure their data protection approach does not lock them out of new, more efficient technologies or become obsolete before delivering a return on investment. To support the orchestrated approach to data management, consider data protection systems that will integrate and provide value added management for diverse emerging technologies, applications, protocols and data types.

Data centers are facing unprecedented data growth rates and costly, time-consuming data center sprawl. As a result, there is an increasing need for data protection solutions that are designed to handle these massive data volumes in a more efficient way. Specifically, they need enterprise class data protection systems that provide performance and scalability to enable the consolidation of backup targets onto a single unified system. They also need more efficient deduplication of data­bases and other data; and the ability to encrypt data at rest without slowing performance or adding complexity.

About the Author:

Peter Quirk is director of product management at Sepaton. He has spent most of his career working for vendors in systems engineering, product marketing, product management and project management roles, with responsibilities in operating systems, databases, languages, hardware platforms, storage, and social media.

<< back Page 2 of 2

Sponsors