Newsletters




When Adversity Strikes, Will Your Data Continue to Flow?


What does it take to have an always-on organization? How can data be available when needed, regardless of any outages or downtime that may befall a data center? One may think that having expensive technology to assure redundancy—such as a secondary data center—means that everything will always be there when an adverse event strikes. However, while an impressive array of technology exists to deliver data on a real-time, continuous basis, even those organizations with multiple redundancies built into their systems and networks still face challenges keeping up.

A new survey of 331 data managers and professionals, conducted by Unisphere Research, a division of Information Today, Inc., finds that traditional, hardware-centric approaches to the challenge don’t necessarily deliver seamless availability. The survey was conducted among members of the Independent Oracle Users Group (IOUG) and sponsored by EMC. Respondents came from organizations of all sizes and across various industries. At least 43% of respondents indicate that the majority of their enterprise data currently needs to have high availability.

In the past year, unplanned downtime has cost businesses in terms of lost productivity and a loss in customer confidence. On average, businesses forfeit 1 business day per year to downtime. One in seven also report severe data losses.

These incidents, no matter how minor, carry a high cost. The costs are real to the business and can be felt almost immediately. Loss of employee productivity—because of interrupted work, or needing to wait while systems get back up and running—is the leading effect. A majority of enterprises (54%) say they have suffered productivity losses as a result of downtime. There is also an impact on the way customers perceive businesses as well. One-third of respondents say that their organizations suffered a loss of customer confidence and loyalty in the past year when their systems went down. Fully 28% report direct loss of revenues, and a similar percentage of respondents note that innovation suffered, as downtime cut into product or service development.


Business Impact of Data Loss/Downtime Over Past 12 Months

Loss of employee productivity - 54%

Loss of customer confidence/loyalty -33%

Loss of revenue - 28%

Delay in product/service development - 28%

Delay in getting products/services to market - 14%

?Loss of customers - 10%

Loss of a new business opportunity - 10%

There have been no business consequences - 16%


The survey finds that multiple data centers are the rule among most enterprises —most have one or two data centers, while 30% have three or more. Most are configured to be on hot standby, and handle multiple database types. Other leading availability approaches include maintenance of standby servers on a remote site, snapshots, and storage virtualization. Interestingly, 40% of enterprises still rely on tape backup and tapes sent offsite.


Availability Solutions and Strategies

Standby servers on a remote site - 46%

Snapshots - 46%

Storage virtualization - 46%

Asynchronous replication - 40%

Tape backup and tapes sent offsite - 40%

Cloning - 35%

Backup appliance - 35%

Synchronous replication - 34%

Virtual servers with restart capabilities - 32% (including cloud)

Replication of both applications and data - 31% (such as virtual machine images)


But this may not be enough. One-third of respondents report low levels of satisfaction with their current enterprise data availability strategies. Businesses are seeking more uptime from service-level agreements, but IT departments are struggling to provide it.

More than three in five data managers are not fully confident they can meet these SLAs in the event of a system disruption, according to the survey. And, 28% of respondents report that, at best, they are only “sometimes” meeting their agreed-upon SLAs for downtime. This is even more problematic when multiple data centers are involved, with more moving parts that could go down.

Confidence in the ability to meet downtime SLAs doesn’t necessarily increase with the number of data centers deployed. Among respondents in multiple data center environments, only 36% express extreme confidence—compared to 42% of respondents with only one data center.

Data availability is an area in need of rethinking. While these approaches are functioning and working, the survey finds they are not inspiring a great deal of confidence among data managers and professionals. The leading roadblock to data availability, cited by 60% of respondents, is organizational support. Still, a majority state their preferred action to address data availability issues during the past 3 years has been to purchase or upgrade their systems and hardware.

Many respondents are turning to cloud and virtualization. Close to half recognize the need to increase the flexibility of movement of data across environments and have made the move to data virtualization. About 25% of respondents also indicate that they are adopting cloud services as an approach to ensuring the availability of their data. Private or hybrid clouds are the most likely forms of data protection cloud services to be leveraged within the next 12 months, the survey finds. Close to one-third report using these environments. In addition, close to half of respondents, 45%, report they have virtualized their Oracle environments.


Sponsors