Image courtesy of Shutterstock
Storage requirements are growing by leaps and bounds, and more organizations are turning to cloud computing to manage the load. However, cloud isn’t necessarily seen as the best approach to data storage bursting—rather, it’s mainly being used for backup and for hosting development and test environments.
That’s the gist of the results of a new survey, which finds enterprises are straining under big data loads. The survey, covering 361 data managers and professionals and conducted by Unisphere Research, a division of Information Today, Inc., finds that organizations are employing a range of new strategies and approaches to improve the speed of data delivery and integration. Conducted among members of the Independent Oracle Users Group (IOUG), the survey included respondents from organizations of all sizes and across various industries.
The research confirms that the volume of data to be stored continues to rise into the petabyte range for many organizations. Thirty-one percent of respondents now report they have more than 100TB of Oracle Database data in storage, including all clones, snapshots, replicas, backups, and archives. This is up from 28% with that amount of data in storage 2 years ago. Significantly, 15% of respondents now have more than a petabyte of Oracle data within their enterprises, up from 10% in a previous survey.
Enterprise Data Volume
100TB to 1PB 18% 16%
1PB or more 10% 15%
What’s behind this ongoing data growth? Increasing business—particularly digital business—is creating an ongoing deluge of information, the survey confirms. Close to half of data executives said they are experiencing greater business transaction volume.