Enterprises today face a shortage of database talent that is only going to grow more acute. This new study indicates that 41% of database professionals intend to leave the data management field within the next 10 years. About half of this group will be retiring, while others will be seeking other opportunities in management or self-employment. In an era in which data analytics is seen as the most effective competitive tool, there will be a shrinking pool of qualified professionals to help manage, secure and store data. At the same time, there has never been a more exciting time to be a database professional. Most entering the field tend to be quite satisfied with their jobs, and they recognize the crucial role they are now playing in today’s analytics-driven organization.
With the game-changing potential of big data come several challenges that must be addressed by organizations big and small. The information that makes up big data comes from both inside and outside a single organization and may be used by many different stakeholders. As a result, data governance continues to grow in importance. To better understand the impact of new data sources on data governance practices, in the first quarter of 2014, IBM commissioned Unisphere Research, a division of Information Today, Inc., to survey IT and business stakeholders at organizations across North America.
Providing the right content, at the right time, to the right customer enables organizations to rise above the numerous sites and digital pitches now competing for customer attention. However, at this time, marketing executives are struggling with ways to create more content, deliver it with greater frequency, and streamline its delivery. In addition, few organizations are measuring the results of their content marketing across emerging social media channels.
In 2014 the spotlight is on data management departments as they lead the effort to deliver competitive advantage from Big Data analytics. Decision makers seek information from a growing range of data sources and sophisticated toolsets. Managing the integration of myriad networks, data systems, and applications to deliver reliable information is a greater challenge than ever before.
At a time when data is the fuel which drives business growth, the onus is on enterprises to protect that data, while at the same time assuring its accessibility. Over the years, there has been growing awareness among enterprise executives and managers about the potential issues to enterprise data security—not only from outside hackers and thieves, but also from people inside organizations, often those with privileged access. Enterprises are making greater and more frequent efforts to monitor and audit data for evidence of security events.
The objective of this survey was to provide an accurate and consistent assessment of current enterprise database management practices, solutions and tools in use, and the important challenges facing database administrators and their organizations. The majority of the data trended early and was consistent with a preliminary analysis that was conducted before the survey was completed. Respondents from a wide range of industries participated in this research with a good distribution of small, medium, and large businesses.
Many organizations today face a significant challenge that is negatively impacting their operational efficiency, revenue potential and ability to provide a trusted IT environment: the prevalence of mission-critical application downtime. To combat this problem, infrastructure owners and DBAs have introduced strategies to increase resiliency—however—both unplanned and planned downtime are still pervasive despite these efforts. As a result, this study discovered that 46% of survey respondents are less than satisfied with their current availability strategy for their mission-critical Oracle-based applications.
Organizations need to prepare for rising levels of big data streaming into their organizations. The ability to manage and assure 24x7x365 database performance, regardless of workloads and user demands, is key to agility and growth. Faster delivery of databases requires automated packaging and deployment processes, both at the initial instantiation and throughout the entire life cycle of the databases in question including decommissioning. As data environments grow larger and consumers increasingly expect “on-demand” access, the ever-increasing complexity in governance requirements can potentially slow the roll out of database services. Database administrators face uphill challenges to address these challenges in today’s multi-layered and globally diverse data centers. Those who succeed are able to adapt and evolve by translating database services into critical services the business can depend on.
It’s no secret that today’s organizations are awash with data. Data is streaming into transaction systems, appliances and devices from a wide variety of applications, and new sources including social media. Proponents of Big Data state that data contains veins rich with information for decision makers and the business, and many organizations have made it a priority to capture and use this data. However, what many organizations are also discovering is that managing and storing this all this data has a cost. While there is a drive across the industry to introduce new and more digitally compact forms of data storage, as well as cloud storage, these solutions do not get to the heart of the problem for enterprises—data needs to be managed more effectively, and tied closer to the business, from the start.
Cloud computing is no longer a novel concept being experimented with at the edge of the enterprise. It is now a mainstream business technology strategy that is delivering the agility and flexibility that businesses require to move forward. A new survey finds that cloud computing continues gaining converts within the enterprise, and is pushing down deep roots within companies that have deployed the approach.
Thomas Davenport, co-author of the watershed book Competing on Analytics and visiting professor at Harvard University, has famously referred to the role of the data scientist to be the “sexiest job of the 21st century.” And it’s no wonder—data scientists are being cast as the visionaries who will help guide their organizations into the future, by scooping up information from all corners of the enterprise and beyond, and figuring out ways to make that data tell compelling stories.
Application servers are proliferating, the number of end users is expanding rapidly, and data volumes and types are growing impressively as part of today’s increasingly digital organizations.
However, the performance of database systems is struggling to keep pace—in fact, databases are hampered by reliance on disk based storage, a technology that has been in place for more than two decades. Even with the addition of memory caches and solidstate drives, the model of relying on repeated access to the permanent information storage devices is still a bottleneck in capitalizing on today’s “Big Data.”
Data keeps growing, and along with it, opportunities for unprecedented insights into customers, sales, markets, and processes. With information now being generated from all corners of the enterprise, executives, managers, and professionals can ask and get answers to questions they have never been able to consider. For companies that are able to offer business decision makers rapid and easy access to business intelligence (BI) or analytic data from which they can assemble their own interfaces and reports, this means competitive advantage. However, today’s BI systems still present obstacles to realizing this vision.
As more data becomes available from an abundance of sources both within and outside, organizations are seeking to use those abundant resources to increase innovation, retain customers, and increase operational efficiency. At the same time, organizations are challenged by their end users, who are demanding greater capability and integration to mine and analyze burgeoning new sources of information.
Thanks to relentless global competition and an unforgiving economy, organizations have been under non stop pressure to deliver products and services. For many the lifeblood of finding new opportunities has been to mine the data assets being gathered from all corners of their enterprise and beyond—transactions, customer data, employee input, and information about market conditions.
To compete in today’s hyper-competitive economy, organizations need the right information, at the right time, at the push of a button. However, there is no one single source for relevant data; the typical enterprise today has a vast array of data types and formats pulsing through its veins, flowing in from its own systems and databases, as well as streaming in from external sources. For today’s data manager or professional, the challenge is being able to provide end users access to actionable information in as close to real time as possible. Today’s data systems—many of which were built and designed for legacy systems of the past decade, supporting far less data—are not up to the task.
produced by Unisphere Research, a division of Information Today, Inc., and sponsored by MarkLogic
conducted in partnership with the SHARE users group and Guide SHARE Europe, produced by Unisphere Research, a division of Information Today, Inc., and sponsored by IBM and Marist College
produced by Unisphere Research, a division of Information Today, Inc., and sponsored by Application Security, Inc.
produced by Unisphere Research, a division of Information Today, Inc., and sponsored by IBM
produced by Unisphere Research, a division of Information Today, Inc., and sponsored by Oracle
produced by Unisphere Research, a division of Information Today, Inc., and sponsored by Quest Software
produced by Unisphere Research, a division of Information Today, Inc., and sponsored by Oracle