June 2013 - UPDATE

Subscribe to the online version of Database Trends and Applications magazine. DBTA will send occasional notices about new and/or updated DBTA.com content.

Trends and Applications

Are we attempting to view 21st-century organizations through a 1990s window? Decision makers' ability to understand what customers are thinking and to be able to deliver service that dazzles and amazes them, depends on their organization's ability to sift through all the available data. However, organizations are not doing a very good job of competing on analytics. Efforts to introduce more analytic power across enterprises to analytics-driven cultures appear to be lukewarm at best. What is needed is a way to embed analytics deeper into day-to-day business operations.

Oracle expanded its partnerships with key cloud vendors. Microsoft Corp. and Oracle Corp. have formed a partnership that will enable customers to run Oracle software on Windows Server Hyper-V and in Windows Azure, Microsoft's cloud platform. Customers will be able to deploy Oracle software — including Java, Oracle Database and Oracle WebLogic Server — on Windows Server Hyper-V or in Windows Azure and receive full support from Oracle. In addition, Salesforce.com and Oracle also announced a comprehensive nine-year partnership encompassing all three tiers of cloud computing: applications, platform and infrastructure.

If money was not an issue, we wouldn't be having this conversation. But money is front and center in every major database roll-out and optimization project, and even more so in the age of server virtualization and consolidation. It often forces us to settle for good enough, when we first aspired for swift and non-stop. The financial tradeoffs are never more apparent than they have become with the arrival of lightning fast solid state technologies.

DBTA and SAP will present a webcast on how extreme transaction works and how it is helping data-intensive enterprises overcome their data growth and complexity challenges. Across the public sector, financial services, healthcare, and other industries, organizations are being challenged with managing more data and more users are demanding access to that data with less time and resources. This trend was underscored in a study conducted over Database Trends and Applications readers in April, which found that faster query performance was the number-one technology challenge that respondents are trying to get their arms around.

IBM recently laid out a set of new initiatives to further support and speed up the adoption of the Linux operating system across the enterprise. These include two new Power Systems Linux Centers, as well as plans to extend support for Kernel-based Virtual Machine (KVM) technology to its Power Systems portfolio of server products.

RainStor, a provider of an enterprise database for managing and analyzing historical data, says it has combined the latest data security technologies in a comprehensive product update that has the potential to rapidly increase adoption of Apache Hadoop for banks, communications providers and government agencies.

The amount of data being generated, captured and analyzed worldwide is increasing at a rate that was inconceivable a few years ago. Exciting new technologies and methodologies are evolving to address this phenomenon of science and culture creating huge new opportunities. These new technologies are also fundamentally changing the way we look at and use data. The rush to monetize "big data" makes the appeal of various "solutions" undeniable.

The objective of "old school" capacity management was to ensure that each server had enough capacity to avoid performance issues, and typically focused on trend-and-threshold techniques to accomplish this. But the rapid adoption of the cloud, and now OpenStack, means that supply and demand is much more fluid, and this form of capacity management is now obsolete. Unfortunately, many are ignoring this new reality and continuing to rely on what have quickly become the bad habits of capacity management. These old school methods and measures not only perpetuate an antiquated thought process, but they also lead to, among other problems, low utilization and density. In a world of tightening IT budgets these are problems that can't be ignored.