Newsletters




In-Memory Databases Speed Up Their Foray into the Enterprise

Page 1 of 2 next >>

Image courtesy of Shutterstock

In their many years of existence, in-memory databases and technologies have meant one thing to most observers: running applications and serving up data at lightning-fast speeds. Now, as data increasingly evolves into a strategic enterprise asset, in-memory databases have implications beyond blazing bits and bytes. They are opening new opportunities for business innovation and growth.

In-memory technologies have been available for well over a decade, but typically have been niche solutions, employed within high-speed transaction servers within financial services, or as part of embedded solutions. Now, in-memory is reaching a tipping point of widespread adoption within enterprise data environments. This is borne out in recent announcements from leading enterprise systems vendors which offer in-memory as quick-to-deploy solution for the bulk of their customer bases, and opening up in-memory capabilities to organizations of all sizes, including those that may have not had the budgets for in-memory-capable processors.

A recent DBTA survey of IT and data managers finds that about one-third of enterprises, 32%, run in-memory databases at their locations in some capacity. Three-fourths expect to expand their use of this technology over the next 3 years. Functions being applied against in-memory technologies include analytics/ business intelligence (58%); core business functions such as finance and production (42%) and IT operational data, such as logs, or systems monitoring (25%). The advantages of running in-memory databases include faster response times/ reduced latency (88%); greater flexibility (25%); and more rapid deployments of applications (21%).

“Typically, the use of in-memory databases is associated with low-latency, high volume systems such as telecommunications networks, high-speed trading applications and embedded systems,” observed Dr. Elliot King, research fellow at the Lattanze Center for Information Value at Loyola University Maryland. In-memory database use, he added, has expanded from its traditional base of financial and trading applications to the broader mainstream of business, including analytics, web-based transactions, and billing and provisioning, according to the center’s study of 237 enterprises.  


Download the DBTA THOUGHT LEADERSHIP SERIES: THE IN-MEMORY REVOLUTION


For most enterprises, in-memory technology represents a new approach to the age-old challenge of boosting systems performance. For years, accelerating data delivery and application performance in efforts to reduce latency meant adding more disks and beefing up hardware. When performance slowed, enterprises either increased disk capacity or bought into faster disk drives and arrays. Dramatic increases in memory capacity, the rise of 64-bit processing, and plummeting memory prices have opened up a new frontier for performance. Now, in-memory technology is seen as a technology and methodology for navigating through and pulling actionable insights from the deluge of big data that is flooding in from internal systems, devices, the cloud, and social media.

In-memory provides the following advantages to enterprises:

Run applications faster: Raw speed is the original and most pronounced value of in-memory. The ability to load data into machine random access memory means a quantum leap for application speed. In enterprises with hundreds of terabytes of data associated with each core application, the round-trips data needs to make back and forth between the disk arrays and CPUs really adds up. When loaded within memory, data moves at potential speeds 1,000 times faster than the traditional CPU-to-disk round trip. Plus, in-memory database systems only require a single movement of data, adding to the speed equation.

Reduce staff time spent on manual administrative tasks: Having in-memory technology helps reduce the amount of staff time required to manage data sites. Essentially, running an in-memory database takes out the complexity associated with traditional databases that are optimizing disk space. Typically, transactional calls to databases require the creation of indexes and summary tables with triggers, requiring manual scripting and maintenance by DBAs. Putting the transactional database in-memory means eliminating or reducing much of that work. This can serve to free up data professionals for higher-level activities such as working closer with business managers to formulate data strategies.

Run both transactional and analytical data in the same space: Recent innovations in the in-memory field include the introduction of in-memory capabilities within a relational database, creating a single in-memory database that can support both row-based transactional datasets as well as column-based analytical data. This may help relieve organizations of the requirements of maintaining separate data environments for transaction and analytics, and help fuse the two. These two environments 

Page 1 of 2 next >>

Related Articles

How do you catch the next big wave - not the wave everyone's already riding, but the one on the horizon? When it comes to big data, the next big wave is in-memory technologies, which accelerate data processing for faster, more accurate decisions.

Posted July 15, 2014

This is perhaps the most exciting era the database industry has ever seen. As they face a hypercompetitive global economy, enterprises are reinventing and disrupting themselves at an unprecedented paceā€”and are embracing data to do so.

Posted June 11, 2014

Sponsors