Delivering Information Faster: In-Memory Technology Reboots the Big Data Analytics World

Bookmark and Share

In-memory technology—in which entire data sets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.

The appeal of in-memory technology is growing as organizations face the challenge of big data, in which decision makers seek to harvest insights from the terabytes’ and petabytes’ worth of structured, semi-structured and unstructured data that is flowing into their enterprises. A recent survey of 323 data managers and professionals finds that while organizations are still in the early stages of in-memory adoption, most data executives and professionals are expressing considerable interest in adopting this new technology.

In the survey, fielded among members of the Independent Oracle Users Group and conducted by Unisphere Research, a division of Information Today, Inc., close to one-third of organizations already have in-memory databases and tools deployed within their enterprises, and report advantages such as real-time operational reporting, accelerating existing data warehouse environments, and managing and handling unstructured data. Another one-third are considering in-memory technologies. The report, “Accelerating Enterprise Insights: 2013 IOUG In- Memory Strategies Survey” (January 2013), was underwritten by SAP. There are compelling technical advantages to having an in-memory database, but the business benefits can be far-reaching. From a technical standpoint, data analysis jobs performed within memory can potentially run up to 1,000 times faster than similar jobs employing traditional disk-to-processor transfers.

Access the full version of this article in the DBTA Thought Leadership special section on In-Memory Databases/In-Memory Analytics. A short registration form is necessary to access the special section.

In-memory databases offer large, high capacity memory space in which entire datasets—potentially millions of records—can be loaded all at once for rapid access and processing, thereby eliminating the lag time involved in disk-to-memory data transfers. Because of these limitations, many applications relying on data analytics have only been able to deliver limited reports built on smaller chunks of data. In addition, a new generation of tools, featuring visual and highly interactive interfaces, help bring data to life for decision makers. The hardware available also makes in-memory processing a reality. Multi-core processors are now the norm, and memory keeps getting cheaper and cheaper. We have seen Moore’s Law—that posits that processor power and capacity will double every 18 months—continue to hold true. In 1965, a company paid for memory at a rate of $512 billion per gigabyte of RAM. Now, it’s a mere $1 per gigabyte. In a few years, the cost per gigabyte of RAM will plunge even further, to a few cents.

Google-Like BI

From a business standpoint, in-memory capabilities bring organizations ever closer to the holy grail of big data—the ability to compete smartly on analytics, meaning that a data-driven business would be able to closely engage with and anticipate the needs of customers and markets. Overall, nearly 75% of respondents in the IOUG-Unisphere Research survey believe that in-memory technology is important to enabling their organizations to remain competitive in the future. Close to half (47%) of the survey respondents whose companies are making extensive or widespread use of in-memory say they see the greatest opportunity for future use in providing real-time operational reporting. One-third of those respondents say in-memory is playing a role in delivering new types of applications not possible within their current data environments.

In today's hyper competitive business climate, end users expect information at their fingertips, at a moment’s notice. They want a Google-like experience in working with enterprise data—meaning the ability to ask any question and receive a set of potential responses in a matter of subseconds. By bringing on-the-spot analysis of complete datasets close to the user, in-memory analytics opens up data-driven decision making to individuals at all levels of the organization—beyond the relatively small handfuls of business analysts, statisticians, and quants who have traditionally been the users of BI and analytics tools. Customer service representatives, for example, can make on-the-spot decisions based on the profitability of customers, with data from CRM, transactional and data warehouse systems immediately available. Operations executives can prioritize production orders, taking into account scheduling and forecasting data that is readily available.

In-memory analytics enables more business end users to build their own queries and dashboards, without relying on IT to unlock data sources or to build and deliver the reports. With dramatically enhanced processing speed, and the availability of entire data sets, in-memory processing opens the way for more sophisticated market analysis, what-if analysis, data mining, and predictive analytics.

Access the full version of this article in the DBTA Thought Leadership special section on In-Memory Databases/In-Memory Analytics. A short registration form is necessary to access the special section.