Newsletters




10 Ways In-Memory Databases are Helping Enterprises Get Ahead

<< back Page 2 of 2

  1. Greater ability to leverage the real-time Internet of Things: Businesses are increasingly relying on real-time data streams from devices or sensors embedded into products, out in their supply chains, or among employees and contractors. Real-time responses to events require analytics performed at blazing speeds, in-memory databases enable such rapid response times.
  2. Faster and more distributed decision making. The increased ability of decision makers to build their own queries and dashboards with little or no involvement from IT will shift analytics initiatives to the business side. End users will be able to conduct more sophisticated segmentation and market analysis because entire datasets will be available for immediate processing. Large volumes of data can be quickly crunched and analyzed, often reducing wait times for reports from hours to a matter of minutes, if not seconds.
  3. More effective reporting. In-memory databases can support many more queries and run much faster than traditional analytic environments, so they also can more readily support highly graphical, intuitive dashboards—which usually require processing power on the back end. Visually enhanced dashboards mean more intuitive interfaces, and, therefore, reduced requirements to hire Ph.D.s in statistics to decipher rows and columns of numbers.
  4. Greater business agility. Since in-memory databases have enormous capacity and are able to support large datasets to be queried, any end user at any level in the organization may be able to quickly identify and isolate any subsets of enterprise data for further investigation. Previous BI environments required that data be extracted in chunks, slowing down analysis as new data needed to be swapped between the BI platform and disk arrays. In-memory data stores can also simply be cleared from the system when complete, making room in-memory for new jobs. The increased speed and flexibility offered through in-memory processing means a transformation in the way organizations approach decision making. End users no longer need to send requests to the IT department to pull the information or reports they need. More ad hoc analysis is possible this way, since end users can access data almost instantaneously, and conduct analysis on the spot, asking any questions that come to mind.
  5. 6. Greater end user adoption of data analysis services and applications. Non-technical end users may be more likely to adopt analytical applications, especially with the increased access and ease of use enabled through in-memory databases. In a world in which many end users are accustomed to the instantaneous responsiveness of online sites such as Google, the slow response times of many enterprise query engines is no longer acceptable.
  6. 7. Offloading processing jobs from over-taxed transactional databases. Running analytics against a transactional system can severely tax the performance of the system, and the ability to quickly launch analytic applications from an aligned database will stave off such a severe performance penalty.
  7. Faster application development. Developers will not be consumed with the requirement to program disk performance features into applications. As a result, applications will be turned around and put into production at greater speeds, while seeing increased performance.
  8. Scaling with almost no limits. In-memory configurations can sit on groups of processors—symmetric multiprocessing—which support large memory capacity. The capacity of such systems can already go as high as those of large disk-based databases. Potentially, all stored in a RAID array could potentially be moved right into machine memory—which could even scale into the exabyte range in some server farms.
  9. Reduced infrastructure—and infrastructure maintenance—is required. Rather than having a database system linked to disk arrays, and adversely affecting database servers with analytic workloads, this all can be handled within in-memory environments. Many typical database infrastructure features may even slow down in-memory performance. “When data access becomes faster, every source of overhead that does not matter in traditional disk-based systems, may degrade the overall performance significantly,” according to the IEEE researchers. “The shifting prompts a rethinking of the design of traditional systems, especially for databases, in the aspect of data layouts, indexes, parallelism, concurrency control, query processing, and fault-tolerance.”

IN-MEMORY DISRUPTION

The impact of in-memory databases and technology is extending well beyond the data center—it is a potentially disruptive technology and approach to information management. Decision makers will have increased latitude to explore new areas of opportunity for the business, or to address challenges in new ways. Database operations will work much closer with the business, delivering value at the moment it’s needed.


For more articles on the In-Memory Revolution, download a special DBTA Thought Leadership Section here

<< back Page 2 of 2

Sponsors