Newsletters




Best Practices Special Section on NoSQL, NewSQL, and Hadoop: Bringing Big Data into the Enterprise Fold


Are organizations’ systems and data environments ready for the big data surge? Far from it, a new survey shows. The survey of 298 members of the Independent Oracle Users Group (IOUG), conducted by Unisphere Research and sponsored by Oracle Corp., finds fewer than one out of five data managers and executives are confident that their IT infrastructure will be capable of handling the surge of big data. And big data is already here — more than one out of 10 survey respondents report having in excess of a petabyte of data within their organizations, and a majority report their levels of unstructured data are growing. (“Big Data, Big Challenges, Big Opportunities: 2012 IOUG Big Data Strategies Survey.”)

Since big data incorporates so many different data types in varying volumes and from many different sources, it would make both data managers and end users’ lives easier if it all could be brought into a single comprehensive framework that can be easily managed and accessed. This, in fact, has long been the holy grail of the IT and database industries — a vision that, unfortunately, has yet to be realized. “If it was possible to easily manage structured and unstructured data in the same frameworks, the legacy vendors would have already solved all these problems and new innovative technologies would never have taken root,” Sanjay Mehta, vice president of product marketing for Splunk, points out.


Access the full version of this article in the Best Practices Section on NoSQL, NewSQL and Hadoop: New Technologies for the Era of Big Data. A short registration form is necessary to access the special section.


The challenge is that existing database environments — especially relational databases — “were not designed for the tsunami of data that organizations are being asked to absorb in today’s world,” adds Russ Kennedy, vice president of product strategy for Cleversafe. “They do well with small, fast transactions but were not designed to handle this changing landscape.”

For many organizations, then, big data requires newer technology strategies, especially platforms including the open-source Hadoop framework and NoSQL databases that maintain hierarchical structures. The IOUG-Unisphere survey, for example, found adoption of Hadoop and NoSQL technologies will double over the coming year — from 7% to 16% for Hadoop adoption and 11% to 15% for NoSQL adoption.

“Collecting click streams from a website will have very different uses than recording of customer purchases,” advises David Champagne, chief technology officer for Revolution Analytics. “Hadoop may be optimal for storing massive streams of data. A traditional data warehouse is likely the best choice for customer transactions where low latency access to the information is critical.”

Plus, big data often requires different skill sets, as well as different philosophical approaches than traditional environments. A realignment of resources and technologies is taking place within many organizations looking to manage these dual data environments. “Traditional data warehouse and BI applications are getting redefined and call for adopting new skills and technologies that are yet to mature, however being actively used by businesses,” says Prasanna Venkatesan, practice director for big data and analytics with HCL Technologies. Such new skill sets include Hadoop, big data storage, and in-memory analytics, visualization, and reporting.

Many of these skills and technologies will require the ability to transition between these data environments. “It’s critical that effective tools become available to connect and translate both the old relational world with the new nonrelational one,” Darin Bartik of Quest Software (now a part of Dell), tells DBTA.


Access the full version of this article in the Best Practices Section on NoSQL, NewSQL and Hadoop: New Technologies for the Era of Big Data. A short registration form is necessary to access the special section.


Sponsors