Big Data Management: The Age of Big Data Spells the End of Enterprise IT Silos

<< back Page 3 of 3

There is one good rule of thumb to assess whether you have a big data problem or not—if you are not using new data sources, you likely don’t have a big data problem. If you are consuming new information from the new data sources, you might have a big data problem.

What’s Ahead in Data Management

There are a few areas in which we can certainly expect to have many innovations over the next few years.

Real-time analytics on massive data volumes has more and more demand. While there are many in-memory database technologies including many proprietary solutions, I believe the future is with the Hadoop ecosystem and open standards. However, proprietary solutions such as SAP HANA or just announced Oracle In-Memory Database are very credible alternatives.

Graph databases will see significant uptake. There are several graph databases and libraries available, but they all have unique weaknesses when it comes to scalability, availability, in-memory requirements, data size, modification consistency and plain stability. As we have more and more data generated that is based on dynamic relations between entities, graph theory becomes a very convenient way to model data. Thus, the graph databases space is bound to evolve at a fast pace.

Continuously increasing security demands is a general trend in many industries although most of the modern data processing technologies have weak security capabilities out of the box. This is where established relational databases with very strong security models and capabilities to integrate easily with central security controls have a strong edge. While it’s possible to deploy a Hadoop-based solution with encryption of data in transit and at rest, strong authentication, granular access controls, and access audit, it takes significantly more effort than deploying mature database technologies. It’s especially difficult to satisfy strict security standards compliance with newer technologies, as there are no widely accepted and/or certified secure deployment blueprints.

The future of the database professional—One of the challenges that is holding companies from adopting new data processing technologies is the lack of skilled people to implement and maintain that new technology. Those of us with a strong background in traditional database technologies are already in high demand and are even in higher demand when it comes to the bleeding-edge, not-yet-proven databases. If you want to be ahead of the industry, look for opportunities to invest in learning one of the new database technologies and do not be afraid that it might be one of those technologies that becomes nonexistent in a couple of years. What you learn will take you to the next level in your professional career and make it much easier to adapt to the quickly changing database landscape.

About the Author:

Alex Gorbachev, chief technology officer at Pythian, has architected and designed numerous successful database solutions to address challenging business requirements. He is a respected figure in the database world and a sought-after leader and speaker at conferences. Gorbachev is an Oracle ACE director, a Cloudera Champion of Big Data, and a member of OakTable Network. He serves as director of communities for the Independent Oracle User Group (IOUG). In recognition of his industry leadership, business achievements, and community contributions, he received the 2013 Forty Under 40 award from the Ottawa Business Journal and the Ottawa Chamber of Commerce.

<< back Page 3 of 3

Related Articles

Forward-looking CIOs and IT organizations are exploring new strategies for tapping into non-traditional sources of information such as websites, tweets and blogs. While this is a step in right direction, it misses the bigger picture of the big data landscape.

Posted March 17, 2014

Along with big data there come some fundamental challenges. The biggest challenge is that big data is not able to be analyzed using standard analytical software. There is new technology called "textual disambiguation" which allows raw unstructured text to have its context specifically determined.

Posted March 14, 2014