Capturing and Analyzing Big Data with the Right Tools

Hadoop and NoSQL databases have emerged as leading choices to capture and analyze big data by bringing new capabilities to the field of data management and analysis.

At the same time, the relational database, firmly entrenched in most enterprises, continues to advance in features and varieties to address new challenges.

DBTA recently held a roundtable webinar featuring Brian Bulkowski, co-founder and CTO of Aerospike, Ashish Sahu, VP of product marketing at Cask Data, Inc., and John Leach, chief technology officer at Splice Machine, to discuss the key differences between Hadoop, NoSQL and RDBMS today as well as best practices successful implementations.

Bulkowski explained the keys to successful data approaches include re-imagining data to fit your analytics, using modern, programmatic languages and tools, building for greater “front edge” data sizes and building a hybrid transaction analytics processing architecture.

Hybrid memory architectures enable digital transformation in fundamentally different ways. They deliver simplicity, faster time to market, business agility, competitive advantage, and lowest TCO, Bulkowski explained.

Sahu outlined how the promise of big data is the next frontier for innovation, competition, and productivity.

Highly distributed storage and compute Hadoop systems change the game, Sahu explained.

The platform is composed of hybrid open Source Software, it scales out forever, there are no bottlenecks, it’s easy to ingest big data, provides agile data access and is affordable.

With a multitude of options available, companies need to pick the right tool for the right purpose.  Hadoop, with the support of multiple workloads, is a solid candidate for a single and reliable platform for distributed data processing and high volume data storage, according to Sahu.

 Unified integration platform for big data will help companies realize rapid time-to-value while simplifying their IT landscape. Companies can limit shadow IT by empowering business analysts with self-service.

Leach recommended Splice Machine as a potential tool of choice for analyzing and handling big data.

Splice Machine offers a hybrid and cloud platform that offers users the ability to run transactional workloads at scale with full ACID compliance, analytical workloads with in-memory processing, and full ANSI-SQL compatibility to power existing applications.

An archived on-demand replay of this webinar is available here.