Bridging the Gap Between Traditional and Big Data with Data Modeling

As more enterprises incorporate new technologies such as Hadoop and NoSQL and new strategies like data lakes to manage fast-growing volumes of highly-variable and dynamic data, users need to put comprehensive plans in place to “bridge the gap” between platforms to mesh traditional enterprise data and big data.

DBTA recently held a roundtable webcast with experts Debi Patnaik, director of global product management office at Hitachi Data Systems, and Jordan Martz, director of technology solutions at Attunity, who discussed how to successfully blend all tools together.

“If we are trying to utilize all that data in systems it is near-impossible in terms of return on investment,” Patnaik said.

The true problem statement, Patnaik said, is to derive value from data, and then answer questions that include: What value can it add to business? What is the quantifiable end-goal?  What is the focal point – customers, operations, or innovation? What are the short versus long-term goals? Does the ROI justify investment?

According to Patnaik, there are three steps users should be aware of. These include getting the right data, orchestrating a data lake, and having a platform that brings everything together.

The Hitachi platform can do all these things, Patnaik explained, and be cloud ready for users who prefer not to work on prem.

Martz explained that growing business demands now include new and fast changing analytics needs and new and more data sources while users work with shorter delivery times.

Attunity’s solution can help, Martz said, as the platform has an intuitive guided user interface along with helping users load and sync data with ease.

An archived on-demand replay of this webinar will be available here.