Newsletters




Bridging the Data Divide: Getting the Most Value From Data With Integration

<< back Page 2 of 4 next >>

Accordingly, the Unisphere-IOUG survey finds, on average, decision makers spend 25% of their time locating data—from discovery to putting data in an appropriate place to be analyzed. These challenges reflect back on the data warehouse-dominated data delivery strategies most organizations have in place and the outdated mechanisms associated with them. It’s time for a new approach to data delivery that is aligned with today’s digital realities.

Complexity is an issue that stymies many data integration efforts in this new environment. “Precisely because there are so many tools, and big data technologies evolve so rapidly, the burden to make everything work together, while keeping up with the latest frameworks and releases, is putting a real strain on organizations,” said Tendü Yogurtçu, general manager of big data at Syncsort. “We’re seeing a strong desire to have a single software environment to manage both streaming and batch processing, while protecting them from having to redesign their data integration jobs when a new compute framework, such as Spark, comes along.”

Industry observers make the following recommendations for bringing data integration into the new enterprise realm:

This Is Not Your Father’s Data Integration

It’s time to rethink the definition of the term “data integration,” which continues to evolve. “Integration in the past was connecting two systems, sharing and moving data,” said Chris McNabb, general manager of Dell Boomi. “Today, as you integrate, you get into a much broader definition. Now that integration can be done more easily, you have to make sure that everyone who is asking for data is actually authorized to see the data they are requesting. When you think about an integration platform 3 or 5 years from now, it will have management, governance, and analytic capabilities.”


While industry experts encourage greater automation and machine learning to rapidly assemble information, it’s also critical that end users be able to assemble insights quickly and easily as well.


Accordingly, the “era of the old-school client/server model supported by consultants who manually code in SQL is over,” said Ran Sarig, CEO and co-founder of Datorama. “It’s being replaced by cloud-based solutions, which utilize machine learning to clean, analyze, and present data.” Many of the existing time-consuming, manual drag-and-drop methods for a self-service approach to working and reporting with data cannot keep pace with the newer information dynamics that require instantaneous adaptation, he continued.

Essentially, data integration has become much more than the label implies. As Chatelain put it, “The value of data integration can continue to increase when an organization looks beyond just the action of integrating data. Data integration should be seen as an element of a complete end-to-end data supply chain—that begins when data is created, imported, or integrated with other data. It then moves through the organization and ends when data is analyzed to create actionable, valuable business insights—such as ideas for a new product or service, or improved consumer engagement or new globalization strategies.”

Rethink Your Management Structure and Workflows

Ultimately, the success of enterprise data integration from this point forward relies more on the sense of ownership promoted by management, not by any technology solutions that get dropped in place. The biggest trend that will hit data “isn’t going to be technology-related, but instead will focus on the process and people sides of the holy triumvirate,” said James Quin, senior director of content and C-suite communities for CDN Media. “We’re already starting to see the early phases of this change now, but what will take hold in pretty much every shop next year is the fragmentation, or segmentation, if you prefer, of responsibilities when it comes to enterprise data. While initially titles such as chief analytics officer and chief data officer were kind of interchangeable, we’re really seeing that analytics is very much becoming a business role and a business responsibility, and governance or management is becoming an IT responsibility.”

Think Real Time, Not Batch

Real time has become a vital part of the data integration equation. Fifty-seven percent of managers and professionals in the Unisphere-IOUG survey state that there is now strong demand for delivery of real-time information within their organizations. Less than one-third, however, say they are capable of delivering most of their data in real time at this time. “The old batch, schedule-driven approaches  cannot cope with demands from the business for ever more current and complete information,” said Steve Wilkes, CTO and co-founder at Striim. “The value of data and, by implication, data integration, is directly related to the timeliness and context associated with the data."

<< back Page 2 of 4 next >>

Related Articles

The next major release of MarkLogic's enterprise NoSQL database platform is expected to be generally available by the end of this year. Gary Bloom, president and CEO of the company, recently reflected on the changing database market and how new features in MarkLogic 9 address evolving requirements for data management in a big data world. "For the first time in years, the industry is going through a generational shift of database technology - and it is a pretty material shift," observed Bloom.

Posted June 30, 2016

Sponsors