Overcoming Big Data Integration Challenges

With a variety of big data comes complexity and enterprises are faced with the challenge of extracting and combining that data to form meaningful insights.

The challenges involved with data integration are growing and, in a recent DBTA webinar, Kevin Petrie, senior director and technology evangelist at Attunity, and Sreevatsan Raman, head of engineering at Cask Data, highlighted new technologies and techniques for overcoming data silos and enabling the delivery of information across the enterprise.

Challenges include the complexity and costliness involved in moving data across platforms, the lengthy amount of time delivering value in analytics due to manual processes, and the lack of insight in managing data effectively, according to Petrie.

He said the Attunity platform can effectively handle these issues, accelerating data delivery, empower rapid utilization of data, and continuously improve the management of data, Petrie said.

Attunity Replicate can give users centralized control, targets the widest range of data sources and types, and performs high speed data transfer.

Kafka is an emerging technology that can also help, he added. The platform can broaden a user’s ecosystem, move high data volumes from many sources through message brokers, and enables massive data consolidation.

Foundations for other effective data integration solutions include data variety, customizable cleansing, different modes of delivery, timely access to data, can handle current and emerging use cases, and has tight data governance, Raman explained.

To watch a replay of this webinar, go here.

Image courtesy of Shutterstock.


Subscribe to Big Data Quarterly E-Edition