Newsletters




Data Integration: From Dark Art to Enterprise Architecture

Page 1 of 2 next >>

Why is data integration so difficult? For many organizations, data integration has been handled as a dark art over the years, implemented behind the scenes with ad hoc scripts, extract, transform, and load (ETL) operations, connectors, manual coding, and patching. Often, front-end applications to get at needed data are built and deployed one at a time, requiring considerable IT staff time, as well as creating a waiting period for business decision makers.

This one-off, manual approach to data integration will not work in today’s competitive global economy. Decision makers need information, at a moment’s notice, that is timely and consistent. However, they are challenged by their organizations’ outdated data integration systems and methods. Often, information may be delayed for weeks, if not months, by the time it takes to develop handcoded scripts to deliver requested reports. The process of adding new data sources—either by user requests or due to activity such as a merger or acquisition—could take just as long.

To address this challenge, there’s a renewed push across the industry to elevate data integration from being a series of one-off projects shrouded in mystery to the core of a multidisciplinary, enterprise architecture—incorporating new and existing approaches such as master data management, data virtualization, and data integration automation. By introducing enterprise architectural sensibilities to data integration, it can be turned into a process for innovation and delivery of productive change for organizations. The key is to be able to bake data integration into the enterprise, converting it into a repeatable, sustainable process, rather than as bolted-on projects.


For more articles on this topic,access the special section "DBTA Best Practices: Data Integration, Master Data Management, and Data Virtualization"


Today’s 24/7 enterprises require a well-designed, next-generation data integration architecture capable of providing trusted data to decision makers’ interfaces from any and all sources, regardless of format, context, or file size. Such an architecture must support any changes that take place either with the back-end data sources, or with the front-end interfaces.

Most organizations rely on traditional relational databases, data warehouses, and business intelligence tools to store, manage, access, and decipher analytical data. A survey of 338 data managers conducted by Unisphere Research, a division of Information Today, Inc., and sponsored by Attunity, finds new strategies are emerging as well. For example, high-volume data sites are increasingly adopting next-generation solutions such as cloud-based storage, appliances, and data virtualization to move data more rapidly across the enterprise to the decision makers who need it (“Moving Data: Charting the Journey from Batch to Blazing, May 2012”). Today’s 24/7 enterprises require a well-designed, next-generation data integration.

The journey to achieving next-generation data integration is a longterm one, through which organizations gradually build on their people, processes, and technologies to achieve a highly functioning enterprise architecture. At the start of the journey, organizations may be addled by an absence of any data integration architecture at all. Instead, what prevails is a project-based data culture, devoid of systematic, repeatable, automated processes for data integration.

The risk is that there may only be a few data managers or administrators—or even just one individual—who carry around all the knowledge of how things work in their heads. Of course, once these individuals leave the company—or are away on vacation—they take all working knowledge with them, leaving the enterprise stranded.

Page 1 of 2 next >>

Sponsors