Bookmark and Share

DBTA Best Practices: Data Integration, Master Data Management, and Data Virtualization

Today’s 24/7 enterprises require a well-designed, next-generation data integration architecture.
Why is data integration so difficult? For many organizations, data integration has been handled as a dark art over the years, implemented behind the scenes with ad hoc scripts, extract, transform, and load (ETL) operations, connectors, manual coding, and patching. Often, front-end applications to get at needed data are built and deployed one at a time, requiring considerable IT staff time, as well as creating a waiting period for business decision makers.

This one-off, manual approach to data integration will not work in today’s competitive global economy. Decision makers need information, at a moment’s notice, that is timely and consistent. However, they are challenged by their organizations’ outdated data integration systems and methods. Often, information may be delayed for weeks, if not months, by the time it takes to develop handcoded scripts to deliver requested reports. The process of adding new data sources —either by user requests or due to activity such as a merger or acquisition — could take just as long.


This Best Practice Series is sponsored by Oracle, Attunity, Hit Software and TransLattice

Download Now!

Download PDF

Sponsors