For years, data warehouses and extract, transform and load (ETL) have been the primary methods of accessing and archiving multiple data sources across enterprises. Now, an emerging approach – data virtualization – promises to advance the concept of the federated data warehouse to deliver more timely and easier-to-access enterprise data.
These are some of the observations made at Composite Software's third Annual Data Virtualization Day, held in New York City. This year’s gathering was the largest ever, with nearly 250 customers and practitioners in attendance, Composite reports.
Composite CEO Jim Green and CTO David Besemer opened and wrapped up the event with roadmaps of the vendor's data virtualization vision. “Data virtualization revives the original goals of data warehousing,” Besemer pointed out. “The idea that I can access all of my data without having to rip and replace all the physical systems we've built all these years.”
Data virtualization represents the answer to the “new normal” that is now seen across many enterprise IT and data environments – with big data, new client devices, predictive analytics, and self-service business intelligence at the forefront of corporate plans, Besemer said. “Data virtualization is an alternative to ETL, but that's only the tip of the iceberg.”
Rick van der Lans of R20/Consultancy, provided an overview of the role the technology – typically in the form of a data virtualization server – plays in enterprises, noting that the technology does not replace, but supplements data warehouses. “Most production systems don’t track history,” he said. “You need historical data for analysis. Data virtualization is not about throwing away data warehouses.” In addition, he added, “data warehouses make an excellent, cleansed data source for the data virtualization layer.”
To illustrate the role of a data virtualization layer, van der Lans compared the technology to the service window in a restaurant – customers place their specific orders, but do not have to be concerned about what goes on in the other side of the window in the kitchen.
Additional speakers included industry analysts Wayne Eckerson and Claudia Imhoff, as well as end-users Martin Brubeck, CTO at Pearson, Peter Armstrong, Vice President at Comcast Business Intelligence, and Michael McNabb, SVP of Global Business Services at Franklin Templeton.
The benefits of data virtualization are seen in the more rapid access users gain to reports and insights. “Data models have been way too abstract for end-users,” van der Lans pointed out. “And they don’t ask for data warehouses, and don’t care about data structures. Virtualization helps show users the data itself, and make it more practical, more down-to-earth for them.”
For more information on Composite Software, go here.