Newsletters




Why Big Data is Not a Death Sentence for Data Warehousing


Despite the rise of big data, data warehousing is far from dead. While traditional, static data warehouses may have indeed seen their day, an agile data warehouse — one that can map to the needs of the business and change as the business changes — is quickly on the rise.

Many of the conversations today around big data revolve around volume and while that is certainly valid, the issue is also about understanding data in context to make valuable business decisions. Do you really understand why a consumer takes action to buy? How do their purchases relate? When will they do it again? Big data is limited when it comes to answering these questions. An agile approach — one that gives even big data a life beyond its initial purpose — is the value data warehousing can bring to bear and is critical to long-term business success.

For all its hype, big data is still just that — data. Its value will be tied to how well and how quickly your business exploits it. As many companies know all too well, today it takes too long for most to quickly integrate new sources of data for business analytics and reports. In fact, our surveys of data warehouse users and operators show that nearly 30% of organizations take more than 90 days to integrate those sources, and almost 40% take 1–3 months.

As data volumes, variants, and velocities — three of the elements of big data — increase, common sense tells us those numbers don’t get better. This is especially critical given 90% of all data created has been created in the past 2 years. Most companies still have yet to make the critical transition to treating data as a shared enterprise asset. Those that have understood one important thing—any asset has a value, and in most cases, that value depreciates over time.

You already know that if your enterprise can’t use information effectively and efficiently, you are at a disadvantage. Improving this means getting your data warehouse and master data houses in order, and ensuring they can be updated quickly are key enablers to extracting full value from big data. Ignoring this means you start at a competitive disadvantage that only worsens over time. Insights from big data are only useful if your organization can directly relate it to sales performance, customer satisfaction, on-time delivery performance, financial performance, and other key business outcomes.

With the deluge of data coming, it’s a certainty that not all of it will be worth retaining or managing. Lots of industry conversations have focused on putting business and IT professionals at the same table to determine what data is relevant. It’s important to note that mining the value of big data is a sustained, iterative process between business and IT. It is the only way to make informed trade-off decisions on where to invest resources. But by their very nature, the opportunities to exploit business value in big data will require speed of response and by default this necessitates a more agile way of working and an agile data foundation.

Consider this, in today’s big data world many organizations are relying on facts that may be supplied by hundreds, if not thousands, of their own suppliers, customers, and other business partners. Having many different data sources can make it difficult to maintain the consistency and accuracy of the facts in the data warehouse. Traditional approaches will just not cut it. Instead, organizations in this situation are realizing that they need to govern data with greater agility.

The first step toward that foundation is designing a model that provides context for the introduction of relevant big data to core business processes and metrics. It continues by assuring the model is flexible and may be extended to incorporate new, unforeseen sources that can deepen insights. And finally, it means the model can be modified rapidly while new data is freshest and most relevant.

Big data has put a spotlight on the challenges organizations face when it comes to data. Data warehousing gets a bad rap from its legacy (pun intended) and, to be fair, it is well-deserved. Traditional data warehouses bring to mind words such as “inflexible,” “brittle,” and “prone to break when changes needed to be made.” When technologies no longer fulfill their promise, the gut reaction is to say they’re “dead” and move onto the next big thing. However, to make big data a big success, organizations will need to depend on their data warehouse to glean business insights and make better decisions. Agile data warehouses provide the mechanism to ensure the right information is available for decision making at the right time. Without it, big data may not survive.     


About the author:

Darren Pierce is CTO of Kalido, a provider of agile information management software.


Sponsors