Newsletters




Business Intelligence Needs a New Dimension to Deliver the Value It Promises


Image courtesy of Shutterstock

There is no doubt that business intelligence (BI) and analytics, predictive models and the overall acceleration of data science has been pretty remarkable over the last decade. Tableau, TIBCO Spotfire, IBM Cognos, and SAS, to name just a few, have all made tremendous strides. Jaspersoft, as well as Apache in open source arena, have democratized BI applications and models, opening up huge innovation opportunities.

There have also been major advances and amazing growth in visualization and human factors, especially in the exploding field of mobile end-points. The iPad, its competitors and derivatives will be the future of executive interaction and decision making in the enterprise. BI has a totally cool future ahead of it. In addition, we can probably all agree that the hype about the role of the “data scientist” is overblown and the promises of big data technologies are in excess of what is required for the majority of enterprise business problems.

But immersive BI and decision making is the future. Without it, true business innovation and agility will be unachievable.

There are some real hurdles to overcome to achieve this value. With the evolution of analytics, models, intelligence creation and representation of results, they are now a part of the solution to major business problems. Analysis is a small component that is easy to manage and control. This is why, over the last decade, we have gone through a revolution in analytical acceleration, complexity and access. But the way we deploy this value has not changed in nearly three decades. It assumes that everything is nicely packaged and available, so that complex calibrations, data, behavioral and opportunity analysis can be leveraged. This assumption is the genesis of why so many enterprise BI projects fail. A Standish report recently reviewed enterprise projects of greater than $20 million in value from 2003 to 2012 and found a success rate of only 6%. Those results prompted Mackenzie to do a similar review with results of that study indicating that more than half of all projects fail.

An Antiquated Integration and Implementation Model 

Why is this? We are stuck with an antiquated two-dimensional integration and implementation model which assumes that before value can be realized, we need to get components of that value into a more stable and normalized condition.

Let’s use this analogy: prior to the year 2000, BI was one-dimensional – like a train – able to go backwards and forwards between data source and destination. Second- and third-generation analytics added another dimension to help calibrate different sources and establish more computationally intense models. One could compare these to a car, now enabled with two-dimensional motion, adding a new dimension, i.e., lateral movement to the previous one-dimensional tracks. Yet both of these transport types require pre-defined “paths” that are designed by someone else in hopes to satisfy the needs of travelers. This is the same deployment requirement that forces analysts to wait for the data to be laid out prior to the effective application of analytical tools.

Therein is the challenge. This two-dimensional approach forces a huge amount of data preparation, infrastructure and basic physical and organizational facilities for these tools to operate effectively. Yet, the elements that make analytics and BI effective are increasingly distributed and continue to expand as the Internet of Things increases complexity, size, distribution and heterogeneity. With the existing deployment approach we are stuck figuratively building more roads to make our analytics work effectively with the data and value sources that are increasingly made available. This is not a big data issue. This issue has to do with increasingly distributed data and applications and our ability to effectively access, process and distribute valuable results across a complex environment.

A third dimension is critical for analytics and BI to stay effective against the growth of data and activities at the edge. If we go back to the transportation analogy, there is a need for something similar to an aircraft, with the ability to fly and traverse to wherever data and the sources of value are located. Backwards, forwards, left and right are now augmented by up and down.

Imagine analytics in three dimensions. Complete, unfettered navigation around the enterprise providing the degree of flexibility needed to improve the efficiency of the journey to one’s destination. This can only be achieved by decoupling the analytics from the systems and data integration prerequisites, away from the need for prebuilt roads or tracks. This freedom can only be achieved by allowing analytics to be unlocked from the deployment model that has held back its delivery of value and finally realize the incredible promise it offers.

Data supporting most enterprises is indeed “big,” though not in the sense that some wish us to see it. Social media, digital retail and telecommunications volumes have certainly gone through the roof. Most enterprise volumes outside those businesses, excluding the rapidly maturing Healthcare market, are growing at much lower rates. But data is still creating “big business problems.” These dilemmas are not necessarily because of volume, but rather because of the distribution of and access to that data. It’s all over the place, increasing in complexity, and resident in disparate technologies.

Most enterprises do not have a Big Data problem. Instead they have a Distributed Data problem, one that is accentuated by a legacy value creation model, forcing us to shoehorn all these distributed components into a common persistence structure. I’ve been quoted saying that 70 cents in the dollar of most project dollars is spent in identifying, accessing, aggregating, normalizing, and optimizing data before a single penny of value can be created. 55 cents of that 70 is an exact repeat of at least five other projects the enterprise has undertaken in the last 3 years.

Within this legacy value creation model, data needs to be moved, organized and optimized for analytics to create value. The two-dimensional value creation model will increasingly be untenable because time-to-value will grow as a multiple of the increasingly complex set of data sources that are continuously expanding outwards. So yes, data is big in its access complexity. It’s monolithic, difficult to access and difficult to move and centralize. Yet analytics, even applications, are not. They are relatively simple to manage and move; why are we trying to move the big stuff to the small stuff all the time?

Let’s assume for a moment that a large number of business problems are not volume-burdened. Let us think that “big business problems” are seen as a function of the distribution of and access to appropriate data and assets. A fair estimate for this category would be around 60%-70% of current business challenges including: customer understanding and management, risk management, regulatory reporting, HR management, workflow, and operational optimization.

One Solution is Needed

To avoid repeating the challenges of the past we need one solution. Analytics must be made mobile and aligned with the way data is proliferating across the expanding ecosystem. Distributed analytics need to move and orchestrate the creation of intelligence quickly and simply. Analytics that are deployment and technology agnostic are expected to deliver without the prerequisite of large systems and data integration projects. This is completely counterintuitive to the way analytics have been deployed in the past and there are huge vested interests determined to keep these inefficiencies in place.

Enterprise problems and the components to solving those problems are increasingly distributed. They are big problems, but not necessarily "big data" problems. They are increasingly abstracted away from the traditional methods of trying to understand and solve those problems. Centralized data warehousing and analytics architectures are under attack by the very trends that made them a household name. In this new normal, the exponential expansion of edge-based compute and data generation will far outstrip linear expansion of centralized capabilities. We need to add a new dimension of mobility, data access, and intelligence creation in order to enable BI to keep up. A new dimension of movement must be added, one that removes the constraints of traditional demands for infrastructure, integration and deployment. It’s time to let BI fly.


Sponsors