In the 1958 IBM Journal article that is generally acknowledged as the first usage of the term “business intelligence,” author Hans Peter Luhn described the challenges and goals of the BI community in terms that are profoundly resonant nearly a half-century later.
Luhn wrote: “Information is now being generated and utilized at an ever-increasing rate. At the same time the growth of organizations and increased specialization and divisionalization have created new barriers to the flow of information. There is also a growing need for more prompt decisions at levels of responsibility far below those customary in the past. Undoubtedly the most formidable communications problem is the sheer bulk of information that has to be dealt with.”
Sound familiar? It may come as small comfort to today’s business owners to know that, long before personal computers and the Internet, the BI community felt overwhelmed by the need to collect, organize, and distribute a seemingly infinite array of data. And while BI developers have made great strides in the sophistication with which they deliver solutions to their clients, we have a long way to go to bring our tool sets up to the standards enabled by today’s increasingly visual and interactive Web.
The essential question we need to ask ourselves is: “Now that we can get unlimited amounts of data up to the user’s screen, how do we get it from the screen to the brain?” Think of this as the last 18 inches that is, the distance between the computer monitor and the user.
Traditionally, BI vendors have bridged this divide by developing systems that reflect a set of data that reside in a database. While this is inherently the goal of any BI system, we need to remember that data does not start out as data. The lifecycle of any set of data begins as something more tangible - a fact, an idea, a plan.
Once a group of people agree that they need to store a particular set of data, they tend to strip it of its context and turn it into digital information - much like breaking up a sentence into letters. Most data management systems simply reflect these component pieces back through static cells. What we need to focus on instead is putting the story back into the data.
A New Generation
The most effective way to achieve this is by employing a new generation of business intelligence capabilities known as “interactive visual analysis.” Leveraging the human ability to recognize patterns, interactive visual analysis uses time-based Gantt charts, bubble charts, heat maps, and newer visual metaphors to aid users in understanding information.
While graphical elements such as line and pie charts might be included in a static report, they display data in only one or two dimensions and cannot show relationships with data in other reports. More sophisticated analysis requires viewing additional reports or even requests to build custom reports. By providing a fluid interaction model to drill down for explanation, zoom out for context, and overlay pertinent dimensions as if the data were a map, interactive visual analysis enables the user to quickly and easily create meaningful data perspectives without calling on IT staff.
Attention to visualization gives software a more coherent, put-together “look,” but the goal is not to come up with the most attractive interface. This is not about lipstick or cosmetic surgery. Rather, visualization actually provides the link between the data and its relevance. In doing so, it gives the user a framework through which to access, interpret, and re-purpose complex data. The data has a narrative, and good visualization helps to convey that narrative to the end-user.
I often draw an analogy between BI software and architecture, which is the field in which I was trained. Typically, a BI team lays a foundation for a data management system, and only after this foundation is in place does the team start thinking about what to do with the data they’ve captured. We need to change this approach. In order to build a robust and useful BI solution, we need to know what the building will look like before we set the foundation.
We also need to take a holistic view of how data functions within an organization. People have come to realize that data warehouses are living, breathing things. They need to evolve as organizations and businesses evolve. Accordingly, the solutions that provide a window into the data need to scale with the needs of individual users and enhance productivity across functional teams. In order to do this, we need to have a clear sense of the total data schema and understand how it will show up in the views of users with different functions, deadlines, and priorities. If a user sees something red, the next logical questions might be: “What makes up that redness?” ”Exactly how red is it?” “How long has it been red?”
From "Reporting" to Analysis
Another key to advancing the state of the art in business intelligence is moving away from a “reporting” mindset and toward an “analysis” mindset. At many companies, analytics are pretty far from the presentation of data. A request for analysis typically produces a “report,” which is often a bulky document or printout that can take anywhere from days to weeks to produce - and it usually only addresses an isolated part of the problem at hand.
The legacy of report-based methodologies begs for a more dynamic and fluid alternative, and the combination of structurally sound software and superior visualization tools can deliver it.
Along with a more visual approach to business intelligence solutions, we also need to emphasize transparency in the calculations that take place inside the so-called “black box.” Rather than deliver a solution based on complex calculations that take place out of sight, we should put open-ended analytical tools in the hands of knowledge workers so that they can “see” the solution. By tightening this interactive loop between the human brain and the database, we actually reach a point in which the visuals become the analytics.
At big companies and small companies alike, knowledge workers are the biggest assets. Successful businesses need to empower this group of people - who, after all, have direct expertise and domain knowledge - as opposed to relying primarily on technicians to run analyses. The people who know what to do with the data are often too far from the data to make use of it. There are often many levels of infrastructure between them, and that makes for an imperfect translation layer. It’s like playing telephone.
On the other hand, interactive visual analysis extends an organization’s reporting and decision-making capabilities. By combining data into elegant, interactive pictures that can equate to hundreds of static reports, interactive visual analysis enables users to make more informed, more confident decisions faster. Organizations that embrace these new paradigms of data analysis will gain a significant competitive advantage in their markets, as users become more efficient and fewer questions go unanswered.