Establishing a data warehousing or business intelligence environment initiates a process that works its way through the operational applications and data sources across an enterprise. This process focuses not only on identifying the important data elements the business lives and breathes, but the process also tries very hard to provide rationality in explaining these elements to business intelligence users. These explanations are found within the text of data dictionaries and about the context of the data structures used for answering queries. As the new queries are executed and information dispersed, aspects of the inner workings that never may have been previously exposed are revealed. Such exposure is always a positive thing that increases transparency within the organization.
Usually an organization is thirsting for this new flow of information. The business intelligence machine is sent on the trail of filling in the known gaps, and as those previously missing pieces become available, its success raises everyone’s comfort levels. Under other circumstances there may be bumps in the road. Bumps that might be unexpected shocks, for example, when long established and trusted reports are uncovered to have unexpected errors. It may start simply; a special and complex report is migrated into the business intelligence area. Numbers from the new version vary from the numbers of the original. The differences may be drastic or only slight, but the differences are significant. Significant enough to alter decisions made from the details. The original report’s code perhaps is so complex that no one left in the organization truly understands everything going on within it. As the new logic is reviewed, and all the extraction logic bringing together the source data for the new business intelligence solution is reviewed, every step can be verified and agreed upon. It can happen that no errors are found in the new process. Therefore, the organization must now accept that previous decisions were based on information now shown to have been incorrect.
More often than not, an organization knows its own weaknesses. For example, gaps between isolated accounting systems and order fulfillment systems that are usually patch-worked together with many manual steps that result in the kind of every-day terrors of which an enterprise may be fully conscious. And, should discrepancies arise as fallout from joining together data between these systems, it is likely that no one would be overly surprised. However, even when the issues are known, the data warehouse builder needs to help remind people that the data warehouse only helps to expose these issues and bring the actual details to light. Exposure is not the same as the ability to apply a quick fix. While the data warehouse processing may apply business-defined rules to help manage such issues, source system issues, problems, and discrepancies can only be truly fixed within the source system.
As one scales the rock face of creating a new data mart, or data warehouse, or other variation, one must be prepared to handle these circumstances and continue one’s ascent. One must document the business rules used in transforming things from source to target all long the process flow. These documents may need to be reviewed to reaffirm that regardless of what an old process may or may not have done, this new process is truly pristine. Additionally, it is important to have testing affirm data quality as it passes through every step. A solid foundation for growth is created by having a process that both works and has a pedigree that curries everyone’s faith. And lastly, one must be prepared to continually remind people that the value of the business intelligence area is in providing transparency of organizational data. And transparency is simply that, i.e., exposure, the ability to examine. Transparency does not mean a quick fix, or a bandage that hides the reality.