Newsletters




How to Manage the Next Generation of Big Data Solutions

<< back Page 2 of 2

Bookmark and Share

Evolving to the analytical enterprise is not an overnight process, rather it’s a step-by-step journey, marked by adopting and refining the best available data tools and methodologies available.

Six key guidelines for aligning your data management people and platforms to successfully complete this transition:

  1. The analytical organization is about the business—not more, not less. The business needs to drive the process, and business and technology teams need to collaborate to move the enterprise forward. Remember, many business leaders now want to compete on analytics, but often don’t understand how to best make this a reality. Management teams and centers of excellence provide a way of elevating the potential of a data-driven culture while transcending organizational inertia or turf battles.
  2. Measure everything. In an analytical enterprise, it’s all about employing data to measure business progress. An important output within the analytical enterprise are key performance indicators that keep track of the impact of systems and decisions on critical business requirements, such as revenue growth, customer churn, product returns, and employee turnover.
  3. Trust is everything. All data flowing through and deep into the analytical enterprise must be trustworthy. This is accomplished by open lines of communication, as well as redoubled efforts to achieve data quality and security. Within the ETL and data warehousing world, well-established processes have long been employed to identify, deduplicate, and clean data before it reaches decision makers. Part of this collection of best practices is master data management, which helps assure that there is a single “gold copy” of data that is being accessed across the enterprise.
  4. Open up data management processes. An analytical enterprise dependson many existing solutions as well as newtypes of technologies that will enhance itscapabilities. For example, both proprietaryand open source approaches, as well asvarious data types—can be accommodatedwith open interfaces. By standardizingdata and data access across the enterprise,it will become easier to identify importantdata sources, as well as how data can beconverted to information that has businessvalue. Such standards should encompasseverything from data management toolsand platforms, data models, integrationprocesses, and security protocols.
  5. Move from standalone data silos to Data as a Service. A rapidly emergingtrend is data virtualization, in which bothdata and data management functions areabstracted into a standardized servicelayer that is accessible from across theenterprise. Decision makers (both in thebusiness and IT departments) shouldn’tbe caught up in trying to wrest data outof silos, or attempting to learn the ins andouts of complicated database systems. In addition, since business users now comeinto the enterprise with a range of devices—from smartphones to tablets to PCs—it’s critical that available information bedelivered through a well-designed and consistent architecture that separates the client interface layer from the underlying back-end systems.
  6. Build for growth, in the most economical way. The analytical enterpriseis a constantly growing organism, and as its power and capabilities catch on, more parts of the business will wantto participate. Thus, the volumes andvarieties of data generated and consumedwill grow as well. Database managers needto be able to quickly and dynamically provision systems to accommodate fast changing workloads, as well as keep things running smoothly on a 24x7 basis. Often, new resources coming online offer more economical ways to manage big data workloads. Analytical enterprises also are more likely to be taking advantage of easily swappable commodity components, such as PC-class processor servers or blade servers. While it may be expensive to attempt to bring log files, machine generated data, or test data into a traditional data warehouse and ETL environment, open source frameworks such as Hadoop can accommodate these files at a much lower cost and employ clustered processing to scale as much as needed.

No Single Approach is Correct

Every analytical enterprise will be different, and there is no single right solution or approach in terms of technology to get there. Fully one-third of 298 data managers surveyed by Unisphere Research said they have already put approaches into place to preprocess massive data stores to load into existing data warehouses for analysis. (“Big Data, Big Challenges, Big Opportunities,” sponsored by Oracle and produced by Unisphere Research, a division of Information Today, Inc., in partnership with the Independent Oracle Users Group) It’s highly likely these data managers recognize the advantages in integrating new data types with their relational data environments. Hadoop is currently the popular choice among data managers and analysts for ingesting and processing big data, but other solutions could come to the fore within the next few years.

Rather than being on any single methodology or technology, the analytical enterprise relies on multiple sources and multiple strategies, including data virtualization, data federation, cloud, master data management, and data warehousing. Decision makers and technologists need to be able to change interfaces and data structures as frequently as the business requires. A well-designed analytical enterprise architecture is built on a strategy which employs the best of all data solutions both old and new.

<< back Page 2 of 2

Related Articles

To address the technical requirements of big data analysis, organizations are deploying new data platforms, tools and techniques. Concurrently, more organizations are interested in moving data workloads to the cloud to capitalize on cost-efficiencies and rapid time-to-value. A recent DBTA roundtable webcast explored how these two trends are converging to present organizations with new opportunities for fast, cost-effective business analysis.

Posted January 28, 2014

With demand growing for easy-to-use analytical applications that can make big data actionable for more users, MicroStrategy has launched MicroStrategy PRIME, a cloud-based, in-memory analytics service. The company says it has been built from the ground up to support the engineering challenges that are associated with development of powerful new information-driven apps.

Posted January 28, 2014

In order to be effective, big data analytics must present a clear and consistent picture of what's happening in and around the enterprise. Does a new generation of databases and platforms offer the scalability and velocity required for cloud-based, big data-based applications—or will more traditional relational databases come roaring back for all levels of big data challenges?

Posted February 26, 2014

Sponsors