April 2013

Subscribe to the online version of Database Trends and Applications magazine. DBTA will send occasional notices about new and/or updated DBTA.com content.

Trends and Applications

In business, the rear view mirror is clearer than the windshield, said the sage of Omaha. And that is particularly true of business intelligence, composed almost entirely of such retrospectives. Consider this: Business intelligence proffers neatly organized historical data as a potential source of hindsight. Of course, there are also the dashboards of happenings in the "now" but precious little in terms of prompts to timely action. The time required to traverse that path from data to insight to intelligence to ideas to implementation to results is often the culprit. It's nowhere near quick enough, especially for businesses like banking, telecommunications and healthcare that set great store by the time value of information and the money value of time.

Cloud computing has become a mainstream business technology strategy that is delivering the agility and flexibility that businesses need to move forward. To meet the requirements cloud brings to enterprises, new breeds of databases are emerging—either running in the cloud, or designed to optimize enterprise cloud computing.

Big data has unceremoniously ended the era of the "all-purpose database." The days of sticking uniform data into a single database and running all your business applications off it are gone. Business data today comes in a variety of formats, from countless sources, in huge volumes and at fantastic speeds. Some data is incredibly valuable the instant it arrives, other data is only valuable when combined with large amounts of additional data and analyzed over time.

Columns - Notes on NoSQL

The term "NoSQL" is widely acknowledged as an unfortunate and inaccurate tag for the non-relational databases that have emerged in the past five years. The databases that are associated with the NoSQL label have a wide variety of characteristics, but most reject the strict transactions and stringent relational model that are explicitly part of the relational design. The ACID (Atomic-Consistent-Independent-Durable) transactions of the relational model make it virtually impossible to scale across data centers while maintaining high availability, and the fixed schemas defined by the relational model are often inappropriate in today's world of unstructured and rapidly mutating data.

Columns - Database Elaborations

Dimensions are the workhorses of a multidimensional design. They are used to manage the numeric content being analyzed. It is through the use of dimensions that the metrics can be sliced, diced, drilled-down, filtered and sorted. Many people relate to dimensions by thinking of them as reference tables. Such thoughts aren't exactly accurate. A dimension groups together the textual/descriptor columns within a rationalized business category. Therefore, much of the content coming from relational tables may be sourced from reference tables, but the relationship between each source reference table and the targeted dimension is unlikely to be one-for-one. These grouped-format dimensions often contain one or more hierarchies of related data items used within the OLAP queries supported by the structures.

Columns - DBA Corner

"Big data" and the impact of analytics on large quantities of data is a persistent meme in today's Information Technology market. One of the big questions looming in IT departments about big data is what, exactly, does it mean in terms of management and administration. Will traditional data management concepts such as data modeling, database administration, data quality, data governance, and data stewardship apply in the new age of big data? According to analysts at Wikibon, big data refers to datasets whose size, type and speed of creation make it impractical to process and analyze with traditional tools . So, given that definition, it would seem that traditional concepts are at the very least "impractical," right?

Columns - SQL Server Drill Down

The best database benchmarks are those that accurately and reliably reflect the applications and configuration of your own database infrastructure. On the other hand, the amount of work that goes into extracting your own transactional workload can be immense. An easier route is to learn and run your own TPC benchmarks, use one of the free tools to run the benchmark, and then extrapolate the TPC test results for your environments. In light of the past several articles in this column about the TPC benchmarks, you're probably wondering how you can do your own TPC benchmark test. First, is this caveat: A "true" TPC benchmark must go through a rigorous and expensive auditing process. So when I say "run your own TPC benchmark," what I really mean is running a "TPC-like" benchmark which contains all of the activities of a regular TPC benchmark, but without the auditing.

MV Community

Revelation Software plans to include support for OpenID authentication as part of OpenInsight for Web (O4W), a web development toolkit that makes it possible for OpenInsight developers with limited or no HTML, XML or JavaScript experience to develop feature-rich web pages. OpenID allows people to use an existing account to sign in to multiple websites without the need to create new passwords. The new capability will be included in O4W as part of the OpenInsight 10.0 release, which is currently under development, Robert Catalano, director of sales at Revelation, tells DBTA.