DBTA E-EDITION
February 2012

Subscribe to the online version of Database Trends and Applications magazine. DBTA will send occasional notices about new and/or updated DBTA.com content.


Trends and Applications

While emails have been "the smoking gun" in many recent court cases, the new big wave in what is "discoverable" is structured (database) data. Accessing data is simpler and much faster from structured data than non-structured data. If the response to e-discovery can come from a structured data format, it is usually much faster than the alternatives and can mitigate the risk of steep fines due to delayed response time. A new combination of stringent regulations and new technology are giving judges and litigators more muscle to subpoena more data. Structured data in any application and database, no matter how old or obsolete, can be used in court as evidence, and increasingly it is being asked for.

Today's organizations must capture, track, analyze and store more information than ever before - everything from mass quantities of transactional, online and mobile data, to growing amounts of "machine-generated data" such as call detail records, gaming data or sensor readings. And just as volumes are expanding into the tens of terabytes, and even the petabyte range and beyond, IT departments are facing increasing demands for real-time analytics. In this era of "big data," the challenges are as varied as the solutions available to address them. How can businesses store all their data? How can they mitigate the impact of data overload on application performance, speed and reliability? How can they manage and analyze large data sets both efficiently and cost effectively?


Columns - Notes on NoSQL

In years to come, we might remember October 2011 as the month the big database vendors gave in to the dark side and embraced Hadoop. In October, both Microsoft and Oracle announced product offerings which included and embraced Hadoop as the enabler of their "big data" solution. The last of the big three database vendors - IBM - embraced Hadoop back in 2010.


Columns - Database Elaborations

Lectures related to master data bring forth all sorts of taxonomies intended to help clarify master data and its place within an organization. Sliding scales may be presented: at the top, not master data; at the bottom, very much master data; in the middle, increasing degrees of "master data-ness." For the longest of times everyone thought metadata was confusing enough ... oops, we've done it again. And, we have accomplished the establishment of this master data semantic monster in quite a grand fashion.


Columns - DBA Corner

Many types of data change over time, and different users and applications have requirements to access data at different points in time. A traditional DBMS stores data that is implied to be valid at the current point-in-time, it does not track the past or future states of the data. For some, the current, up-to-date values for the data are sufficient. But for others, accessing earlier versions of the data is needed. Temporal support makes it possible to store different database states and to query the data "as of" those different states.


Columns - SQL Server Drill Down

Looking back on 2011, I'm struck by two larger trends in the overall database marketplace. First, most energy and excitement (but not much forward motion) seems to be coming from the NoSQL space. And second, the major relational database platforms are generating what little energy they can outside of their core RDBMS technologies. If you kept up on one or more of the better general IT-industry news sources, you probably saw dozens of stories about various NoSQL vendors, spin-offs, and technologies in a single month, compared with perhaps one or two stories in the same period of time covering a traditional RDBMS platform such as Oracle, Microsoft's SQL Server, or MySQL.


MV Community

Entrinsik's 2012 Informer conference (ICON) will be held March 4-6 at the Raleigh Marriott City Center in Raleigh, NC. This year's agenda will include breakout sessions with members of the Informer product development team, panel discussions on best practices and methodologies, and customer-led use cases demonstrating how Informer is being deployed by customers.

InterSystems Corporation, a provider of advanced database, integration and analytics technologies, announced it has become ISO 9001:2008 certified. ISO 9001:2008 is a quality management standard, and for InterSystems, the certification covers all processes related to the product and service creation associated with the InterSystems CACHÉ high-performance database and InterSystems Ensemble integration and development platform that are performed or managed from InterSystems' Cambridge-based headquarters.

Sponsors