Subscribe to the DBTA E-Edition email newsletter

October 2009

Subscribe to the online version of Database Trends and Applications magazine. DBTA will send occasional notices about new and/or updated DBTA.com content.

Trends and Applications

High-profile data breaches at major corporations and the usual assortment of state government agencies and educational institutions have highlighted the value of encrypting data. Yet, breach numbers continue to spike and big losses are becoming more common; according to Verizon's 2009 Data Breach Investigations Report, which looks only at breaches that resulted in stolen data being used in a crime, the total number of records breached in Verizon's 2008 caseload—more than 285 million—exceeded the combined total from 2004 to 2007. Apparently the market is now so saturated with stolen data that the price of each record has dropped from a high of $16 in 2007 to less than 50 cents today. But the intensifying number of successful attacks isn't the most distressing part of data breach reports: the Identity Theft Resource Center reports that only 2.4% of the companies involved in all reported breaches utilized encryption.

In art, the expression "less is more" implies that simplicity of line and composition allow a viewer to better appreciate the individual elements of a piece and their relationship to each other to make a whole. In engineering, "less is more" when you accomplish the same work with fewer moving parts. And when dining out, "less is more" when the portions may be smaller, but the food is so much better and satisfying. In IT, the adage is more accurately stated today as "less does more." As IT increases in complexity, mainframe organizations are being asked to handle greater workloads, bigger databases, more applications, more system resources, and new initiatives. All this, without adding—and sometimes while cutting—staff. In addition, IT is undergoing a serious "mainframe brain drain," as the most experienced technicians retire, taking with them their skills and detailed knowledge of the mainframes' idiosyncrasies.

The rising popularity of web 2.0 and cloud computing services has prompted a reexamination of the infrastructure that supports them. More and more people are using web-based communities, hosted services, and applications such as social-networking sites, video-sharing sites, wikis and blogs. And the number of businesses adopting cloud computing applications such as software as a service and hosted services is climbing swiftly.With all this growth, internet data centers are struggling to handle unprecedented workloads, spiraling power costs, and the limitations of the legacy architectures that support these services. The industry has responded by moving towards a "data center 2.0" model where new approaches to data management, scaling, and power consumption enable data center infrastructures to support this growth.

The Sarbanes-Oxley Act of 2002 (SOX) can be considered the most significant compliance standard of our time. Since the passing of the legislation 7 years ago, companies have had to rethink the way they use technology to store company data. This transformation has been anything but an easy ride for companies today, and has significantly impacted the role of the CIO within an organization.

Columns - Applications Insight

The idea of "virtual" reality—immersive computer simulations almost indistinguishable from reality—has been a mainstay of modern "cyberpunk" science fiction since the early 1980s, popularized in movies such as The Thirteenth Floor and The Matrix. Typically, a virtual reality environment produces computer simulated sensory inputs which include at least sight and sound, and, perhaps, touch, taste and smell. These inputs are presented to the user through goggles, earphones and gloves or—in the true cyberpunk sci-fi—via direct brain interfaces.

Columns - Database Elaborations

The art of building software solutions is comprised of many moving parts. There are project managers coordinating tasks; business analysts gathering requirements; architects establishing standards and designs; developers assembling processes; DBAs building database structures; quality assurance staff testing components for compliance to requirements; and an array of supporting roles providing for functional environments, infrastructure, security, etc. A common task that everyone must perform is estimating the effort necessary to deliver results. Certainly for simple and/or repetitive tasks there is no need for recurrent estimating, since applicable values are based on past known metrics. For example, creating the third or fourth version of the same database within the same environment should allow a DBA to incorporate costs experienced previously as a guide. And unless something unusual occurs, such estimates should be on target. However for creative tasks, such as designing new structures, or building new kinds of processes, there will be no previous documented events to refer to. Faced with these circumstances, individuals usually are not allowed to shrug and say, "it will take as long as it takes," and be left alone.

Columns - DBA Corner

As my regular readers know, I am an avid reader, especially of technology books. And every now and then I review some of the more interesting database-related books in the DBA Corner column.

Columns - SQL Server Drill Down

In this season of recession and financial meltdowns, a common question seems to be, "How big is ‘too big to fail'?" Titans of the financial industry made big bets with lots of risk and, when they didn't pan out, American society overall has to pay the price. But, that aside, the very scale of our financial system, by just about every metric, has reached amazing heights, be that number of financial transactions per second, number of traders, number of funds traded, amount of money changing hands—you name it. This might seem like a tangent to the point of databases in general and SQL Server in particular, but there are actually quite a few similarities in my mind.

MV Community

Robert Catalano, Director of Sales, Revelation Software, Discusses New Features in OpenInsight 9.1

Ashwood Computer, Inc., a full-service VAR and preferred systems integrator for companies utilizing MultiValue database technology, has announced version 3.2 of FastBac DR, its disk-based backup tool.

BlueFinity International has introduced the latest enhancement to its .NET development toolset for MultiValue databases, mv.NET with Solution Objects. mv.NET provides a 100% native .NET interface to all major MultiValue databases, allowing .NET developers to access all aspects of their MultiValue systems. The new Solution Objects component set for mv.NET builds upon this existing infrastructure to provide strongly-typed, class-based access to MultiValue databases via the generation of an advanced data abstraction layer.

IBM recently completed the sale of its worldwide U2 database and tools assets to Rocket Software, a global software development firm, based in Newton, Mass. "The entire U2 team worldwide is moving from IBM to Rocket," Susie Siegesmund, director, IBM U2 Data Servers and Tools, IBM Information Management Software, tells DBTA, reiterating sentiments expressed in a letter to customers when the sale was announced, in which she assured them that the change of ownership will be a benefit to U2's product line as well as customers and business partners and will not affect relationships with the U2 business.

MITS, developer of advanced reporting and business intelligence software for the MultiValue database market, has introduced version 7.1 of MITS Discover, the company's OLAP business intelligence system. This latest release includes new capabilities for increased ease of use and enhanced user productivity, as well as new database support.