Technology Trends in the DBMS Market—Hype vs. Reality

Bookmark and Share

As technology professionals, one of the most important aspects of our jobs is to advise our organizations on the use of new technologies. However, the  challenge is identifying the right technologies. This revolves around three very specific risks:

  1. Adopting technology too early when it is immature, unstable, or does not have the needed ecosystem of tools, available talent, and industry support
  2. Adopting technology that is not right for the particular challenges an organization faces
  3. Investing in a technology that is not going to survive long term

These very real risks are why Gartner publishes its Hype Cycle research. In it, Gartner expresses its opinion about the maturity of  particular technologies, including how long until they reach “mainstream adoption.”

IoT platforms, for example, are entering the “peak of inflated expectations” stage, after which it will pass through the “trough of disillusionment” on its way to the “slope of enlightenment” before it finally reaches the “plateau of productivity” in 5–10 years.

It’s an interesting model, but one I have always thought has two major flaws. First, there is the assumption that all technologies make it through, when we know many good, useful technologies crash and disappear into what Geoffrey Moore calls “the chasm.” A crash-and-burn trajectory describes the path of technologies such as OS/2, IrDA interfaces, and “social enterprise platforms” that many projected would become commonplace. Second, most successful technologies are eventually replaced by newer, arguably better ones after some years in the “plateau of productivity.” Anyone who has been around a while probably feels nostalgic about their BlackBerry, dBase (or FoxPro), 3.5", or Commodore computers.

It’s in this context that I want to make a few observations about the DBMS market. This is especially relevant right now, as there is a line of thinking that SQL databases are going away and DBAs will no longer be needed. This philosophy suggests the future belongs to NoSQL, NewSQL, and big data.

And yet my doctor still uses a fax machine, pretty much every financial transaction goes through a couple thousand lines of COBOL, and IBM released a new mainframe last year. It seems like a paradox, but the truth is—as someone put it so eloquently years ago—the world runs on legacy.

In fact, many analytic business problems used as case studies to prove the value of big data could just as easily be solved using a SQL database and so-called legacy analytics tools with faster implementation and at a lower cost.

Earlier this year, Gartner published market share numbers for the DBMS market, and some very interesting points jumped out at me:

1. Oracle, IBM, and Microsoft hold more than 77% of the $36 billion DBMS market, which is growing at a healthy 8.7%—almost $3 billion in annual growth.

2. The combined revenue (not growth) for all NoSQL and big data vendors is $687 million—still just a fraction of the overall DBMS market (less than 2%, to be precise).

3. Based on my estimates, SQL Server grew $800 million annually, more than the total revenue for all NoSQL and big data technologies combined.

4. Amazon AWS’ database business is most likely more than a billion dollars; Microsoft Azure and other cloud DBaaS are also growing very fast.

5. These numbers, because they are based in revenue not usage, omit viable open source technologies such as MySQL (including MariaDB, Percona, and PostgreSQL), Cassandra, and CouchDB. Looking at this data, we can conclude a few things:

  • SQL databases are going to be here for a long, long time.
  • NoSQL and big data technologies will most likely find a niche and be relevant, but they are not the entire future of databases.
  • The cloud is a very real and a viable alternative, as evidenced by the growing adoption of hybrid IT confirmed in recent research from SolarWinds.

An additional conclusion we can draw, and the most important, is that DBAs are and will continue to be critical in helping organizations make the right decisions and choosing the right technologies based on the problems at hand and the technologies’ maturity, suitability, and ability to meet the specific requirements of businesses and the technical requirements of applications.