Readers' Choice Awards 2015

Subscribe to the online version of Database Trends and Applications magazine. DBTA will send occasional notices about new and/or updated DBTA.com content.

Trends and Applications

The IT industry continues to expand at a brisk pace with a steady influx of innovative products and technologies to help organizations extract value from data, integrate it with new and traditional sources, as well as ensure quality and security.

An array of database technologies, including NoSQL, NewSQL, in-memory databases, and cloud database—or database as a service approaches, are increasingly being adopted to address the new requirements created by the explosive growth in data volume and variety. Social media, the Internet of Things, the need for mobile access, and real-time insights are just some of the new factors wielding pressure on organizations.

The past few years have brought radical changes to the world of IT. After more than 40 years as the undisputed leader in enterprise data management, for the first time, the relational database is facing some worthy challengers.

As organizations seek to leverage more data in more forms, some are finding that cloud-based, or cloud-friendly, databases provide advantages in terms of faster deployment, flexibility, reduced upfront capital commitment, and easy maintenance.

NoSQL database technology is increasingly gaining ground in enterprises. It is true that the relational database is still the undisputed leader when it comes to enterprise data management. But today, newer technologies, and in particular, NoSQL database technologies, are making forays into the enterprise.

Sometimes called the fifth NoSQL database technology, the MultiValue database dates back to the mid-1960s, with Don Nelson and Dick Pick widely credited as the founding fathers of the technology. Also referred to as Pick or MultiDimensional, a key advantage of MultiValue, is the database structure's use of attributes that can have multiple values, rather than one single value as with relational technology.

Putting data to work by extracting useful information for competitive advantage is the goal for many companies. Why else amass huge stores of data if not to use it to understand customers' needs better, foresee new opportunities, and ward off impending threats.

In-memory databases and grids have entered the enterprise mainstream. Today, new offerings are emerging in many forms—from extensions of relational database management systems to NoSQL databases to cloud-hosted NoSQL databases.

While new and bigger data environments may seem complex, the objective is always simply to exploit data for decision making. Column-oriented relational databases - which store data in columns rather than in rows - were developed in the 1990s.

Perhaps no technology is more aligned with the new world of big data than Hadoop, which allows for the distributed processing of large datasets across compute clusters

Maintaining and optimizing database management systems has never been as important or as complex. Today, it involves far more than simply ensuring the high performance of a set of homogenous databases.

Developing new database software quickly and collaboratively among globally located team members with varying skill levels, for deployment on cloud or on premise while also avoiding potential pitfalls that can create downtime and delays is a goal for many organizations. To effectively deal with today's rapid business cycles, what's needed are nimble development solutions.

Four years ago in The Wall Street Journal, Marc Andreesen wrote that software is eating the world. Those words continue to resonate. But with the heavy focus on software, there is pressure on underlying database systems to remain up and running, always available and functioning with lightning speed.

For data-driven enterprises, no activity is more vital than keeping the database systems up and running. Still, unplanned downtime does happen for a variety of reasons. To be prepared in the event of system failures, infrastructure owners and DBAs have developed strategies to increase resiliency and assure availability of data.

Today, data is under assault from within organizations and from outside. The risks are endless. Breaches can be caused by any number potential threats - outside hackers, accidental staff mistakes, abuse by rogue employees, sharing data with outside partners. And, the damage can be costly, damaging finances and reputations.

Data is flowing into organizations in greater faster and from more sources and in more formats than ever before. Along with this deluge, there is also a greater appreciation for the fact that is data can be leveraged for faster and better decision-making, connecting more appropriately with customers, creating better products, and avoiding risk.

As organizations place more value on data than ever before, the adage "garbage in, garbage out" has taken on more significance. Sloppy data hygiene is reflected in subpar business outcomes.

Effective data governance improves the usability, integrity, and security of enterprise data. As organizations gather, store and access increasing volumes of data, strong data governance allows them to have confidence in the quality of that data for a variety of tasks as well adhere to security and privacy standards.

Data integration is critical to many organizational initiatives such as business intelligence, sales and marketing, customer service, R&D, and engineering. These days, however, data integration is getting more complicated. There are more types of data than ever before, as well as more sources, and more target platforms on can be stored.

Data replication advances a number of enterprise goals, supporting scenarios such as distribution of information as part of business intelligence and reporting initiatives, facilitating high availability and disaster recovery, and as part of a no-downtime migration initiative.

A key component to data integration best practices, change data capture (CDC) is based on the identification, capture and delivery of the changes made to enterprise data sources. CDC helps minimize access to both source and target systems as well as supports the ability to keep of record of changes for compliance, and is a key component of many data processes.

The overall goal of data integration is to achieve a complete view of enterprise data across disparate data environments. One approach that has come to the fore to overcome data silos and revive the original goal of the data warehouse is data virtualization.

The cloud has enabled a range of capabilities such as big data analytics, unlimited capacity for storage, and a range of other on-demand services.

In a fast-paced world, data driven decisions must be made quicker than ever before. When organizations need to analyze fast-developing business events as they happen in real time - with the ability to evaluate threats, risks, and opportunities - a streaming data solution is required to improve usability and speed time to insight.

Business intelligence encompasses a variety of tools that enable organizations to collect data from internal systems and external sources, prepare it for analysis, develop, and run queries against the data, and create reports, dashboards and data visualizations.

Organizations are continuing to see data as a treasure trove from which they can extract valuable insights. Data mining solutions provide the tools that enable them to view those hidden gems and facilitate better understanding of new business opportunities, competitive situations, and complex challenges

Query and reporting solutions are part of a comprehensive business intelligence approach. When you have questions, these solutions provide the answers. For as long as enterprises have been gathering data, BI groups have been utilizing query and report programs as the primary applications that produce output from information systems.

With more companies utilizing a variety of tools to back up their data, enterprises are looking to offload some of that information into a space that is offsite and cost-effective. Cloud could be the answer as it has transformed computing, making it less expensive, easier and faster to create, deploy, and run applications as well as store enormous quantities of data.

Efficient, secure storage plays a critical role in the enterprise. Tech companies are beginning to offer smarter storage solutions, enabling greater efficiency through data compression, information lifecycle management, and tiered storage strategies.

In the age of big data, organizations require solutions that can help manage the three Vs, volume, velocity, and variety. Addressing the three Vs are the converging forces of open source, with its rapid crowd-sourced innovation, cloud, with its unlimited capacity and on-demand deployment options, and NoSQL database technologies, with their ability to handle unstructured, or schema-less, data.