DBTA E-EDITION
Readers' Choice Awards 2017

Subscribe to the online version of Database Trends and Applications magazine. DBTA will send occasional notices about new and/or updated DBTA.com content.


Trends and Applications

Who makes the best relational database, what is the best NoSQL database, which company has the best Hadoop platform? To find out, Database Trends and Applications magazine went straight to the experts. Each year, DBTA allows subscribers to vote for the DBTA Readers' Choice Awards. Unlike any other awards programs conducted by DBTA, this one is unique because the nominees are submitted and the winners are chosen by DBTA readers

Coined over a decade ago, the term "polyglot persistence" has come to be seen as a shorthand phrase in the database world for using the best tool for the job, or in other words, the right database for the data storage need at hand. Today, that might mean NoSQL, NewSQL, in-memory databases, and cloud databases—also known as database as a service—approaches.

The rise of new data types is leading to new ways of thinking about data, and newer data storage and management technologies, such as Hadoop, Spark, and NoSQL, are disruptive forces that may eventually result in what has been described as a post-structured world. But while Hadoop and NoSQL are undoubtedly growing in use, and a wider array of database management technologies in general is being adopted, the reality is that at least for now the relational database management system is still reigns supreme and is responsible for storing the great majority of enterprise data.

Cloud databases, also referred to as database as a service (DBaaS), can be SQL or NoSQL, open source or relational, and allow customers to choose the environment that is best suited for their particular use cases—whatever that may be, including support for streaming data pipelines or a data warehousing for analytics. Cloud is increasingly being used by organizations to support a range of users, including data managers, developers, and testers, who require almost instantaneous access to a database environment without going through their respective IT, finance, or top managers or needing to acquire new servers or storage arrays to support their efforts.

Addressing the need to store and manage increasingly large amounts of data that does not fit neatly in rows and columns, NoSQL databases can run on commodity hardware, support the unstructured, non-relational data flowing into organizations from the proliferation of new sources, and are available in a variety of structures that open up new types of data sources, providing ways to tap into the institutional knowledge locked in PCs and departmental silos.

MultiValue database technology maintains a strong following of loyal supporters. The NoSQL database technology continues to have advocates and is entrenched in many industry verticals, including retail, travel industry, oil & gas, healthcare, government, banking, and education.

As data stores grow larger and more diverse, and greater focus is placed on competing on analytics, processing data faster is becoming a critical requirement. In-memory technology has become a relied-upon part of the data world, now available through most major database vendors. In-memory can process workloads up to 100 times faster than disk-to-memory configurations, which enables business at the speed of thought.

In the new world of big data and the data-driven enterprise, data has been likened to the new oil, a company's crown jewels, and the transformative effect of the advent of electricity. Whatever you liken it to, the message is clear that enterprise data is of high value. And, after more than 10 years, there is no technology more aligned with advent of big data than Hadoop.

In today's highly complex, data environments with multiple data platforms across physical data centers and the cloud, managing systems and processes manually is no longer sufficient. What is needed is the ability to manage and monitor business-critical assets with automated precision. Indeed, with so many technologies involved in enterprise data management today, the key words in database administration are "comprehensive" and "automated.

Geographically spread-out development teams with varied skill levels and different areas of proficiency, ever-faster software development cycles, multiple database platforms—all this translates to a challenging development environment that database development solutions providers seek to streamline.

If data is the new oil, then getting that resource pumped out to more users faster with fewer bottlenecks is a capability that must be achieved. The issue of under-performing database systems cannot be seen as simply an IT problem. Remaining up and running, always available and functioning with lightning speed today is a business necessity.

In today's always-on economy, database downtime can inflict a mortal wound on the life of a business. That's why having a trusted backup solution that can ensure that databases can be back up and running quickly in the event of an outage is critical. Today, close to one-fourth of organizations have SLAs of four nines of availability or greater, meaning they require less than 52 minutes of downtime per year, according to a recent Unisphere Research survey.

The unending stream of bad news about data breaches shows no sign of abating, with a raft of breaches far and wide involving organizations such as health insurance companies, businesses, government agencies, and the military. The average consolidated cost of a data breach grew to $4 million in 2016, while the average cost for each lost or stolen record containing sensitive and confidential information reached $158, according to IBM's 2016 annual cost of a data breach study.

With more data streaming in from more sources, in more varieties, and being used more broadly than ever by more constituents, ensuring high data quality is becoming an enterprise imperative. In fact, as data is increasingly appreciated as the most valuable asset a company can have, says DBTA columnist Craig S. Mullins, data integrity is not just an important thing; it's the only thing. If the data is wrong, then there is no reason to even keep it, says Mullins.

Strong data governance solutions support data quality, protect sensitive data, promote effective sharing of information, and help manage information through its lifecycle from creation to deletion with adherence to regulatory and government requirements. As organizations gather, store, and access increasing volumes of data, strong data governance allows them to have confidence in the quality of that data for a variety of tasks as well adhere to security and privacy standards.

Data integration is critical to many organizational initiatives such as business intelligence, sales and marketing, customer service, R&D, and engineering. However, big data integration involves projects not products, and there is often a lack of expertise available. Data integration frees employees to concentrate on analysis and forecasting - tasks that require a human touch. It also vastly reduces the chances for errors to be introduced during the data translation process.

Robust data replication capabilities deliver high volumes of (big) data with very low latency, making data replication ideal for multi-site workload distribution and continuous availability.

Change data capture (CDC) is a function within database management software that makes sure data is uniform throughout a database. CDC identifies and tracks changes to source data anywhere in the database, and then applies those changes to target data in the rest of the database.

Data virtualization provides organizations with the ability to allow the business and IT sides of organizations to work closer together in a much more agile fashion, and helps to reduce the complexity.

The Internet of things (IoT) is the inter-networking of physical devices, vehicles (also referred to as "connected devices" and "smart devices"), buildings, and other items embedded with electronics, software, sensors, actuators, and network connectivity which enable these objects to collect and exchange data.

Streaming data solutions can evaluate data quickly and enable predictive analytics based on sources of rapidly changing data—including social media, sensors, and financial services data, to head off future problems, machine failures, natural disasters, and take advantage of unfolding opportunities.

Business intelligence incorporates a variety of tools that enable organizations to collect data from internal systems and external sources, prepare it for analysis, develop, and run queries against the data, and create reports, dashboards and data visualizations.

Big data analytics examines large amounts of data to uncover hidden patterns, correlations, and other insights. Organizations have an increasing number of concerns today with respect to big data. There is the challenge posed by the sheer volume of information that is being created and saved every day as well as the lack of understanding about the information in data stores and how best to leverage it.

Data visualization allows enterprises to create and study the visual representation of data, empowering organizations to use a tool to "see" trends and patterns in data. A primary goal of data visualization is to communicate information clearly and efficiently via statistical graphics, plots and information graphics.

The cloud continues to transform computing, making it less expensive, easier and faster to create, deploy, and run applications as well as store enormous quantities of data. The cloud offers services of all types, including software as a service, database as a service, infrastructure as a service, platform as a service, cloud puts a range of technologies within users' easy reach—when they need it, where they need it, with the ability to scale elastically.

Storage solutions run the gamut of assisting a variety of use cases including backup and archiving, content storage and distribution, big data analytics, static website hosting, cloud-native application data, and disaster recovery.

At the core of big data lie the three Vs - volume, velocity, and variety. What's required are solutions that can extract valuable insights from new sources such as social networks email, sensors, connected devices as sensors, the web, and smartphones. Addressing the three Vs are the combined forces of open source, with its rapid crowd-sourced innovation, cloud, with its unlimited capacity and on-demand deployment options, and NoSQL database technologies, with their ability to handle unstructured, or schema-less, data.

Data modeling tools can help organizations create high quality data models that enable them to shape, organize, and standardize data infrastructure, change structures, and produce detail documentation. Moreover, data modeling solutions can also help companies visualize and manage business data across platforms and extend that data to users with varying job roles and skill levels across geographies and time zones.


Actions

Sponsors