Newsletters




Trends and Applications



Coined over a decade ago, the term "polyglot persistence" has come to be seen as a shorthand phrase in the database world for using the best tool for the job, or in other words, the right database for the data storage need at hand. Today, that might mean NoSQL, NewSQL, in-memory databases, and cloud databases—also known as database as a service—approaches.

Posted August 02, 2017

The rise of new data types is leading to new ways of thinking about data, and newer data storage and management technologies, such as Hadoop, Spark, and NoSQL, are disruptive forces that may eventually result in what has been described as a post-structured world. But while Hadoop and NoSQL are undoubtedly growing in use, and a wider array of database management technologies in general is being adopted, the reality is that at least for now the relational database management system is still reigns supreme and is responsible for storing the great majority of enterprise data.

Posted August 02, 2017

Best Cloud Database

Posted August 02, 2017

Addressing the need to store and manage increasingly large amounts of data that does not fit neatly in rows and columns, NoSQL databases can run on commodity hardware, support the unstructured, non-relational data flowing into organizations from the proliferation of new sources, and are available in a variety of structures that open up new types of data sources, providing ways to tap into the institutional knowledge locked in PCs and departmental silos.

Posted August 02, 2017

Best MultiValue Database

Posted August 02, 2017

As data stores grow larger and more diverse, and greater focus is placed on competing on analytics, processing data faster is becoming a critical requirement. In-memory technology has become a relied-upon part of the data world, now available through most major database vendors. In-memory can process workloads up to 100 times faster than disk-to-memory configurations, which enables business at the speed of thought.

Posted August 02, 2017

In the new world of big data and the data-driven enterprise, data has been likened to the new oil, a company's crown jewels, and the transformative effect of the advent of electricity. Whatever you liken it to, the message is clear that enterprise data is of high value. And, after more than 10 years, there is no technology more aligned with advent of big data than Hadoop.

Posted August 02, 2017

Best Database Administration Solution

Posted August 02, 2017

Geographically spread-out development teams with varied skill levels and different areas of proficiency, ever-faster software development cycles, multiple database platforms—all this translates to a challenging development environment that database development solutions providers seek to streamline.

Posted August 02, 2017

If data is the new oil, then getting that resource pumped out to more users faster with fewer bottlenecks is a capability that must be achieved. The issue of under-performing database systems cannot be seen as simply an IT problem. Remaining up and running, always available and functioning with lightning speed today is a business necessity.

Posted August 02, 2017

In today's always-on economy, database downtime can inflict a mortal wound on the life of a business. That's why having a trusted backup solution that can ensure that databases can be back up and running quickly in the event of an outage is critical. Today, close to one-fourth of organizations have SLAs of four nines of availability or greater, meaning they require less than 52 minutes of downtime per year, according to a recent Unisphere Research survey.

Posted August 02, 2017

The unending stream of bad news about data breaches shows no sign of abating, with a raft of breaches far and wide involving organizations such as health insurance companies, businesses, government agencies, and the military. The average consolidated cost of a data breach grew to $4 million in 2016, while the average cost for each lost or stolen record containing sensitive and confidential information reached $158, according to IBM's 2016 annual cost of a data breach study.

Posted August 02, 2017

With more data streaming in from more sources, in more varieties, and being used more broadly than ever by more constituents, ensuring high data quality is becoming an enterprise imperative. In fact, as data is increasingly appreciated as the most valuable asset a company can have, says DBTA columnist Craig S. Mullins, data integrity is not just an important thing; it's the only thing. If the data is wrong, then there is no reason to even keep it, says Mullins.

Posted August 02, 2017

Best Data Governance Solution

Posted August 02, 2017

Best Data Integration Solution (Overall)

Posted August 02, 2017

Best Data Replication Solution

Posted August 02, 2017

Change data capture (CDC) is a function within database management software that makes sure data is uniform throughout a database. CDC identifies and tracks changes to source data anywhere in the database, and then applies those changes to target data in the rest of the database.

Posted August 02, 2017

Best Data Virtualization Solution

Posted August 02, 2017

Best IoT Solution

Posted August 02, 2017

Streaming data solutions can evaluate data quickly and enable predictive analytics based on sources of rapidly changing data—including social media, sensors, and financial services data, to head off future problems, machine failures, natural disasters, and take advantage of unfolding opportunities.

Posted August 02, 2017

Business intelligence incorporates a variety of tools that enable organizations to collect data from internal systems and external sources, prepare it for analysis, develop, and run queries against the data, and create reports, dashboards and data visualizations.

Posted August 02, 2017

Big data analytics examines large amounts of data to uncover hidden patterns, correlations, and other insights. Organizations have an increasing number of concerns today with respect to big data. There is the challenge posed by the sheer volume of information that is being created and saved every day as well as the lack of understanding about the information in data stores and how best to leverage it.

Posted August 02, 2017

Data visualization allows enterprises to create and study the visual representation of data, empowering organizations to use a tool to "see" trends and patterns in data. A primary goal of data visualization is to communicate information clearly and efficiently via statistical graphics, plots and information graphics.

Posted August 02, 2017

Best Cloud Solution

Posted August 02, 2017

Best Data Storage Solution

Posted August 02, 2017

At the core of big data lie the three Vs - volume, velocity, and variety. What's required are solutions that can extract valuable insights from new sources such as social networks email, sensors, connected devices as sensors, the web, and smartphones. Addressing the three Vs are the combined forces of open source, with its rapid crowd-sourced innovation, cloud, with its unlimited capacity and on-demand deployment options, and NoSQL database technologies, with their ability to handle unstructured, or schema-less, data.

Posted August 02, 2017

If there is a "bottom line" to measuring the effectiveness of your big-data applications, it's arguably performance, or how quickly those apps can finish the jobs they run. Let's consider Spark. Spark is designed for in-memory processing in a vast range of data processing scenarios. Data scientists use Spark to build and verify models. Data engineers use Spark to build data pipelines. In both of these scenarios, Spark achieves performance gains by caching the results of operations that repeat over and over again, then discards these caches once it is done with the computation.

Posted July 14, 2017

Microsoft's SQL Server platform has been on a roll lately, with its smooth and feature-rich SQL 2016 release gaining accolades across the industry, including the DBMS of the Year award from db-engines.com. The company's increased focus on cloud, strong BI tool set, impressive functionality and emerging cross-platform capability with SQL on Linux has made this RDBMS a staple of database shops around the world.

Posted July 14, 2017

Software audits are becoming a major risk to organizations. Microsoft, Oracle, SAP and other leading software vendors keep close tabs on their customers for potential license violations and true-up costs. A common occurrence is deploying more copies of software than the license agreement allows. But taking software inventory is a time-consuming and laborious process. A better approach is an integrated ITAM and asset information source.

Posted July 14, 2017

DevOps—the close working alliance between development and operations teams—is catching hold in enterprises dependent on continuously and frequently delivering new versions of software, whether for internal consumption or external services.

Posted July 05, 2017

It's been long acknowledged that data is the most precious commodity of the 21st-century business, and that all efforts and resources need to be dedicated to the acquisition and care of this resource. Lately, however, executives have become enamored with the vision of transforming their organizations into "data-driven" enterprises, which move forward into the future on data-supported insights. So, what, exactly, does the ideal "data-driven enterprise" look like?

Posted July 05, 2017

Big data has been in vogue for years, but many businesses are having a lot of difficulty harnessing value and gaining insights from the voluminous amounts of data they collect. However, there is an often-ignored set of data in the enterprise that is truly actionable, data that can be described as "active" data.

Posted July 05, 2017

Gone are the days of 2-year software development projects. Now organizations are striving for code updates that drop monthly, weekly, even daily. A developer will say, "I need a workable copy of Oracle today." And IT responds, "OK, we'll have that in about 2 weeks." What's a developer to do? Sometimes this leads to the use of synthetic datasets (one might also refer to this simply as "fake data"). These never work as well as real data, and the result is more bugs and slower software development. Other times the Dev/Test side of the house will expense spinning up a bunch of systems in the cloud, so-called "shadow IT."

Posted June 16, 2017

"Platforms" are all the rage in software positioning and messaging. And recently, a new platform has become the "platform du jour" - driven by the urgency felt by enterprises as they struggle to manage an increasing amount of data and an increasing number of data formats all generated from an increasingly number of applications on an increasingly diverse mix of infrastructure - the "data platform."

Posted June 16, 2017

New and emerging vendors offer fresh ways of dealing with data management and analytics challenges in areas such as data as a service, security as a service, cloud in a box, and data visualization. Here, DBTA looks at the 10 companies whose approaches we think are worth watching.

Posted June 16, 2017

DBTA 100 2017 - The Companies That Matter Most in Data

Posted June 15, 2017

There are two types of businesses in the world today: those that run on data and those that will run on data. Data security now sits at the top of nearly every organization's priority list. But with such a high volume of data coming into most businesses every day, how can information security professionals quickly identify which data is the highest priority for protection? After all, security costs time and money, and not all types of data are as sensitive or vulnerable as others.

Posted June 01, 2017

The days of looking at your data in the rearview mirror are coming to an end. Most organizations now realize that if they want to make better decisions faster, they need to understand and respond to what is happening in real time. Such an ability to analyze data at the "speed of thought" requires figuring out how to build a high-performance and effective streaming pipeline that is both affordable and scalable, and is also easy to implement and manage.

Posted June 01, 2017

As data centers evolve and become more advanced, so too should the tools that are built for them. The traditional approach of monitoring data centers results in reactive problem solving and more alerts than anyone can handle. As we move from reactively monitoring to proactively managing data centers, there will be fewer disruptions, faster resolution, and higher efficiencies. But in order to achieve meaningful gains in these metrics, the new approach and tools must support a critical functionality: automation.

Posted May 05, 2017

One of the significant changes enterprises should expect to gain speed this year is the expanding role of business users in IT. Fences around the IT landscape are slowly coming down and business users are playing an active part in the digital transformation. It's uncharted territory, driving an increased demand for information governance initiatives to help companies use data to navigate the transformation. Here are trends to watch as data and business users' roles evolve within IT and across the enterprise.

Posted April 18, 2017

As mobile has pushed deeper into enterprises, there is a growing recognition that it may be possible to run significant parts of businesses from relatively small devices. While mobile devices may not be ready to run entire enterprises, in many cases, they certainly can run more limited functions.

Posted April 18, 2017

Today's enterprise database environments are growing in size and complexity, fueled by rising data volumes and new business demands. Many databases that have been at the heart of existing enterprises to power mission-critical applications are now being positioned to support new Digital Native businesses. As a result, 24x7 high availability is no longer a luxury for select applications; it's a necessity for the bulk of the business—many organizations can no longer afford downtime in their data environments—even for a minute.

Posted April 18, 2017

Over the past 5 years, the volume of data that businesses use to stay relevant and operational has exponentially increased, and shows no signs of slowing. Within this vast quantity of data, the answers to critical business questions lie dormant, waiting to be accessed. Who are the most valuable customers? What are they buying? How can I get them to purchase more from me, and how can I find more like them?

Posted April 07, 2017

While not the most media-hyped technology, databases are certainly one of the most crucial when it comes to our always-online, always-connected society. Databases power not just the applications and websites we use every day, but the businesses that generate revenue and fuel the economy. The internet relies on functioning and well-performing databases to operate.

Posted April 07, 2017

"Temporality" is a term that database managers know well, but it may be a new one for business managers. That has to change, as the temporality your database supports­—or, how it handles time—could be the difference between whether or not the business will increase revenue, pay a fine, or identify new opportunities. Especially important in this regard is "bitemporality," which is the ability to examine data across different points in time.

Posted April 07, 2017

COLLABORATE 17 Starts April 2 at the Mandalay Bay Resort and Casino

Posted March 17, 2017

The Independent Oracle Users Group (IOUG) has represented the voice of data technologists and professionals for more than 20 years, and we are excited about how our community continues to grow and focus on peer-to-peer education and know-how. With that focus we are excited for our premier yearly event: COLLABORATE 17 - IOUG Forum.

Posted March 17, 2017

COLLABORATE 17: Technology & Applications Forum for the Oracle Community is set for April 2-6 in Las Vegas and, for Oracle users, there is simply nothing like being there. Packed with user-focused education and networking, COLLABORATE creates a welcoming atmosphere where participants can candidly share their ideas, challenges and questions.

Posted March 17, 2017

The Simple Storage Service (S3) outage that took place on Feb. 28 prompted observations and reflections from industry experts about the need for proactive cloud services monitoring, the requirement to diversify with multi-cloud strategies, and even the possibility of "too-big-to-fail" safeguards for large cloud services providers.

Posted March 02, 2017

Apache Spark offers a solid foundation for machine learning. There are other tools and packages to help you dive into deep learning, but Spark offers a consistent approach to data access and, therefore, makes machine learning on Spark easier as you need less plumbing.

Posted February 21, 2017

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26

Sponsors