Newsletters




Trends and Applications



We've heard the term "digital transformation" used almost to the point of exhaustion in the past couple of years - but it's not just a lot of hot air: It's the future. Organizations must update their legacy architecture to remain current in the new enterprise landscape, and mainframe rehosting offers 10 key advantages

Posted August 18, 2017

There are plenty of pronouncements about artificial intelligence—both in terms of the miracles it can produce and the threat it poses to humanity. But according to Ali Ghodsi, co-founder and CEO of Databricks, there is actually a "1% problem" in that there are a handful of companies such as Google, Amazon, and a few others that are actually accomplishing their goals with it. AI has vast potential but some of the claims, as well as the fears, are overstated and a little premature right now, he contends.

Posted August 18, 2017

At the SHARE summer 2017 event in Providence, RI, Harry Williams, SHARE's president, reflected on the changes taking place in the enterprise technology ecosystem, and what's ahead for the IBM users group and the industry overall.

Posted August 18, 2017

We're still very much in the early days of artificial intelligence (AI). However, money is pouring into AI initiatives at astounding rates, and enterprises need to move at a deliberate speed to adopt and leverage AI across their systems, applications, and data.

Posted August 09, 2017

The future value of hybrid cloud computing is to empower customers to embrace a cloud strategy of their own, versus being dictated by a vendor. A hybrid cloud environment is defined by the customer—a hybrid cloud solution should not dictate where or which cloud the customer must use with their on-premise installation. Although this may seem obvious, large vendors often ignore this critical point, as they dictate choices based on their (lack of) capabilities.

Posted August 09, 2017

Hackers are rarely far from the news these days, whether they're perpetrating cyber-intrusions into political campaigns or take-downs of major retail websites, social media sites, movie studios, or entertainment conglomerates. But some of the "hacking" headlines can be deceiving. In fact, a significant number of cybersecurity breaches around the digital world actually represent a kind of all-too-familiar crime that is as old as the abacus.

Posted August 09, 2017

Data modeling tools can help organizations create high quality data models that enable them to shape, organize, and standardize data infrastructure, change structures, and produce detail documentation. Moreover, data modeling solutions can also help companies visualize and manage business data across platforms and extend that data to users with varying job roles and skill levels across geographies and time zones.

Posted August 02, 2017

Who makes the best relational database, what is the best NoSQL database, which company has the best Hadoop platform? To find out, Database Trends and Applications magazine went straight to the experts. Each year, DBTA allows subscribers to vote for the DBTA Readers' Choice Awards. Unlike any other awards programs conducted by DBTA, this one is unique because the nominees are submitted and the winners are chosen by DBTA readers

Posted August 02, 2017

Coined over a decade ago, the term "polyglot persistence" has come to be seen as a shorthand phrase in the database world for using the best tool for the job, or in other words, the right database for the data storage need at hand. Today, that might mean NoSQL, NewSQL, in-memory databases, and cloud databases—also known as database as a service—approaches.

Posted August 02, 2017

The rise of new data types is leading to new ways of thinking about data, and newer data storage and management technologies, such as Hadoop, Spark, and NoSQL, are disruptive forces that may eventually result in what has been described as a post-structured world. But while Hadoop and NoSQL are undoubtedly growing in use, and a wider array of database management technologies in general is being adopted, the reality is that at least for now the relational database management system is still reigns supreme and is responsible for storing the great majority of enterprise data.

Posted August 02, 2017

Cloud databases, also referred to as database as a service (DBaaS), can be SQL or NoSQL, open source or relational, and allow customers to choose the environment that is best suited for their particular use cases—whatever that may be, including support for streaming data pipelines or a data warehousing for analytics. Cloud is increasingly being used by organizations to support a range of users, including data managers, developers, and testers, who require almost instantaneous access to a database environment without going through their respective IT, finance, or top managers or needing to acquire new servers or storage arrays to support their efforts.

Posted August 02, 2017

Addressing the need to store and manage increasingly large amounts of data that does not fit neatly in rows and columns, NoSQL databases can run on commodity hardware, support the unstructured, non-relational data flowing into organizations from the proliferation of new sources, and are available in a variety of structures that open up new types of data sources, providing ways to tap into the institutional knowledge locked in PCs and departmental silos.

Posted August 02, 2017

MultiValue database technology maintains a strong following of loyal supporters. The NoSQL database technology continues to have advocates and is entrenched in many industry verticals, including retail, travel industry, oil & gas, healthcare, government, banking, and education.

Posted August 02, 2017

As data stores grow larger and more diverse, and greater focus is placed on competing on analytics, processing data faster is becoming a critical requirement. In-memory technology has become a relied-upon part of the data world, now available through most major database vendors. In-memory can process workloads up to 100 times faster than disk-to-memory configurations, which enables business at the speed of thought.

Posted August 02, 2017

In the new world of big data and the data-driven enterprise, data has been likened to the new oil, a company's crown jewels, and the transformative effect of the advent of electricity. Whatever you liken it to, the message is clear that enterprise data is of high value. And, after more than 10 years, there is no technology more aligned with advent of big data than Hadoop.

Posted August 02, 2017

In today's highly complex, data environments with multiple data platforms across physical data centers and the cloud, managing systems and processes manually is no longer sufficient. What is needed is the ability to manage and monitor business-critical assets with automated precision. Indeed, with so many technologies involved in enterprise data management today, the key words in database administration are "comprehensive" and "automated.

Posted August 02, 2017

Geographically spread-out development teams with varied skill levels and different areas of proficiency, ever-faster software development cycles, multiple database platforms—all this translates to a challenging development environment that database development solutions providers seek to streamline.

Posted August 02, 2017

If data is the new oil, then getting that resource pumped out to more users faster with fewer bottlenecks is a capability that must be achieved. The issue of under-performing database systems cannot be seen as simply an IT problem. Remaining up and running, always available and functioning with lightning speed today is a business necessity.

Posted August 02, 2017

In today's always-on economy, database downtime can inflict a mortal wound on the life of a business. That's why having a trusted backup solution that can ensure that databases can be back up and running quickly in the event of an outage is critical. Today, close to one-fourth of organizations have SLAs of four nines of availability or greater, meaning they require less than 52 minutes of downtime per year, according to a recent Unisphere Research survey.

Posted August 02, 2017

The unending stream of bad news about data breaches shows no sign of abating, with a raft of breaches far and wide involving organizations such as health insurance companies, businesses, government agencies, and the military. The average consolidated cost of a data breach grew to $4 million in 2016, while the average cost for each lost or stolen record containing sensitive and confidential information reached $158, according to IBM's 2016 annual cost of a data breach study.

Posted August 02, 2017

With more data streaming in from more sources, in more varieties, and being used more broadly than ever by more constituents, ensuring high data quality is becoming an enterprise imperative. In fact, as data is increasingly appreciated as the most valuable asset a company can have, says DBTA columnist Craig S. Mullins, data integrity is not just an important thing; it's the only thing. If the data is wrong, then there is no reason to even keep it, says Mullins.

Posted August 02, 2017

Strong data governance solutions support data quality, protect sensitive data, promote effective sharing of information, and help manage information through its lifecycle from creation to deletion with adherence to regulatory and government requirements. As organizations gather, store, and access increasing volumes of data, strong data governance allows them to have confidence in the quality of that data for a variety of tasks as well adhere to security and privacy standards.

Posted August 02, 2017

Data integration is critical to many organizational initiatives such as business intelligence, sales and marketing, customer service, R&D, and engineering. However, big data integration involves projects not products, and there is often a lack of expertise available. Data integration frees employees to concentrate on analysis and forecasting - tasks that require a human touch. It also vastly reduces the chances for errors to be introduced during the data translation process.

Posted August 02, 2017

Robust data replication capabilities deliver high volumes of (big) data with very low latency, making data replication ideal for multi-site workload distribution and continuous availability.

Posted August 02, 2017

Change data capture (CDC) is a function within database management software that makes sure data is uniform throughout a database. CDC identifies and tracks changes to source data anywhere in the database, and then applies those changes to target data in the rest of the database.

Posted August 02, 2017

Data virtualization provides organizations with the ability to allow the business and IT sides of organizations to work closer together in a much more agile fashion, and helps to reduce the complexity.

Posted August 02, 2017

The Internet of things (IoT) is the inter-networking of physical devices, vehicles (also referred to as "connected devices" and "smart devices"), buildings, and other items embedded with electronics, software, sensors, actuators, and network connectivity which enable these objects to collect and exchange data.

Posted August 02, 2017

Streaming data solutions can evaluate data quickly and enable predictive analytics based on sources of rapidly changing data—including social media, sensors, and financial services data, to head off future problems, machine failures, natural disasters, and take advantage of unfolding opportunities.

Posted August 02, 2017

Business intelligence incorporates a variety of tools that enable organizations to collect data from internal systems and external sources, prepare it for analysis, develop, and run queries against the data, and create reports, dashboards and data visualizations.

Posted August 02, 2017

Big data analytics examines large amounts of data to uncover hidden patterns, correlations, and other insights. Organizations have an increasing number of concerns today with respect to big data. There is the challenge posed by the sheer volume of information that is being created and saved every day as well as the lack of understanding about the information in data stores and how best to leverage it.

Posted August 02, 2017

Data visualization allows enterprises to create and study the visual representation of data, empowering organizations to use a tool to "see" trends and patterns in data. A primary goal of data visualization is to communicate information clearly and efficiently via statistical graphics, plots and information graphics.

Posted August 02, 2017

The cloud continues to transform computing, making it less expensive, easier and faster to create, deploy, and run applications as well as store enormous quantities of data. The cloud offers services of all types, including software as a service, database as a service, infrastructure as a service, platform as a service, cloud puts a range of technologies within users' easy reach—when they need it, where they need it, with the ability to scale elastically.

Posted August 02, 2017

Storage solutions run the gamut of assisting a variety of use cases including backup and archiving, content storage and distribution, big data analytics, static website hosting, could-native application data, and disaster recovery.

Posted August 02, 2017

At the core of big data lie the three Vs - volume, velocity, and variety. What's required are solutions that can extract valuable insights from new sources such as social networks email, sensors, connected devices as sensors, the web, and smartphones. Addressing the three Vs are the combined forces of open source, with its rapid crowd-sourced innovation, cloud, with its unlimited capacity and on-demand deployment options, and NoSQL database technologies, with their ability to handle unstructured, or schema-less, data.

Posted August 02, 2017

If there is a "bottom line" to measuring the effectiveness of your big-data applications, it's arguably performance, or how quickly those apps can finish the jobs they run. Let's consider Spark. Spark is designed for in-memory processing in a vast range of data processing scenarios. Data scientists use Spark to build and verify models. Data engineers use Spark to build data pipelines. In both of these scenarios, Spark achieves performance gains by caching the results of operations that repeat over and over again, then discards these caches once it is done with the computation.

Posted July 14, 2017

Microsoft's SQL Server platform has been on a roll lately, with its smooth and feature-rich SQL 2016 release gaining accolades across the industry, including the DBMS of the Year award from db-engines.com. The company's increased focus on cloud, strong BI tool set, impressive functionality and emerging cross-platform capability with SQL on Linux has made this RDBMS a staple of database shops around the world.

Posted July 14, 2017

Software audits are becoming a major risk to organizations. Microsoft, Oracle, SAP and other leading software vendors keep close tabs on their customers for potential license violations and true-up costs. A common occurrence is deploying more copies of software than the license agreement allows. But taking software inventory is a time-consuming and laborious process. A better approach is an integrated ITAM and asset information source.

Posted July 14, 2017

Data modelers are responsible for creating and maintaining the conceptual, logical, and physical data models for an organization. These data models are used to define the data requirements that support the business goals, but business stakeholders often have difficulty understanding how these technical documents correlate to their business processes.

Posted July 14, 2017

DevOps—the close working alliance between development and operations teams—is catching hold in enterprises dependent on continuously and frequently delivering new versions of software, whether for internal consumption or external services.

Posted July 05, 2017

It's been long acknowledged that data is the most precious commodity of the 21st-century business, and that all efforts and resources need to be dedicated to the acquisition and care of this resource. Lately, however, executives have become enamored with the vision of transforming their organizations into "data-driven" enterprises, which move forward into the future on data-supported insights. So, what, exactly, does the ideal "data-driven enterprise" look like?

Posted July 05, 2017

Big data has been in vogue for years, but many businesses are having a lot of difficulty harnessing value and gaining insights from the voluminous amounts of data they collect. However, there is an often-ignored set of data in the enterprise that is truly actionable, data that can be described as "active" data.

Posted July 05, 2017

Gone are the days of 2-year software development projects. Now organizations are striving for code updates that drop monthly, weekly, even daily. A developer will say, "I need a workable copy of Oracle today." And IT responds, "OK, we'll have that in about 2 weeks." What's a developer to do? Sometimes this leads to the use of synthetic datasets (one might also refer to this simply as "fake data"). These never work as well as real data, and the result is more bugs and slower software development. Other times the Dev/Test side of the house will expense spinning up a bunch of systems in the cloud, so-called "shadow IT."

Posted June 16, 2017

"Platforms" are all the rage in software positioning and messaging. And recently, a new platform has become the "platform du jour" - driven by the urgency felt by enterprises as they struggle to manage an increasing amount of data and an increasing number of data formats all generated from an increasingly number of applications on an increasingly diverse mix of infrastructure - the "data platform."

Posted June 16, 2017

New and emerging vendors offer fresh ways of dealing with data management and analytics challenges in areas such as data as a service, security as a service, cloud in a box, and data visualization. Here, DBTA looks at the 10 companies whose approaches we think are worth watching.

Posted June 16, 2017

A new era of computing is unfolding with big data, cloud, and cognitive all converging at once. This confluence will transform how we do business and it's impacting all industries.

Posted June 16, 2017

The world of data management is constantly changing. Each year, the DBTA 100 spotlights the companies that are dealing with evolving market demands through innovation in software, services, and hardware.

Posted June 15, 2017

There are two types of businesses in the world today: those that run on data and those that will run on data. Data security now sits at the top of nearly every organization's priority list. But with such a high volume of data coming into most businesses every day, how can information security professionals quickly identify which data is the highest priority for protection? After all, security costs time and money, and not all types of data are as sensitive or vulnerable as others.

Posted June 01, 2017

The days of looking at your data in the rearview mirror are coming to an end. Most organizations now realize that if they want to make better decisions faster, they need to understand and respond to what is happening in real time. Such an ability to analyze data at the "speed of thought" requires figuring out how to build a high-performance and effective streaming pipeline that is both affordable and scalable, and is also easy to implement and manage.

Posted June 01, 2017

With data increasingly recognized as a highly valuable enterprise asset, data protection is understandably becoming a higher priority. To explore the issues surrounding data protection, including the role of people, processes, and technology in creating a proactive security stance, Database Trends and Applications is introducing the Cybersecurity Sourcebook.

Posted June 01, 2017

Data management is going through a paradigm shift—again, said Joe Caserta, president of Caserta Concepts, who delivered a keynote titled, "Architecting Data for the Modern Enterprise," at Data Summit 2017 in NYC.

Posted May 25, 2017

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24

Sponsors