Newsletters




Trends and Applications



This year, DBTA's list of trend-setting products in data and information management includes newer approaches leveraging artificial intelligence, machine learning, and automation as well as products in more established categories such as relational and NoSQL database management, MultiValue, performance management, analytics, and data governance.

Posted December 05, 2018

Increasingly, DBAs are seeing artificial intelligence (AI), and machine learning applied to database management and optimization, taking self-healing and self-tuning to the next level. These solutions, from both database and third-party vendors, allow DBAs to spend less time searching for bottlenecks, and more time doing more productive and creative work in support of strategic business goals.

Posted December 04, 2018

Everything changes—especially when we seek to automate tasks. Automation is leading to amazing consumer benefits—from vehicles to clothing to voice-activated devices—and making our lives better. In the workplace, as automation becomes applied to repetitive tasks that are being handled manually, people become concerned about their livelihoods. Will they lose their jobs? How will they provide for themselves and their family? How will automation impact them personally?

Posted December 04, 2018

The year just ending has been an interesting one for data managers. Artificial intelligence (AI) and machine learning took center stage, which also meant an increasingly glaring spotlight on data sourcing, management, and viability. The continued rise of the Internet of Things (IoT) also meant no letting up on demands for data environments to deliver requirements fast and furiously. The year ahead will bring more of the same—as well as a continuation of the transformation of information management.

Posted December 04, 2018

How fast and far can databases grow, and how can such growth be sustained? That's the question faced by many data managers these days, who deal with growing demands from their businesses for real-time, analytical capabilities, incorporating data-driven initiatives such as the Internet of Things and artificial intelligence. They are responding and keeping up with these requirements through a combination of cloud resources and automation.

Posted December 04, 2018

The call for speakers is officially open for the sixth annual Data Summit conference, to be held in Boston, May 21-22, 2019. Whether you are an IT practitioner or a business stakeholder, we encourage you to consider participating in this exciting conference. The deadline for submitting proposals is December 14, 2018.

Posted December 04, 2018

In this new world where data is the coin of the realm, the winners will be enterprises who can quickly and effectively harness data—the right data—to reduce risk, improve business outcomes, and create new value for their customers, employees and other stakeholders. As they work to realize this goal, enterprises are increasingly finding that activating copy, backup, archived and other data located on secondary storage can be just as, if not more, useful for driving digital transformation as the production and original data located on their primary storage.

Posted November 01, 2018

While AI and machine learning cannot—yet—turn back time, cognitive technologies can analyze data in ways that were previously unattainable. Manual modeling on past failure patterns executed by "data scientists" is nothing new, but data analysis performed by AI-powered platforms builds cognitive learning which not only can learn from past failure patterns but more importantly learn to detect issues not known or seen before.

Posted November 01, 2018

There's no easy answer to the dilemma of infrastructure decline. But the obvious response, to attack the problems in piecemeal fashion, clearly doesn't work. That's what organizations have been doing all along: bringing in faster processors, adding databases, deploying more clouds. These may offer temporary relief, but eventually they'll just add to data complexity and congestion.

Posted November 01, 2018

There's a renaissance happening in organizations today. Process automation is now ‘in vogue' again. It's no doubt that robotic process intelligence and artificial intelligence are driving this renaissance, helping to transform the enterprise. But what that transformation looks like is another question entirely.

Posted October 10, 2018

Data lakes have helped organizations deal with the massive amounts of data generated daily. They are intended to serve as a central repository for raw data, a treasure trove for data scientists to analyze and gain actionable insight. They also serve as the foundation for many "self-service" analytics initiatives. While getting data into a lake is simple, getting insight and value from all of that data, however, has proven to be challenging for many organizations. A recent Forrester report found that 60%-73% of all enterprise data goes unused for analytics. This statistic exposes some of the harsh realities of data lakes.

Posted October 10, 2018

Shortly after contracting with a cloud service provider, a bill arrives that causes sticker shock. There are unexpected and seemingly excessive charges, and those responsible seem unable to explain how this could have happened. The situation is urgent because the amount threatens to bust the IT budget unless cost-saving changes are made immediately. But this cloud services sticker shock is often caused by mission-critical database applications, as these tend to be the most costly for a variety of reasons.

Posted October 10, 2018

To stay competitive in today's digitally driven market, the modern enterprise must keep pace with end users' expectations. Customers and employees alike want access to information anytime, anywhere, giving them the flexibility to work, shop, bank, and live on their own terms. While most forward-looking organizations are making strides to deliver this remote access—largely by moving to a hybrid IT model—many are still limited by the state of their back-end infrastructures.

Posted October 10, 2018

These days, clouds are everywhere, providing today's database managers with an impressive range of options to choose from—including public cloud, private cloud, and, for most, somewhere in between in the hybrid realm. There may be multiple variations within a single organization, and these distinct hybrid environments are constantly evolving as well. These may be "intentional" and "accidental" hybrid environments, but accidental or not, "variety" is the watchword for many hybrid projects.

Posted October 10, 2018

There is no shortage of content flowing through today's enterprises, including data, documents, graphics, videos, and more. This is raw material that provides a wealth of opportunities—many of which are untapped—to businesses. The catch is that this data and content is scattered across various systems inside and outside of enterprises. Plus, there is not enough understanding of who views which content, and what is motivating them to consume the content. As more organizations seek to embrace digital transformation, they are turning to content automation as a way to deliver information quickly and effectively to their customers.

Posted October 10, 2018

The Oracle Applications Users Group (OAUG) is committed to the education, networking, and advocacy of Oracle Applications users. Each year during Oracle OpenWorld, the group hosts special interest group (SIG) meetings as well as additional opportunities for networking and education. 

Posted September 20, 2018

Quest Oracle Community (formerly Quest International Users Group) just rebranded and launched a fully redesigned digital experience to better meet customer needs amidst the rapidly changing landscape of business technology and Oracle solutions. Quest offers Oracle users a wealth of resources through webinars, blogs, customer stories and more, along with support from a community of users with similar interests.

Posted September 20, 2018

2018 is an exciting year for Oracle OpenWorld, with as much change in the event format as there is in the technology. The Independent Oracle Users Group (IOUG) will be well represented in the Content Catalog, as over 20 members are scheduled to present on the hot topics for data professionals.

Posted September 20, 2018

Every year, thousands of data experts and professionals from over 100 different countries converge at Oracle OpenWorld in San Francisco to discover the newest updates to Oracle's ecosystem of technologies. Designed for attendees who want to connect, learn, explore and be inspired, Oracle OpenWorld offers more than 2,200 educational sessions led by more than 2,000 customers and partners sharing their experiences, first hand.

Posted September 20, 2018

Every year, thousands of data experts and professionals from over 100 different countries converge at Oracle OpenWorld in San Francisco to discover the newest updates to Oracle's ecosystem of technologies. Designed for attendees who want to connect, learn, explore and be inspired, Oracle OpenWorld offers more than 2,200 educational sessions led by more than 2,000 customers and partners sharing their experiences, first hand.

Posted September 04, 2018

The Open Mainframe Project has announced Zowe, an open source software framework that bridges the divide between modern applications and the mainframe, intended to provide easier interoperability and scalability among products and solutions from multiple vendors. Zowe is the first open source project based on z/OS.

Posted September 04, 2018

As the industry embraces artificial intelligence and marches toward autonomous systems, some enterprises will still need to rely on traditional databases that must be managed, tuned, and secured to ensure the best performance. And while not all organizations will be able to make the autonomous transition quickly due to internal processes, there are a few tricks to optimize your old database with terabytes of data at speeds needed for everything from e-commerce to IoT and music streaming services on a global scale.

Posted September 04, 2018

Data integration aims to provide a unified and consistent view of all enterprise wide data. The goal is to logically (and sometimes also physically) unify different data sources or data silos to provide a single unified view which is as correct, complete, and consistent as possible.  

Posted September 04, 2018

There's been a lot of discussion recently about autonomous databases, which offer the promise of keeping database systems stable and performant while alleviating much of the tuning and configuration drudgery that DBAs must typically slog through. It is important to note that an autonomous database needs to strike a balance between giving administrators too much and too little control. Too much control takes the form of endless configuration and tuning parameters that place the burden of performance and stability on the operator. Too little control takes the form of automated behaviors that reduce the predictability of the system to an unacceptable level.

Posted September 04, 2018

One of the most important shifts in data warehousing in recent times has been the emergence of the cloud data warehouse. Previously, setting up a data warehouse required a huge investment in IT resources to build and manage a specially designed on-premise data center. Now, several cloud computing vendors offer data warehousing functions as a service (DWaaS), accessible via an Internet connection. This model negates the costly capital expenditure and management required for an on-premise data warehouse.

Posted September 04, 2018

Everyone wants to be part of a data-driven enterprise, and for good reason. Data analytics, when applied in a meaningful way, provides an enormous competitive advantage. There's a catch to this though that frequently gets overlooked amidst the glowing analyst projections and keynote speeches about a limitless future in which systems and machines do all the heavy lifting and thinking for businesses. Data—the right kind, in the right sequence, in the right context—doesn't just magically drop out of the cloud. It needs to be discovered, identified, transformed, and brought together for analysis, management, and eventual storage.

Posted September 04, 2018

Data lakes are a convenient and cost effective way to store a lot of data with completely diverse structures so that you don't have to build the model ahead of time, explained Paul Sonderegger, senior data strategist, Oracle, during Data Summit 2018. Data lakes are a convenient way and cost effective way to store a lot of data with completely diverse structures so that you don't have to build the model ahead of time, explained Paul Sonderegger, senior data strategist, Oracle during Data Summit 2018.

Posted September 04, 2018

Costly data breaches are on the rise—2017 was a record-setting year in terms of cybersecurity incidents, with more than 14.5 billion malware-infected emails sent and 1.9 billion data records stolen in the first 6 months of the year.

Posted August 08, 2018

This year is an expansive one for the database ecosystems that have evolved around the major platforms. Artificial intelligence (AI), machine learning, the Internet of Things (IoT), and cloud computing are now mainstream offerings seen within the constellations of database vendors, partners, and integrators.

Posted August 08, 2018

This may seem contradictory at first glance: Fresh data from the database user community finds that data lakes continue to increase within the enterprise space as big data flows get even bigger. Yet, at the same time, enterprises appear to have pulled back on Hadoop implementations.

Posted August 08, 2018

The concepts of Agile methodology and continuous delivery have become popular in software development, yet they are somewhat less mature among DBAs and database developers. Shay Shmeltzer, director of product management for Oracle Cloud Development Tools, discussed how database administrators (DBAs) and SQL developers can take advantage of newer development approaches while also dealing with the unique challenges that exist in the world of database development.

Posted August 08, 2018

Experts may disagree on the precise definitions of artificial intelligence (AI), cognitive computing, machine learning (ML), or natural language processing. However, there is no debate about whether the proliferation of sensors and mobile devices, the rapid increase in data volume, and the heightened need for rapid decision making is fueling a demand for smarter solutions and greater automation.

Posted August 08, 2018

The votes have been counted and the results are in. Now, it's time to offer congratulations as Database Trends and Applications magazine unveils the 2018 Readers' Choice Awards winners. Many of the vendors and products are well-known with market-leading positions established over many years.  However, there are also newer names in the mix, representing the rapidly evolving nature of information technology solutions and services.

Posted August 08, 2018

Today, there is a wide range of big data technologies helping organizations reap the benefits of the massive data volumes available to their companies. Until recently, most data was generated for a single objective and difficult to repurpose. Much of it even ended up stored on tapes that were rarely if ever accessed. The bottom line is that it was difficult for decision makers to get to all the information they needed. But today, new tools, platforms, and technologies are changing all that.

Posted August 08, 2018

The demand to become a data-driven business with a competitive edge in the digital economy is greater now than ever. BI and analytics is recognized as the cornerstone to success today. As the value of data becomes better appreciated, organizations are deploying  a multi-pronged approach that includes making data access and insights available to a broader swath of users as opposed to it remaining limited to small groups of analysts or executives, offering compelling data visualizations, and making data available more quickly for decision making.

Posted August 08, 2018

As a key component to data integration best practices, change data capture (CDC) is based on the identification, capture, and delivery of the changes made to enterprise data. CDC helps minimize access to both source and target systems. It also supports the ability to keep a record of changes for compliance, and is a key component of many data processes.

Posted August 08, 2018

After many years of relying mainly on relational database management systems in on-premise data centers, organizations are finding viable additional options in the form of cloud computing deployments and newer NoSQL options, a new study finds. Cloud computing options have provided the possibility to host some database-centric applications that typically would be hosted in an on-premises data center.

Posted August 08, 2018

Business agility and reduced cost are leading reasons that companies are adopting cloud technologies and hybrid cloud approaches.  Although security was initially a key obstacle standing in the way, increasingly that concern is dissipating while advantages in agility and cost reduction become leading drivers for the move to the cloud.

Posted August 08, 2018

Whether it is cognitive computing, machine learning, intelligent automation, augmented reality, or artificial intelligence, smart technologies are gaining ground with use cases spanning health services, analytics, customer service, manufacturing, logistics, and a range of other fields.

Posted August 08, 2018

The big focus in analytics today is on access for all, and the ability to not only see what happened in the past but what is going on now or about to take place. A recent survey by Forbes Insights and Dun & Bradstreet of more than 300 senior executives across a broad range of industries confirmed that the goal of many organizations is to develop a data-driven culture, but also finds there is still plenty of work to be done to make that a reality.

Posted August 08, 2018

The data governance market is expected to grow from $1.31 billion in 2018 to $3.53 billion by 2023, increasing by a CAGR of 22%, according to a recent ResearchandMarkets.com report. What is driving that growth? It is a combination of factors, the research shows, including rapidly increasing data volumes, new regulatory and compliance mandates, and the need to enhance strategic risk management and decision making as well as greater business collaboration.

Posted August 08, 2018

Today, data is both the output and the fuel of companies. For many however, the process of becoming a data-driven organization is hindered inflexible systems that were created years and even decades earlier.

Posted August 08, 2018

Top data modeling solutions enable organizations to discover, design, visualize, standardize and deploy high-quality data assets through an intuitive GUI. With the ability to view "any data" from "anywhere" for consistency, clarity and artifact reuse across large-scale data integration, master data management, big data and business intelligence/analytics initiatives, today, data modeling is a critical component to many initiatives.

Posted August 08, 2018

Today, data quality solutions are available in the cloud as software as a service as well as on premise, and support the necessary data integrity for critical systems such as customer relationship management, master data management, data governance initiatives, and database management, as well as for regulatory compliance initiatives—including the E.U.'s new General Data Protection Regulation (GDPR).

Posted August 08, 2018

While the data growth rate, number of database instances, and number of platforms that each DBA must support has not changed radically in the last few years, the database infrastructure has become more complicated. Two key factors are at play in the increasing complexity.

Posted August 08, 2018

Cyberattacks are becoming the number-one risk to businesses, brands, operations, and financials, according to recent "SonicWall Cyber Threat Report" (March 2018).  There were 9.32 billion malware attacks in total in 2017, representing an 18.4% increase over 2016. On the other hand, Verizon's Data Breach Investigations Report (DBIR) shows that more a quarter of the time, data breaches across the world originated from an organization's "insiders." But the report notes, malicious employees looking aren't the only insider threat you face. Errors were at the heart of almost one in five (17%) breaches.

Posted August 08, 2018

Storage solutions provide critical services for backup and archiving, content storage and distribution, big data analytics, and disaster recovery. Increasingly also there are smarter storage solutions, enabling greater efficiency and cost reduction through data compression, information lifecycle management, and tiered storage strategies.

Posted August 08, 2018

With data virtualization organizations can gain the ability to allow the business and IT sides of organizations work closer together in a much more agile fashion, reducing complexity and boosting productivity. Data Virtualization helps customers find and analyze the data they need in hours or days, rather than months, so that they can quickly discover insights and take insight-driven action, said Mark Palmer, senior vice president of analytics, TIBCO.

Posted August 08, 2018

Today's data visualization tools go beyond the standard charts and graphs used in Excel spreadsheets, displaying data in more sophisticated ways such as infographics, dials and gauges, geographic maps, sparklines, heat maps, and detailed bar, pie and fever charts. The images may include interactive capabilities, enabling users to manipulate them or drill into the data for querying and analysis.

Posted August 08, 2018

Companies are increasingly looking for the right database for the data storage need at hand. That might mean NoSQL, NewSQL, in-memory databases, and cloud databases—also known as database as a service—approaches. 

Posted August 08, 2018

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26

Sponsors