Newsletters




Trends and Applications



Voting has opened for the annual Database Trends and Applications Readers' Choice Awards Program in which the winning information management solutions, products, and services are chosen by you—the people who actually use them. The competition will end on Wednesday, May 8, so be sure to cast your votes for your favorite products now.

Posted March 05, 2020

One of the primary objectives of any high-availability architecture is to ensure that any single points of failure are eliminated, such as cluster nodes connecting to a single SAN. If you are running SAP in the cloud, you can take advantage of your cloud provider's availability zones, which may exist in different geographic regions. Although a high-availability cluster can be deployed within a single zone, the zone itself is a single-point-of-failure. If the zone becomes unavailable, end users may lose access to the entire cluster.

Posted March 05, 2020

Organizations have progressively pushed more infrastructure from on-premise data centers into cloud data centers. Some of this is driven by the desire to reduce costs, but more often than not, organi­zations are realizing that the infrastruc­ture required for the level of connectiv­ity, data growth, and analytics needed for success in a modern organization is well beyond the reach of homegrown infrastructure.

Posted March 05, 2020

Today, organizations must deliver new applications, as well as application updates and patches, at a faster rate than ever before. DevOps principles address this challenge by focusing on a combination of tools, processes, and collaboration among development and operations teams to enable more agile and integrated workflows for development, test, and deployment. It sounds great. The problem is that this streamlined approach often does not incorporate the database.

Posted March 02, 2020

Zions Bancorporation wanted to ensure that there is continuous software delivery in a fully automated fashion that includes all aspects and layers of the product stack. A key goal was to enable delivery teams to be self-sufficient, without handoffs or silos in other organizations or disciplines.

Posted March 02, 2020

PASS was confident in its ability to enhance and improve the events, content, and learning opportunities PASS offers, but it had two big concerns with the database. First, making changes to the database was a cumbersome process. Second, the implementation of the EU's GDPR requires new protocols.

Posted March 02, 2020

Like many healthcare organizations, SelectHealth's IT team struggled to move quickly, while building innovative, high-quality experiences for customers, and to bring existing services to scale. To accelerate application development, quickly onboard global partners, and revamp its organizational, regulatory, and compliance policies, SelectHealth pursued a platform-based approach.

Posted March 02, 2020

While organizations are adopting cloud for some areas of their database estate, very few are migrating totally. DBAs now need to manage hybrid estates that combine on-premise and cloud deployments. No wonder that 23% of respondents see migrating to the cloud and integrating with the cloud as their biggest challenge over the next 12 months. Ensuring they have the ability to monitor their entire estate from a single tool is therefore vital if DBAs are to remain on top of their workloads.

Posted February 10, 2020

For data managers, AI and machine learning not only offer new ways of delivering rapid insights to business users but also the promise of improving and adding intel­ligence to their own operations. While many AI and machine learning efforts are still works in progress, the technol­ogies hold the potential to deliver more enhanced analytic capabilities through­out enterprises.

Posted February 10, 2020

For years, if not decades, database managers have been struggling with the challenges of providing as much access as possible to corporate informa­tion assets while at the same time protect­ing these crown jewels. All of this work, of course, has had to take place within the confines of relatively tight budget and resource constraints. Now, a new generation of solutions and platforms holds great promise in releasing profes­sionals from the more mundane aspects of their jobs to devote more time to the activities that matter to their businesses. However, even with database automation and cloud resources abundantly available on the market, many database managers still spend inordinate amounts of time on low-level tasks.

Posted February 10, 2020

Registration is open for the seventh annual Data Summit conference, to be held in Boston, May 19-20, 2020, with pre-conference workshops on May 18. Data Summit focuses on both the business and technical aspects of data management and analysis and how it is transforming the business world. Early bird pricing is available now.

Posted February 10, 2020

One of the great challenges with catastrophic events is that they can come from any number of sources, so business owners must make sure they're prepared for all types of disasters. Preparation is the only way to avert disasters and ensure your operations will continue without significant disruption.

Posted January 02, 2020

Data is the lifeblood of all organizations in today's era of digital business transformation. Because it is, the corporate spotlight is especially bright on DBAs who are tasked with ensuring users and applications always have access to the data on which they depend. Considering the hit to a company's bottom line, and potentially to its reputation in the market, when infrastructure and information becomes unavailable, it's easy to see how important it is for these DBAs to have access to the highest-quality, expert technical support professional when help is needed.

Posted January 02, 2020

Key data management trends have emerged that are shaping the capabilities of IT products and services for 2020 and beyond. To help showcase innovative products and services each year, Database Trends and Applications magazine looks for offerings that promise to help organizations derive greater benefit from their data, make decisions faster, and do so with higher levels of security.

Posted December 04, 2019

Think about the last time you filled out a paper form and contrast that with how many times you've filled out forms online. We live in an era where everything is digital. Forms are online, every click is captured, and even personal lives are documented on social media. The first wave of digitization led to more BI and better data-driven decisions. But, as we head into 2020, the focus has shifted from BI to operational analytics. Traditional BI was focused on enabling executives to make decisions using historical data. It was accelerated by technologies such as Hadoop, which were built for scale but could not deliver results.

Posted December 01, 2019

As we stand at the start of a new year and on the precipice of a new decade—the 2020s, DBTA reached out to industry leaders for their perspectives on not only what's ahead in the year 2020 but also what they see developing as the next decade unfolds.

Posted December 01, 2019

Clouds, autonomous databases, and fast-growing data environments dominated the results of surveys by Unisphere Research, a division of Information Today, Inc., throughout 2019. Data keeps expanding beyond the bounds of traditional corporate on-premise systems, and the demands on data managers are growing.

Posted December 01, 2019

The IT industry is going through a major shift from centralized data centers to dispersed deployments across a variety of cloud and on-premise platforms. At the same time, availability is becoming more critical. Recently, Dave Bermingham, technical evangelist of SIOS Technology, shared his views on the current state of high availability in the cloud, and what organizations need to do to ensure continuity of service. "When moving to the cloud, the first thing you will discover is that the traditional SAN-based failover cluster for HA is no longer an option," he noted.

Posted December 01, 2019

Whether you are reading the news, going to the store, dealing with customer service or sending a package, it has become apparent that AI is becoming part of our daily lives. We can see this on more of a macro level with the automotive industry and its adoption of AI to improve the overall driving experience, as well as the healthcare industry as it uses the technology to automate the process of identifying and ultimately diagnosing high-risk patient groups. Even the agriculture industry is taking advantage of AI to improve operating efficiency and assist with the automation of essential farming processes.

Posted December 01, 2019

Businesses have a great deal of experience developing and implementing data protection strategies that allow them to recover from attacks on their on-premise IT environments. However, increasingly, enterprises need to begin considering a new threat to their IT environments. This threat is malicious actors using "zero-day" vulnerabilities—vulnerabilities that are so new, they cannot be patched before they are exploited—to attack and bring down the major cloud providers that organizations are increasingly relying on to host critical applications and data.

Posted October 31, 2019

Database security has always been important, and with the compliance requirements of new regulations such as GDPR and California Consumer Privacy Act (CCPA), it's an issue that reaches across the organization, into the board room and to customers. This attention is putting new pressure on DBAs to secure production data and development and testing databases. Here are some relatively simple database security best practices and security checks that are easily executed, and can help organizations better understand and strengthen their defensive security posture

Posted October 31, 2019

Today, data is critical to every organization and every department within every organization. Yet, all the disparate systems for handling it are creating new challenges. Joe Caserta, founder and president of Caserta, a technology consulting and implementation firm focused on data and analytics strategies and solutions, recently discussed the current state of data integration and what is needed to overcome today's problems.

Posted October 31, 2019

Semantically enabled machine reasoning is an efficient form of AI that can help with basic tenets like data quality and completeness, and that can scale to provide automated pattern recognition for decision support in mission-critical applications. AI - delivered by semantic technologies - opens a wealth of opportunity to improve efficiency in all types of enterprise business applications. In this very powerful new era, errors are reduced, data insights are more sophisticated and quickly gleaned, and staff is freed to focus on excellent service, new product development, and overall business growth.

Posted October 31, 2019

What if the reason the BI implementation was failing was not the users or their willingness to work together, but that they were using the wrong analytics platform

Posted October 31, 2019

On June 11, 2019, the National Institute of Standards & Technology (NIST) released an updated white paper, detailing several action plans (https://csrc.nist.gov) for reducing software vulnerabilities and cyber-risk. In the paper, titled "Mitigating the Risk of Software Vulnerabilities by Adopting a Secure Software Development Framework (SSDF)," NIST provided organizations with solid guidelines to avoid the nasty—not to mention expensive—consequences of a data breach.

Posted October 31, 2019

The data warehouse and data lake each solve different business problems and impose their own unique challenges.Organizations shouldn't write off data warehouses—as they evolve, they are taking on new roles in digital enterprises. Data lakes may add a great deal of flexibility to an enterprise data strategy, but they are supported by fast-breaking technologies that require constant vigilance.

Posted October 31, 2019

There is a sea change underway in enterprise architecture. Just a few years ago, enterprise administrators were fearful of the security implications of trusting an outside provider to protect their data assets. Although security is still a cloud concern—one which predominates at the time of cloud migration, and even grows stronger post-implementation—the use of cloud platforms has gained widespread acceptance.

Posted October 31, 2019

As more companies continue to rely on interconnected networks, virtualized cloud services, and IoT technologies, the potential for downtime and its costs will only rise. By achieving true network resilience, companies can focus on maintaining their services, removing single points of failure and having a plan to bring the network back up to continue normal operations—before it costs them.

Posted October 01, 2019

Data continues to grow in volume, variety, and velocity, resulting in new data management technologies. Recently, Deepti Srivastava, product manager for Cloud Spanner at Google Cloud, discussed how database requirements are evolving and how Google's Cloud Spanner is advancing a relational-NoSQL-convergence approach by giving customers the combined benefits of relational database structure with non-relational horizontal scale.

Posted October 01, 2019

Digital transformation. Infrastructure modernization. Global data center demands. All these forces and more are driving enterprises around the world to seek out next generation cloud-based technologies for a wide range of applications—even those most critical to their business. In reality, however, migrating to the cloud or any other modern architecture is not as easy as it sounds.  

Posted October 01, 2019

DevOps is now widely accepted in application development because, by introducing a culture of collaboration and cooperation between development and operations teams, it enables features to be released faster to end users. As DevOps grows, there is a corresponding need to ensure the database is included so that the entire development process is seamless and free of bottlenecks.

Posted October 01, 2019

Oracle has identified a need for "augmented" analytics, leveraging machine learning and AI throughout the analytics process to help drive up the impact and value of data, and enable knowledge workers to uncover more insights. Recently, Bruno Aziza, group VP, Oracle Analytics, described this new phase in analytics, the role that cloud plays in making it possible, and what the capabilities will enable for customers.

Posted October 01, 2019

Radiant Advisers' John O'Brien identifies four focus areas for data analytics—understanding customer behavior, understanding product usage, increasing operational efficiency, and business model innovation—in his closing keynote at Data Summit 2019.

Posted September 03, 2019

Nearly every week of 2018 featured headlines of a new cyberattack on companies that people trust with their data, such as Marriott and Facebook. In years past, the biggest concern for companies was being hit with hefty fines, but now, they risk reputation damage if they breach compliance mandates and regulations when they are attacked.

Posted August 15, 2019

Streaming platforms allow individuals to see data in real-time batches, enabling businesses to analyze data in motion, simplify the development of applications, and extend the value of existing systems by integrating with already implemented applications along with supporting both structured and unstructured data.

Posted August 14, 2019

A relational database is a set of formally described tables from which data can be accessed or reassembled in many different ways without having to reorganize the database tables. The standard user and application programming interface (API) of a relational database is the Structured Query Language (SQL). SQL statements are used both for interactive queries for information from a relational database and for gathering data for reports.

Posted August 14, 2019

Look to NoSQL for fast, highly scalable access to free-form data. This comes with a few costs, like consistency of reads and other safeguards common to SQL databases. But for many applications, those safeguards may well be worth trading for what NoSQL offers.

Posted August 14, 2019

Highly customized, mission-critical applications have been built on MultiValue database technology, which is sometimes called the fifth NoSQL database technology, for many years now. The MultiValue database dates back to the mid-1960s, with Don Nelson and Dick Pick widely credited as the founding fathers of the technology. Also referred to as Pick or MultiDimensional, a key advantage of MultiValue, is the database structure's use of attributes that can have multiple values, rather than one single value as with relational technology.

Posted August 14, 2019

The Internet of things (IoT) is the inter-networking of physical devices, vehicles (also referred to as "connected devices" and "smart devices"), buildings, and other items embedded with electronics, software, sensors, actuators, and network connectivity which enable these objects to collect and exchange data.

Posted August 14, 2019

In-memory databases and technologies enable decision makers to get to the information they are seeking rapidly and more readily. While in-memory technology has been on the market for many years, today, the demand for intelligent, interactive experiences requires back-end systems and applications operating at high performance, and incorporating movement and delivery of data faster than ever before.

Posted August 14, 2019

After more than 15 years, there is still probably no technology more aligned with advent of big data than Hadoop. The Apache Hadoop framework allows for the distributed processing of large datasets across compute clusters, enabling scale up from single commodity servers to thousands of machines for local computing and storage. Designed to detect and handle failures at the application layer, the framework supports high availability.

Posted August 14, 2019

Today's database administration solutions help to improve DBA productivity while simplifying repetitive administrative tasks, helping to locate and alleviate performance bottlenecks, and optimizing code.

Posted August 14, 2019

With enterprises juggling more than ever?—from massive data volumes and multiple database platforms, to DevOps and the cloud?—database monitoring is more important than ever. Companies can't afford any database downtime.

Posted August 14, 2019

Social media, the Internet of Things, demands for mobile access, and real-time insights are just some of the factors that have increased the pressure on organizations to change how data is managed. And as a result there have never been so many data management choices to deal with it all.

Posted August 14, 2019

For database development teams, maximizing competence, performance, adaptability, and readiness will help simplify development and allow automation to achieve repeatable processes, all while avoiding potential risks that create downtime. Companies want to generate queries and reports, perform SQL development and optimization, detect and diagnose database problems, automate administration tasks, and more

Posted August 14, 2019

With long downtimes simply unacceptable today, organizations seek solutions with capabilities such as the ability to manage backups seamlessly, manage and monitor backups, ensure data integrity, scale efficiently, restore quickly to any point in time, and provide security features to stay in compliance with local geographic and industry mandates.

Posted August 14, 2019

Data visualization tools have evolved beyond the standard charts and graphs used in Excel spreadsheets, displaying data in more sophisticated ways such as infographics, dials and gauges, geographic maps, sparklines, heat maps, and detailed bar, pie and fever charts. The images may include interactive capabilities, enabling users to manipulate them or drill into the data for querying and analysis.

Posted August 14, 2019

With data virtualizatio the business and IT sides of organizations can work closer together in a much more agile fashion, reducing complexity and boosting productivity. Data virtualization enables artificial intelligence/machine learning and data science initiatives by delivering all the available data to algorithms in real time, said Ravi Shankar, chief marketing officer, Denodo.

Posted August 14, 2019

According to an IDC report, the Global Datasphere will grow from 33 Zettabytes (ZB) in 2018 to 175ZB by 2025. "To keep up with the storage demands stemming from all this data creation, IDC forecasts that over 22ZB of storage capacity must ship across all media types from 2018 to 2025."

Posted August 14, 2019

Increasingly stringent data privacy regulations along with a generally lower tolerance for data mishandling, are making companies even more concerned about improving their data security postures and thwarting cyber risk.

Posted August 14, 2019

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29

Sponsors