Newsletters




Trends and Applications



Data center downtime and outages are undoubtedly costly—with 40% costing between $100,000 and $1 million each according to a 2020 Uptime Institute report. Ironically, data center management hasn't always been driven by much data. The good news is data center failures are largely avoidable. The bad news is most operators aren't employing the data center analytics solutions needed to proactively predict and prevent downtime. 

Posted March 11, 2022

In the world of DataOps, the dual mandate of data democratization and the need for privacy, security, and compliance are pushed front and center. The speed and simplicity of cloud analytics adoption is opening a new world of risk for corporate data governance teams mandated with ensuring organizations remain compliant in an increasingly complex world of global regulations. In short, managing data and access governance in a traditional top-down, IT-controlled manner, while managing data, analytical, and visualization technologies via a decentralized, agile DataOps framework, is a recipe for disaster.

Posted March 11, 2022

Anyone investing in an SAP ERP solution running on Oracle Linux is clearly serious about ensuring that their critical line of business systems run smoothly, predictably, and efficiently. It should go without saying that those operations will be consistently accessible and run without unexpected interruption. Saying nothing, though, would be a mistake, for there's nothing magical about Oracle Linux that inherently guarantees high availability (HA) for an SAP landscape. Without proper planning and an infrastructure configured for HA, that big investment is vulnerable.

Posted March 11, 2022

Junk data is any data that is not governed. Junk data starts to accumulate when individuals make copies of data from a larger dataset for a particular use case, make changes to it, and then do not integrate those changes into the larger set. By taking the time to invest in data integrity upfront, a company can ensure the quality and security of its data assets are appropriately available to the business ultimately saving time and money.

Posted March 11, 2022

Major technology trends are reshaping the DBA role at many organizations. The size and complexity of database environments continues to grow with higher data volumes, more workloads, and an increasing rate of database deployments that need to be managed. To help IT decision makers and database professionals tackle these changes, challenges, and opportunities, DBTA recently held a special roundtable webinar with Kathryn Sizemore, senior MySQL/MariaDB database administrator, Datavail and Devin Gallagher, senior sales engineer, IDERA.

Posted March 11, 2022

IBM Security has released its annual X-Force Threat Intelligence Index unveiling how ransomware and vulnerability exploitations together were able to hurt businesses in 2021, further burdening global supply chains, with manufacturing emerging as the most targeted industry.IBM Security X-Force said that while phishing was the most common cause of cyberattacks in general in the past year, it observed a 33% increase in attacks caused by vulnerability exploitation of unpatched software.

Posted March 11, 2022

Each year, Data Summit features industry-leading experts covering the topics that matter most for data professionals who want to stay on top of the latest technologies and strategies. The conference program is now available for review, and a variety of pass options are being offered, including special pricing for attendees who register early.

Posted March 11, 2022

The costs of downtime—even for a minute—are simply too steep for today's digitally evolving enterprises to tolerate. As part of their efforts to keep expensive downtime at bay—and ensure the continued viability and availability of data—data managers are increasingly turning to strategies such as automation and cloud services. Still, they continue to have difficulties and acknowledge that keeping their data environments up-to-date is holding them back from delivering more capabilities to their organizations.

Posted February 08, 2022

Data management has never been so unfettered—and yet so complicated at the same time. An emerging generation of tools and platforms is helping enterprises to get more value from their data than ever. These solutions now support and automate a large swath of structural activities, from data ingestion to storage, and also enhance business-focused operations such as advanced analytics, AI, machine learning, and continuous real-time intelligence.

Posted February 08, 2022

As business has become more digital, data has become the most valuable asset of many organizations. But protecting that data has also become much more compli­cated as organizations increasingly migrate it to a mix of public and private cloud infra­structures, such as Microsoft Azure, Amazon Web Services, and Google Cloud. With most businesses today operating in a multi-cloud environment, it's no longer possible to sim­ply lock up precious data in the proverbial vault and guard the perimeter.

Posted February 08, 2022

As an industry, we've been talking about the promise of data lakes for more than a decade. It's a fantastic concept—to put an end to data silos with a single repos­itory for big data analytics. Imagine having a singular place to house all your data for analytics to support product-led growth and business insight.

Posted February 08, 2022

Last year picked up where 2020 left off. Though pandemic restrictions have eased, 2022 looks to be another uncertain year with the rise of the omicron variant. As we begin 2022, DBTA presents the annual MultiValue Special Report and asks MV executives to address several questions.

Posted February 08, 2022

The internet and IoT benefit from AI, and AI is quickly shaping the world around us and becoming increasingly important within business operations. In fact, research by Deloitte shows that 73% of IT and line-of-business executives view AI as an indispensable part of their current business. It's clear to see that there is great potential for AI in virtually all areas of our lives. However, AI systems can only ever be as powerful as the information they are built on. Huge quantities of very specific data is needed to effectively train systems in the right way. Here, we'll explore the key points behind the data required and how it is being sourced.

Posted January 03, 2022

The pandemic of the past 2 years has proven to be a significant disruptor, bringing to light the many flaws in cybercrime preparedness across several industries. However, opportunity exists for government and business leaders to repair the cracks and prepare for the future. Underpinning good automation with best practices will ensure that organizations are set up for success against future attacks.

Posted January 03, 2022

Today, businesses want data to help them get deep, dynamic, and even real-time insights into their customers, markets, and supply chains, often via machine learning models. They want to make better predictions both of day-to-day realities and longer-term trends. They want data to help them create better, unique experiences for their customers. And they need fast access to data to support the rapid innovation that will let them thrive in the new hyper-competitive environment. Businesses can no longer afford to wait for the old data mills' wheels to turn.

Posted January 03, 2022

Database management system (DBMS) configuration tuning is an essential aspect of any data-intensive application effort. But it is historically difficult because DBMSs have hundreds of configuration "knobs" that control everything in the system, such as the amount of memory to use for caches and how often the DBMS writes data to storage.

Posted January 03, 2022

In the database world, as well as elsewhere, high availability (HA) and disaster recovery (DR) are sometimes confused—or even considered to be the same thing. HA is the ability of a database and its associated services to operate continuously without failing and to deliver an agreed service level (SLA) of operational uptime, whereas DR is the ability to recover data/databases and maintain/regain services after an outage event, or a natural or manmade catastrophe. Ensuring just one or the other does not equate to cyber-resilience. You really must have both.

Posted January 03, 2022

Hazelcast provides a streaming and memory-first application platform for stateful, data-intensive workloads on-prem, at the edge, or as a fully managed cloud service. The company recently  announced version 5.0 of the Hazelcast Platform in which the Hazelcast IMDG and Hazelcast Jet products have been merged into a single product to not only simplify the development of applications using Hazelcast but also put more focus on developing real-time applications that will engage with end users in new ways. Hazelcast CEO Kelly Herrell shared more information about the new release as well as the company's future road map.

Posted January 03, 2022

As many tools and platforms have evolved to solve niche problem areas, they are fairly disconnected, creating more confusion for data leaders to decide what tools and platforms are needed to support end-to-end business needs. In simple terms, XOps can be broken down to "X" related to data, infrastructure, business intelligence (BI), and machine learning (ML) models, and "Ops" is the automation via code. The individual component has existed for years, but the difference now is they are interconnected to drive agility and innovation by removing silos.

Posted December 08, 2021

You've decided to go with ELK to centralize and manage your logs. Wise decision. But before you go ahead and install Elasticsearch, Logstash, Kibana, and the different Beats, there is one crucial question that you need to answer: Are you going to run the stack on your own, or are you going to opt for a cloud-hosted solution?

Posted December 08, 2021

The past 2 years have been defining ones for enterprises seeking to become data-driven. There have been changes wrought by COVID-19, of course, but, even before the pandemic, companies were already on a path to better leverage the data that was streaming in from all corners of their orga­nizations. With this heightened focus, new roles have been emerging for the caretakers of data, including database administrators, data engineers, data analysts, data scientists, and developers.

Posted December 08, 2021

Becoming a data-driven enterprise isn't just a lot of analyst hyperbole. It is the ability to deliver tangible results, from successfully launching new products to achieving increased productiv­ity. A recent study of 1,250 executives, con­ducted by the Enterprise Strategy Group and Splunk, reveals that data leaders—those organizations that are excelling at data clas­sification, aggregation, quality measures, investigation skills, and monitoring—are seeing results in their bottom lines and mar­ket positions. At the same time, the survey shows, all organizations still lag in moving forward with data aggregation, classification, and monitoring.

Posted December 08, 2021

The importance of leveraging data quickly and effectively is a message that has come through loud and clear in recent years—and with increasing intensity since the onset of the COVID-19 pandemic. Whether it is anticipating supply chain problems, addressing customer concerns with agility, or identifying new opportunities and pouncing quickly, the ability to achieve a comprehensive view of all available information for real-time decision making has become a strong theme. To help make the process of identifying useful products and services easier, here, DBTA presents a list of Trend-Setting Products for 2022.

Posted December 08, 2021

The value of analytics is fairly easy to realize when implemented correctly. As businesses start to see increased operational efficiency, more successful marketing campaigns, improved customer retention, and many other benefits, adoption will continue to improve and analytics will soon be driving the decision making of the entire organization.

Posted November 01, 2021

We can count on seeing more cyberattacks and there are two reasons for that. First, as AI becomes more ubiquitous and embedded in our everyday lives, it presents yet another threat surface hackers will seek to exploit to wreak havoc at scale. More AI, more attacks. Second, the Silicon Valley "move fast and break things" model has meant that historically teams have pushed a lot of code into production without putting security guard rails in place. The same thing is now happening with AI. We're so focused on getting novel capabilities into production—that we prioritize first-to-market movements over security, leaving the door open for exploitation.

Posted November 01, 2021

Cloud migration benefits center primarily around increased efficiency. The ideal results of a successful cloud migration yield a more intelligent allocation of resources that allows companies to focus on their core competencies. Within that, successful migrations reduce IT costs, improve the end-user experience and increase scalability and security. Once adopted, cloud computing users can reduce capital expenditures and decrease maintenance and operational costs—ensuring maximum results with minimal expenses. However, not every migration is successful, and the process can be complex, leading to mistakes that can hinder the journey and incur negative ramifications.

Posted November 01, 2021

Data governance used to be relatively simple to define; it originally required handling data quality, metadata management, discovery, and classification. But traditional data governance has its limits—it doesn't take security into account and often leaves companies, customers and data at risk. Companies struggle to protect access to sensitive data—in fact, this is the leading cause of cyberattacks almost every year.

Posted November 01, 2021

Organizations focus a majority of their database migration efforts on a single task: synchronizing data from production to their new target database. The migration goal is to have a perfect copy of the production data in the replacement database so that the cutover will be in as small of a maintenance window as possible. While data migration is a critical step in the overall migration project plan, it shouldn't consume the majority of an organization's resources. There are four key areas to consider when planning a migration from MongoDB to Amazon DocumentDB (with MongoDB compatibility.

Posted October 05, 2021

October is "National Cyber Security Awareness Month," putting the spotlight on the need to keep data safe. In particular, as businesses continue to invest in SaaS-based solutions, they must rethink their risk management strategies to prioritize protecting one of their most important assets: SaaS app data. The transition to cloud and the adoption of SaaS-based applications is not a new phenomenon, but the pandemic clearly accelerated the shift. Notably, cloud spending increased 37% to $29 billion during the first quarter of 2020 alone, despite an expected 8% decline in overall IT spending. With hybrid and remote working models now becoming the norm, this reliance on cloud and SaaS will surely continue as organizations look for scalable and cost-effective ways to provide employees with anytime, anywhere access to information.

Posted October 05, 2021

What types of platforms are most viable for modern data analytics requirements? These days, there are a wide variety of choices avail­able to enterprises, including data lakes, ware­houses, lakehouses, and other options—resident within an on-site data center or accessed via the cloud. The options are boundless. It's a matter of finding the best fit for the business task at hand.

Posted October 05, 2021

AI, machine learning, and edge com­puting may be all around us, and these technology endeavors all have one important thing in common: Their suc­cess depends on the quality of the data fed into them. Data managers recognize that data quality efforts must be improved to meet these new demands and they are con­cerned about the quality of the data moving through their enterprises. Eight in 10 orga­nizations' data quality efforts are lagging or problematic. These are among the findings of a new survey of 238 data managers conducted by Unisphere Research, a division of Information Today, Inc., in partnership with Melissa.

Posted October 05, 2021

Today, a data warehouse is used to do more than just integrating data from multiple sources for better, more accurate analysis and reporting. A data warehouse must also be reliable, traceable, secure, and efficient at the same time. It needs to offer these advantages to differentiate itself, especially in business intelligence. This is where good data warehouse governance becomes very important. There are several enterprise data warehouse best practices and governance tips to keep in mind, along with key principles to implement.

Posted October 05, 2021

When it comes to the debate concerning the pros and cons of both all-in-one platforms and best-of-breed systems, there was some contention in the past over which solution reigned supreme. However, in today's modern technological landscape where businesses across industries have completely revamped the way they structure and utilize their IT products and needs, this is no longer the case.

Posted September 16, 2021

Last year, organizations around the world, across all industries, were forced to leverage new technologies on multiple fronts to accommodate a new normal. The adoption of AI and machine learning saw exponential growth to bring about the changes needed to keep up with the shift to remote working. AI and ML technologies found their way into everything from advanced medical diagnostic systems to quantum computing systems, and from virtual assistants to smart homes. According to Algorithmia's 2021 Enterprise Trends in Machine Learning report, 50% of enterprises plan to spend more on AI and ML in 2021, with 20% saying they will be significantly increasing their budgets.

Posted September 16, 2021

Document-oriented databases are one of the fastest growing categories of NoSQL databases, and the primary reason is the flexibility of schema or logic design. Document databases make it easier for developers to store and query data in a database by using the same document-model format they use in their application code. The flexible, semi-structured, and hierarchical nature of documents and document databases allows them to evolve with applications' needs.

Posted September 16, 2021

A solid data protection strategy is required for optimum knowledge management that safeguards critical company content. Recognizing that today more than ever, an organization's data can always be threatened by cybercrime, lost by accident, or damaged through natural disasters, off-site data protection becomes an indispensable component of every enterprise's data protection strategy.

Posted September 16, 2021

A multi-cloud approach can provide a number of benefits including avoiding vendor lock-in, optimizing cost performance, and increasing reliability by distributing resources in the event of an IT disaster. These advantages have many organizations racing to adopt multi-cloud as the solution to their infrastructure needs. However, an unsuccessful transition can cost businesses significant time, resources, and money. Moving to multi-cloud requires deep planning and knowledge of each ecosystem to ensure a smooth transition.

Posted September 16, 2021

To meet the needs of the digital economy of the 2020s, data architecture has evolved into a dif­ferent animal than it was 10, or even 5, years ago. Most notably, there are three trends that have changed the way enterprises look at and design their data architectures.

Posted September 07, 2021

The past 18 months have put the need for data-driven insight and agility into sharp focus for many organizations. With more data than ever being created, making the right choices among the myriad options for data management and analytics is a top priority for many organizations. To help them progress along their data-driven journeys, each year, DBTA presents the Readers' Choice Awards which provides the opportunity to recognize com­panies whose products have been selected by the experts—our readers.

Posted August 11, 2021

We all know that in uncertain times, a forecast underlies a company's success or failure. Forecasts keep prices low by optimizing business operations—including cash flow, production, staff, and financial management—while increasing knowledge of the market. Business forecasting gives you an essential tool for adapting to change and fostering competitive advantage.But relevant forecasting isn't easy.

Posted August 02, 2021

To meet the needs of the digital economy of the 2020s, data architecture has evolved into a dif­ferent animal than it was 10, or even 5, years ago. Most notably, there are three trends that have changed the way enterprises look at and design their data architectures.

Posted August 02, 2021

One of the challenges of working with Hadoop environments has been maintaining the infrastruc­ture for big data projects. That's where cloud makes things easier and, increas­ingly, has served as the underlying infra­structure platform of choice for Hadoop initiatives. At the same time, not every­thing has moved to the cloud just yet for big data environments. Many IT managers expect to live in a hybrid environment. They are planning for multi-cloud data management to deliver business value and are also still relying on old-school approaches and manual tools to support their data environments.

Posted August 02, 2021

Over the last few years it has been fascinating to see how organizations evolve their data management and application development environments.  Traditionally, companies depended on monolithic architectures that initially served them well, however today's on-demand business environment calls for a model that can support a more flexible, microservices-driven approach, and facilitate the pace of innovation.  Why is this trend towards microservices and a DevOps approach becoming so pervasive?  The answer relates to a higher-level trend: the push towards on-demand IT, as the focus has shifted to prioritize the developer experience, and the specific technologies they are using.

Posted August 02, 2021

Let's start by admitting that the title of this article is a tease. It's a valid question and one that thinking people ask all the time. But in truth it's not the first question you should be asking. More importantly, the answer to the question really depends on how you answer the questions that you should be asking first. Here the questions to ask.

Posted August 02, 2021

Latency is the new outage: It's a phrase that's being thrown around the networking industry—a mantra if you will. It used to be that experience was measured in uptime and downtime. Is Google down right now? Is AWS experiencing an outage in Asia? Can anybody log on to Salesforce? Those days are over. Being up and running is not enough. Applications need to be fast, and the expectation is that every retailer—from Target to the mom-and-pop store down the street—needs to have a digital omni-channel presence with experiences that rival Amazon—a nearly half-a-trillion-dollar global brand.

Posted July 15, 2021

The business uses for cognitive computing and related technologies, including AI, machine learning, natural language processing, and robotic processing automation, are becoming more widespread than ever. To help readers gain a greater understanding about this emerging area of information technology, the solutions available, and their role in handling real-world challenges, DBTA and Big Data Quarterly present the list of Cool Companies in Cognitive Computing.

Posted July 15, 2021

Today, data is driving business success. Downtime is a killer and even slow response time can be hazardous to business health. But how do organizations achieve the performance and speed that is required today amid growing complexity and rapid data growth?

Posted July 15, 2021

By taking a proactive stance on the importance of rapid decision making, flexibility, and scalability today, you can future-proof your applications—and operations—by building on top of a data platform that's designed for the mod­ern world and capable of data ingestion to action in under 10 milliseconds. That's the new demand and the new reality. 

Posted July 15, 2021

We're still at the start of the 2020s, and already, things look very different from the preceding decade. For data executives and profession­als, the years ahead may mean change on a scale never seen before in the IT industry. Promising new technologies—as well as redesigned and repurposed older ones—are reshaping the data center and analytics shops in new and exciting ways. We asked industry leaders for their views on what is enhancing the ability of enterprises to compete on data.

Posted June 10, 2021

With cloud-based enterprise software becoming the norm, set against a back­drop of businesses having to adapt their processes and operations in a highly volatile economic environment, Kuber­netes-supported software deployments are a perfect fit. Those C-level executives responsible for business change and devel­opment should not ignore the power of Kubernetes and its ability to ensure their business has the digital foundation to adapt in a safe, secure, and efficient man­ner for years to come. 

Posted June 10, 2021

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30

Sponsors