Newsletters




Trends and Applications



Data fabrics are an emerging technology, a modern distributed data framework encompassing architecture as well as data management and integration software, designed to help organizations manage their data. Data fabrics can help get more—and more relevant—data into workflows faster for business intelligence and reporting. Much the way businesses use IT architectures to manage and maintain IT assets, data fabrics help businesses manage and get value from data.

Posted June 02, 2022

Broadcom Inc., a global technology provider that designs, develops, and supplies semiconductor and infrastructure software solutions, announced it is acquiring VMware, Inc., an innovator in enterprise software. Broadcom will acquire all of the outstanding shares of VMware in a cash-and-stock transaction that values VMware at approximately $61 billion, based on the closing price of Broadcom common stock on May 25, 2022. In addition, Broadcom will assume $8 billion of VMware net debt.

Posted June 02, 2022

Companies now collect more data than ever before, but challenges remain for accessing and analyzing them. David Armlin, VP solution architect and customer success, ChaosSearch, discussed "Learn, Unlearn, Relearn: Embracing the Future of Cloud Analytics," during his Data Summit 2022 session.

Posted June 02, 2022

There are so many new buzzwords lately, including the data lakehouse, data mesh, and data fabric, just to name a few. But what do all these terms mean, and how do they compare to a data warehouse? This presentation covers all of them in detail and explains the pros and cons of each, with suggested use cases so attendees can see what approach will really work best for their big data needs.

Posted June 02, 2022

Data is one of the most valuable assets your company owns. Database management, however, is undifferentiated heavy lifting that incurs high operational costs. Management of your database, such as applying patches, managing backups and failovers, is not where you provide value to your customers.

Posted May 04, 2022

The volume, velocity and veracity of today's data deluge has put immense pressure on underlying data platforms and organizations' abilities to manage them effectively. And the pandemic has only exacerbated the problem. According to a 2021 survey, nearly half of digital architects are under high or extremely high pressure to deliver digital projects, but 61% blame legacy technology for making it difficult to complete modernization efforts. That said, databases of all types—SQL, NoSQL, or NewSQL—be they on-prem, cloud, hybrid, or edge, are struggling to navigate this new reality.

Posted May 04, 2022

No other subject seems to capture the attention of IT leaders right now like database migrations. If there were an IT theme for 2022, it would be: Enterprises migrate from legacy data warehouses to the cloud. And it is no longer just the "early adopters" but the entire customer base that is looking to make the move to cloud-based systems. Let's examine the three most common problems that hamper the execution of migration projects and what can be done to avert migration disasters.

Posted May 04, 2022

PostgreSQL compatibility is a no-brainer for any modern database offering a unique feature set. Every modern database vendor is now offering some level of compatibility to reach developers and increase adoption. Recently, Google Spanner announced compatibility, saying, "PostgreSQL has emerged as the ‘API' for operational databases." But, even if a database claims compatibility, it can be difficult to decipher what that means in relation to another database, as not all "compatibility" is created equal.

Posted May 04, 2022

Your database environment is getting more complex. If that sounds like old hat, that's because it is.For years, articles like this have warned of the dangers of increasingly convoluted database environments. They've discussed how Database Administrators (DBAs) are now tasked with managing thousands of databases and supporting technologies. They've lamented over increasing fragmentation and workflow-stymying bottlenecks, which have, among a whole list of things, significantly hampered developer velocity. And they've explained that opting for specialized databases and tools for every need can quickly spiral into vendor lock-in, swelling support contracts, and surrendered control over your data.

Posted May 04, 2022

The 9th annual Data Summit conference will be held May 17-18, 2022, at the Hyatt Regency Boston. Pre-conference workshops will take place on May 16, 2022. The program is available for review and a variety of pass options are available to suit individual requirements.

Posted May 04, 2022

It's no secret that businesses have become more data-driven in recent years, with powerful big data analytics increasingly deliv­ering on the promise of adding business value. As a result, the quality and enrichment of data have become top priorities for business leaders on a global scale. In fact, according to a recent survey of chief data officers, 88% have started building auto­mation into their data management processes to help manage data quality.

Posted April 07, 2022

It is well known that a database is the fundamental building block for any data-based initiative. Databases are used when collecting, storing, processing, and analyzing data. A database is the silent component that drives business decisions and operational improvements or simply keeps track of inventory. As much as the database should be the almost invisible part of these processes, it is crucial to make the right choice. While it might look easy to select a suitable database, there are a few things to evaluate when making a decision.

Posted April 07, 2022

As everyone pushes for real-time analytics, more responsive online services, and more protection against cybercrime, data resiliency has moved front and center. Put simply, data must be available at all times. This requires a shift in con­ventional thinking toward data resiliency strategies in rec­ognition of the fact that it is no longer a technical issue; it's a business issue.

Posted April 07, 2022

Enterprises are just scratching the sur­face of data-driven opportunities, and many simply aren't ready to leverage their data assets to lead their markets. There are concerns about the security of sharing data between organizations, as well as iden­tifying and building platforms to accomplish a data-driven infrastructure. These concerns may abate as data-driven partner ecosystems and benefits develop. 

Posted April 07, 2022

Anyone investing in an SAP ERP solution running on Oracle Linux is clearly serious about ensuring that their critical line of business systems run smoothly, predictably, and efficiently. It should go without saying that those operations will be consistently accessible and run without unexpected interruption. Saying nothing, though, would be a mistake, for there's nothing magical about Oracle Linux that inherently guarantees high availability (HA) for an SAP landscape. Without proper planning and an infrastructure configured for HA, that big investment is vulnerable.

Posted March 16, 2022

Data center downtime and outages are undoubtedly costly—with 40% costing between $100,000 and $1 million each according to a 2020 Uptime Institute report. Ironically, data center management hasn't always been driven by much data. The good news is data center failures are largely avoidable. The bad news is most operators aren't employing the data center analytics solutions needed to proactively predict and prevent downtime. 

Posted March 11, 2022

In the world of DataOps, the dual mandate of data democratization and the need for privacy, security, and compliance are pushed front and center. The speed and simplicity of cloud analytics adoption is opening a new world of risk for corporate data governance teams mandated with ensuring organizations remain compliant in an increasingly complex world of global regulations. In short, managing data and access governance in a traditional top-down, IT-controlled manner, while managing data, analytical, and visualization technologies via a decentralized, agile DataOps framework, is a recipe for disaster.

Posted March 11, 2022

Anyone investing in an SAP ERP solution running on Oracle Linux is clearly serious about ensuring that their critical line of business systems run smoothly, predictably, and efficiently. It should go without saying that those operations will be consistently accessible and run without unexpected interruption. Saying nothing, though, would be a mistake, for there's nothing magical about Oracle Linux that inherently guarantees high availability (HA) for an SAP landscape. Without proper planning and an infrastructure configured for HA, that big investment is vulnerable.

Posted March 11, 2022

Junk data is any data that is not governed. Junk data starts to accumulate when individuals make copies of data from a larger dataset for a particular use case, make changes to it, and then do not integrate those changes into the larger set. By taking the time to invest in data integrity upfront, a company can ensure the quality and security of its data assets are appropriately available to the business ultimately saving time and money.

Posted March 11, 2022

Major technology trends are reshaping the DBA role at many organizations. The size and complexity of database environments continues to grow with higher data volumes, more workloads, and an increasing rate of database deployments that need to be managed. To help IT decision makers and database professionals tackle these changes, challenges, and opportunities, DBTA recently held a special roundtable webinar with Kathryn Sizemore, senior MySQL/MariaDB database administrator, Datavail and Devin Gallagher, senior sales engineer, IDERA.

Posted March 11, 2022

IBM Security has released its annual X-Force Threat Intelligence Index unveiling how ransomware and vulnerability exploitations together were able to hurt businesses in 2021, further burdening global supply chains, with manufacturing emerging as the most targeted industry.IBM Security X-Force said that while phishing was the most common cause of cyberattacks in general in the past year, it observed a 33% increase in attacks caused by vulnerability exploitation of unpatched software.

Posted March 11, 2022

Each year, Data Summit features industry-leading experts covering the topics that matter most for data professionals who want to stay on top of the latest technologies and strategies. The conference program is now available for review, and a variety of pass options are being offered, including special pricing for attendees who register early.

Posted March 11, 2022

The costs of downtime—even for a minute—are simply too steep for today's digitally evolving enterprises to tolerate. As part of their efforts to keep expensive downtime at bay—and ensure the continued viability and availability of data—data managers are increasingly turning to strategies such as automation and cloud services. Still, they continue to have difficulties and acknowledge that keeping their data environments up-to-date is holding them back from delivering more capabilities to their organizations.

Posted February 08, 2022

Data management has never been so unfettered—and yet so complicated at the same time. An emerging generation of tools and platforms is helping enterprises to get more value from their data than ever. These solutions now support and automate a large swath of structural activities, from data ingestion to storage, and also enhance business-focused operations such as advanced analytics, AI, machine learning, and continuous real-time intelligence.

Posted February 08, 2022

As business has become more digital, data has become the most valuable asset of many organizations. But protecting that data has also become much more compli­cated as organizations increasingly migrate it to a mix of public and private cloud infra­structures, such as Microsoft Azure, Amazon Web Services, and Google Cloud. With most businesses today operating in a multi-cloud environment, it's no longer possible to sim­ply lock up precious data in the proverbial vault and guard the perimeter.

Posted February 08, 2022

As an industry, we've been talking about the promise of data lakes for more than a decade. It's a fantastic concept—to put an end to data silos with a single repos­itory for big data analytics. Imagine having a singular place to house all your data for analytics to support product-led growth and business insight.

Posted February 08, 2022

Last year picked up where 2020 left off. Though pandemic restrictions have eased, 2022 looks to be another uncertain year with the rise of the omicron variant. As we begin 2022, DBTA presents the annual MultiValue Special Report and asks MV executives to address several questions.

Posted February 08, 2022

The internet and IoT benefit from AI, and AI is quickly shaping the world around us and becoming increasingly important within business operations. In fact, research by Deloitte shows that 73% of IT and line-of-business executives view AI as an indispensable part of their current business. It's clear to see that there is great potential for AI in virtually all areas of our lives. However, AI systems can only ever be as powerful as the information they are built on. Huge quantities of very specific data is needed to effectively train systems in the right way. Here, we'll explore the key points behind the data required and how it is being sourced.

Posted January 03, 2022

The pandemic of the past 2 years has proven to be a significant disruptor, bringing to light the many flaws in cybercrime preparedness across several industries. However, opportunity exists for government and business leaders to repair the cracks and prepare for the future. Underpinning good automation with best practices will ensure that organizations are set up for success against future attacks.

Posted January 03, 2022

Today, businesses want data to help them get deep, dynamic, and even real-time insights into their customers, markets, and supply chains, often via machine learning models. They want to make better predictions both of day-to-day realities and longer-term trends. They want data to help them create better, unique experiences for their customers. And they need fast access to data to support the rapid innovation that will let them thrive in the new hyper-competitive environment. Businesses can no longer afford to wait for the old data mills' wheels to turn.

Posted January 03, 2022

Database management system (DBMS) configuration tuning is an essential aspect of any data-intensive application effort. But it is historically difficult because DBMSs have hundreds of configuration "knobs" that control everything in the system, such as the amount of memory to use for caches and how often the DBMS writes data to storage.

Posted January 03, 2022

In the database world, as well as elsewhere, high availability (HA) and disaster recovery (DR) are sometimes confused—or even considered to be the same thing. HA is the ability of a database and its associated services to operate continuously without failing and to deliver an agreed service level (SLA) of operational uptime, whereas DR is the ability to recover data/databases and maintain/regain services after an outage event, or a natural or manmade catastrophe. Ensuring just one or the other does not equate to cyber-resilience. You really must have both.

Posted January 03, 2022

Hazelcast provides a streaming and memory-first application platform for stateful, data-intensive workloads on-prem, at the edge, or as a fully managed cloud service. The company recently  announced version 5.0 of the Hazelcast Platform in which the Hazelcast IMDG and Hazelcast Jet products have been merged into a single product to not only simplify the development of applications using Hazelcast but also put more focus on developing real-time applications that will engage with end users in new ways. Hazelcast CEO Kelly Herrell shared more information about the new release as well as the company's future road map.

Posted January 03, 2022

As many tools and platforms have evolved to solve niche problem areas, they are fairly disconnected, creating more confusion for data leaders to decide what tools and platforms are needed to support end-to-end business needs. In simple terms, XOps can be broken down to "X" related to data, infrastructure, business intelligence (BI), and machine learning (ML) models, and "Ops" is the automation via code. The individual component has existed for years, but the difference now is they are interconnected to drive agility and innovation by removing silos.

Posted December 08, 2021

You've decided to go with ELK to centralize and manage your logs. Wise decision. But before you go ahead and install Elasticsearch, Logstash, Kibana, and the different Beats, there is one crucial question that you need to answer: Are you going to run the stack on your own, or are you going to opt for a cloud-hosted solution?

Posted December 08, 2021

The past 2 years have been defining ones for enterprises seeking to become data-driven. There have been changes wrought by COVID-19, of course, but, even before the pandemic, companies were already on a path to better leverage the data that was streaming in from all corners of their orga­nizations. With this heightened focus, new roles have been emerging for the caretakers of data, including database administrators, data engineers, data analysts, data scientists, and developers.

Posted December 08, 2021

Becoming a data-driven enterprise isn't just a lot of analyst hyperbole. It is the ability to deliver tangible results, from successfully launching new products to achieving increased productiv­ity. A recent study of 1,250 executives, con­ducted by the Enterprise Strategy Group and Splunk, reveals that data leaders—those organizations that are excelling at data clas­sification, aggregation, quality measures, investigation skills, and monitoring—are seeing results in their bottom lines and mar­ket positions. At the same time, the survey shows, all organizations still lag in moving forward with data aggregation, classification, and monitoring.

Posted December 08, 2021

The importance of leveraging data quickly and effectively is a message that has come through loud and clear in recent years—and with increasing intensity since the onset of the COVID-19 pandemic. Whether it is anticipating supply chain problems, addressing customer concerns with agility, or identifying new opportunities and pouncing quickly, the ability to achieve a comprehensive view of all available information for real-time decision making has become a strong theme. To help make the process of identifying useful products and services easier, here, DBTA presents a list of Trend-Setting Products for 2022.

Posted December 08, 2021

The value of analytics is fairly easy to realize when implemented correctly. As businesses start to see increased operational efficiency, more successful marketing campaigns, improved customer retention, and many other benefits, adoption will continue to improve and analytics will soon be driving the decision making of the entire organization.

Posted November 01, 2021

We can count on seeing more cyberattacks and there are two reasons for that. First, as AI becomes more ubiquitous and embedded in our everyday lives, it presents yet another threat surface hackers will seek to exploit to wreak havoc at scale. More AI, more attacks. Second, the Silicon Valley "move fast and break things" model has meant that historically teams have pushed a lot of code into production without putting security guard rails in place. The same thing is now happening with AI. We're so focused on getting novel capabilities into production—that we prioritize first-to-market movements over security, leaving the door open for exploitation.

Posted November 01, 2021

Cloud migration benefits center primarily around increased efficiency. The ideal results of a successful cloud migration yield a more intelligent allocation of resources that allows companies to focus on their core competencies. Within that, successful migrations reduce IT costs, improve the end-user experience and increase scalability and security. Once adopted, cloud computing users can reduce capital expenditures and decrease maintenance and operational costs—ensuring maximum results with minimal expenses. However, not every migration is successful, and the process can be complex, leading to mistakes that can hinder the journey and incur negative ramifications.

Posted November 01, 2021

Data governance used to be relatively simple to define; it originally required handling data quality, metadata management, discovery, and classification. But traditional data governance has its limits—it doesn't take security into account and often leaves companies, customers and data at risk. Companies struggle to protect access to sensitive data—in fact, this is the leading cause of cyberattacks almost every year.

Posted November 01, 2021

Organizations focus a majority of their database migration efforts on a single task: synchronizing data from production to their new target database. The migration goal is to have a perfect copy of the production data in the replacement database so that the cutover will be in as small of a maintenance window as possible. While data migration is a critical step in the overall migration project plan, it shouldn't consume the majority of an organization's resources. There are four key areas to consider when planning a migration from MongoDB to Amazon DocumentDB (with MongoDB compatibility.

Posted October 05, 2021

October is "National Cyber Security Awareness Month," putting the spotlight on the need to keep data safe. In particular, as businesses continue to invest in SaaS-based solutions, they must rethink their risk management strategies to prioritize protecting one of their most important assets: SaaS app data. The transition to cloud and the adoption of SaaS-based applications is not a new phenomenon, but the pandemic clearly accelerated the shift. Notably, cloud spending increased 37% to $29 billion during the first quarter of 2020 alone, despite an expected 8% decline in overall IT spending. With hybrid and remote working models now becoming the norm, this reliance on cloud and SaaS will surely continue as organizations look for scalable and cost-effective ways to provide employees with anytime, anywhere access to information.

Posted October 05, 2021

What types of platforms are most viable for modern data analytics requirements? These days, there are a wide variety of choices avail­able to enterprises, including data lakes, ware­houses, lakehouses, and other options—resident within an on-site data center or accessed via the cloud. The options are boundless. It's a matter of finding the best fit for the business task at hand.

Posted October 05, 2021

AI, machine learning, and edge com­puting may be all around us, and these technology endeavors all have one important thing in common: Their suc­cess depends on the quality of the data fed into them. Data managers recognize that data quality efforts must be improved to meet these new demands and they are con­cerned about the quality of the data moving through their enterprises. Eight in 10 orga­nizations' data quality efforts are lagging or problematic. These are among the findings of a new survey of 238 data managers conducted by Unisphere Research, a division of Information Today, Inc., in partnership with Melissa.

Posted October 05, 2021

Today, a data warehouse is used to do more than just integrating data from multiple sources for better, more accurate analysis and reporting. A data warehouse must also be reliable, traceable, secure, and efficient at the same time. It needs to offer these advantages to differentiate itself, especially in business intelligence. This is where good data warehouse governance becomes very important. There are several enterprise data warehouse best practices and governance tips to keep in mind, along with key principles to implement.

Posted October 05, 2021

When it comes to the debate concerning the pros and cons of both all-in-one platforms and best-of-breed systems, there was some contention in the past over which solution reigned supreme. However, in today's modern technological landscape where businesses across industries have completely revamped the way they structure and utilize their IT products and needs, this is no longer the case.

Posted September 16, 2021

Last year, organizations around the world, across all industries, were forced to leverage new technologies on multiple fronts to accommodate a new normal. The adoption of AI and machine learning saw exponential growth to bring about the changes needed to keep up with the shift to remote working. AI and ML technologies found their way into everything from advanced medical diagnostic systems to quantum computing systems, and from virtual assistants to smart homes. According to Algorithmia's 2021 Enterprise Trends in Machine Learning report, 50% of enterprises plan to spend more on AI and ML in 2021, with 20% saying they will be significantly increasing their budgets.

Posted September 16, 2021

Document-oriented databases are one of the fastest growing categories of NoSQL databases, and the primary reason is the flexibility of schema or logic design. Document databases make it easier for developers to store and query data in a database by using the same document-model format they use in their application code. The flexible, semi-structured, and hierarchical nature of documents and document databases allows them to evolve with applications' needs.

Posted September 16, 2021

Pages
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31

Sponsors