Information Management News

Sponsors

Linux Becomes a Player in the SQL Server World

If you're looking for proof of the hybrid, multi-platform nature of today's data environments, look no further than many SQL Server sites. Linux—not too long ago seen as a competitive platform to all things Microsoft—has become a platform of choice now supporting many SQL Server environments. Read More

DataKitchen’s Chris Bergh Reveals the Steps for Enterprise DataOps Success at Data Summit Connect 2021

Today, most data and analytic leaders recognize that DataOps provides the foundation for analytics excellence. From his perspective leading DataOps transformations at enterprise organizations, Chris Bergh, DataKitchen's head chef and CEO, shared steps for success in an enterprise DataOps journey at Data Summit Connect 2021. Read More

The Techniques and Technologies Bringing Agility to Enterprise Data

Today's data environments are becoming more flexible and portable than their more rigid, siloed predecessors, thanks to emerging approaches. Strategies that are the building blocks of portability, adaptability, and rapid delivery—containers, microservices, DevOps, and DataOps—are transforming the way data is managed. Read More

Vertica’s Paige Roberts Redefines the Data Warehouse at Data Summit Connect 2021

The modern data warehouse needs to support advanced analytics on multiple types of data from semi-structured to streaming. Driving BI dashboards is still at the heart of a data warehouse, but even those now demand data be current. Old-school batch ETL won't cut it. Paige Roberts, open source relations manager, Vertica, explained why and how our old definitions no longer apply, along with what changed and what drove those changes during her Data Summit Connect 2021 presentation, "Your Definition of a Data Warehouse Is Probably Wrong." Read More

Newsletters


Columnists

Todd Schraml

Database Elaborations

Todd Schraml

  • New Data (Almost) Always Rings Twice Anything worth doing, is worth doing again and again. Right? When building out and standardizing new subject areas or new sources for one's data warehouse, hub, or other analytics area, a task often initially overlooked is the logic bringing in the data. Obviously, everybody knows that the new data must be processed. What many ignore is the idea that establishing the process to bring the data in often must be done twice, or more.
Recent articles by Todd Schraml
Craig S. Mullins

DBA Corner

Craig S. Mullins

  • What Do You Mean by Database Performance? Even in today's modern IT environment, performance management is often conducted reactively instead of proactively. You know the drill. A client calls with a response time problem. A table space maxes out on extents. A program is running without taking commits causing all kinds of locking problems. Somebody changed an application package without checking on the new access paths and transactions are slower than before. And then somebody submitted that "query from hell" again that just won't stop running. Sound familiar?
Recent articles by Craig S. Mullins
Kevin Kline

SQL Server Drill Down

Kevin Kline

  • Azure SQL Database Gets More Secure with Secure Enclaves An important new set of features for security and confidential computing was announced at the recent Microsoft Ignite virtual conference and are now in public preview. Although the name doesn't exactly trip off your tongue, Always Encrypted with Secure Enclaves offers important new capabilities for orga­nizations that need greater control and security over their data while also enjoying the agility, scalability, and productivity gains of the public cloud.
Recent articles by Kevin Kline
Guy Harrison

MongoDB Matters

Guy Harrison

  • MongoDB, meet Kubernetes Setting up a distributed MongoDB cluster by hand is a complex and error-prone process. However, MongoDB provides a Kubernetes operator, which allows such a deployment to be established within a Kubernetes cluster amazingly easily. The "operator" is a controller program that runs within the Kubernetes cluster and contains the MongoDB-specific logic for establishing MongoDB cluster topologies. One need only supply the operator with a configuration file, and the operator will do the rest—creating and configuring MongoDB nodes, setting up best-practice security, and handling the connectivity between nodes.
Recent articles by Guy Harrison
  • Cleaning Dirty Data Remember, all data is dirty—you won't be able to make all of it perfect. Your focus should be on making it good enough to pass along to the next person. The first step is to examine the data and ask yourself, "Does this data make sense?" Data should tell a story or answer a question. Make sure your data does, too. Then, before you do anything else, make a copy—or backup—of your data before you make the smallest change.
Recent articles by  

Trends and Applications