Operational databases continue to expand, with database sizes growing in most organizations. In terms of performance, the more data in the operational database, the less efficient transactions running against that database tend to be. The other impact, database administration complexity, causes longer processing time and outages to perform traditional DBA tasks. But as important as operational performance and administration issues are, frequently they are ancillary to the regulatory issue of preserving authentic data over time.
Posted May 04, 2022
Data privacy regulations—coupled with the desire to protect sensitive data—impose requirements on organizations to protect production data. Since many organizations rely on production data as a source for test data, techniques are needed to mask sensitive data elements from unauthorized viewing.
Posted April 07, 2022
Standard SQL defines four isolation levels that can be set using the SET TRANSACTION ISOLATION LEVEL statement: Serializable, Repeatable Read, Read Committed, and Read Uncommitted. Although all of the popular relational DBMSs support isolation level, the exact naming and implementation will vary.
Posted March 11, 2022
It can get complicated trying to decipher what is actually offered by DBMSs that market themselves as multi-model. Which models? Delivered how? But using a DBMS that supports more than one model can provide a simpler, easier-to-support environment.
Posted February 08, 2022
The DBA should be an advisor to the business in terms of the types of database security that can be enabled. At a high level, this boils down to being able to answer four questions.
Posted January 03, 2022
Being a successful database administrator requires more than just deep technical knowledge of database systems. Gaining the technical know-how is important, yes, but there are many sources that offer technical guidance for DBAs. The non-technical aspects of database administration are just as challenging, yet there are fewer guiding resources to help. With this in mind, let's look at the 10 "rules of thumb" for DBAs to follow.
Posted December 08, 2021
Once data has been created, it gets moved around to support many different purposes. There are many techniques that can be used to facilitate data movement. One of the simplest ways for the DBA to move data from one place to another is to use the LOAD and UNLOAD utilities that come with the DBMS. The LOAD utility is used to populate tables with new data (or to add to existing data), and the UNLOAD utility is used to read data from a table and put it into a data file. Each DBMS may call the actual utilities by different names, but the functionality is the same or similar from product to product.
Posted November 01, 2021
You might think the title of this article is somewhat controversial, but you should wait to render judgment until you've read to the end. There are several important shifts impacting data management and database administration that cause manual practices and procedures to be ineffective. Let's examine several of these trends.
Posted October 05, 2021
Designing and implementing a proper database for application development is a complex and time-consuming task. The choices made during database and application design will impact the usefulness of the final, delivered application. One of the most important design considerations is the impact of database locking on concurrent processes.
Posted September 16, 2021
If you've been paying attention at all, I'm sure you've heard the term "phishing." It is used to describe the fraudulent practice of sending emails purporting to be from a reputable source to induce individuals to reveal personal information, such as passwords and credit card numbers. But maybe you haven't heard the term "vishing" yet.
Posted August 02, 2021
Picture this scene: It is a little after 5 p.m. on a Friday and a chat message pops up from my "favorite" application programmer. Something isn't working properly. Yes, that is the message. "Something" isn't working properly. That's all. This is no exaggeration of the types of situations that routinely pop up. Sure, these may not be the exact words, but I'd wager that most DBAs have been on the receiving end of this type of request. So, what should be done in this type of situation?
Posted July 15, 2021
With new "data models" and types of database systems purported to be better than relational, it seems that the industry is all too ready to stick a fork in relational. So, is relational technology outdated? There are all sorts of pretenders to the throne for data management including Hadoop, XML, and the various types of NoSQL database products. But why? Are there truly any shortcomings in the relational way that require its wholesale replacement?
Posted June 10, 2021
Even in today's modern IT environment, performance management is often conducted reactively instead of proactively. You know the drill. A client calls with a response time problem. A table space maxes out on extents. A program is running without taking commits causing all kinds of locking problems. Somebody changed an application package without checking on the new access paths and transactions are slower than before. And then somebody submitted that "query from hell" again that just won't stop running. Sound familiar?
Posted April 29, 2021
You would think that with the towering importance of data in today's modern organization that data modeling would be viewed as extremely important by management and IT professionals, so it is somewhat ironic that the age of big data has coincided with a long-term slide in data administration and modeling in many organizations. This is not a situation that should continue to be tolerated.
Posted April 06, 2021
There are many new and intriguing possibilities for analytics that require an investment in learning and new technology. But the return on the investment can be sizable in terms of gaining insight into your business, and in better servicing your customers.
Posted March 01, 2021
If you happen to be a fan of Monty Python's Flying Circus, then you probably recognize the line "None shall pass" in the title of this month's column. It is the unchanging exclamation of the Black Knight in the movie Monty Python and the Holy Grail. He just stands there trying to block everyone who attempts to go past him—even after a better swordsman has cut off his arms and legs. I'm sure that some of the application developers reading this can relate to this story, replacing the knight with their "favorite" DBA.
Posted February 10, 2021
Today, more than ever, it is important to break down the cultural barriers and foster an environment of teamwork and communication for the entire IT organization, and indeed, for the entire company. Perhaps one of the most important cultural hurdles to overcome these days is the need for speed. A study done by Forrester Research showed that Dev teams are accustomed to new releases on a quarterly basis or faster, and out on the edge there are teams that deploy multiple times a day! Then we have Ops teams, where the expectation is to have new releases twice a year or even slower.
Posted January 07, 2021
Due to the advent of client/server computing in the 1990s, then internet and web-connectivity driving development, and the more recent growth of NoSQL databases, we are now managing more types of data on more platforms than we ever have before. And that means that managing data is a lot more complex. This also means that DBAs need to possess expertise in cross-platform management and administration.
Posted December 10, 2020
Data gravity is the concept that data remains where it is and applications and services (and even more data) are attracted to and use that data. Moving data can be a tricky, time-consuming, and complex process, even on-prem. Now add in the movement of large quantities of enterprise data through the internet to a cloud service provider. Anyone who has managed or used big data, or even moderately large data, will immediately understand the problem.
Posted November 04, 2020
Database administration includes responsibility for ensuring the ongoing operational functionality and efficiency of an organization's databases and the applications that access those databases. Traditionally, for the DBA, this has meant becoming an expert about "all things" related to a particular database management system (DBMS), such as Microsoft SQL Server, Oracle, IBM Db2, and others, and that the DBMS was installed on-premise on physical servers.
Posted October 08, 2020
Good data quality starts with metadata—and the importance of quality data cannot be overstated. Poor data quality costs the typical company between 10% and 20% of their revenue. In addition, high quality data is crucial for complying with regulations.
Posted September 09, 2020
The database world is in tumult these days. There are new requirements and new capabilities that organizations are adopting and integrating into their data persistence infrastructure all the time. The world is no longer relational/SQL-only. Organizations are adopting NoSQL database systems to support specific use cases and types of workloads. This is increasing the complexity of how data is managed. But it is not just NoSQL that is driving organizations to run multiple DBMSs. Many organizations have more than one relational DBMS. They may run Db2 on the mainframe and Linux, Oracle on UNIX, and SQL Server on Windows, and perhaps have a few MySQL instances, too. And DBAs are managing a lot of different database instances.
Posted August 11, 2020
Both development and production database administration are required to support database applications. However, it is not usually necessary to have different DBA staffs to perform the different roles. Indeed, intimate knowledge of how a database application was developed can make it easier to support that application once it becomes operational in the production world. But the bottom line is this: You will need to define, plan for, and staff both development and production DBA roles in order to create useful database applications.
Posted July 01, 2020
When non-DBAs think about what it is that a DBA does, performance monitoring and tuning are usually the first tasks that come to mind. This should not be surprising. Almost anyone who has come in contact with a computer has experienced some type of performance problem. Moreover, relational database systems have a notorious reputation (much of it undeserved) for poor performance.
Posted June 10, 2020
As a database consultant, one of the things I am most frequently asked about is dealing with database locking issues. Every DBMS has different settings and parameters to control locking at the system and database levels. And, yes, it is important to set these up correctly, but they are rarely the core cause of performance problems caused by locking. So let's focus on the universal issue that is almost always the culprit: poor coding techniques and inappropriate oversight before code is moved to production.
Posted May 13, 2020
Modern software development teams have adopted a continuous delivery approach based upon DevOps and agile development techniques. The small and frequent code changes that result from such methodologies can deliver significant benefit in terms of reduced lead time for changes, a lower failure rate, and a reduced mean time to recovery when errors are encountered. Indeed, today's developers migrate more frequent changes into the production environment than ever before.
Posted April 08, 2020
Ensuring the integrity of the organization's databases is a key component of the DBA's job. A database is of little use if the data it contains is inaccurate or if it cannot be accessed due to integrity problems. The DBA has many tools at his disposal to ensure data integrity. At a high-level, there are two aspects to data integrity.
Posted March 05, 2020
The mainframe has a long history of applications and uses, which stretches back more than 50 years. This means that some mainframe applications require modernization. This is often referred to as "mainframe modernization," but a better term for this is "mainframe application modernization" because the mainframe already is a modern platform.
Posted February 10, 2020
Application developers typically do not spend a lot of time on optimizing performance. Their focus is on writing code that matches the project specifications and delivers the expected results. So, it makes sense that SQL performance testing should be conducted on all programs before they are migrated to a production environment, or you will likely experience performance problems
Posted January 02, 2020
If you have been around the IT industry for as long as I have, you have seen technologies and ideas come and go—and sometimes even come back again. This is surely the case with the "new" products that call themselves data catalogs.
Posted December 01, 2019
Truly, the speed and performance of your production database systems encompasses a wide range of parameters and decisions that are made well before implementation. DBAs need to understand the options available, the factors that impact performance and development with each DBMS option, and work to keep the IT organization up-to-speed and educated on all of the available choices.
Posted October 31, 2019
A "best practice" is a professional procedure or tactic that is accepted or prescribed as being correct or most effective. Take a moment to think about the things that you do, or want to do, on a daily basis to manage your database infrastructure. What are the main categories of "things" that you want to tackle with best practices?
Posted October 01, 2019
Although adopting advanced analytics is on the radar for most organizations these days, it is important to understand some of the problems that can occur as you implement analytics projects. Perhaps the most important obstacle to overcome is ensuring buy-in from your organization's leaders.
Posted September 03, 2019
Unless you've been living in a cave somewhere, you've certainly heard news about stolen and hacked data, commonly referred to as data breaches. They occur with great regularity, and there is no indication that the frequency of data breaches is slowing down. Organizations must take more aggressive actions to prevent the sensitive information in their care from being stolen and abused.
Posted August 07, 2019
The latest release of Db2, version 11.5, unleashed on the world in June 2019, is being marketed by IBM as "The AI Database." AI promises to be one of the biggest technological game-changers of this century as it enables computing devices and systems to take on more activities that heretofore had to rely on a human. But, what does IBM mean by an AI database?
Posted July 18, 2019
What are the practices and procedures that you have found to be most helpful to automate in administering your databases? Yes, I know that automation has been a standard claim for most DBMS vendors, as well as third-party DBA tool vendors, for many years. But are you really anywhere closer to an "on demand," "lights-out," "24/7" database environment yet?
Posted June 10, 2019
It can be challenging for IT architects and executives to keep up with today's modern IT infrastructure. Homogeneous systems, common in the early days of computing, are almost non-existent today in the age of heterogeneous systems. It is de rigueur for Linux, Unix and Windows servers to be deployed throughout a modern IT infrastructure. And for larger shops, add in mainframes, too.
Posted May 01, 2019
One of the most important and rapidly changing areas of database management is security and protection. The major DBMS vendors have been adding security features and improving the way you can protect your precious corporate data. But it can be difficult to keep up with these features, so let's take a brief look at some of the more interesting and useful database security options.
Posted April 09, 2019
Scalability and elasticity are related, though they are different aspects of database availability. Both scalability and elasticity help to improve availability and performance when demand is changing, especially when changes are unpredictable.
Posted March 04, 2019
Every organization that manages data using a DBMS requires a database administration group to ensure the effective use and deployment of the company's databases. And since most modern organizations rely on database systems, they also need DBAs. That said, the discipline of database administration is not well-understood, nor is it universally practiced in a coherent and easily replicated manner.
Posted February 08, 2019
Everybody knows that a database management system, or DBMS, is the system software used to store, manage, and access enterprise data. But what is a data analytics platform? Well, it can be a lot of things, so make sure that you examine any "platform" your organization is considering very carefully. A data platform might comprise a DBMS, and it might not. It might be a tool to help you collect and analyze large data sets from various sources. It might be a series of analytics tools and interfaces for accessing data. Or it might be any one, or set of, offerings that manage, virtualize, index, secure, or otherwise work with your data.
Posted January 02, 2019
Managing the performance of database systems and applications is a significant job responsibility for DBAs. From a database perspective, there are three basic performance components that must be performed.
Posted December 04, 2018
Before the end of the decade, the number of connected objects is projected to expand greatly. According to several different analysts, the number of connected objects by 2020 could be as low as 26 billion or as high as 50 billion. But even the low end of that range is quite large. Indeed, connectedness is becoming commonplace and accepted across a wide spectrum of services and applications.
Posted November 01, 2018
When you mention big data and analytics, the first thing most people think of is Hadoop, Spark, and NoSQL. But are these newer technologies required for big data projects? What about the mainframe? Mainframes are not often mentioned in big data articles and advertising. But they should be.
Posted October 10, 2018
The current trend for software development teams is to adopt a continuous delivery approach based upon DevOps and agile development techniques. DevOps is relatively a new term, coined in 2009, defining software engineering practices that combine software development (Dev) and software operations (Ops). The general idea is not really all that new, but the adoption of agile techniques and modern tooling to automate software delivery is. The goal of DevOps is for developers and operations personnel to collaborate throughout the entire service lifecycle, from design through development and into production.
Posted September 04, 2018
We are living in the age of polyglot persistence, which really just means that it makes sense to store data using the technology that best matches the way the data will be used by applications. The age of trying to force everything into a relational DBMS is over, and we now have NoSQL, NewSQL, in-memory, and Hadoop-based offerings that are being used to store data. But you really should also be looking at the algorithmic approach offered by Ancelus Database.
Posted August 08, 2018
A Worldwide Data Ethics Council is something we greatly need. The council would focus on debating, crafting and proposing clearer regulations that dictate what is—and is not—ethical in terms of data collection, retention, and usage. Furthermore, it would communicate the message of being skeptical of everything and using caution before sharing anything with anyone. The council could also work on forms of data ethics education for schools and universities, as well as to educate the press and government officials. I mean, let's face it, after watching those U.S. Congressional hearings with Mark Zuckerberg I don't think any techies believe that government officials are prepared for the Information Age.
Posted July 02, 2018
With all of the data breaches and accusations of improper data usage in the news these days, the question of who owns data looms large. Understanding who owns which data is a complex question that can't be answered quickly or easily.
Posted June 01, 2018
The relational optimizer is a very complex component of the RDBMS that we too often take for granted. The optimization techniques of the major RDBMS products continue to be improved with every new release, too. Relational optimization has saved countless hours of work and, as long as we use it properly and code our applications with knowledge of what optimization can do, the RDBMS can be used for a wide variety of requirements and use cases. Don't lose sight of that as you wend your way through the hype out there regarding new types of database systems.
Posted May 08, 2018
Data lake is a newer IT term created for a new category of data store. But just what is a data lake? According to IBM, "a data lake is a storage repository that holds an enormous amount of raw or refined data in native format until it is accessed." That makes sense. I think the most important aspect of this definition is that data is stored in its "native format." The data is not manipulated or transformed in any meaningful way; it is simply stored and cataloged for future use.
Posted April 12, 2018