Newsletters




Craig S. Mullins

Craig S. Mullins

Craig S. Mullins is president and principal consultant with Mullins Consulting, Inc. He has more than 3 decades of experience in all facets of data management and database systems development.  

Mullins is the author of three best-selling books, DB2 Developer’s Guide, Database Administration: The Complete Guide to DBA Practices & Procedures, and A Guide to Db2 Application Performance for Developers.

Mullins is also an IBM Champion for Analytics, a DB2 Gold Consultant, and a member of the IDUG Volunteer Hall of Fame.

You can reach him via his website at www.MullinsConsulting.com.

Articles by Craig S. Mullins

Every DBA knows that a test platform is required to prop­erly enable application development. Indeed, testing has always been crucial, but it has become more complicated with the advent of new technologies, platforms, and devices, not to mention the impact of regulatory compliance. One of the biggest challenges faced by DBAs and developers is managing test data.

Posted April 11, 2024

Database administration is a very technical discipline, but it is also a discipline in which the practitioner is very visible politically within the organization. DBAs should be armed with a proper attitude and sufficient technical knowledge before attempting to practice database administration. This column usually covers the many diverse technology issues with which DBAs must deal. But for this month, we are going to take a short diversion and discuss some high-level guidance for DBAs to follow that can make their job easier to deal with.

Posted March 14, 2024

Before talking about its impact, it makes sense to first define what is meant by the term "regulatory compliance." There are two components: regulatory and compliance. Regulatory refers to regulations, which are governmental, and business rules and laws. Regulations may exist for specific industries, countries, jurisdictions, and practices. Compliance refers to following the directives of the regulations as they apply to your business operations.

Posted February 08, 2024

Earlier in 2023, we discussed The Importance of Metadata here in the DBA Corner column. And, indeed, metadata is more important than ever before because it helps us to understand our data. Data without metadata lacks meaning, making it difficult to be used. Metadata gives data its form and makes it usable by your organization.

Posted January 11, 2024

Database performance and scalability are both important aspects of managing and optimizing databases, but they refer to different characteristics and considerations. Database performance refers to how efficiently a database system can respond to and process queries, transactions, and other operations. It is focused on the speed, responsiveness, and overall efficiency of database operations.

Posted December 14, 2023

An age-old question when it comes to staffing a DBA group is how to determine the right number of DBAs required to keep your organization's databases online and operating efficiently. Many organizations try to operate with the minimal number of DBAs that they can. The general idea here is that the fewer staff members you hire, the lower overall cost will be. However, that assumption is probably not entirely true. An overworked DBA staff can make mistakes that cause downtime and operational problems far in excess of the salary requirements of an additional DBA.

Posted November 09, 2023

Database performance can be a complex and difficult task, but there are some high-level maxims that can help to sim­plify optimizing the performance of your databases and applica­tions. Let's look at some of these guiding rules of thumb. You should always keep in mind the business objectives of the databases and applications you manage. It is wise to manage performance based on the expectations and budget of the busi­ness users.

Posted October 12, 2023

The role of the database administrator (DBA) has always been crucial to ensuring that databases are properly organized, secured, and optimized. But recent years have seen many significant changes in the field of database administration. Before talking about how it is changing, let's start by defining it. Database administration (or DBA) refers to the set of tasks and processes involved in managing and maintaining database systems (and the applications and users that access them). This includes everything from designing and creating databases to deploying, monitoring, and securing data across the enterprise.

Posted September 14, 2023

Cloud computing has revolutionized the way businesses operate and has fundamentally changed the role of database administrators (DBAs). Traditionally, DBAs have been responsible for the installation, maintenance, and management of on-prem databases. However, in the age of cloud computing, DBAs are required to adapt to new technologies, methodologies, and tools.

Posted August 10, 2023

As a database administrator, you know that the field is constantly evolving. New technologies are emerging all the time, and it can be difficult to keep up with the latest trends. And almost every new technical trend and advancement tends to interact with data. So, DBAs need to be "in the know" and regularly update their knowledge of what's new. That's where industry conferences come in.

Posted July 13, 2023

Artificial Intelligence, or AI, is having a significant impact on most industries and job roles these days, and it will only increase as AI techniques and algorithms improve over time. I'm sure you've heard about things like natural language processing, machine learning, and large language models such as those used by ChatGPT and Google Bard. These are all forms of AI, and they are influencing and impacting the way we work.

Posted June 08, 2023

The end goal of database design is to be able to transform a logical data model into an actual physical database. A logical data model is required before you can even begin to design a physical database. Assuming that the logical data model is complete, though, what must be done to implement a physical database?

Posted May 11, 2023

Organizations often force the DBA to take on the job of data modeling. That does not mean that DBAs are well-trained in data modeling, nor does it mean that DBAs are best suited to take on this task. The data administration (DA) team is best suited for modeling data. This is because data modeling requires knowledge of the business aspects of data resource management.

Posted April 13, 2023

This month, I want to take a higher-level view of database performance concepts and introduce a few themes that will likely follow you throughout your career working with database systems and applications. These themes should be viewed as guiding principles to keep in mind when confronted with database performance issues.

Posted March 09, 2023

Today, nobody argues about the importance of data. It is a given that data must be collected, managed, and analyzed to conduct business successfully in the modern era. But all too often the meaning of the data is not being preserved. This can result in large stores of data that are difficult to interpret and use in any meaningful manner. Situations that create unusable data can be minimized with appropriate metadata management. But this requires effort and time, and therefore metadata is frequently ignored (much like documentation).

Posted February 09, 2023

Most database implementations are anything but static. Once deployed, databases are queried, updated, loaded, unloaded, reorganized, and data is deleted and inserted from them on an ongoing basis. Data is also moved around and replicated all over the place. All of these activities resulting in database changes cause storage requirements to change as well.

Posted January 12, 2023

Readers of this column sometimes ask me questions about databases and database administration, which I welcome. And at times I will take the opportunity to answer particularly intriguing questions in print. One intriguing question I have been asked more than once is: "What metrics and measurements are useful for managing how effective your DBA group is?"

Posted December 08, 2022

Last month we looked at various types of database recovery, how they work, and how DBAs need to prepare for recovery scenarios. This month, let's delve a little deeper into the issues and decisions that DBAs need to be prepared to address as they work on database recovery. The first thing that DBAs need to be aware of is the recovery time objectives, or RTOs, for the database objects in question. In an ideal world, RTOs would have been established for each object and the backup procedures would be in place to establish sufficient time for recovering to those objectives.

Posted November 10, 2022

Things break and the DBA must be prepared for situations where a failure impacts the availability, integrity, or usability of a database. Reacting to failures and service disruptions is a key component of the DBA's job. This means ensuring that appropriate backups are taken for all your critical database objects. Assuming backups exist, when an inevitable database recovery is required, the next decision is how to get the data you need back as quickly as possible. But there are several different types and methods of performing recovery.

Posted October 06, 2022

Establishing agreed upon services levels for database applications is of the utmost importance for assuring that performance meets required criteria. Without pre-determined, negotiated service level agreements in place, database and application performance can become a never-ending game of blind man's bluff, where DBAs blindly and constantly seek an unspoken goal. Instead, active service level management should be the standard.

Posted September 08, 2022

Assuring data quality should be one of the most important goals for IT professionals. This is truer today than it has ever been as organizations rely on data to power their AI and machine learning algorithms. Accurate, quality data is required if you want to make accurate decisions. And that is true, whether the data is being analyzed by a human or a computer algorithm.

Posted August 11, 2022

Optimizing the performance of operational databases and the applications that access them is a constant battle for DBAs. Of course, writing efficient SQL is the most important aspect of ensuring optimal database performance but no amount of SQL-tweaking or system-tuning can optimize the performance of queries run against a poorly designed or disorganized database. Therefore, time must be spent on tuning and optimizing the design, parameters, and physical construction of database objects, specifically tables and indexes, and the files in which their data is stored. The actual composition and structure of database objects must be monitored continually and changed accordingly if the database becomes inefficient.

Posted July 07, 2022

Data archiving is an important aspect of data governance and data management. Not only does archiving help to reduce hardware and storage costs, but it is also an important aspect of long-term data retention and a key participant in regu­latory compliance efforts. When long-term data retention is imposed on your data—anything more than a couple of years—then archiving it can be the most optimal solution.

Posted June 02, 2022

Operational databases continue to expand, with database sizes growing in most organizations. In terms of performance, the more data in the operational database, the less efficient transactions running against that database tend to be. The other impact, database administration complexity, causes longer processing time and outages to perform traditional DBA tasks. But as important as operational performance and administration issues are, frequently they are ancillary to the regulatory issue of preserving authentic data over time.

Posted May 04, 2022

Data privacy regulations—coupled with the desire to pro­tect sensitive data—impose requirements on organizations to protect production data. Since many organizations rely on pro­duction data as a source for test data, techniques are needed to mask sensitive data elements from unauthorized viewing.

Posted April 07, 2022

Standard SQL defines four isolation levels that can be set using the SET TRANSACTION ISOLATION LEVEL statement: Serializable, Repeatable Read, Read Committed, and Read Uncommitted. Although all of the popular relational DBMSs support isolation level, the exact naming and implementation will vary.

Posted March 11, 2022

It can get complicated trying to decipher what is actually offered by DBMSs that market themselves as multi-model. Which models? Delivered how? But using a DBMS that supports more than one model can provide a simpler, easier-to-support environment.

Posted February 08, 2022

The DBA should be an advisor to the business in terms of the types of database security that can be enabled. At a high level, this boils down to being able to answer four questions.

Posted January 03, 2022

The amount of data we use and manage doubles annually, and performing analytics on that data can uncover heretofore unknown insights that lead to competitive advantage.

Posted December 15, 2021

Being a successful database administrator requires more than just deep technical knowl­edge of database systems. Gaining the technical know-how is important, yes, but there are many sources that offer technical guidance for DBAs. The non-technical aspects of database administration are just as challenging, yet there are fewer guiding resources to help. With this in mind, let's look at the 10 "rules of thumb" for DBAs to follow.

Posted December 08, 2021

Once data has been created, it gets moved around to support many different purposes. There are many techniques that can be used to facilitate data movement. One of the simplest ways for the DBA to move data from one place to another is to use the LOAD and UNLOAD utilities that come with the DBMS. The LOAD utility is used to populate tables with new data (or to add to existing data), and the UNLOAD utility is used to read data from a table and put it into a data file. Each DBMS may call the actual utilities by different names, but the functionality is the same or similar from product to product.

Posted November 01, 2021

You might think the title of this article is some­what controversial, but you should wait to render judgment until you've read to the end. There are several important shifts impacting data management and database administration that cause manual practices and procedures to be ineffective. Let's examine several of these trends.

Posted October 05, 2021

Designing and implementing a proper database for application development is a complex and time-consuming task. The choices made during database and application design will impact the usefulness of the final, delivered application. One of the most important design considerations is the impact of database locking on concurrent processes.

Posted September 16, 2021

If you've been paying attention at all, I'm sure you've heard the term "phishing." It is used to describe the fraudulent practice of sending emails purporting to be from a reputable source to induce individuals to reveal personal information, such as passwords and credit card numbers. But maybe you haven't heard the term "vishing" yet.

Posted August 02, 2021

Picture this scene: It is a little after 5 p.m. on a Friday and a chat message pops up from my "favorite" application programmer. Something isn't working properly. Yes, that is the message. "Something" isn't working properly. That's all. This is no exaggeration of the types of situations that routinely pop up. Sure, these may not be the exact words, but I'd wager that most DBAs have been on the receiving end of this type of request. So, what should be done in this type of situation?

Posted July 15, 2021

With new "data models" and types of database systems purported to be better than relational, it seems that the industry is all too ready to stick a fork in relational. So, is relational technology outdated? There are all sorts of pretenders to the throne for data management including Hadoop, XML, and the various types of NoSQL database products. But why? Are there truly any shortcomings in the relational way that require its wholesale replacement?

Posted June 10, 2021

Even in today's modern IT environment, performance management is often conducted reactively instead of proactively. You know the drill. A client calls with a response time problem. A table space maxes out on extents. A program is running without taking commits causing all kinds of locking problems. Somebody changed an application package without checking on the new access paths and transactions are slower than before. And then somebody submitted that "query from hell" again that just won't stop running. Sound familiar?

Posted April 29, 2021

You would think that with the towering importance of data in today's modern organization that data modeling would be viewed as extremely important by management and IT professionals, so it is somewhat ironic that the age of big data has coincided with a long-term slide in data administration and modeling in many organiza­tions. This is not a situation that should continue to be tolerated.

Posted April 06, 2021

There are many new and intriguing possibilities for analytics that require an investment in learning and new technology. But the return on the investment can be sizable in terms of gaining insight into your business, and in better servicing your customers.

Posted March 01, 2021

If you happen to be a fan of Monty Python's Flying Circus, then you probably recognize the line "None shall pass" in the title of this month's column. It is the unchanging exclamation of the Black Knight in the movie Monty Python and the Holy Grail. He just stands there trying to block everyone who attempts to go past him—even after a better swordsman has cut off his arms and legs. I'm sure that some of the application developers reading this can relate to this story, replacing the knight with their "favorite" DBA.

Posted February 10, 2021

Today, more than ever, it is important to break down the cultural barriers and foster an environment of teamwork and communication for the entire IT organization, and indeed, for the entire company. Perhaps one of the most important cultural hurdles to overcome these days is the need for speed. A study done by Forrester Research showed that Dev teams are accustomed to new releases on a quarterly basis or faster, and out on the edge there are teams that deploy multiple times a day! Then we have Ops teams, where the expectation is to have new releases twice a year or even slower.

Posted January 07, 2021

The primary role of database ‘custodian' continues to be the main emphasis of their jobs but increasingly, DBAs are expected to take on additional responsibilities.

Posted December 14, 2020

Due to the advent of client/server computing in the 1990s, then internet and web-connectivity driving development, and the more recent growth of NoSQL databases, we are now manag­ing more types of data on more platforms than we ever have before. And that means that managing data is a lot more complex. This also means that DBAs need to possess exper­tise in cross-platform management and administra­tion.

Posted December 10, 2020

Data gravity is the concept that data remains where it is and applications and services (and even more data) are attracted to and use that data. Moving data can be a tricky, time-consuming, and complex process, even on-prem. Now add in the movement of large quantities of enterprise data through the internet to a cloud service provider. Anyone who has managed or used big data, or even moderately large data, will immediately understand the problem.

Posted November 04, 2020

Database administration includes responsibility for ensuring the ongoing operational functionality and efficiency of an organization's databases and the applications that access those databases. Traditionally, for the DBA, this has meant becoming an expert about "all things" related to a particular database management system (DBMS), such as Microsoft SQL Server, Oracle, IBM Db2, and others, and that the DBMS was installed on-premise on physical servers.

Posted October 08, 2020

Good data quality starts with metadata—and the importance of quality data cannot be overstated. Poor data quality costs the typical company between 10% and 20% of their revenue. In addition, high quality data is crucial for complying with regulations.

Posted September 09, 2020

The database world is in tumult these days. There are new requirements and new capabilities that organizations are adopting and integrating into their data persistence infrastructure all the time. The world is no longer relational/SQL-only. Organizations are adopting NoSQL database systems to support specific use cases and types of workloads. This is increasing the complexity of how data is managed. But it is not just NoSQL that is driving organizations to run multiple DBMSs. Many organizations have more than one relational DBMS. They may run Db2 on the mainframe and Linux, Oracle on UNIX, and SQL Server on Windows, and perhaps have a few MySQL instances, too. And DBAs are managing a lot of different database instances.

Posted August 11, 2020

Both development and production database administration are required to support database applications. However, it is not usually necessary to have different DBA staffs to perform the different roles. Indeed, intimate knowledge of how a database application was developed can make it easier to support that application once it becomes operational in the production world. But the bottom line is this: You will need to define, plan for, and staff both development and production DBA roles in order to create useful database applications.

Posted July 01, 2020

When non-DBAs think about what it is that a DBA does, performance monitoring and tuning are usually the first tasks that come to mind. This should not be surprising. Almost anyone who has come in contact with a computer has experienced some type of performance problem. Moreover, relational database systems have a notorious reputation (much of it undeserved) for poor performance.

Posted June 10, 2020

As a database consultant, one of the things I am most frequently asked about is dealing with database locking issues. Every DBMS has different settings and parameters to control locking at the system and database levels. And, yes, it is important to set these up correctly, but they are rarely the core cause of performance problems caused by locking. So let's focus on the universal issue that is almost always the culprit: poor coding techniques and inappropriate oversight before code is moved to production.

Posted May 13, 2020

Modern software development teams have adopted a continuous delivery approach based upon DevOps and agile development techniques. The small and frequent code changes that result from such methodologies can deliver significant benefit in terms of reduced lead time for changes, a lower failure rate, and a reduced mean time to recovery when errors are encountered. Indeed, today's developers migrate more frequent changes into the production environment than ever before.

Posted April 08, 2020

Ensuring the integrity of the organization's databases is a key component of the DBA's job. A database is of little use if the data it contains is inaccurate or if it cannot be accessed due to integrity problems. The DBA has many tools at his disposal to ensure data integrity. At a high-level, there are two aspects to data integrity.

Posted March 05, 2020

The mainframe has a long history of applications and uses, which stretches back more than 50 years. This means that some mainframe applications require modernization. This is often referred to as "mainframe modernization," but a better term for this is "mainframe application modern­ization" because the mainframe already is a mod­ern platform.

Posted February 10, 2020

Application developers typically do not spend a lot of time on optimizing performance. Their focus is on writing code that matches the project specifications and delivers the expected results. So, it makes sense that SQL performance testing should be conducted on all programs before they are migrated to a production environment, or you will likely experience performance problems

Posted January 02, 2020

If you have been around the IT industry for as long as I have, you have seen technologies and ideas come and go—and sometimes even come back again. This is surely the case with the "new" products that call themselves data catalogs.

Posted December 01, 2019

Truly, the speed and performance of your production database systems encompasses a wide range of parameters and decisions that are made well before implementation. DBAs need to understand the options available, the factors that impact performance and development with each DBMS option, and work to keep the IT organization up-to-speed and educated on all of the available choices.

Posted October 31, 2019

A "best practice" is a professional procedure or tactic that is accepted or prescribed as being correct or most effective. Take a moment to think about the things that you do, or want to do, on a daily basis to manage your database infrastructure. What are the main categories of "things" that you want to tackle with best practices?

Posted October 01, 2019

Although adopting advanced analytics is on the radar for most organizations these days, it is important to understand some of the problems that can occur as you implement analytics projects. Perhaps the most important obstacle to overcome is ensuring buy-in from your organization's leaders.

Posted September 03, 2019

Unless you've been living in a cave somewhere, you've certainly heard news about stolen and hacked data, commonly referred to as data breaches. They occur with great regularity, and there is no indication that the frequency of data breaches is slowing down. Organizations must take more aggressive actions to prevent the sensitive information in their care from being stolen and abused.

Posted August 07, 2019

The latest release of Db2, version 11.5, unleashed on the world in June 2019, is being marketed by IBM as "The AI Database." AI promises to be one of the biggest technological game-changers of this century as it enables computing devices and systems to take on more activities that heretofore had to rely on a human. But, what does IBM mean by an AI database?

Posted July 18, 2019

What are the practices and procedures that you have found to be most helpful to automate in administering your databases? Yes, I know that automation has been a standard claim for most DBMS vendors, as well as third-party DBA tool vendors, for many years. But are you really anywhere closer to an "on demand," "lights-out," "24/7" database environment yet?

Posted June 10, 2019

It can be challenging for IT architects and executives to keep up with today's modern IT infrastructure. Homogeneous systems, common in the early days of computing, are almost non-existent today in the age of heterogeneous systems. It is de rigueur for Linux, Unix and Windows servers to be deployed throughout a modern IT infrastructure. And for larger shops, add in mainframes, too.

Posted May 01, 2019

One of the most important and rapidly changing areas of database management is security and protection. The major DBMS vendors have been adding security features and improving the way you can protect your precious corporate data. But it can be difficult to keep up with these features, so let's take a brief look at some of the more interesting and useful database security options.

Posted April 09, 2019

Every organization that manages data using a DBMS requires a database administration group to ensure the effective use and deployment of the company's databases. And since most modern organizations rely on database systems, they also need DBAs. That said, the discipline of database administration is not well-understood, nor is it universally practiced in a coherent and easily replicated manner.

Posted February 08, 2019

Everybody knows that a database management system, or DBMS, is the system software used to store, manage, and access enterprise data. But what is a data analytics platform? Well, it can be a lot of things, so make sure that you examine any "platform" your organization is considering very carefully. A data platform might comprise a DBMS, and it might not. It might be a tool to help you collect and analyze large data sets from various sources. It might be a series of analytics tools and interfaces for accessing data. Or it might be any one, or set of, offerings that manage, virtualize, index, secure, or otherwise work with your data.

Posted January 02, 2019

Managing the performance of database systems and applications is a significant job responsibility for DBAs. From a database perspective, there are three basic performance components that must be performed.

Posted December 04, 2018

Before the end of the decade, the number of connected objects is projected to expand greatly. According to several different analysts,  the number of connected objects by 2020 could be as low as 26 billion or as high as 50 billion. But even the low end of that range is quite large. Indeed, connectedness is becoming commonplace and accepted across a wide spectrum of services and applications.

Posted November 01, 2018

When you mention big data and analytics, the first thing most people think of is Hadoop, Spark, and NoSQL. But are these newer technologies required for big data projects? What about the mainframe? Mainframes are not often mentioned in big data articles and advertising. But they should be.

Posted October 10, 2018

The current trend for software development teams is to adopt a continuous delivery approach based upon DevOps and agile development techniques. DevOps is relatively a new term, coined in 2009, defining software engineering practices that combine software development (Dev) and software operations (Ops). The general idea is not really all that new, but the adoption of agile techniques and modern tooling to automate software delivery is. The goal of DevOps is for developers and operations personnel to collaborate throughout the entire service lifecycle, from design through development and into production.

Posted September 04, 2018

We are living in the age of polyglot persistence, which really just means that it makes sense to store data using the technology that best matches the way the data will be used by applications. The age of trying to force everything into a relational DBMS is over, and we now have NoSQL, NewSQL, in-memory, and Hadoop-based offerings that are being used to store data. But you really should also be looking at the algorithmic approach offered by Ancelus Database.

Posted August 08, 2018

A Worldwide Data Ethics Council is something we greatly need. The council would focus on debating, crafting and proposing clearer regulations that dictate what is—and is not—ethical in terms of data collection, retention, and usage. Furthermore, it would communicate the message of being skeptical of everything and using caution before sharing anything with anyone. The council could also work on forms of data ethics education for schools and universities, as well as to educate the press and government officials. I mean, let's face it, after watching those U.S. Congressional hearings with Mark Zuckerberg I don't think any techies believe that government officials are prepared for the Information Age.

Posted July 02, 2018

With all of the data breaches and accusations of improper data usage in the news these days, the question of who owns data looms large. Understanding who owns which data is a complex question that can't be answered quickly or easily.

Posted June 01, 2018

Architecting the Modern Enterprise: 10 Key Technologies for a Strong Foundation

Posted May 11, 2018

The relational optimizer is a very complex component of the RDBMS that we too often take for granted. The optimization techniques of the major RDBMS products continue to be improved with every new release, too. Relational optimization has saved countless hours of work and, as long as we use it properly and code our applications with knowledge of what optimization can do, the RDBMS can be used for a wide variety of requirements and use cases. Don't lose sight of that as you wend your way through the hype out there regarding new types of database systems.

Posted May 08, 2018

Data lake is a newer IT term created for a new category of data store. But just what is a data lake? According to IBM, "a data lake is a storage repository that holds an enormous amount of raw or refined data in native format until it is accessed." That makes sense. I think the most important aspect of this definition is that data is stored in its "native format." The data is not manipulated or transformed in any meaningful way; it is simply stored and cataloged for future use.

Posted April 12, 2018

DBAs spend a great deal of time monitoring and managing the performance of database systems and applications. But it doesn't have to be such a large percentage of their time. Yes, changing data patterns, requirements, and time will always conspire together to create performance problems. But more can be done to avoid the up-front poor performance of most database applications.

Posted March 07, 2018

The term "big data" has been bandied about for a number of years now, to the point where it has been used so much that it is a part of IT culture. Hard to specifically define, yet everyone seems to have a good idea what is meant by it, big data is here to stay. And that is a good thing!

Posted February 01, 2018

Blockchain is a distributed, shared, permissioned ledger for recording transactions with consensus, provenance, immutability and finality. It is the technology that drives virtual currencies like Bitcoin. But its potential spans many more industries and use cases than just virtual currencies.

Posted January 02, 2018

The importance of data to today's modern world becomes more and more clear every day. Organizations are creating, storing, gathering, and managing more data than ever before. If you are reading this article, chances are, you will agree with this statement: "You are managing more data this year than you did last year … and your organization is planning to manage even more data next year."

Posted December 01, 2017

Moving Beyond Relational With Data Integration

Posted November 28, 2017

One of the prevailing conversations around database systems lately revolves around ACID support. But not everybody knows what is meant by the term ACID. Oh, perhaps they know that it involves how data integrity is maintained or that it impacts locking. And, at a high level many folks know that relational/SQL systems support ACID whereas that is not always the case for NoSQL database systems.

Posted November 01, 2017

What Type of DBA Are You?

Posted October 18, 2017

DBAs spend a lot of time on tuning application code and SQL statements to boost efficiency and optimize access. But SQL is only one aspect of database systems performance. It is also important for DBAs to devote time to tuning and optimizing the design, parameters, and physical construction of database objects, specifically tables and indexes, and the files in which their data is stored.

Posted September 07, 2017

Referential integrity (RI) is a method for ensuring the "correctness" of data within a DBMS. People tend to oversimplify RI, stating that it is merely the identification of relationships between relational tables. It is actually much more than this. RI embodies the integrity and usability of a relationship by establishing rules that govern that relationship.

Posted August 09, 2017

As every DBA should know, DBMS data is typically persisted using disk storage. So the data is stored on disk and when it is later read or modified it has to be accessed and changed on disk. To optimize these processes, most DBMSes use a cache, or buffer pool, to stage data in memory when it is accessed. By moving the data to memory subsequent accesses to the same data can be more efficient because disk I/O can be bypassed.

Posted July 05, 2017

When a failure occurs, the DBA must ascertain whether recovery is required. It is possible, though not very likely for active databases, that a failure does not impact the integrity of your data. Assuming that recovery is required, the DBA will need to determine what resources (backup copies and log files) are available and how best to perform the needed database recovery.

Posted June 01, 2017

As DBAs, we can get mired in the depths of performance tuning parameters and scripts, sometimes getting lost in all the details. It is a good idea to always have a set of goals and philosophies that you can lean on to keep you focused and working on the appropriate things. That is what I want to talk about in this month's DBA Corner column. Some high-level rules of thumb for achieving your DBMS-related performance goals and maintaining your sanity.

Posted May 05, 2017

Every good DBA understands that backing up their database data is a non-optional part of assuring data availability and integrity. As a DBA, you need to know the difference between a full image copy backup and an incremental image copy backup and implement the proper image copy backup strategy based on application needs and database activity.

Posted April 07, 2017

The heart of any relational database management system is the system catalog that documents the database objects and system settings being used. The system catalog offers a wealth of information about your DBMS. You can think of it as the knowledge base of every piece of data known to the system. For this reason, it is important that DBAs understand what is in the system catalog, as well as how to access and manipulate the information it contains.

Posted March 02, 2017

The role of the DBA is growing and becoming more complicated in the age of digital transformation. As the amount and type of information we store expands, DBAs must become versed in administering not just one type of DBMS (e.g., relational), but multiple types (document, key/value, wide column stores, and graph) and even non-DBMS data platforms (e.g., Hadoop). Furthermore, cloud computing can change the manner in which existing applications and databases operate.

Posted February 08, 2017

The industry is changing, the way that DBAs work is changing, and database systems are changing. We all need to come to grips with the fact that the way we worked in the past is no longer the way we work with today's modern database environment.

Posted January 03, 2017

The clear trend these days is to automate and enable computerized tasks to streamline and optimize administrative and maintenance tasks. Many database management tasks that today require oversight and handholding by DBAs can, over time, be turned over to intelligently automated software to manage. But automation is just the first step.

Posted December 01, 2016

Navicat: A Helpful Toolkit for DBAs

Posted November 02, 2016

Typically, most applications consist of both batch and online workloads. This is true even today, when most of our attention has turned to online and web-based interaction. Sure, online activities are the most obvious, in-your-face component of countless applications, but batch processing still drives many actions behind the scenes. This can include applying updates, processing reports, integrating input from multiple sources and locations, data extraction, database utility processing, and more.

Posted October 07, 2016

The current driving force for many IT projects is big data and analytics. So how will the job of DBA be impacted as their companies deploy big data analytics systems? The answer, is quite a bit, but don't forget everything you already know!

Posted September 02, 2016

Regulatory compliance is a critical aspect of the IT landscape these days, and the ability to audit database activities showing who did what to which data when is a specific requirement of many industry and governmental regulations. There are six primary methods that can be used to accomplish database auditing.

Posted August 04, 2016

Database administration is undergoing some significant changes these days. The DBA, traditionally, is the technician responsible for ensuring the ongoing operational functionality and efficiency of an organization's databases and the applications that access that data. But modern DBAs are relied upon to do far more than just stoke the fires to keep database systems performing

Posted July 12, 2016

Keeping your DBMS software up-to-date can be a significant job. The typical release cycle for DBMS software is every 18 to 36 months for major releases, with constant bug fixes and maintenance updates delivered in between those major releases.

Posted June 09, 2016

Being able to assess the effectiveness and performance of your database systems and applications is one of the most important things that a DBA must be able to do. This can include online transaction response time evaluation, sizing of the batch window and determining whether it is sufficient for the workload, end-to-end response time management of distributed workload, and more. But in order to accurately gauge the effectiveness of your current environment and setup, service level agreements, or SLAs, are needed.

Posted May 04, 2016

When users require access to multiple databases on multiple servers distributed across different physical locations, database security administration can become quite complicated. The commands must be repeated for each database, and there is no central repository for easily modifying and deleting user security settings on multiple databases simultaneously. At a high level, database security boils down to answering four questions.

Posted March 31, 2016

Every DBA should take advantage of the mechanisms provided by the DBMS to ensure data integrity. When DBMS-provided methods are used, fewer data integrity problems are likely to be found. Fewer data integrity problems mean higher quality databases and more proficient end users.

Posted March 03, 2016

One of the biggest challenges facing organizations today is making sure that the right information gets to the right people. It requires attention, diligence, and planning to ensure that data is used appropriately and accurately. Unfortunately, few organizations treat data as the corporate asset it truly is.

Posted February 10, 2016

If you are a working DBA, the actual work you do these days is probably significantly different than it was when you first began work as a DBA. So is the term DBA really accurate any longer? Or has the job grown into something more?

Posted January 07, 2016

As a DBA, establishing a reasonable backup schedule for your databases can be a challenging project. It requires you to balance two competing demands: the need to take image copy backups frequently enough to assure reasonable recovery time, and the need to not interrupt daily business. The DBA must be capable of balancing these two objectives based on usage criteria and the capabilities of the DBMS.

Posted December 02, 2015

There are many different ways to look at database administration. It can be done by task, by discipline, by DBMS, by server, and so on. But one useful way to look at database administration is in terms of the type of support being delivered to applications. You can paint a broad brush stroke across the duties of the DBA and divide them into two categories: those that support development work and those that support the production systems.

Posted November 09, 2015

Too little emphasis overall is placed on the integrity and recoverability of the data—and too much is placed on performance. Yes, performance is probably the most visible aspect of database systems, at least from the perspective of the end user. But the underlying assumption of the end user is always that they want to access accurate and, usually, up-to-date data. But what good does it do to quickly access the wrong data? Anybody can provide rapid access to the wrong data!

Posted October 07, 2015

Whenever I get into a discussion about database standards I invariably bring up one of my favorite quotes on the topic: "The best thing about standards is that there are so many to choose from." It shouldn't be true, but it is.

Posted September 09, 2015

As a data professional, you have heard the term "unstructured data." And you probably know what is meant by that term, as well. For those who do not, unstructured data is a general term used to define data that is not numbers, letters, and dates stored or viewed as rows and columns. But it is a horrible term. In fact, unstructured data is a lie. Let me tell you why.

Posted August 10, 2015

You'd be surprised at the variety of sages that have uttered pithy pieces of wisdom that prove useful in some way to DBAs. So with that in mind, let's review some of the could-have-been-DBAs through history by reviewing their own words!

Posted July 08, 2015

The landscape for database management systems is changing more rapidly these days than it has since the earliest days of the relational DBMS. Not only do we have an onslaught of NoSQL database systems of various different forms (column, document, key/value, and graph databases), but we also see a burgeoning market for in-memory database management, where the DBMS relies on main memory instead of disk for data storage, management, and manipulation. But there is another "category" of DBMS evolving that is known as "NewSQL."

Posted June 09, 2015

Every now and then, somebody will raise the age-old question "How can I measure the effectiveness and quality of my DBA staff?" This can be a difficult question to answer. And it almost always hides the actual question that is begging to be asked, which is "How many DBAs do we need?"

Posted May 14, 2015

When databases are built from a well-designed data model, the resulting structures provide increased value to the organization. The value derived from the data model exhibits itself in the form of minimized redundancy, maximized data integrity, increased stability, better data sharing, increased consistency, more timely access to data, and better usability.

Posted April 06, 2015

One of the trickiest aspects of relational database management can be dealing with missing information. The standard method of representing missing information is to set the "value" to null.

Posted March 12, 2015

When the data requirements of an organization change, the databases used to store the data must also change. Unfortunately, today's database systems do not make managing database change particularly easy.

Posted February 11, 2015

Key Developments in Big Data, Data Management, and New Tech in 2014

Posted January 29, 2015

Lock contention issues can be frustrating problems to investigate and debug. Before blaming the database system (as is the usual response), there are a few questions to consider that can help identify the cause of the locking problems.

Posted December 03, 2014

Designing an appropriate set of indexes can be one of the more troubling aspects of developing efficient relational database applications. Perhaps, the most important thing you can do to assure optimal application performance when accessing data in a relational/SQL database is to create the correct indexes for your tables based on the queries your applications use. Of course, this is easier said than done. With the caveat that appropriate index creation can be complicated, let's look at the top 10 steps to building the right indexes on your tables.

Posted November 12, 2014

Check constraints are supported in all of the major DBMS products, including DB2, Oracle, and SQL Server. But they are too-often ignored and not implemented in production databases. Properly defined, check constraints can provide enhanced data integrity without requiring procedural logic.

Posted October 08, 2014

Recounting the evolution of IBM's relational database technology DB2, Craig S. Mullins goes back a full 3 decades in his latest column, starting with the wild and woolly 1980s and ending with DB2 10 and 11 for z/OS as the currently supported versions of DB2, which have added important new functionality.

Posted September 10, 2014

Programs that read database data can access numerous rows and are therefore susceptible to concurrency problems. To get around this issue, most major RDBMS products support read-through locks, also known as "dirty read" or "uncommitted read," to help overcome concurrency problems. When might you want to consider using dirty reads in your applications?

Posted August 05, 2014

Lists help us to remember, impose order upon tasks, and provide guidance. Some lists have taken on great importance in the world and in our lives. The Bill of Rights is such a list. So is the Ten Commandments. But although we use lists to administer, program, and maintain our databases, there really is no overarching list of "things you should (or should not) do" to ensure an effective, accurate, well-designed database. So here is the list of Ten Database Commandments.

Posted July 03, 2014

Good data quality starts with metadata. Accurate data definitions are required in order to apply the controls for compliance to the correct data. Without proper metadata definitions, it is impossible to apply regulatory compliance to data.

Posted June 11, 2014

Recursive SQL can be very elegant and efficient. However, because of the difficulty developers can have understanding recursion, it is sometimes thought of as "too inefficient to use frequently." But, if you have a business need to walk or explode hierarchies in your database, recursive SQL will likely be your most efficient option.

Posted May 08, 2014

Driving a wide range of applications, from operational applications such as fraud detection to strategic analysis such as customer segmentation, advanced analytics goes deeper than traditional business intelligence activities into the "why" of the situation, and delivers likely outcomes.

Posted April 04, 2014

Database disaster recovery must be an integral component of your overall business recovery plan. Just because your organization has not yet experienced a disaster, or is not in a high-risk area, does not absolve you from the need for contingency planning. Here is what all DBAs should do to be prepared.

Posted March 12, 2014

Metadata is required to make data usable by applications, yet many data professionals do not understand or simply ignore metadata management. Just think about the metadata you must manage on your iPod—or whatever MP3 player you use—to be able to successfully select the music you want to hear.

Posted February 10, 2014

DB2 11 for z/OS brings with it a bevy of interesting and useful new features. They range the gamut from development to admin­istration to performance to integration with big data. Even if your company won't be migrating right away, now that DB2 11 is out in the field and available, the time has come for all DB2 users to take some time to learn what this latest and greatest version of DB2 can do.

Posted December 04, 2013

Although the concept of NoSQL is relatively new, some of the players have actually been around for a while. Yet, the concept of what NoSQL means continues to change. With all that in mind, let's take a look at the world of NoSQL.

Posted November 13, 2013

A successful DBA must understand application development and the issues involved in programming and design. Here are some things that every DBA must know about application development and the design projects of their organization.

Posted October 09, 2013

DBAs need to make many different types of changes to a database over its lifetime. Some will be simple and easy to implement, others much more difficult and complex. It is the DBA's job to understand the best way to implement any type of database change, but often, simple changes become more difficult in the real world. Database change management tools help to make this job easier and is one of the first tools acquired by many organizations when they implement a database of any size.

Posted September 11, 2013

Just about every company with a DBMS has that binder full of corporate and/or IT standards. That one over there in the corner with the cobwebs on it — the one that you only use when you need an excuse to avoid work. Okay, well, maybe it's not quite that bad. Your standards documents could be on the company intranet or some other online mechanism (but chances are there will be virtual cobwebs on your online standards manuals, too).

Posted August 07, 2013

Just about every company with a DBMS has that binder full of corporate and/or IT standards. That one over there in the corner with the cobwebs on it - the one that you only use when you need an excuse to avoid work. Okay, well, maybe it's not quite that bad. Your standards documents could be on the company intranet or some other online mechanism (but chances are there will be virtual cobwebs on your online standards manuals, too).

Posted August 05, 2013

Perhaps the most common question I am asked by readers is "How can I become a DBA?" The question is actually not as simple as it seems and there are many different aspects to the answer. Sometimes it is an application programmer who wants to become a DBA. Sometimes a DBA for one DBMS wants to support a different DBMS; for example, a SQL Server DBA wants to move to become an Oracle DBA. And other times someone with no IT background at all asks the question. The answer is different for all of these folks.

Posted July 09, 2013

Although storage management can be an afterthought for the DBA, it really shouldn't be. Storage issues are vitally important and unless managed appropriately, it can be very costly. The cost of managing storage can be as much as 10 times higher than the initial cost of acquiring the storage—and the growth rate for disk storage was 37% between 1996 and 2007. Even so, it is common for storage-related issues to be relegated to the backburner by DBAs, but every database professional should understand modern storage basics.

Posted June 13, 2013

Data is not sedentary. Once data has been created, organizations tend to move it around to support many different purposes—different applications, different geographies, different users, different computing environments, and different DBMSs. Data is copied and transformed and cleansed and duplicated and stored many times throughout the organization. Different copies of the same data are used to support transaction processing and analysis; test, quality assurance, and operational systems; day-to-day operations and reporting; data warehouses, data marts, and data mining; and distributed databases. Controlling this vast sea of data falls on the DBA who uses many techniques and technologies to facilitate data movement and distribution.

Posted May 09, 2013

"Big data" and the impact of analytics on large quantities of data is a persistent meme in today's Information Technology market. One of the big questions looming in IT departments about big data is what, exactly, does it mean in terms of management and administration. Will traditional data management concepts such as data modeling, database administration, data quality, data governance, and data stewardship apply in the new age of big data? According to analysts at Wikibon, big data refers to datasets whose size, type and speed of creation make it impractical to process and analyze with traditional tools . So, given that definition, it would seem that traditional concepts are at the very least "impractical," right?

Posted April 10, 2013

When data professionals think about regulatory compliance we tend to consider only data in our production databases. After all, it is this data that runs our business and that must be protected. So we work to implement database auditing to know who did what to which data when; or we tackle database security and data protection initiatives to protect our data from prying eyes; or we focus on improving data quality to ensure the accuracy of our processes.

Posted March 14, 2013

Enterprise developers these days are usually heads down, in the trenches working on in-depth applications using Java or .NET with data stored in SQL Server or Oracle or DB2 databases. But there are other options. One of them is FileMaker, an elegant database system and development platform that can be used to quickly build visually appealing and robust applications that run on Macs, Windows PCs, smartphones, and iPads.

Posted February 13, 2013

Each new year, this column looks back over the most significant data and database-related events of the previous year. Keeping in mind that this column is written before the year is over (in November 2012) to meet publication deadlines, let's dive into the year that was in data.

Posted January 03, 2013

A proper database design cannot be thrown together quickly by novices. A practiced and formal approach to gathering data requirements and modeling data is mandatory. This modeling effort requires a formal approach to the discovery and identification of entities and data elements. Data normalization is a big part of data modeling and database design. A normalized data model reduces data redundancy and inconsistencies by ensuring that the data elements are designed appropriately.

Posted December 06, 2012

Every organization that manages data using a DBMS requires a database administration group to ensure the effective use and deployment of the company's databases. And since most modern organizations of every size use a DBMS, most organizations have DBAs, or at least people who perform the on-going maintenance and optimization of the database infrastructure.

Posted November 13, 2012

As businesses push to reduce the data latency between analytical systems and operational systems, data warehouses begin to take on more of the character of a transactional system. For a data warehouse to deliver near real-time information, the choices generally are to update the warehouse more frequently or access data directly from operational systems. Either way, the push to reduce latency changes the nature of database performance to support the data warehouse. The main reason we created data warehouses in the first place was to separate resource-intensive analytical processing from shorter duration, but very frequent transaction processing. As the two worlds now come back together, the churn pressure on the database system can be significant.

Posted October 10, 2012

If you've worked with relational database systems for any length of time, you've probably participated in a discussion (argument?) about the topic of this month's column, surrogate keys. A great debate rages within the realm of database developers about the use of "synthetic" keys. And if you've ever Googled the term "surrogate key," you know the hornet's nest of opinions that swirls around on the topic. For those who haven't heard the term, here is my attempt at a quick summary: A surrogate key is a generated unique value that is used as the primary key of a database table; database designers tend to consider surrogate keys when the natural key consists of many columns, is very long, or may need to change.

Posted September 11, 2012

It is impossible to have missed the sweeping changes being thrust upon the data world due to regulatory compliance. But even if you've noticed, chances are that the sheer volume of regulations was too mind-boggling to fully digest. Compliance starts with the CEO, but it works its way down into the trenches, and impacts database administration. With that in mind, this month's column will offer a brief introduction to the regulatory landscape and its impact on database administration.

Posted August 09, 2012

Although data integrity is a pervasive problem, there are some data integrity issues that can be cleaned up using a touch of SQL. Consider the common data entry problem of extraneous spaces in a name field. Not only is it annoying, sometimes it can cause the system to ignore relationships between data elements.

Posted July 11, 2012

Unless you plan for and issue regular COMMITs in your database programs, you will be causing locking problems. It is important for every programmer to issue COMMIT statements in all application programs where data is modified (INSERT, UPDATE, and DELETE). A COMMIT externalizes the modifications that occurred in the program since the beginning of the program or the last COMMIT. A COMMIT ensures that all modifications have been physically applied to the database, thereby ensuring data integrity and recoverability. Failing to code COMMITs in a data modification program is what I like to call "Bachelor Programming Syndrome" — in other words, fear of committing.

Posted June 13, 2012

Every now and then some sage consultant will offer advice like "Large tables should be partitioned" or "Be sure to use static SQL for your applications with high volume transaction workloads." But how useful is this advice? What do they mean by large and high volume? Terms such as these are nebulous and ever changing. Just what is a large database today?

Posted May 09, 2012

An important aspect of database security is designing your applications to avoid SQL injection attacks. SQL injection is a form of web hacking whereby SQL statements are specified in the fields of a web form to cause a poorly designed web application to dump database content to the attacker. Stories abound in the news where SQL injection was used for nefarious purposes. Several high-profile cases over the past few years impacted government websites, Microsoft in the U.K., a Swedish election, PBS (the Public Broadcasting System), and Lady Gaga's website (among many others).

Posted April 11, 2012

As a DBA, it is almost inevitable that you will change jobs several times during your career. When making a job change, you will obviously consider requirements such as salary, bonus, benefits, frequency of reviews, and amount of vacation time. However, you also should consider how the company "treats" their DBAs. Different organizations place different value on the DBA job. It is imperative to your career development that you scout for progressive organizations that understand the complexity and ongoing learning requirements for the position.

Posted March 07, 2012

Many types of data change over time, and different users and applications have requirements to access data at different points in time. A traditional DBMS stores data that is implied to be valid at the current point-in-time, it does not track the past or future states of the data. For some, the current, up-to-date values for the data are sufficient. But for others, accessing earlier versions of the data is needed. Temporal support makes it possible to store different database states and to query the data "as of" those different states.

Posted February 09, 2012

At the outset of each new year, I devote an edition of my column to review the significant data and database-related events of the previous year. Of course, to meet my deadlines, the column is written before the year is over (this column is being written in November 2011), so please excuse any significant news that may have happened late in December.

Posted January 11, 2012

In a world replete with regulations and threats, organizations today have to go well beyond just securing their data. Protecting this most valuable asset means that companies have to perpetually monitor their systems in order to know who did exactly what, when and how - to their data.

Posted December 01, 2011

Being a successful database administrator requires far more than technical acumen and database knowledge. DBAs should be armed with a proper attitude as well as sufficient fortitude and personality before attempting to practice database administration. Gaining the technical know-how is important, yes, but there are many sources that offer technical guidance for DBAs. The non-technical aspects of DBA are just as challenging, though. So with that in mind, this month's column will offer 10 "rules of thumb" for DBAs to follow as they improve their soft skills.

Posted November 10, 2011

Every professional programmer (and DBA) should have a library of books on SQL fundamentals. There are many SQL titles to choose from, and a lot of them are very good. But this month's column will outline four SQL books that should be on every database professional's bookshelf.

Posted October 15, 2011

Cost containment is an important IT department goal in this day and age of financial austerity. Every decision regarding your computing environment must be weighed not only against the value it can deliver to your organization but also upon its cost to procure, implement, and maintain. If a positive return on investment cannot be rapidly delivered, then the software (or hardware) won't be adopted.

Posted September 14, 2011

Often, when the business of data management frustrates me, I look for inspiration in what may seem at first glance to be odd places. For instance, I think the Lewis Carroll "Alice in Wonderland" books offer sage advice for our particular industry.

Posted August 11, 2011

Simplification is important in today's era of increasing complexity and ever-changing software environments. A key component of simplification is to remember the basics and apply some elementary rules and practices to your database environment. Many problems arise because we don't keep track of the things most of us already know. I stumbled upon the idea for this month's column after recalling Malcolm Gladwell's excellent business book Blink: The Power of Thinking Without Thinking. In this book, Gladwell offers up case studies and examples depicting the benefit of our "adaptive unconscious" - a powerful innate ability that provides us with instant and sophisticated information.

Posted July 07, 2011

The DBA, often respected as a database guru, is just as frequently criticized as a curmudgeon with technical knowledge but limited people skills. Most programmers have their favorite DBA story. You know, those anecdotes that begin with "I had a problem ..." and end with "... and then he told me to stop bothering him and read the manual." DBAs do not have a warm and fuzzy image. This may have something to do with the nature and scope of the job. The DBMS spans the enterprise, effectively placing the DBA on call for the applications of the entire organization. Database issues can require periods of quiet reflection and analysis to resolve, so DBAs generally do not want to be disturbed. But their quiet time is usually less than quiet; constant interruptions to answer questions and solve problems are a daily fact of life.

Posted June 08, 2011

Design reviews are an important facet of the system development lifecycle for database applications. It is during the design review that all aspects of the database and application code are reviewed for efficiency, effectiveness, and accuracy. It is imperative that all database applications, regardless of their size, are reviewed to assure that the application was design properly, efficient coding techniques were used, and the database is accessed and modified correctly and efficiently. The design review is an important process for checking the validity of design decisions and correcting errors before applications and databases are promoted to production status.

Posted May 12, 2011

What will you do when you find out you're about to acquire or consolidate with another firm or division? Are you aware of the risks you may be inheriting? What data is going to demand the highest availability? What IT regulations will you have to address and how do you know if existing controls already address them? Here are 10 "data health" checks you can conduct to answer help these questions before giving a green light to an M&A or consolidation.

Posted April 05, 2011

One of the ongoing goals of database administration is to minimize downtime and improve availability. If the DBMS is down, data cannot be accessed. If the data is not available, applications cannot run. And if your applications cannot run, your company is losing business. Lost business translates into lower earnings and perhaps even a lower stock valuation for your company. These are all detrimental to the business and therefore, the DBA is called upon to do everything in his or her power to ensure that databases are kept online and operational.

Posted March 09, 2011

One of the most fertile grounds for disagreement between database professionals is the appropriate usage of views. Some analysts promote the liberal creation and usage of views, whereas others preach a more conservative approach. When properly implemented and managed, views can be fantastic tools that help to ease data access and simplify development. Although views are simple to create and implement, few organizations take a systematic and logical approach to view creation. And therein lies the controversy. A strategic and reasonable policy guiding the creation and maintenance of views is required to avoid a muddled and confused mish-mash of view usage. Basically, views are very useful when implemented wisely, but can be an administrative burden if implemented without planning.

Posted February 02, 2011

As my regular readers know, toward the end of each year I devote an edition of my column to review the significant data and database-related events of the past year. Of course, to meet my deadlines, the column is written before the year is over (this column is being written in November 2010), so please excuse any significant news that may have happened late in the year!

Posted January 07, 2011

Assuring optimal performance of database applications starts with coding properly formulated SQL. Poorly written SQL and application code is the cause of most performance problems. As much as 75% of poor relational performance is caused by "bad" SQL and application code. But writing efficient SQL statements can be tricky. This is especially so for programmers new to a relational database environment or those who have never been trained to properly write SQL.

Posted November 30, 2010

SQL is the lingua franca for modifying and reading database data and any DBA worth his (or her) paycheck should be proficient in writing SQL queries. But SQL is a flexible and feature-rich language, so there are always things that can be learned - even by senior technicians. As such, this month's column discusses several interesting SQL queries that you can put in your bag of tricks for future use.

Posted November 09, 2010

If you are a DBA, or a database performance analyst, chances are that you deal with performance-related issues regarding your database systems every day of the week. But have you ever stopped for a moment and tried to define what you mean when you say "database performance"? Doing so can be a worthwhile exercise, if only to organize your thoughts on the matter. Think about it; don't we need a firm definition of database performance before we can attempt to manage the performance of our databases?

Posted October 12, 2010

Database systems require data files to store the data under management. These files, or data sets, reside on storage media. So storage management should be a key part of the database operations required of a database administrator (DBA). Unfortunately, storage is sometimes relegated to an afterthought; after all, don't we have storage administrators who deal with our disk arrays? But this way of thinking is misguided. To succeed, database administration and storage administration need to cooperate and work together.

Posted September 07, 2010

What is the most difficult thing about acquiring enterprise software? If you are like most IT technicians, your first inclination was probably something related to cost justification. Let's face it, enterprise software typically is very expensive ... and eventually, something will need to bring costs more in line with value.

Posted August 10, 2010

In this issue's column I'll be providing a fundamental introduction to database and database management concepts. Many of you may think that they understand the basic concepts and fundamentals of database technology. But quite a few of you likely do not, so please do not skip over this. First of all, what is a database? DB2 is not a database; neither are Informix, Oracle and SQL Server. Each of these is a DBMS, or Database Management System. You can use DB2 (or Informix or SQL Server) to create a database, but DB2, in and of itself, is not a database.

Posted July 12, 2010

Before we go any further, let me briefly answer the question posed in this column's title: "No Way!" OK ... with that out of the way, let's discuss the issue ... Every so often, some industry pundit gets his opinions published by declaring that "Database administrators are obsolete" or that "we no longer need DBAs." Every time I hear this, it makes me shake my head sadly as I regard just how gullible IT publications can be.

Posted June 07, 2010

Have you heard about stream computing? Basically, it involves the ingestion of data - structured or unstructured - from arbitrary sources and the processing of it without necessarily persisting it. Any digitized data is fair game for stream computing. As the data streams it is analyzed and processed in a problem-specific manner. The "sweet spot" applications for stream computing are situations in which devices produce large amounts of instrumentation data on a regular basis. The data is difficult for humans to interpret easily and is likely to be too voluminous to be stored in a database somewhere. Examples of types of data that are well-suited for stream computing include healthcare, weather, telephony, stock trades, and so on.

Posted May 10, 2010

As you work to protect your data in this day-and-age of data breaches and regulatory compliance, technology and software solutions to data and database security spring to the top of most people's minds. This is to be expected because, after all, most of our data is stored on computers so technology and software are required to protect the data from unauthorized access. This is a good thing: Technology is a crucial component of protecting your valuable business data. But it is not the only thing.

Posted April 07, 2010

The continuing acceptance and growing usage of Linux as an enterprise computing platform has enlivened the open source community. The term "open source" refers to software that users are free to run, copy, distribute, study, change, and improve. Often "open source" gets misinterpreted to mean free software. This is understandable, but the open source concept of free is closer to liberty than it is to no charge.

Posted March 04, 2010

Have you ever read those inserts that your bank, credit card providers, insurance company, mutual fund company, and others slip inside your statements and bills? We all get them. You know, those flimsy pieces of paper, printed in small type and written in convoluted English. I have started collecting them - sort of like baseball cards. But I doubt they'll ever be valuable. They are entertaining, though ... and disheartening.

Posted February 09, 2010

My whole career has been based on managing data and producing information and, as such, I am intrigued with the issue of information overload - or the perception that there is too much information. A former boss called me an information bottom-feeder because I always seemed to have a nugget of information or two that applied to her projects and quests. You see, I'm of the opinion that you can never have enough information - at least regarding those things you care about.

Posted January 11, 2010

As per my regular custom, this final DBA Corner column of the year is a review of the most significant data and database-related events of the year. Of course, to meet my deadlines, it is October 2009 as I write this, so please excuse any significant news that may have happened late in the year!

Posted December 14, 2009

How many times have you been surfing the web only to encounter a form that requests a slew of personal information before you can continue on? You know what I'm talking about. A company markets a white paper or poll results or something else that intrigues you, so you click on the link, and bang, there you are. You don't have the information you wanted yet, but if you just fill out this form then you'll be redirected to the information.

Posted November 11, 2009

As my regular readers know, I am an avid reader, especially of technology books. And every now and then I review some of the more interesting database-related books in the DBA Corner column.

Posted October 13, 2009

If you use an IBM z Series mainframe you've undoubtedly heard about zIIPs and zAAPs and other specialty processors. But maybe you haven't yet truly examined what they are, what they do, and why they exist. So, with that in mind, let's take a brief journey into the world of specialty processors.

Posted September 14, 2009

Managing Data Resources During Mergers and Acquisitions

Posted August 19, 2009

As the U.S. markets strive for a recovery in 2009, many IT managers are cringing at the thought of managing their data through what may be a record year of mergers and acquisitions. Managing an ever-increasing mountain of data is not a simple task in the best of times, but doing so while combining formerly separate entities during an economic slowdown can be a monumental challenge.

Posted August 14, 2009

Protecting the data in our enterprise databases is extremely important. But what exactly does that mean? Oh, at one level we have the database authorization and roles built directly into the DBMS products. You know what I'm talking about: GRANT and REVOKE statements that can be used to authorize access to database objects, resources and statements. Many organizations have adopted policies and products to migrate this type of security out of the DBMS and into their operating system security software.

Posted July 13, 2009

Before we even begin this month's column I had better define what I mean by a "black box." Simply put, a black box is a database access program that sits in-between your application programs and the DBMS. It is designed so that all application programs call the black box for data instead of writing SQL statements that are embedded into a program. The general idea behind such a contraption is that it will simplify application development because programmers will not need to know how to write SQL. Instead, programmers call the black box to request data. SQL statements become calls-and every programmer knows how to code a call, right?

Posted June 15, 2009

The economy is a wreck and things will likely get worse before they improve. Unemployment is even worse; almost 600,000 jobs were lost in January 2009, sending the unemployment rate to 7.6%, the highest it has been in 16 years. So many data professionals are out there looking for their next challenge … and more probably will be job hunting before the year is out.

Posted May 15, 2009

Although the most important aspect of DBA tool selection is functionality and the way it satisfies your needs, the stability of the vendor that provides the product is also important

Posted April 15, 2009

Have you noticed that dynamic SQL is more popular today than ever before? There are a number of factors contributing to the success of dynamic SQL. Commercial off-the-shelf applications, such as SAP, Siebel, and PeopleSoft, utilize dynamic SQL exclusively. In many cases, too, dynamic SQL is the default choice for in-house application development.

Posted March 15, 2009

The optimizer is the heart and soul of a relational DBMS. It analyzes SQL statements and determines the most efficient access plan for satisfying each statement. The optimizer accomplishes this by parsing the SQL statement to determine which tables and columns must be accessed. It then queries system information and statistics stored in the system catalog and directory to determine the best method of accomplishing the tasks necessary to satisfy the request.

Posted February 15, 2009

Staffing the DBA organization is not a simple matter. Several non trivial considerations must be addressed, including the size of the DBA staff and the reporting structure for the DBAs.

Posted January 15, 2009

As per my regular custom, this final column of the year will take a look back at the most significant data and database-related events of the year.

Posted December 15, 2008

Posted November 15, 2008

Posted October 15, 2008

Posted September 15, 2008

Sponsors