At Data Summit Connect Fall 2020, data management expert Craig S. Mullins delivered a presentation titled, "The Changing Requirements of Database Administration." Mullins, who is president and principal consultant of Mullins Consulting, an IBM Gold Consultant, and a longtime DBTA columnist, looked at the industry trends and issues that are impacting DBAs and fundamentally transforming the duties associated with their jobs.
Videos of presentations from Data Summit Connect Fall 2020, a free series of data management and analytics webinars presented by DBTA and Big Data Quarterly, are available for viewing on the DBTA YouTube channel.
Mullins reviewed the current landscape for database management systems and DBAs and examined a host of trends—big data and data growth, the cloud, regulatory compliance, the prevalence and cost of data breaches, the use of AI and machine learning, database heterogeneity, the rise of DevOps and agile methodologies, and IoT—while touching on the impact of each on DBAs' responsibilities.
Some of the key issues affecting DBAs are that and changing their jobs are that:
- Staffing: Despite what organizations are saying about seeking to become data-driven and valuing data, DBAs are a declining percentage of IT staff, putting more pressure on the people in those roles.
- Cloud: Cloud usage and adoption is increasing but the future is hybrid and preferably in an integrated fashion which means that DBAS are taking on additional responsibilities. According to 451 Research, 57% of respondents to a survey said they are moving to a hybrid IT environment that leverages on-prem and off-prem cloud /hosted resources.
- Security: Data breaches remain a big risk with the average cost of data breach now costing $7 million and $141 per record, according to Ponemon Research.
- Compliance: New privacy mandates also put more pressure on DBAs due to auditing and SIEM, encryption, data masking, the right to be forgotten, and metadata management.
- AI: AI works by combining large amounts of data with faster iterative processing and intelligent algorithms—means that more data is needed to fuel AI and ML initiatives.
- DevOps: The methodology that combines development teams and operations teams to enable faster and more frequent updates runs counter to familiar DBA release practices that tend to be slower and less frequent, forcing work differently than what they are accustomed to.
- IoT: There are now more sensors on the planet than people and these sensors produce data, much of which needs to be managed.
- Heterogeneity: In 2011, the top five database vendors (Oracle, Microsoft, IBM, SAP, and Teradata controlled 91% of the revenue. By 2016 that percentage had declined to 86.9%. The database market is still dominated by SQL and relational, but is becoming much more varied.
The bottom line said Mullins is that DBAs are being asked to do more with larger amounts and more types of data being accessed more rapidly from more sources without any prolonged downtime permitted, while using and supporting new database types and capabilities, and with fewer DBAs as a percentage of IT staff than ever before.
Mullins recommended that organizations:
- treat database management as a management discipline
- take a proactive versus reactive position
- automate what you can
- turn takes over to the computer to free up DBA time
- use intelligent automation and autonomics
- integrate into DevOps practices
- embrace modern DBA tools and utilities that understand the new digital landscape that includes
- large amounts and types of data
- supports new functionality and technologies
- integrates into the DevOps pipelines
- is always available and easy to use
Mullins was joined in the Data Summit Connect session by John Pocknell, senior market strategist, database solutions at Quest.
In Pocknell's presentation titled, "Tools for a Cross-Platform DBA," he explained how in today’s heterogeneous, hybrid database environment, having the right monitoring and management can provide valuable insight. Pocknell showcased tools that can help make DBAs save time and be more efficient across more complex environments that span cloud and on-prem, relational and NoSQL.
Among the key problems that are complicating database management, said Pocknell, are:
- Increased data diversity, which makes it harder for business to maintain operational control and extract value to business insights.
- Cloud usage, which is causing complexity. According to the 2020 Flexera State of the Cloud report, most organizations are over-budget by 23% with approximately 30% of cloud spend being wasted. The overspending is often because companies find it hard to correctly right-size their database cloud service and can't adequately manage database performance in the cloud.
- A lack of the right tools. According to the 2020 State of Database DevOps survey, 38% of CIOs said that the speed of delivery of database changes was the top driver for Database DevOps. Yet many database dev teams still don't use version control and lack the right tools to be able to automate development processes such as testing and QA to reduce production defects.
- Data security vulnerabilities, which make organizations at risk for attack. According to the 2020 Insider Threat report, 90% of companies feel vulnerable to insider attack, with 53% confirming insider attacks in the last 12 months. This is often due to the sheer number of databases and quantity of data in production and non-production environments, making protection of personal data and auditing a challenge.
Pocknell described how Quest customers in industries spanning healthcare, financial services, and clinical research have used combinations of Quest tools, such as Foglight for Databases, ApexSQL, Benchmark Factory, Toad DevOps Tookit, and Toad for Oracle, to address data management challenges. Featured use cases included the need to increase operational oversight and maintain SLAs while moving to a mixed database environment, increasing operational efficiency in order to deliver changes and business value faster, and ensuring compliance with GDPR.