Being able to keep a database running at peak performance is important for obvious reasons, but it may be surprising how difficult a task that can be. In a recent webinar Joseph McKendrick, lead analyst with Unisphere Research, and Robert Wijnbelt, senior product manager with Dell Software, discussed how to identify, prioritize and resolve database performance problems to ensure peak performance in an ever-changing data landscape.
One reason maintaining high database performance has become more difficult is due to the complexity of newer systems. While these systems provide better quality, there is the trade-off because they are more complex than past databases.
Also, while databases have improved, data environments have changed even more so with cloud, hybrid cloud, and other platforms to consider. In a survey of 300 data managers, there are a number of challenges cited data managers and professionals with the top two being rapid diagnosis of database performance problems, cited by 52% of respondents, and keeping databases at new patch levels, mentioned by 51% of respondents, stated McKendrick.
During the next 3 years the top two challenges that will be facing data managers are data growth and data security. This is not surprising, considering the rapid advancements in technology leading to rapid data growth, and the soaring value of data, which make the ability to keep it secure that much more difficult. With the advances in automation, the role of DBA is not fading away but changing and becoming more important than ever, he added.
Wijnbelt noted that while maintaining peak performance is difficult, it still comes down to the basics: the database and the DBA. He outlined the “Pillars of Efficient Monitoring”: centralized architecture, remote collection, adaptive baseline alerting, service level monitoring, consistent cross platform interface, and integrated analytics.
“If you want to do this correctly, these are the most important items we think that you need to have in a system to address database performance,” explained Wijnbelt. “The strategy for database managing that we take with our Foglight product is on the one hand we have detailed database resource monitoring which looks at different performance metrics and puts it in an analytics engine. The second prong is transactional workload analytics. If you want to really look into your database and see why certain problems are arising, this transaction workload system will tell you exactly how users are accessing the system.”
To watch a replay of this webinar, go here.