Newsletters




Database Activity Monitoring Can be Accomplished Without Performance Overhead


Historically, database auditing and database performance have been like oil and water; they don’t mix. So auditing is often eliminated, because performance drag on critical systems is unacceptable.

In the past, native database auditing was the only option for monitoring database activity. While able to do the job, it added a tremendous amount of overhead to servers - resulting in a negative affect on configuration, management, separation of duties activities and, more critically, on database performance. High levels of native auditing leverage the same disk I/O as the production databases they’re designed to audit. This, coupled with the possibility of downtime due to data mismanagement, leads organizations to give up on auditing - and run mission-critical systems with virtually no visibility into the actions of authorized users.

How does an enterprise integrate a high level of auditing to meet regulatory requirements for Sarbanes-Oxley (SOX), Gramm-Leach-Bliley (GLBA) and the Payment Card Industry Data Security Standard (PCI-DSS), protect sensitive information and still maintain the high-performance databases that serve the business needs of end-users?

Fortunately, auditing activity no longer requires native tools. Database activity monitoring (DAM) tools are now being used to audit and report on data activity. These solutions can accomplish detailed auditing of database very effectively without slowing the performance of databases or other systems. DAM technologies enable multiple methods of activity monitoring (not always with the same solution) including network protocol monitoring, agent-based and agent-less auditing. They are used to audit and report on activity for regulatory requirements, like privileged user activity for SOX (while maintaining separation of duties) or monitoring access to cardholder data for PCI. They’re also capable of detecting anomalous activity with data - nearly impossible to find with native tools.

The least intrusive approach to DAM is passive network monitoring via a network appliance. Several solutions use this approach. Software-based solutions are also available. Passive network monitoring examines a database protocol - such as Oracle or SQL Server - usually via a switched port analyzer (SPAN) or mirror port (a capability contained in most enterprise-level network infrastructure equipment). This allows the network to send copies of the SQL activity to and from the database server to a monitoring device. Network and security engineers typically utilize sniffers or intrusion detection systems (IDS) in this manner. It allows a device, other than the database server, to see the SQL traffic without service interruption or additional latency.  Inline capabilities also exist, via network tap or by placing the network appliance between the network infrastructure and the database server. This approach can be effective in certain circumstances but latency, scalability and throughput issues must be considered.

Once the protocol is decoded by a DAM solution, it is examined and stored within a database on the appliance. In the newest generation of DAM, this is accomplished very quickly. DAM also allows you to be selective about what you’re auditing. If insert/update/delete against a certain table from the typical client-side application isn’t relevant, then you don’t need to capture it. But what if that access comes from an unapproved application?

DAM policy engines vary, but targeted auditing should be a considered a minimum requirement - providing audit records specific to the data of interest (without utilizing native audit capabilities in the target platform) such as: host and server username; client and server IP addresses; server TCP port; server-side application (i.e., Oracle, DB2, Sybase); Client-side application used (i.e., Toad, SQL*Plus, Clarify); complete SQL command ; success/failure of the command executed; and data about the response: size in rows, bytes, etc.

Network-based monitoring, can adress more than 90 percent of the activity monitoring required in a typical client-server deployment. There are some exceptions, such as standalone deployments, where the application and database server are on the same hardware, direct console access or encrypted sessions.

Once network auditing is deployed, you will need to select a local auditing option. Local auditing is needed to meet compliance requirements and better protect data by monitoring the most privileged users - like DBAs. There are two main approaches to local auditing: agent-based and agent-less. They each have their advantages.

Agent-based local auditing comes in two flavors. The first is an all-purpose agent, which reviews both local and network-based activity. These agents utilize a mix of monitoring approaches, like sniffing on the local Ethernet interface and parsing the native audit logs (like the redo or transaction log, etc.). While effective, they require a repository to store audit data. This is typically a relational database just like any other database that exists within your environment. However, because the agents are gathering data from the transaction log and the database itself, they can add value by allowing for the viewing of sessions encrypted with a database application’s native encryption capabilities after they have been decrypted and by allowing for review of the previous values within a field.

Network monitoring does an excellent job of capturing client-server traffic destined for the database server from other hosts in the environment, but what about the ad hoc database access that takes place on the system console? The second type of agent-based deployment is a lighter weight agent that complements the network monitoring approach. The agent reflects console activity to the monitoring appliance so that it can be reviewed. The agent could also be leveraged to capture more than just console activity. For instance, if IPSec encryption is used between an application server and the database, a lightweight agent could be used to forward that activity to a monitoring solution after it has been decrypted. This closes the gap left by the network monitor by allowing the same monitoring host to review both host- and network-based activity.

There are situations where agents are not the best approach - like when system administrators have so many agents running that they don’t want to introduce another piece of software and possibly risk into the system. In this scenario agent-less auditing, can provide value. There are many knobs and switches in database platforms’ native auditing capabilities that can be tuned in order to capture specific activity. For example, SQL Server Profiler can be configured to capture only local activity. Native auditing could also be expanded to capture a greater amount of activity, depending on the auditing requirements of the environment. However, this needs to be done with great care: too much native auditing could create performance problems. Very selective native auditing can be used to capture privileged activity, console activity, etc., that may not be visible via network monitoring.

When looking at monitoring for compliance and data protection, it’s important to remember that the technology landscape has changed dramatically. We no longer have to live with performance and management overhead to gather the data required by a multitude of entities. We can leverage tools outside of the database, such as network, agent- and agent-less monitoring, to capture the data without impacting users.

In any DAM deployment, we want to maximize flexibility. What may work well for one enterprise may not work well for another. So finding a solution that can utilize any combination of network monitoring, agent-based or agent-less auditing helps maximize the value that can be extracted. DAM solutions help the business protect its core data assets and be more limber at the same time. It’s no longer a choice between database performance and auditing. We can finally have our cake and eat it too.


Sponsors