<< back Page 2 of 3 next >>

To Ensure Data Security, Simplify and Analyze


If the standards are set for different environments based on the data classification and roles, they should be consistent across the databases so that, as data moves from one area to another, the authorizations can stay in place. While that may sound easy enough, it can take quite a bit of effort to transition to this state. A current assessment of access and an understanding of the data movement and workflows are places to start. Partnering with different teams, such as the big data and data quality teams for data classification and workflows, will be essential.

There will always be exceptions that will prevent simplification, but reducing risk and looking at how to protect the data more easily should be the focus. In looking to protect data and databases, there are three key areas to review and manage:

  1. Authentication and authorization. This is who can access the data and what actions they have permission to perform. Authentication to a server might even open up files that are not protected, or access to data  as it is being moved from a staging area to production or during stages in a batch process. Both the permissions and access to data should only be allowed for those authorized users at all stages. Even as big data processes take place, the servers and files that can be accessed and how must be restricted to protect the data flows.
  2. Encryption. Encrypting data is going to protect data at rest and in transit. For those users without access to the keys to decrypt the data, the information is going to be useless. Tools are helpful here for key management and to provide consistent ways to encrypt files and data.
  3. Monitoring and auditing. All processes are not going to be able to be automated to implement the policies and standards. This means that monitoring and auditing against activity and baselines are required. It is necessary to verify that unauthorized users are not gaining access and that the proper controls are in place for changes and activity. Analytics will be able to assist here in developing preventive controls based on that activity.

The point is not to oversimplify what needs to get done to secure the environment but instead to focus on the main areas that will reduce the larger risks in the environment. It will be a continuous challenge to review and verify that the right information is being captured. Each of the areas for protecting data comes with its own challenges in terms of dealing with legacy systems for which security has not been planned and providing ways to avoid slowing down the processing and data movement—which is just as important for real-time usage by the business as being kept secure.

What’s Ahead

Monitoring database activity is going to bring forth details for analytics even though one of the difficulties might be the inability to capture all of the needed data. Questions can be developed to figure out what data should be collected. As more data is collected, the questions may change, and additional information can be collected or integrated with other systems to start to really understand the security posture of the environment.

Audit data and activity information can come from native auditing, activity monitoring tools, and other database and system logs. Filtering of this data usually happens by only auditing the activity that is a priority or by looking at the activity of the highly privileged actions. This too must be reviewed so that the right information is being captured. There is a fine line between sending all of the information and adhering to policies that govern which activity is being monitored. Whether the filtering is done at the gathering or analytic stage, documentation and applied policies are extremely important for deeper review of the abnormal behavior. For example, system logs might show someone logging in and elevating privileges, but the database audit logs might only capture the activity of specific actions of the system administrator—which means the database logs wouldn’t be passing in the details to correlate the activities performed because the policy was based on only specific commands. If a privileged user was audited, no matter what the commands, the data could be correlated to what action was performed in the database with the login that elevated the access at the system level.

<< back Page 2 of 3 next >>


Newsletters

Subscribe to Big Data Quarterly E-Edition