Thursday, 20 December 2012

Data Encryption and Protection - Part 2

In my last post I ran through the basics of SQL Server's database encryption options: Transparent Database Encryption (TDE) and Cell-Level Encryption. In this post I'm going to discuss some of the considerations required before implementing encryption. 

There are four main points to consider:

  1. What do I need to accomplish
  2. Cost of implementation
  3. Cost of rollback/removal
  4. Performance impact
My last post covered the facts that will inform the first question. And I feel that, in an ideal world, a combination of both encryption types is best. This post will deal with questions 2 and 3 in regards to both Transparent Data Encryption (TDE) and cell-level encryption as well as performance, question 4.

Wednesday, 12 December 2012

Monitoring out of hours DB access

Recently a client has asked to monitor staff usage of a desktop application out of hours and out of the office. So the specified criteria for tracking db access were:
  • Between 17:00 and 09:00 hours on weekdays
  • Weekends
  • Computers from outside company offices
 The client uses SQL Server 2005 so the two solutions that immediately came to mind were:
  • Run a job with a query interrogating the sys.dm_exec_sessions dmv checking for sessions within the specified criteria
  • Create a logon trigger that fires at each logon event and populates a table with the required data when a logon event occurs within the specified criteria
In this case I felt that creating a logon trigger would be the more efficient option as it would fire only when a user logged on and would execute the query only during the specified times. Where as a job would need to be scheduled and would run even when not needed, i.e. when no one is accessing the database within the specified times.

A while back I posted about Logon Triggers when I first learned about them. I gave the example found in BOL about limiting the number of connections a user can have at any one time. Since then I haven't had an opportunity to put such triggers into practice. So it was, sadly, quite exciting to finally jump into it.

First I created a table in my AdminDB: 
USE [AdminDB] CREATE TABLE dbo.tblLogonTracking
    stblLogonTrackingID int primary key clustered identity (1,1) NOT NULL,
    login_time datetime NOT NULL,
    [host_name] varchar(50) NOT NULL,
    [program_name] varchar(100) NOT NULL,
    [EventData] VARCHAR(15) NULL
Then the trigger itself:
USE [master]
ALTER TRIGGER trLogonTracking
IF    DATEPART(HH, getdate()) NOT BETWEEN 9 AND 17
        INSERT INTO AdminDB.dbo.tblLogonTracking (login_time,[program_name],[host_name],[EventData])
        WHERE ORIGINAL_LOGIN() IN ('AppLogin1','AppLogin2')
        AND (APP_NAME() LIKE '%app1_name %' OR APP_NAME() LIKE '%
        AND NOT EXISTS (SELECT 1 FROM AdminDB.dbo.tblLogonTracking as a
                        WHERE DATEDIFF(mi,GETDATE(),a.login_time) > 60
                        AND APP_NAME() = a.[program_name]
                        AND HOST_NAME() = [host_name])
        AND EVENTDATA().value('(/EVENT_INSTANCE/ClientHost)[1]','NVARCHAR(15)') NOT LIKE '10.1.%'
I've created the trigger 'ON ALL SERVER' since I need to use it for several databases/applications. But it could be created just for a single database by running it on the required db and using 'ON DATABASE'. Due to the nature of the applications I've also included a NOT EXISTS statement so I don't record the same user logging in and out successively. The goal here is to know that someone is accessing the database in out of hours periods, not necessarily how many times they are accessing the system. But that should be relatively easy to modify based on requirements.
I also needed to access the IP address of the incoming connection - to filter out tthe company IP range. So I'm using the EVENTDATA() function. Event data is held as XML and therefore an XML statement was required. For more info on EVENTDATA() see MSDN.

With the above there is now a table to be accessed for usage/auditing reports. It would probably be a good idea to archive and/or cleanup the LogonTracking table just to keep this tidy and efficient.

It would be nice to hear what other people have used logonn triggers for as well.

Monday, 26 November 2012

Data Encryption and Protection - Part 1

A client has recently revisited their need to protect sensitive data in their application and the database that it connects to. As a DBA it is easy to focus solely on the database but as I was considering their infrastructure and all the parts of their system it became clear that that was a small piece of the puzzle. In terms of protecting data for an entire IT solution there are quite a few other areas to consider. In this and a few subsequent posts I will discuss what options there are in securing data. I will attempt to touch on the major advantages and disadvantages of those options. Obviously, as I'm not a network or systems admin I will only touch on the concepts to consider that are beyond the scope of securing the database.
First I will cover what I know best: SQL Server.  There are 2 main encryption methods serving different purposes:
  1.  Transparent Database Encryption (TDE) - available for SQL Server 2005+
  2. Cell-level encryption

Sunday, 11 November 2012

Implementing Change Data Capture Part 3 – Creating an SSIS Package

In the last post I covered creating a wrapper function for use within an SSIS package and staging tables in the data warehouse. This post will deal with creating the SSIS package(s) itself.

As I mentioned in my first post about Change Data Capture (CDC) , SQL Server Books Online has a very good step-by-step how-to on setting up CDC and the related SSIS package to perform incremental loads to a data warehouse. My series of posts is meant to make that set up and implementation a little easier by somewhat automating the script and object generation process.

Sunday, 4 November 2012

Implementing Change Data Capture Part 2 – Preparing for SSIS

In the last post I discussed setting up Change Data Capture (CDC) on a database. This post deals with the first steps in setting up an incremental extract process. This would be, loosely speaking, the ‘E’ of a larger ETL (Extract, Transform, Load) process.

I feel that this is where CDC really shines. With CDC itself a transaction log based process like log shipping and mirroring and, set up as in the last post, with the CDC tables on a separate filegroup (even a separate disk array) it is a very efficient method of extracting data to a data warehouse.

Microsoft BOL’s step-by-step guide calls this an ‘incremental load’. But I see this really as the extract part of the process. This and the next post deal with getting the data out of an OLTP database and preparing it for the transform and load parts of an ETL. As you will see there is no transform occurring and it is just incidental that the data is being loaded into the data warehouse.

Monday, 22 October 2012

Implementing Change Data Capture Part 1 - Configure Database

I've finally gotten round to investigating Change Data Capture or CDC in SQL Server 2008. For ages it's seemed like this big unknown, but while studying for the SQL Server 2008 Developer MCITP certification I decided it would be a good time to dive in.

Surprisingly, Books Online (BOL), which I've always struggled using, has not only a good explanation of CDC but a great tutorial on setting it up. Plus the kind Microsoft people even take you through the implementation of an SSIS based incremental load. This was a great find, as one of the ways in which CDC really shines is when used as part of an ETL solution. There are other very useful applications, such as auditing data changes (not to be confused with SQL Server 2008's Audit feature), but I'll focus on the ETL application here.

For more in-depth information about what Change Data Capture is and how it works see the BOL entry 'Basics of Change Data Capture'.

What I'd like to do over the next few posts is share a few scripts I've come up with to streamline the implementation of CDC. While the basics are very easy to implement, it's the SSIS package(s) that could get rather tedious to build without some generic scripts.

Monday, 30 July 2012

Orphaned users

I'm sure many people have experienced the need to "reconnect" orphaned users after restoring a database from a different server. Not a difficult remedy, just run exec sp_change_users_login 'auto_fix','user_name' and you're good. If you run sp_change_users_login 'Report' first you can find all orphaned users for the given database.

Today I bumped into a situation where I ran the SP and received the error "An Invalid parameter or option was specified for 'sys.sp_change_users_users_login'". I was perplexed as I've run the sp many times before with no problems. I was starting to tear my hair out having searched the web with no joy until, and in hindsight, this is silly, I looked to see if the login existed on the server. And guess what, it didn't.

So, as is often the case, the Microsoft error message is, if not wrong, then at least misleading! I would have expected to see an error like 'User doesn't exist' or similar. Alas, in future, when this error appears, check whether the login exists.

See my other post with a query to resolve any and all orphaned users on a database.