t0mm0d t0mm0d - 1 month ago 15
SQL Question

Auditing data changes in SQL Server 2008

I am trying to find a highly efficient method of auditing changes to data in a table. Currently I am using a trigger that looks at the INSERTED and DELETED tables to see what rows have changed and inserts these changes into an Audit table.

The problem is this is proving to be very inefficient (obviously!). It's possible that with 3 thousand rows inserted into the database at one time (which wouldn't be unusual) that 215000 rows would have to be inserted in total to audit these rows.

What is a reasonable way to audit all this data without it taking a long time to insert in to the database? It needs to be fast!

Thanks.

Answer

A correctly written trigger should be fast enough.

You could also look at Change Data Capture

I quite often use AutoAudit:

AutoAudit is a SQL Server (2005, 2008, 2012) Code-Gen utility that creates Audit Trail Triggers with:

Created, CreatedBy, Modified, ModifiedBy, and RowVersion (incrementing INT) columns to table

Insert event logged to Audit table

Updates old and new values logged to Audit table Delete logs all final values to the Audit table

view to reconstruct deleted rows

UDF to reconstruct Row History

Schema Audit Trigger to track schema changes

Re-code-gens triggers when Alter Table changes the table

Update: (Original edit was rejected, but I'm re-adding it):

A major upgrade to version 3.20 was released in November 2013 with these added features:

  • Handles tables with up to 5 PK columns

  • Performance improvements up to 90% faster than version 2.00

  • Improved historical data retrieval UDF

  • Handles column/table names that need quotename [ ]

  • Archival process to keep the live Audit tables smaller/faster but retain the older data in archive AutoAudit tables