Automate your Database Integrity checks using SMO

Allen White shows how to use Visual Basic to create simple routines that use SMO for inserting and deleting jobs from SQL Server Agent and for creating or deleting stored procedures in SQL Server.

Automate your Database Maintenance using SMO – Part 3

Allen White shows how to use Visual Basic to create simple routines that use SMO for inserting and deleting jobs from SQL Server Agent and for creating or deleting stored procedures. He then shows how to use these routines to automatically check the integrity of all the appropriate databases in any server, and optimize them.This is the third article in the series. See also …
Automate your Database backups using SMO and
Alert-based Transaction Log Backups
(The source code to this article can be accessed via the speech-bubble above)

Once you’ve implemented a well thought-out backup plan you’ll want to perform regular optimization processes and check the integrity of your databases. These steps will help to keep the performance of your database activity at its peak. The integrity checks use DBCC CHECKDB commands to ensure no problems exist in your databases. Books Online has an excellent discussion of the DBCC CHECKDB command. Optimizations are done in this process by performing an UPDATE STATISTICS command against all database tables and an ALTER INDEX REORGANIZE command against all indexes. If there are other steps you normally do during regular maintenance feel free to include those steps in your implementation of this code.

As I mentioned in Part 1 of this series, SMO (Server Management Objects) is an object library which allows you to write programs to manage SQL Server. These programs use SMO to both examine the server on which it runs to determine what databases exist and to create the alerts and SQL Server Agent jobs which perform the maintenance required.

Creating and running the VB Script

The first thing you’ll need to do is to select Add Reference from the Project menu. Within the Add Reference dialog, use the Ctrl key to select the following assemblies (you may have to browse to the C:\Program Files\Microsoft SQL Server\90\SDK\Assemblies folder):

Next, in the code window at the top of the code, before any declarations, insert the following lines:

You’ll also need to change the Sub Main() statement to:

This change will allow you to process the command-line arguments, which is the first thing we need to do in each application.

Most of my servers use named instances so I can’t just tell the program to connect to “.”. Therefore, I pass in the server name as the first parameter.

So, to rebuild the maintenance jobs you could issue the following command line (I run the jobs weekly, hence the name, but you can determine your own schedule):

  u:\DBMaint\ BuildWeeklyMaintJobs -S”MyServer\Inst01″ -Tmyname@myorg.com

The job will connect to server “MyServer\Inst01”,  and notify mynam@myorg.com of the backup.

The first thing we want to do is to connect to our server, set initial default fields to improve SMO performance, and determine the version of SQL Server we’re connecting to.

The next step is to drop the existing maintenance jobs.  I create separate jobs for system databases and user databases for optimizations, and one job for checking database integrity.  The four job names I use are ‘CheckDBIntegrity’, ‘SystemDBOptimizations’ and ‘UserDBOptimizations’.  These commands drop them so the rest of the program and build new copies.

Now we’re ready to perform the subroutines that build the three jobs:

This is the end of the main subroutine, so we want to catch any errors that might have occurred, using the following code:

We also need the BuildNotifyStep module we used in parts 1 and 2.  It can be included here or at the end of the module.

Creating a job to Check Database Integrity

The first job we’ll create is the one which checks the database integrity of all databases on our server, except tempdb.  (If you’re concerned about the integrity of tempdb, restart your server.  It’ll be refreshed from model and you’ll have a good tempdb again.)  We use the DBCC CHECKDB WITH NO_INFOMSGS command to verify the integrity of each database on the server.  First we’ll declare our subroutine and the variables it requires.

Next, we’ll get the database collection on the server, set the default initial fields for more efficient SMO operations, and create the Agent job to check the database integrity.

We have to step through each database object in the collection, and for each database (except tempdb) create a step to run the DBCC command against the database.  We’ll also set the starting step for the job.

We now build the step to notify us if the DBCC fails, and add a dummy step for the last database’s DBCC step to go to when it succeeds.

Then we create the schedule object and set the properties to occur once a week, on Sunday at midnight, starting today.

The Catch block notifies us if any problems in the subroutine, and uses the same code as the Catch block in the main routine, and we’re done with the Check Integrity job.

System Database Optimizations

In each system database we want to shrink the database, leaving 10 percent free space, for each table within the database we want to update statistics, and for each view we want to update statistics with the FULLSCAN and NORECOMPUTE options.  Your optimizations can include the tasks you find most useful.

Next, we’ll get the database collection on the server, set the default initial fields for more efficient SMO operations, and create the Agent job.

SQL Server 2000 provided the system stored procedure xp_sqlmaint which performed these optimization steps for us (after setting the options in the Database Maintenance Plan Wizard.  These same actions are performed in SQL Server 2005 via the DBCC SHRINKDATABASE and UPDATE STATISTICS commands.  

We need to assign the first job step to the job, and build the failure notification step for each step in the job, and add a dummy step for the last step to go to when it succeeds.

Then we create the schedule object and set the properties to occur once a week, on Sunday at 3AM, starting today.

The Catch block notifies us if any problems in the subroutine, and uses the same code as the Catch block in the main routine, and we’re done with the System Database Optimizations job.

User Database Optimizations

I ran into a problem building the user database optimizations, because the command length for a job step didn’t allow me to include commands to update the statistics for all of the tables in a database with lots of tables and views.  Because of this I built stored procedures in each of the databases with the necessary optimizations, and in the job I execute the stored procedure.  We’ll start with the stored procedure to reorg the indexes.

We need to set the initial fields so SMO performs well, get the database object for the selected database, and drop the existing stored procedure if it exists.

The stored procedure we’re creating in this routine is called db_index_reorg, because that’s what it does.

We’ll grab the collection of tables in the database and iterate through them, creating an ALTER INDEX command for each non-system table.

Since views can have indexes as well we need to do the same thing for each view.

The body of the stored procedure is now complete, so we can assign it to the textbody property of the stored procedure object, and create the stored procedure.

The Catch block uses the same code as the Catch block in the main routine, and we’re done with this subroutine.

The subroutine to build the Update Statistics stored procedure is similar to the index reorg subroutine.

We need to set the initial fields so SMO performs well, get the database object for the selected database, and drop the existing stored procedure if it exists.

The stored procedure we’re creating in this routine is called db_update_stats.

We need to iterate through the table collection and build UPDATE STATISTICS statements for each non-system table in the database.

The same thing needs to be done for the views in the database.

The body of the stored procedure is now complete, so we can assign it to the textbody property of the stored procedure object, and create the stored procedure.

The Catch block uses the same code as the Catch block in the main routine, and we’re done with this subroutine as well.

Now that the stored procedures have been created (well, the subroutines that will create the stored procedures) we can build the last job.  This job will perform the optimizations for the user databases on the target server.

Next, we’ll get the database collection on the server, set the default initial fields for more efficient SMO operations, and create the Agent job.

In this job we need to create steps for each database, but we don’t want to create steps for system databases or for snapshot databases.  Snapshot databases only exist in SQL Server 2005, so we set up a Boolean variable to indicate that we want to create a step for the database, and initialize it to TRUE.  If the database is a system database, or if this server is a SQL Server 2005 server and the database is a snapshot database, we’ll set that flag to FALSE.  Then, if the flag is set to TRUE we’ll build the step in the job for the database.

Now, we can build the optimization commands for the step, including  a DBCC SHRINKDATABASE command as well as executing each of the two stored procedures we just created.  We then add that step, and if it’s the first step in the job, set the job to execute this step first

We can now build the failure notification step for each step in the job, and add a dummy step for the last step to go to when it succeeds.

Then we create the schedule object and set the properties to occur once a week, on Sunday at 3AM, starting today.

The Catch block notifies us if any problems in the subroutine, and uses the same code as the Catch block in the main routine, and we’re done with the System Database Optimizations job.

There are many different tasks you can perform using this method, and you should adjust the code to perform the tasks you find most appropriate.  Keeping your databases clean will help you get the maximum performance out of them and will keep your users (and your boss) happy.