Hi...
I'm inserting and deleting about 30 000 records into 2 tables each day - import them from a text file using DTS.
The users add about 1000 records a day using Access and Windows .NET frontends...
Which TSQL commands should I run frequently to keep the database up to speed ?
I'm doing the following ... do you know of anything else?
Backup LOG MyDataBase WITH TRUNCATE_ONLY
DBCC SHRINKDATABASE (MyDataBase , 40)
GO
Backup LOG tempdb WITH TRUNCATE_ONLY
DBCC SHRINKDATABASE (tempdb, 70)
GO
THANKS!!!!!!!!!!!!!!!
Dave
There are a lot of different things you can do to keep the speed of your database up, such as:
Set the database to Simple Recovery and enable Auto Shrink, this will perform the same function you are doing with your DBCC Shrink commands. There are quite a few debates as to whether or not Simple Recovery and Auto Shrink impact performance but I have not seen any negative impacts myself.
Partition your database across multiple physical drives, this can dramatically improve performance
If you are using SAN disk, properly align the sector boundaries of your disk, here is a rather large document on the subject but it's a good one
http://www.hertenberger.co.za/resources/diskpar.pdf#search='how%20to%20use%20diskpar'
Delete any data older than "x" days from your tables, which it sounds like you may already be doing
Dedicate "x" amount of RAM to your instance
Dedicate "x" number of CPU's to your instance
The list goes on but that is a few examples.
|||Simple recovery model will not impact performance. However, the autoshrink certainly will. If the autoshrink kicks off when you are trying to process other queries, it will definitely slow everything down.|||Hi Dave,
What comes to mind immediatly is that there will be a potentially large amount of fragmentation of tables/indexes in your db. This is because: a) you are performing a large amount of deletes and inserts often and b) shrinkdatabase introduces logical fragmentation.
So, I would recommend you run dbcc showcontig on your main tables, then perform rebuilds as needed with either DBCC DBREINDEX or DBCC INDEXDEFRAG. As an aside, if there has been a large amount of fragmentation on several tables, make sure your statistics are up to date, and run sp_recompile on the tables in question so that any stored procs you have can make use of the new stats immediatly.
Cheers
Rob
Hi, Lesego.
If you're going to be doing queries agianst these tables that you're adding and removing data from, it's probably a good idea to update the statistics on the table. UPDATE STATISTICS is the command to use, and you'll want to run it against any statistics the table has -- you can find those most easily by exploring in the object browser in managemnet studio, but you'll have one for each index on the table, plus any that you've created yourself, plus any that the server has created automatically.
UPDATE STATISTICS might not be too important if you're selecting data directly from the table. But it will be very important if you are using the table that's the target of your insert/delete batch job in any JOINs with other tables. The query optimizer makes many decisions about how to best execute a statement based on information it can gather from the statistics on the table.
Hope that helps, and do let us know if you have more follow-up questions.
.B ekiM
No comments:
Post a Comment