Performing large amounts of schema updates and transaction log issues

by Craig Walker   Last Updated May 23, 2018 13:06 PM

We have an issue with growing transaction logs hitting disk size limits

When ever we upgrade our software on a client's site. our upgrade installer runs a number of scripts to add columns to tables and move data around. once complete it will upgrade the web site that uses the db

The typical db can be of a moderate size 1GB - a few of the tables will contain 10's of millions of rows. nothing that should stress the server too much

We have very little control over our customers back up policy although we recommend full transactional logs backed up every 1hr. we cant guarentee this followed. And we have no control over when the customer runs the upgrade proccess.

Recently one of our scripts added a column to a table that typically has a large numbers of rows, 10 - 20 mil, the column had a default value so had to update all the rows.

For one customer this caused an already large transaction log to grow to the size of the disk. The script failed and at that point our installer stops

I have read a few articles regarding truncating or shrinking the log, running checkpoints , switching to simple mode.

I would like to avoid them needing to do any db maintenance before upgrading as most of them know little about it.

Does any one have a best practice to follow when running these kind of upgrade schema scripts.

The DB is MSSQL

Any help would be much appreciated

Tags : sql-server sql


Related Questions


Linked Server Error on SQL Agent Job

Updated March 16, 2016 08:02 AM