If you are dealing with SQL Azure and a large data volume, you’re bound to run into this error.

Msg 40552, Level 20, State 1, Line 1
The session has been terminated because of excessive transaction log space usage. Try modifying fewer rows in a single transaction

This error is more of a pain than anything and you can work around it.

The structural architecture that SQL Azure uses and all around resource allocations, doesn’t allow us to go in there and simply size a transaction log like we with SQL Server on metal.  At least, I haven’t found one.  With that, we run into a mess of headaches when we have anything over a few hundred thousand rows in one table.  Here is what you can do, to get beyond this error.

Changing Data and Log Usage Errors

Changing data means just that – updating, inserting or deleting large amounts of data.  If you run into this error while loading data, try to perform the age old method of batch processing data.  Now this goes against almost every fiber of a performance mindset given the advancements in data loading and transformation.  However, it’s a life we have to work around while in SQL Azure.  With SSIS, you can create multiple threads and work on each thread while only sending so much at once to SQL Azure.  SSIS is fully capable of batch processing data also.

Creating, Rebuilding a Big Index and Log Usage Errors

Indexing is what got this article to be published.  If you have a few hundred million rows in a table, create a nonclustered index on that table, you will undoubtedly result in the log space usage error.  For a moment, this raised great concern.  Truly, the data is no good in SQL Azure if you cannot read it in under 30 minutes, preferably seconds.  So you need indexing to support the queries that are going to be used to read the data effectively.  Luckily, SQL Azure does support ONLINE operations in indexing.

In this case, the ONLINE operation with a CREATE INDEX statement provides the functionality of the index to be built by taking advantage of the build main phase.  This performs the primary needs of reading the table and then inserting by means of bulk loading operations, into the new index.

To create the index with ONLINE, you simply specify the ON value in the options.

	[SalesOrderID] ASC
INCLUDE ([DueDate],
	[SubTotal]) WITH ONLINE = ON)  

This “should” allow the index creation to complete.  It will take longer for the creation but, the importance of having indexing is without a doubt, not something you can give up.  Unless the data is simply a DR strategy or some sort of backup.