SQL Server: A Guide to Updating Data in Bulk

I’m attempting to find the optimal method for executing a bulk update through my SQL server mini console application. I’ve devised my own approach for bulk updating as shown below. However, I’m dissatisfied with its performance as updating 50000-100000 records in my database becomes sluggish, even with batch updates of 1000 at a time. Is there a library or solution available that could accelerate the process? Any assistance would be greatly appreciated.


I am experiencing slow performance when joining and updating a table with 10 million rows, which takes over an hour and increases my transaction log by 10+ GBs. Is there a better approach to improve this process?

Following each update, the indexes and constraints undergo a check and all relevant data is logged. Can
SQL server
be instructed to only
check constraints
once the update process is complete and to keep the update logging to a minimum?

Below is my query which has been adjusted to enhance readability by changing certain names.

SET o.Info1 = u.Info1, o.Info2 = u.Info2, o.Info3 = u.Info3
FROM Orders o
ON u.ID = o.User_ID

In response to a comment, the table definition would resemble the following (with simplifications for the sake of a general question).

Table Orders

ID int PK
OrderNumber nvarchar(20)
User_ID int FK to table Users
Info1 int FK to table T1
Info2 int FK to table T2
Info2 int FK to table T3

Table Users

ID int PK
UserName nvarchar(20)
Info1 int FK to table T1
Info2 int FK to table T2
Info2 int FK to table T3

Solution 1:

There is no existence of


, but you can consider the following actions.

  1. Consider changing your database to simple recovery mode prior to performing this task, if feasible.
  2. Before performing an update, it is advisable to remove indexes and then recreate them after the update is finished.
  3. do updates in smaller batches , something like

    WHILE (1=1)
       -- update 10,000 rows at a time 
       UPDATE TOP (10000) O
       FROM Table O inner join ... bla bla
        IF (@@ROWCOUNT = 0)


Remember to take a complete backup once you switch the recovery mode back to full, if you choose the simple mode option. This is because logging will not start until a full backup is taken, upon restoring the full recovery mode.

Solution 2:

In Dotnet winform, I loaded data on demand and proceeded to generate a new table, implement bulk operations, and update the primary table by joining it with
bulk table
. This entire process took approximately 5 seconds to complete for a data set of 1 million rows.

Frequently Asked Questions

Posted in Sql