Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Updating 100k records in one update query

Is it possible, or recommended at all, to run one update query, that will update nearly 100k records at once?

If so, how can I do that? I am trying to pass an array to my stored proc, but it seems not to work, this is my SP:

CREATE PROCEDURE [dbo].[UpdateAllClients]
    @ClientIDs varchar(max)
AS
BEGIN
    DECLARE @vSQL varchar(max)
    SET @vSQL = 'UPDATE Clients SET LastUpdate=GETDATE() WHERE ID IN (' + @ClientIDs + ')';
    EXEC(@vSQL);
END

I have not idea whats not working, but its just not updating the relevant queries.

Anyone?

like image 749
m0fo Avatar asked Sep 08 '25 11:09

m0fo


2 Answers

The UPDATE is reading your @ClientIDs (as a Comma Separated Value) as a whole. To illustrate it more, you are doing like this.

assume the @ClientIDs = 1,2,3,4,5

your UPDATE command is interpreting it like this

UPDATE Clients SET LastUpdate=GETDATE() WHERE ID IN ('1,2,3,4,5')';

and not

UPDATE Clients SET LastUpdate=GETDATE() WHERE ID IN (1,2,3,4,5)';

One suggestion to your question is by using subquery on your UPDATE, example

UPDATE Clients 
   SET LastUpdate = GETDATE() 
WHERE ID IN
    (
       SELECT ID
       FROM tableName
       -- where condtion
    )

Hope this makes sense.

hope this helps

like image 157
John Woo Avatar answered Sep 10 '25 07:09

John Woo


A few notes to be aware of.

Big updates like this can lock up the target table. If > 5000 rows are affected by the operation, the individual row locks will be promoted to a table lock, which would block other processes. Worth bearing in mind if this could cause an issue in your scenario. See: Lock Escalation

With a large number of rows to update like this, an approach I'd consider is (basic):

  1. bulk insert the 100K Ids into a staging table (e.g. from .NET, use SqlBulkCopy)
  2. update the target table, using a join onto the above staging table
  3. drop the staging table

This gives some more room for controlling the process, but breaking the workload up into chunks and doing it x rows at a time.

like image 43
AdaTheDev Avatar answered Sep 10 '25 09:09

AdaTheDev