DEV Community

Discussion on: Bulk Update Multiple Records with Separate Data — Laravel

Collapse
 
squatto profile image
Scott Carpenter

Using CASE statements is a clever way to do this! You just need to make sure that you pay attention to the length of your query so that you don't go over the max query length/packet size. In MySQL you can check the max by looking at the max_allowed_packet variable: show variables like 'max_allowed_packet';

When I have a one-off update to run against a large number of records, I typically create an Artisan command (or a job) that updates a small subset of the records, and then I'll schedule it to run every minute (or a longer period, if needed). It spreads the load over time and is easy to keep track of the progress. Once it's done, I remove the call from the scheduler and delete the command/job class.

Collapse
 
bertugkorucu profile image
Bertug Korucu • Edited

Spreading it to several jobs is indeed a great strategy! I guess it's all about picking the right strategy for the job.

If the task is not a time-sensitive issue, spreading it overtime like you said beats the bulk update - but if it's time-sensitive, I go with the bulk update strategy.