DEV Community

Cover image for How to perform very large Insert/Update operations in Laravel efficiently
Aduramimo Oludare
Aduramimo Oludare

Posted on • Updated on

How to perform very large Insert/Update operations in Laravel efficiently

Writing backend laravel logic means having to wrestle with database manipulation problems from time to time, and if you have ever been in a situation where you have very large records to insert or update (say 15,000), especially when writing cron jobs that crawl applications for very large information and tries to send back reports for business intelligence purposes, it might shock you to note that these operations may take anywhere between 10 minutes to 1 hour if care is not taken to optimize such code, depending on the various timeout settings on your server.

Writing a database insert for each record in an array with the following code might seem okay if done on a local environment or even on a server with few records to worry about:

                  foreach ($rows as $record) {        
                            DB::table('my_table')->insert([
                            'clientid' => $record['id'],
                            'full_name' =>$record['full_name']),
                            'status' => $record['status']),
                            'created' => now(),
                            ]);
                                          }
Enter fullscreen mode Exit fullscreen mode

This is not efficient however, and might result in unpleasant scenarios such as memory leaks, abrupt SQL locks and so on, when such logic is employed for large datasets.

Some try to overcome this challenge by writing special queued jobs to do this in the background, however there is a better way to get around this issue by simple array manipulations as outlined below:

  1. Assuming you have an array of 25000 records stored in a variable $arrayData.
  2. Define another empty array $insertion_data = array();
  3. Loop through the $arrayData while assigning your database values against each key in $arrayData and push each item into the $insertion_data array :
            $insertion_data = array();

            foreach ($arrayData as $arr){
                $new_data = [
                    'data_requested_at' =>  date("Y-m-d H:i:s", 
                     strtotime($arr['data_requested_at'])),
                    'response_code' => $arr['response_code'],
                    'status' => $arr['status'],
                    'created_at'  => date('Y-m-d H:i:s'),
                    'updated_at'  => date('Y-m-d H:i:s')
                ];

                $insertion_data[] = $new_data;
                            }
Enter fullscreen mode Exit fullscreen mode

4.Grab your final insertion data to a Laravel collection
$insertion_data = collect($insertion_data);

5.Leverage the Eloquent chunk method to have a pool of records you want to insert/update per time. You may start with 500 and then increase it according to requirements. Now loop through this new Collection and perform your database insertion. Don't forget to implement the try-catch technique to investigate database errors.

         $data_to_insert = $insertion_data->chunk(500);
         foreach ($data_to_insert as $key => $data)
         {
             try{
             DB::table('my_target_table')->insert($data 
                       ->toArray());
             }
             catch(\Illuminate\Database\QueryException $e){
                 $error = $e->getMessage();
                 echo $error;
             }
         }
Enter fullscreen mode Exit fullscreen mode

Top comments (0)