DEV Community

loading...

Rails: Bulk Insert to DB

kputra profile image K Putra ・3 min read

If you continue to read this article, I assume that you know Ruby, OOP in Ruby, RoR, and Active Record.

The purposes of this article is only to let you know that you can do bulk insert, and tell you some available options to do that. Therefor, I won't deep dive into each options.

Table of Contents:
Problem
A. Using gems
B. Using the SQL INSERT statement
C. Rails 6
Final Word

Problem

Do you have import function for csv/excel files into your rails apps and then put the records into database? If you have, I believe some of you still do something like this:

def import_record
  CSV.read(file_path).each do |record|
     Article.create!(
       title:  record[0],
       author: record[1],
       body:   record[2]
     )
  end
end
Enter fullscreen mode Exit fullscreen mode

The code above will invoke SQL calls as much as the total records in the csv file. If you have 20 records in the file, then it will make 20 SQL calls.

If you have only 10 records per files, perhaps it is not a big problem. But, what if you were facing 10,000 records per files? 1,000,000 records per file?

Well, 1,000,000 SQL calls are.....

Can we 'refactor' it? Yes we can! Just use bulk insert, then we will only call 1 SQL calls !

Let's start our journey!

A. Using gems

I believe most of the reader is not using Rails 6. But if you do, you can go straigth to C. Rails 6.

I will only cover 1 gem: activerecord-import.

Put this in your Gemfile, then run bundle install:

gem 'activerecord-import'
Enter fullscreen mode Exit fullscreen mode

As the name suggests, one of the dependency of this gem is active record. Make sure your apps use active record as ORM.

Then, let's update our code above:

def import_record
  articles = array_of_records
  Article.import articles
end

def array_of_records
  records = []
  CSV.read(file_path).each do |record|
    records << extract_(record)
  end
  records
end

def extract_(record)
  {
    title: record[0],
    author: record[1],
    body: record[2]
  }
end
Enter fullscreen mode Exit fullscreen mode

So, instead of make 20 SQL calls, your code will make only 1 SQL calls!

Not so much improvement for 20 records. But so much improvement for 10,000 records, 1,000,000 records, 20,000,000 records, and so on.

Read the github documentation for full information.

B. Using the SQL INSERT statement

No need to install any gems if you are using rails. But you have to understand query language.

def import_record
  articles = array_of_records
  sql = build_sql_from_(articles)
  ActiveRecord::Base.connection.insert_sql(sql)
end

def build_sql_from_(articles)
  sql = "INSERT INTO articles VALUES"
  sql_values = []
  articles.each do |article|
    sql_values << "(#{article.values.join(", ")})"
  end
  sql += sql_values.join(", ")
end

def array_of_records
  ...
end

def extract_(record)
  ...
end
Enter fullscreen mode Exit fullscreen mode

I never use this options, as this require deep knowledge of the database itself.

C. Rails 6

Rails 6 support bulk insert without any gems.

So, let's update our code:

def import_record
  articles = array_of_records
  Article.insert_all articles
end

def array_of_records
  ...
end

def extract_(record)
  ...
end
Enter fullscreen mode Exit fullscreen mode

Not so much different from using activerecord-import. We just change import to insert_all.

Of course there are many bulk insert method besides insert_all in Rails 6. There are many options too, just like activerecord-import.

If you want to know in detail, there are many great articles with great authors that write in detail about bulk insert in Rails 6. You can easily search for it.

Final Word

Bulk Insert is not only for import file like the scenario given. There are many situation where bulk insert can save the day.

That's all from me.

Discussion (0)

pic
Editor guide