DEV Community

Cover image for Importing Data Using MySQL and Arctype
Arctype Team for Arctype

Posted on • Originally published at arctype.com

Importing Data Using MySQL and Arctype

If you are a developer or database administrator that wants to load the data into your databases quickly, you probably already know that you have quite a few options, at least as far as MySQL is concerned. In this blog, we will look through the options you have when it comes to importing data into your MySQL instances.

What Options Are There?

First of all, when it comes to importing data into MySQL, you have a couple of options to import data:

  • One can use INSERT INTO and specify a table name along with the columns and data one needs to import into a given database instance.
  • One can also use LOAD DATA INFILE and specify a specific file where he or she wants to load data from into a given table.

Now we could tell you that “no, LOAD DATA INFILE is not the only option when importing data into MySQL or MariaDB-based instances” and leave it at that, however, that’s not what you’re here for – you’re here to know the best mechanism to use for importing your data into MySQL-based instances.

By now you know that LOAD DATA INFILE and INSERT INTO are your only options – however, you might also know that we have covered some of the functionalities provided by LOAD DATA INFILEin an earlier blog post of ours, so you might wonder why we are writing another one. Well, we haven’t covered everything!

For example, we told you that LOAD DATA INFILE is faster, and it is. But once you have the need to quickly import data into your database instances, you have a couple of other options too. For example, you can use COMMIT statements like so:

SET autocommit=0;
INSERT INTO arctype (demo_column) VALUES (‘Demo Column’);
COMMIT;
Enter fullscreen mode Exit fullscreen mode

Running queries in the above-specified fashion would help your database perform better when inserting data because COMMIT statements save the current state of the database. In other words, turning autocommit to a value of 0, importing your data, and committing then might be a better option in terms of importing data in MySQL because you would only save modifications to disk when you want to and relieve MySQL (or MariaDB) of such hassles. Also, consider setting unique_checks to 0 and foreign_key_checks to 0 as well: setting these parameters to off (0) should improve your database performance as well.

Another quick way to import data into MySQL if you are running the MyISAM storage engine would be to just simply copy the .frm, .MYD, and .MYI files over into a given folder on a different database server. Here’s a caveat though – you should only do this with MyISAM because InnoDB works differently. Never try such an approach on it unless you are fully aware of the consequences.

You can read our blog post about InnoDB vs. big data to learn more, but essentially, InnoDB has one core file – ibdata1 – that is central to its performance. Simply copying over files like this would do more harm than good because the tablespace ID in the ibdata1 would not match the tablespace ID in the database files that were copied over to a new server.

Another quick way to speed up a given data import is to lock the table while importing. Run a LOCK TABLES query before importing your data and UNLOCK TABLES when you’re done, and you should be good to go. Such an approach is recommended if you have a situation where you might find yourself required to prevent certain sessions from modifying tables.

You can also use the bulk importing functionality provided by INSERT statements as well.INSERT INTO arctype (demo_column) VALUES (‘demo’), (‘demo’); would be inserted faster than ordinary INSERT statements.

However, these are not the only options you can employ. If you are using MyISAM (you shouldn’t be, but if you find yourself forced to use such a storage engine…) you might also want to think about increasing the value of bulk_insert_buffer_size. According to MySQL, the variable limits the size of the cache tree in bytes per a single thread, so that should help if you find yourself importing quite a lot of data when working with MyISAM as well.

A consideration: the secure_file_priv variable

The secure_file_priv variable is heavily associated with bulk importing of data inside of your database instances. In prior posts, we said that LOAD DATA INFILE is significantly faster than INSERT statements due to the fact that it comes with many so-called “bells and whistles” unique to itself. Part of that magic is load_data_infile. The variable usually resides in your my.cnf file (which itself is located in your /var/bin/mysql directory) and looks like so (the following example refers to the variable being used in Windows environments):

 secure_file_priv=”c:/wamp64/tmp”
Enter fullscreen mode Exit fullscreen mode

This variable, simply put, puts a restraint on which directories can be used to load data into your MySQL database instance. In other words, once you run a LOAD DATA INFILE query and the file you load into your database instance does not reside in this directory, MySQL will come back with an error like so:

ERROR 1290 (HY000): The MySQL server is running with the --secure-file-priv option so it cannot execute this statement
Enter fullscreen mode Exit fullscreen mode

To overcome this error, you either need to remove the --secure-file-priv setting from your my.cnf file, or load your data from a directory specified in the value of the variable. Once you do so, your data should be loaded in (and exported) without any kinds of issues!

Other Concerns

Aside from loading data into your MySQL database instances using LOAD DATA INFILE and making use of this privilege, the value of such a variable might also be relevant when exporting data out of MySQL using SELECT … INTO OUTFILE. this variable can also be used to disable import and export operations using LOAD DATA INFILE and SELECT ... INTO OUTFILE: simply set this variable to NULL in such a case. In general, though, you can also run a query like SHOW QUERY VARIABLES LIKE ‘secure_file_priv’ or SELECT @@secure_file_priv in order to observe the name of the directory that this variable is set to.

Time to import some data!

So, you're ready to import some data. Can you just use INSERT queries and insert all of the rows one by one? Or perhaps you would use bulk inserting capabilities provided by INSERT?

Technically, yes – you can do that. You can insert your rows from a CSV file by copying them over to INSERT statements, and then running them one by one, but that would take quite a lot of time, also you would need to make sure you are copying over the correct values of rows from the CSV file into your INSERT statement, etc. - that would be quite a hassle!

Thankfully, there are tools that can assist you in performing such tasks. Some of those tools are SQL clients. For example, the SQL client developed by Arctype can help you edit your tables like spreadsheets where you select any cell you want to edit, and delete rows by simply clicking delete, etc.

However, Arctype has another feature that should be more relevant to the scope of this article – Arctype also lets you import CSV files into your database instances. Let's see how to do that!

Go ahead and launch Arctype. Towards the right-hand side of your table and above the structure of it, you will find three horizontal dots. Click on them:
Screenshot showing available options in Arctype
These dots denote the options available for you to use: you can either create a table or refresh them, or you can also import the contents of a given CSV file to your table. Click on Import CSV and select the file you want to import:
Screenshot showing how to import a file
Arctype will provide you with some information regarding your file, and it will provide the first five rows that your CSV file contains. Once you’re satisfied with the outputs you see, it’s time to import your data – go ahead and click Accept:

Once you click accept, you will be able to define the columns of your table that you want to import your data to, you will be able to create a new table or elect to upload your data into an existing table.

Once you’re satisfied with what you see on the screen, go ahead and click I*mport CSV* to import your file: your data should now exist inside of your tables! That’s it – it’s really that simple.
Screenshot of an error message
Sometimes when your data is too long to be imported, and you might face some errors like the one above (in that case, Arctype will import all of the rows leading up to the error), but generally, the import process should go smoothly

The bottom line

When you find yourself required to quickly import data into your MySQL instances, LOAD DATA INFILE is far from your only option. You can also use bulk inserting capabilities provided by INSERT statements, modify MySQL’s functionality in such a way that commits only after the data is fully loaded into your database tables, and only load data into specific columns using LOAD DATA INFILE as well.

We hope you stick around the blog since we have a lot more content prepared for you – this blog covers databases, security, and general advice for developers, so you will certainly find what you are looking for. And, if you want to play around with what you've learned so far, Arctype is the perfect tool with which to do so. Use the button below to download Arctype and put your skills to the test now!

Lukas is an ethical hacker, a MySQL database administrator, and a frequent conference speaker. Since 2014, Lukas has found and responsibly disclosed security flaws in some of the most visited websites in Lithuania and abroad including advertising, gift-buying, gaming, hosting websites as well as some websites of government institutions. Lukas runs one of the biggest & fastest data breach search engines in the world: BreachDirectory.com and frequently blogs in multiple places educating people about information security and other topics. He also runs his own blog over at lukasvileikis.com.

Top comments (0)