DEV Community

Alejandro Cobar
Alejandro Cobar

Posted on

Checking out dbForge’s Data Pump for SQL Server

Working as a SQL Server database administrator, or even as a database developer, means that you are going to be working a lot with data. During such time, it is very likely that you will encounter a requirement to either export or import data from or to your databases. SQL Server Management Studio natively gives you some options to do such a thing, but I’ve seen cases where it simply falls short, and I refuse to jump into the Azure Data Studio bus (it just doesn’t feel the same to me). For that reason, dbForge has created the Data Pump for SQL Server SSMS extension, which allows you to perform your exports/imports with a lot of ease, options, and convenience. I have tried it, and I can summarize it with just 2 words: it’s awesome!

Installation
You can get the dbForge Data Pump SSMS extension by clicking on the “Get Trial” (I recommend the Professional Trial, because we’re pros, right!), as you can see in the screenshot below, and then proceed with the installation wizard.

Note: All of this will be carried out assuming that you already have SQL Server Management Studio installed in your host.
Image description

Once done, you can go to SQL Server Management Studio, open a connection to your SQL Server instance of choosing, and then right-click on any desired database:

You can see that we are presented with 4 options, with the “Help” one being general help information for the tool, so I’m skipping that and will be covering “Export Data” and “Import Data”. Regardless, feel free to go into the rich and depth content guide for SQL Data Pump.
Image description

Export
When we are working with databases, sometimes we have the need to export a set of data for whatever reason (e.g., reporting, for a developer, etc.). Even while SSMS gives you options to do it, sometimes you’ll find the experience a bit rough in some cases. With that said, let’s see what can Data Pump do for us:

So right off the bat, you can see that Data Pump offers a variety of file formats to export our data, so I’m going to check PDF to see what I get.
Image description

Of course, prior to anything, I need to pick a source connection from where the export will take place, so I’ll pick my default local SQL Server instance:
Image description

In “Output settings” you specify where the output file will be created, so make sure to specify a valid directory.
Image description

In the “Options” section, you can format how your rows and content will look. For demonstration purposes, I’ve randomly picked different colors for the header, even and odd rows.
Image description

Next, we have “Data formats”, the section in which we can specify any particular format that we want to apply to the data types.
Image description

Once you hit “Export” and it finishes, let’s see what we get:
Image description

Image description

So I got my data exported in a PDF file without having to go through a lot of hoops, cool!

Let me now try to export the same data, but in JSON format, you know… for the sake of science ;)
Image description

And like that, after a few clicks, I now realize that I have the possibility of exporting my data in several formats, fantastic stuff!

I’m going to try just one more before moving to the “Import Data” option. I want to see if Data Pump can generate a SQL script with statements to populate my data elsewhere:

Indeed! I personally find this a very convenient thing to have because sometimes developers require a very specific set of data in tables, and you just don’t want to go through all the work of generating a database backup so that they can load it wherever they want. On top of that, in most cases, the developers aren’t going to be versed enough to load a file generated with bcp.
Image description

Import
For the import section, well, it’s basically the other way around, Data Pump will give you a set of file formats as an option to load data into your database tables. Now, for simplicity, I will truncate my sample table and will load it with the same .json file obtained as an output of the previous “Export” exercise, so let’s get right to it:

*In this section, make sure you correctly specify the path where your file is located.
Image description

You’ll see right away that Data Pump will give you a preview of the content within the file and lets you pick an encoding for your data (if you wish to choose a particular one):
Image description

You can perform a particular mapping from the source to the destination:
Image description

Data Pump will ask you if you want to append the contents of the file to the destination table, or if you’d like to delete everything first and then load the data (which tells me that I truncated my table in vain because Data Pump would’ve done it for me).
Image description

Once everything is said and done, I’ll go and check my table to see its current state:
Image description

Bingo! Just as expected, my data is right there:
Image description

Final thoughts
● If you are someone that regularly works with SQL Server databases at a professional level, it is very likely that you will eventually have the need to either export data from or import data to your databases. dbForge’s Data Pump for SQL Server not only gives you a robust tool to do so, but it does it in a very convenient way, which is by extending the current functionality of SQL Server Management Studio, which is a huge win in my opinion.

● Not only do you get a wide variety of file formats to work with your data, but it also gives you the option of integrating these tasks into your DevOps workflow (if you have one, of course), something that is very much welcomed these days. I know I didn’t touch on this point in this article, but it is something that I noticed in the documentation, and I think it should be mentioned. Here’s the link for using Data Pump in the DevOps space.

● I would definitely use the export functionality of this tool to backup individual tables, whenever I’m about to test any data change in my workflow. Just this week alone, I can already think of one case where this would’ve saved me quite some time because a developer issued an UPDATE statement without a WHERE clause (I know, the classic meme), and Microsoft is still falling short with providing us a native way to backup individual tables.

Top comments (0)