DEV Community

Tyrel Chambers
Tyrel Chambers

Posted on • Originally published at tyrelchambers.com

How I transferred Postgres data using Sequelize

Hey there!

Have you ever asked yourself "How do I transfer data from one database to another without actually putting any real effort into it?". Me too.

It would be nice if there was a tool that could allow us to interact and mutate data using a GUI because I can't write SQL to save my life.

In this article I'm going to talk to you about how I transferred data from one database to another without exporting/importing the data. I had tried numerous ways using different export options within pgAdmin, but was generally unsuccessful. Growing frustrated I decided to look outside the box and try something different entirely. This will be a general high-level overview of how I did this, but not necessarily a step-by-step since you might not be using the same tools as I and I'm not a pro at pgAdmin or exporting SQL data.

The new database had significant architectural changes, so a simple export/import wouldn't have been good enough (I'm also not that great at writing my own SQL to be able to transform it anyway).

An introduction

I recently remade one of my first production applications. The reason for the rewrite was I had grown in my knowledge of programming overall and felt the current version of my app was flaky at best. This database wasn't big by any means, but it was being used in production and the last thing you want to do is lose data, somehow.

Using Sequelize to connect to both databases

This is probably a risky play if you're not careful because one brain-fart or wrong move and you could do some serious damage to both databases and your emotions.

Within my docker-compose file, I created another database that was blank. I named it something random because the name isn't important since it's temporary. I exported the production database and imported it to my replica database; including the commands to create everything from scratch.

Once this data was successully important, I created a new sequelize instance in my index.js file within my models folder that sequelize can generate for you. I named it something crazy like sequelize2 and gave it the value of a postgres uri string with the proper credentials to connect to that database.

I find the benefits of doing it this way are:

  • No risk of data loss in your production database by botched code
  • Not having to deal with foreign keys in export

I thought 'why not use the tools I already know how to use?'

Technically, you could straight connect to your production database and go rogue and silly, but hey, if that's you, Godspeed!

We're connected to the replica database

Alright, we're here. I'm going to layout some of my code so you can have an idea of my approach to the actual migration process. Make sure you have two connections setup, mine look like this:

let sequelize2;

sequelize2 = new Sequelize(process.env.OLD_REDDEX_DB);
db.sequelize2 = sequelize2;
Enter fullscreen mode Exit fullscreen mode

When it's setup this way, we can access our replica database by referencing db.sequelize2 and our main database db.sequelize. This will allow us to easily pull from one database and insert in another. Now, I don't know if this is "the best way to do it 🙄", but it worked for me and I'd do it again.

Creating the query function

I created a function for query each table I wanted to include in the migration. All it did was fetch the data from the replica database and return it.

./migrate.js

const getUsers = async () => {
// sql statement to return all users
const sql = `SELECT email,uuid,password,initial_message,repeat_message,website_id,youtube_id FROM users`;
// return all users
const users = await db.sequelize2.query(sql, { type: QueryTypes.SELECT });

return users;
};
Enter fullscreen mode Exit fullscreen mode

As you can see, this queries the data held in my users table and returns it to me. However, we now need to mutate that data to fit our new tables and overall database structure.

Mutating the data

Next up, I created a function to receive the data return by the function above and mutate it to fit the new table structure.

./migrate.js

const migrateUsers = async () => {
  const users = await getUsers();

for (let i = 0; i < users.length; i++) {
const user = users[i];
const {
email,
uuid,
password,
initial_message,
repeat_message,
website_id,
} = user;

    const newUser = await db.User.create({
      uuid,
      email,
      password,
      email_confirmed: true,
    });

    await newUser.createProfile({
      greeting: initial_message,
      recurring: repeat_message,
      words_per_minute: 0,
    });

    if (website_id) {
      const website = await db.Website.create({
        uuid: website_id,
        userId: newUser.uuid,
      });

      newUser.websiteId = website.uuid;

      newUser.save();
    }

}
};
Enter fullscreen mode Exit fullscreen mode

Now I can use Sequelize to do what I need in order to properly mutate that data and make it fit in my database. The only thing that changed was the columns names in some tables and generally cleaning up unused or improperly named columns. I find that leaving tech debt like this (the improperly named columns and poor layout) can create issues down the road when you try to work with that data. In the new database, I made sure to take care of that from the start.

Executing the functions

At the end of the file, I created a general main() function that executes the functions above, in order. Any functions that query and return data, are listed within their respective mutation functions. I found this worked really well for me because if one function failed, it didn't mess with anything outside it's functional scope. Breaking down the process into smaller functions is just good practice in general, anyway.

./migrate.js

const main = async () => {
  // await migrateUsers();
  // await migrateWebsites();
  // await migrateTags();
  // await migrateSubmittedStories();
  // await migrateRecentlySearched();
  // await migrateContacts();
  // await migrateContacted();
  await migrateStories();
};

main();
Enter fullscreen mode Exit fullscreen mode

I had multiple functions targetting and mutating multiple different columns. You can just add whatever you need in order to complete the process.

Conclusion

Now that we have our data, we've mutated it to suit our needs, you're good to go.

What I did was I entered my docker container that contained both databases. I then ran my migrate.js file or whatever you named it, and let it do the rest.

I hope this has helped give you an overall view on how to do something in a bit of an unorthodox way (in my opinion). It didn't make a whole lot of sense to try and do it in a way I was unfamiliar with, instead I just used the tools I was given.

Happy hacking!

Top comments (0)