If you've read my last blog post, then you know I wanted to leave Notion for good.
The problem is the other solutions are just not quite there yet for me. I really tried, but I'm probably just too accustomed to Notion and its features.
One of the described issues in that post with Notion was the performance - or the lack of it. I feel like in the last few days the team behind Notion did some backend improvements (also a database upgrade). It definitely feels snappier now. Searching is faster, databases load quicker and you can actually use templates now without waiting forever for them to be set up in the page you're trying to create.
But this post is not really about this, so let's not get into too much detail and focus on our topic.
In the settings of Notion there is an Export all workspace content button. And for backup purposes I did those regularly, but depending on the size of your workspace it can take a while, plus it's a pretty manual process that you have to constantly remind yourself about.
And I'm the type of guy that tries to automate everything away so I can focus on the important things in my life (plus it's fun to do for an engineer, let's be honest).
For this reason I have written a small open source command line application in Go that can do just that: back up your Notion workspace or just specific parts of it - automatically. And thanks to Go it's platform independent, so you can even run it on macOS or Windows (there are pre-built binaries in the repository, if you don't want to build it yourself).
You can find it on Github at
5hay / notionbackup
A small utility command line application that can recursively download Notion pages
The easiest way to run is by downloading a pre-built binary, but since it's open source you can of course build it yourself.
Then you have to set at least 2 environment variables and you're good to go.
The necessary env vars are
-
NOTION_TOKEN
- The token_v2 cookie for Notion that is used for authenticating
- You should find enough search results if you just google "Notion token_v2"
- Here's a quick explanation on how to retrieve that token in case you can't find anything: https://www.redgregory.com/notion/2020/6/15/9zuzav95gwzwewdu1dspweqbv481s5
-
NOTION_PAGEID
- This is the Notion page ID you want to set up for the automatic backup
- On every page you can hit those 3 dots in the corner and click Copy link
- Alternatively, these 3 dots are also in the sidebar for every page if you hover with your mouse over them
-
The page ID can look something like this
https://www.notion.so/username/PageTitle-3514g811b36849b3zi8322fdac38287f
- You can either set this env variable to the whole URL or just the ID part at the end after the last
-
.
- You can either set this env variable to the whole URL or just the ID part at the end after the last
On Linux based systems you can put them into your .*rc files (.bashrc
, .zshrc
, ...), so they are always available.
export NOTION_TOKEN=someLongAlphanumericValue
export NOTION_PAGEID="https://www.notion.so/username/......"
After that, source
your .*rc file and you're already set up.
source ~/.bashrc
There are 2 other environment variables you can set for this application, but they have their defaults and don't need to be necessarily set (check the Github repository if you want to set them anyway).
Now you can execute the binary application by calling ./notionbackup
(assuming you have already cd
'd into that folder).
That's pretty much it. Once the application has finished downloading your backup (it's gonna be a .zip file) you can do whatever you want with it, e.g. integrate it into your backup solution and forget about it.
You could also automatically unzip it after every export so your backup solution can deduplicate the files better. There are endless possibilities. The important thing is just to have a backup in case something happens.
I keep it pretty simple.
I have set up a cron job for once per week that does a backup of my workspace and then pushes the file into my encrypted Google Drive folder with rclone (great tool if you're not familiar yet). The contents of my encrypted Google Drive folder are also replicated into other backup systems, but that's irrelevant for this post.
I hope you found this blog post and the tool useful. If there's only one thing to learn from this post, it's to always have backups for the worst case scenario.
If you want to follow me or just chat, you can find me on Twitter: @5hay
Top comments (0)