Set up Automatic Notion Backup to GitLab

Update (June 14, 2021): The GitLab pipeline is failing for the last 2 times. It seems it’s not working correctly for me. However, it’s still working for some people, so you might want to give it a try, but there’s no guarantee.

I have 3000+ pages and 100+ PDFs (and, other files) uploaded to Notion.

And, since the offline feature is not yet available, it’s very important to take regular backups of your Notion data.

But, taking manual backups on a regular basis is not easy.

So, I started searching for some automated ways to backup the entire Notion account and luckily, I found one.

Related: Why it’s important to learn Git as a developer

Automated Notion Backup

All thanks go to Artur Burtsev who wrote an amazing article explaining how to automate the process.

Let’s proceed.

Since Notion doesn’t provide the API access yet, this method mimics the Notion’s manual backup behaviour and automatically sends the backup file to the remote GitLab repository.

First of all, register a free account with GitLab (if you don’t have one already).

GitLab’s free account lets you store data up to 10 GB which will be enough for most of the users. If you have more than 10GB of data in Notion, you can upgrade your GitLab account otherwise it won’t work for you.

#1. Fetch “token_v2” and “spaceId”

Be careful! This is the most complicated and tricky part of the entire process.

In this step, we will be extracting token_v2 and spaceId from Notion’s “Export all workspace content” feature.

To make it easy, here’s the step-to-step guide to proceed:

  1. Open Google Chrome and log in to your Notion account
  2. Navigate to Settings and Members > Settings
  3. Open the Chrome DevTools by pressing ctrl + shift + j (on Windows/Linus) or cmd + option + j (on macOS)
  4. Now proceed with the next instructions carefully
    1. Click on the Network tab
    2. Enable XHR filter
    3. Clear the console by clicking on the cancel icon
    4. Click on the “Export all workspace content“, select your preferred Export format and click on the Export button
    5. Select enqueueTask from the Name column
    6. Move to the Headers tab and scroll down till you see “cookie:
    7. Copy token_v2 and note down in a text note by name NOTION_TOKEN_V2
    8. Copy spaceId from Request Payload section and note down in a text note by name NOTION_SPACE_ID
Notion Automatic Backup Setup in Google Chrome DevTools

The first and most complicated step is done.

#2. Set up GitLab for the backups

After logging in to your GitLab account, go to Settings → Access Tokens to get an access token with proper rights.

Provide any name to the Personal access token, check the write_repository option and click on the Create personal access token button as shown in the screenshot.

GitLab Access Token for Automatic Notion Backup

Copy the “Your new personal access token” from the next screen and save it in a text note by name CI_PUSH_TOKEN.

Now, Create a new blank project, give it any name of your choice and make sure the Private option is selected. Click on the Create Project button as shown in the screenshot.

Create New Blank GitLab project

Your new project is created.

Now, set up CI/CD script by clicking on the set up CI/CD option which will open an editor for the file .gitlab-ci.yml.

Go to this link, copy the whole script, paste into the file .gitlab-ci.yml and Commit changes without making any change.

Commit GitLab CI-CD Script

Now, it’s time to schedule the backup job.

Click on the Schedules option under CI/CD from left side menu options and create a New schedule.

Schedule Regular Notion Backup in GitLab

A new form will open, enter everything as shown in the screenshot.

Set Pipeline Schedule in GitLab CI-CD Schedules

Here’s what you need to put there:

  • Description: Any name of your choice
  • Interval Pattern: Every day, Every week, or Every month
  • Cron Timezone: Your timezone
  • Target Branch: master
  • Variables:
    • CI_PUSH_TOKEN – Your GitLab personal access token
    • EXPORT_FILENAME – /tmp/
    • NOTION_SPACE_ID – the spaceId that you copied in #1
    • NOTION_TOKEN_V2 – the token_v2 that you copied in #1
    • TZ – Pick your “TZ database name” from this Wikipedia page
  • At last, click on the Save pipeline schedule button

Now, under the CI/CD > Schedules option, the newly created Pipeline will appear, immediately.

Notion Automatic Backup Schedule in GitLab

And, click on the play button to test if the backup process is working correctly. It will take a few minutes before the green checkmark appears in the Last Pipeline column. The completion time depends upon the number of pages you have saved in your Notion account.

And, once it turns green, it means your setup is successful. Your Notion data would be automatically be backing up on a regular basis.

Note: If you would like to see the logs then you can click on the numbered link like #12345679 in the Last Pipeline column and then click on the running job.

After the task completes, a folder named “backup” will be created automatically which contains all your Notion files.

Notion Backup Folder in GitLab Project

If you want a local copy of all the files, you can clone the complete GitLab project repository to your computer by using the git clone command.

For that, click on the Clone button (top right corner, as shown in the screenshot above) and copy the Clone with HTTPS option. Open up your Terminal (in Mac or Linux) or PowerShell (in Windows) type the following command:

git clone

Since the GitLab project that we created is Private, it will ask you for your GitLab login credentials and as soon as you provide that your Notion files will start getting copied to your computer.

But again, there is no point in saving a local copy after this long process, you could have simply downloaded directly from Notion.

This process just helps you keep your Notion data at more than 1 place by taking automated backup, regularly.

That’s it.

Stuck somewhere? Just let me know by dropping a quick comment.

Similar Posts

Leave a Reply to Deepak K Cancel reply

Your email address will not be published. Required fields are marked *


  1. Paul Filippakis says:

    Hey Deepak,
    I am getting the following error from Gitlab, on the Pipeline – Failed Jobs section.
    Enqueued task 1d39e781-f459-4e9e-b2c2-40cc15524679
    Traceback (most recent call last):
    File “./”, line 61, in
    File “./”, line 45, in export
    print(f’\rPages exported: {task.get(“status”).get(“pagesExported”)}’, end=””)
    AttributeError: ‘NoneType’ object has no attribute ‘get’
    Cleaning up file based variables
    ERROR: Job failed: exit code 1
    * My Notion backup is about 580mb

    1. Hi Paul,

      My setup is running correctly but one day it had failed with a similar error. But, I re-run the schedule and everything worked fine after that. Here’s a screenshot of that.

      So, I would suggest you re-run the schedule manually.

      Hope that helps.

      Thank you.

  2. Hi Deepak,

    Same here for me, repeated errors. The script worked the first time I manually activated it, but since, it hasnt worked at all… If you have any lead, I would be glad ! 🙂


    1. Hi Julien,

      This method is not a reliable method to backup your Notion. This just mimics the behaviour of downloading a local backup and it does fail sometimes.

      And, when it fails there’s nothing you can do other than running it manually, again.

      Hope that helps.

      Thank you.

      1. Hi Deepak,

        Thanks for your response, the problem is that even manually (in Gitlab), the process seems too fail each time. I will try this lternative solution

        Best regards,


        1. Let me give a try to the alternative too.

  3. Thanks Deepak for the article.

    I just configured it and it works like a charm.

    Just one thing I noted: The export contains my own private documents in Notion but no other private documents of other users.

    I think it’s important to know…

    Best regards

    1. Glad that it worked for you, Lukas.

      Yes, even if you have full access to the documents of the other users, you’re not the owner and it won’t export those.

      Thank you.

  4. Hi Deepak,

    Thank you for sharing this document.
    – Do you know how in can reduce the number of logs? I got this error each times:
    Job’s log exceeded limit of 4194304 bytes.
    Job execution will continue but no more output will be collected.
    and nothing happens then
    – Do you think we can adapt the script of Github to Gitlab from ?

    Thank you

    1. Now, it’s not working for me also, Didi.

      I have no idea about that script, if it works then you can go with it.


  5. ConfusedCat says:

    I’m confused what this article adds that the original Medium doesn’t provide. Am I missing something or are we just copying things over now?

    1. Hi,

      I tried the original solution and shared my experience of how it worked for me. I have exclusively mentioned the original author’s name and linked to the original article here.

      I haven’t used any screenshots from the original and no text is copied either.

      Still, I am sorry you felt that way.

      Thank you 🥂