Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add "AWS S3 Support" section to README.md for large size notion backup file #30

Open
wants to merge 3 commits into
base: main
Choose a base branch
from
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
98 changes: 98 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -102,3 +102,101 @@ You won't be able to backup files exceeding a size of 100MB unless you enable [G
*.jpeg filter=lfs diff=lfs merge=lfs -text
*.psd filter=lfs diff=lfs merge=lfs -text
```

## AWS S3 Support

Due to LFS storage constraints (batch response: This repository is over its data quota. Account responsible for LFS bandwidth should purchase more data packs to restore access.), we can't store the backup file on LFS. To solve this problem, you can store Notion export files on AWS S3.
Before using this workflow, please create an S3 bucket, an IAM User with Allow S3 PutObject policy(Please ref "AWS S3 IAM Policy" Section), and an IAM User Access key. After creating the S3 bucket and IAM User, add `AWS_S3_BUCKET_NAME`, `AWS_ACCESS_KEY_ID`, `AWS_SECRET_ACCESS_KEY`, and `AWS_DEFAULT_REGION` to github Actions secrets and variables.

### Workflow file
```yaml
name: "Notion backup"

on:
push:
branches:
- master
schedule:
- cron: "0 */4 * * *"

# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:

env:
AWS_S3_BUCKET_NAME: ${{ secrets.AWS_S3_BUCKET_NAME }}
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: ${{ secrets.AWS_DEFAULT_REGION }}

jobs:
backup:
runs-on: ubuntu-latest
name: Backup
timeout-minutes: 120
steps:
- uses: actions/checkout@v3

- uses: actions/setup-node@v2
with:
node-version: '18'

- name: Delete previous backup
run: rm -rf markdown html *.zip

- name: Setup dependencies
run: npm install -g notion-backup

- name: install awscli
id: install-aws-cli
uses: unfor19/install-aws-cli-action@master
with:
version: 2

- name: Run backup
run: notion-backup
env:
NOTION_TOKEN: ${{ secrets.NOTION_TOKEN }}
NOTION_FILE_TOKEN: ${{ secrets.NOTION_FILE_TOKEN }}
NOTION_SPACE_ID: ${{ secrets.NOTION_SPACE_ID }}
NODE_OPTIONS: "--max-http-header-size 15000"

- name: Upload to S3
if: "${{ env.AWS_ACCESS_KEY_ID != '' }}"
run: |
aws s3 cp . s3://$AWS_S3_BUCKET_NAME/ --recursive --exclude "*" --include "*.zip"

- name: Delete zips
if: "${{ env.AWS_ACCESS_KEY_ID == '' }}"
run: |
rm -f *.zip
rm -f markdown/*-Part*.zip
rm -f html/*-Part*.zip

- name: Commit changes
if: "${{ env.AWS_ACCESS_KEY_ID == '' }}"
run: |
git config user.name github-actions
git config user.email [email protected]
git add .
git commit -m "Automated snapshot"
git push
```
### AWS S3 IAM Policy
```json
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::<Your S3 Bucket Name>",
"arn:aws:s3:::<Your S3 Bucket Name>/*"
]
}
]
}
```