Manual backups are backups waiting to be forgotten. Automating your Odoo backups to S3-compatible object storage gives you durable, off-site copies with virtually unlimited retention — and it takes less than 30 minutes to set up.
Why S3-Compatible Storage?
Object storage like Amazon S3, Backblaze B2, Cloudflare R2, or Hetzner Object Storage offers several advantages over storing backups on the same server:
- Backups survive server failure, accidental deletion, or ransomware
- Lifecycle policies automatically expire old backups — no manual cleanup
- Costs pennies per GB per month at rest
- The AWS CLI works with all S3-compatible providers via
--endpoint-url
Prerequisites
- A Linux server running Odoo (any version 14–19)
- An S3-compatible bucket created and access credentials ready
- AWS CLI v2 installed (or compatible CLI for your provider)
postgresql-clientpackage installed forpg_dump
Step 1: Install and Configure the AWS CLI
# Install AWS CLI v2
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o /tmp/awscliv2.zip
unzip /tmp/awscliv2.zip -d /tmp
sudo /tmp/aws/install
# Configure credentials
aws configure
When prompted, enter your Access Key ID, Secret Access Key, default region, and output format (json).
For non-AWS providers, configure a named profile with the endpoint URL:
# ~/.aws/credentials
[backblaze]
aws_access_key_id = YOUR_KEY_ID
aws_secret_access_key = YOUR_APPLICATION_KEY
# ~/.aws/config
[profile backblaze]
region = us-west-002
Step 2: Create the Backup Script
Save this as /opt/odoo-backup/backup.sh and make it executable with chmod +x:
#!/bin/bash
set -euo pipefail
# --- Configuration ---
DB_NAME="myodoo"
DB_USER="odoo"
FILESTORE_PATH="/var/lib/odoo/.local/share/Odoo/filestore/${DB_NAME}"
BACKUP_DIR="/tmp/odoo_backup_staging"
S3_BUCKET="s3://my-odoo-backups"
S3_PREFIX="production"
# For non-AWS: add --endpoint-url https://s3.us-west-002.backblazeb2.com
S3_ENDPOINT=""
RETENTION_DAYS=30
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
BACKUP_NAME="odoo_${DB_NAME}_${TIMESTAMP}"
# --- Prepare staging directory ---
mkdir -p "${BACKUP_DIR}/${BACKUP_NAME}"
# --- Dump database ---
echo "[$(date)] Dumping database..."
sudo -u postgres pg_dump \
-Fc \
-U "${DB_USER}" \
"${DB_NAME}" \
-f "${BACKUP_DIR}/${BACKUP_NAME}/db.dump"
# --- Archive filestore ---
echo "[$(date)] Archiving filestore..."
tar -czf "${BACKUP_DIR}/${BACKUP_NAME}/filestore.tar.gz" \
-C "$(dirname ${FILESTORE_PATH})" \
"$(basename ${FILESTORE_PATH})"
# --- Create final archive ---
echo "[$(date)] Creating final archive..."
tar -czf "/tmp/${BACKUP_NAME}.tar.gz" \
-C "${BACKUP_DIR}" \
"${BACKUP_NAME}"
# --- Upload to S3 ---
echo "[$(date)] Uploading to S3..."
S3_CMD="aws s3 cp /tmp/${BACKUP_NAME}.tar.gz ${S3_BUCKET}/${S3_PREFIX}/${BACKUP_NAME}.tar.gz"
if [ -n "${S3_ENDPOINT}" ]; then
S3_CMD="${S3_CMD} --endpoint-url ${S3_ENDPOINT}"
fi
eval "${S3_CMD}"
# --- Cleanup local staging ---
rm -rf "${BACKUP_DIR}/${BACKUP_NAME}" "/tmp/${BACKUP_NAME}.tar.gz"
echo "[$(date)] Backup complete: ${S3_BUCKET}/${S3_PREFIX}/${BACKUP_NAME}.tar.gz"
Step 3: Schedule with Cron
Edit the crontab for the odoo user (or root if needed for sudo -u postgres):
sudo crontab -e
Add a daily backup at 2 AM and a weekly backup on Sundays at 3 AM:
# Daily backup at 02:00
0 2 * * * /opt/odoo-backup/backup.sh >> /var/log/odoo-backup.log 2>&1
# Weekly full backup on Sunday at 03:00 (same script, different S3 prefix via env var)
0 3 * * 0 S3_PREFIX=weekly /opt/odoo-backup/backup.sh >> /var/log/odoo-backup.log 2>&1
Step 4: S3 Lifecycle Policies
Set lifecycle rules to automatically expire old backups. In the AWS console, go to your bucket > Management > Lifecycle rules:
- Daily backups (
production/prefix): expire after 30 days - Weekly backups (
weekly/prefix): expire after 90 days - Monthly backups (
monthly/prefix): expire after 365 days
Via AWS CLI:
aws s3api put-bucket-lifecycle-configuration \
--bucket my-odoo-backups \
--lifecycle-configuration '{
"Rules": [
{
"ID": "expire-daily",
"Filter": {"Prefix": "production/"},
"Status": "Enabled",
"Expiration": {"Days": 30}
},
{
"ID": "expire-weekly",
"Filter": {"Prefix": "weekly/"},
"Status": "Enabled",
"Expiration": {"Days": 90}
}
]
}'
Step 5: Test the Backup and Restore
A backup you have never restored is a backup you cannot trust. After setting up automation:
- Run the script manually:
sudo /opt/odoo-backup/backup.sh - Confirm the file appears in S3:
aws s3 ls s3://my-odoo-backups/production/ - Download and perform a test restore on a staging server
- Verify Odoo starts and data is intact
S3-Compatible Provider Notes
- Backblaze B2: Use
--endpoint-url https://s3.us-west-002.backblazeb2.comand set the region to match your bucket region - Cloudflare R2: Use
--endpoint-url https://<ACCOUNT_ID>.r2.cloudflarestorage.com— R2 has no egress fees - Hetzner Object Storage: Use
--endpoint-url https://<location>.your-objectstorage.com
How DeployMonkey Automates This for You
DeployMonkey includes built-in S3-compatible backup automation for every managed Odoo instance. Configure your bucket credentials once in the control panel, set your schedule and retention policy, and DeployMonkey handles everything — including backup health monitoring and one-click restores. No scripts, no cron jobs, no surprises. Get started free.
Frequently Asked Questions
Does the backup script work with Cloudflare R2?
Yes. Set S3_ENDPOINT="https://<ACCOUNT_ID>.r2.cloudflarestorage.com" in the script and configure your R2 API token as the AWS credentials. R2 is S3-compatible and has no egress fees, making it cost-effective for frequent restores.
How large will Odoo backups be?
It depends on your data volume and attachment count. A typical small business Odoo instance is 500 MB–2 GB per backup after compression. The filestore (attachments, PDFs) usually accounts for 60–80% of the size.
Should I encrypt my S3 backups?
Yes, especially if your backups contain personal data subject to GDPR or similar regulations. Enable S3 server-side encryption (SSE-S3 or SSE-KMS) on the bucket, or encrypt locally before upload using gpg --symmetric.
How do I get notified if a backup fails?
Add error notification to the script: wrap the main logic in a function and send an email or webhook on non-zero exit. Alternatively, use a cron monitoring service like Healthchecks.io — ping a URL on success, and get alerted if no ping arrives within the expected window.
Can I back up multiple Odoo databases with one script?
Yes. Refactor the script to accept DB_NAME as a parameter and loop over a list of database names. Each database and its filestore will be backed up independently.