Skip to content

Odoo Backups to Amazon S3: Automated Offsite Backup Guide

DeployMonkey Team · March 22, 2026 13 min read

Why S3 for Odoo Backups?

Local backups fail when the server fails. S3 provides: 99.999999999% durability (11 nines), geographic redundancy, versioning, lifecycle management (auto-delete old backups), and encryption. At $0.023/GB/month, it costs pennies to protect your business data.

Setup Steps

Step 1: Create S3 Bucket

# AWS CLI
aws s3 mb s3://company-odoo-backups --region us-east-1

# Enable versioning (protects against accidental overwrites)
aws s3api put-bucket-versioning \
    --bucket company-odoo-backups \
    --versioning-configuration Status=Enabled

# Enable encryption
aws s3api put-bucket-encryption \
    --bucket company-odoo-backups \
    --server-side-encryption-configuration '{
        "Rules": [{"ApplyServerSideEncryptionByDefault": {"SSEAlgorithm": "AES256"}}]
    }'

Step 2: Create IAM User

# Create a dedicated IAM user for backups (principle of least privilege)
aws iam create-user --user-name odoo-backup

# Attach policy — ONLY S3 access to the backup bucket
aws iam put-user-policy --user-name odoo-backup \
    --policy-name OdooBackupPolicy \
    --policy-document '{
        "Version": "2012-10-17",
        "Statement": [{
            "Effect": "Allow",
            "Action": ["s3:PutObject", "s3:GetObject", "s3:ListBucket", "s3:DeleteObject"],
            "Resource": [
                "arn:aws:s3:::company-odoo-backups",
                "arn:aws:s3:::company-odoo-backups/*"
            ]
        }]
    }'

# Create access key
aws iam create-access-key --user-name odoo-backup

Step 3: Install AWS CLI on Odoo Server

# Install AWS CLI
curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o awscliv2.zip
unzip awscliv2.zip && sudo ./aws/install

# Configure credentials (for the odoo user)
sudo -u odoo aws configure
# Enter: Access Key ID, Secret Key, Region, Output format (json)

Step 4: Backup Script

#!/bin/bash
# /opt/scripts/odoo-backup-s3.sh
set -euo pipefail

DB_NAME="odoo"
DB_USER="odoo"
BUCKET="s3://company-odoo-backups"
DATE=$(date +%Y%m%d_%H%M%S)
DAY_OF_WEEK=$(date +%u)  # 1=Monday, 7=Sunday
TMP_DIR="/tmp/odoo-backup-${DATE}"

mkdir -p "$TMP_DIR"

echo "[$(date)] Starting Odoo backup..."

# Database backup (custom format for parallel restore)
pg_dump -U "$DB_USER" -Fc "$DB_NAME" > "${TMP_DIR}/db.dump"
echo "[$(date)] Database dump complete: $(du -h ${TMP_DIR}/db.dump | cut -f1)"

# Filestore backup
FILESTORE="/var/lib/odoo/.local/share/Odoo/filestore/${DB_NAME}"
if [ -d "$FILESTORE" ]; then
    tar -czf "${TMP_DIR}/filestore.tar.gz" -C "$FILESTORE" .
    echo "[$(date)] Filestore archive complete: $(du -h ${TMP_DIR}/filestore.tar.gz | cut -f1)"
fi

# Upload to S3
# Daily backups go to daily/ prefix
# Sunday backups also go to weekly/ prefix
aws s3 cp "${TMP_DIR}/db.dump" "${BUCKET}/daily/${DATE}_db.dump"
aws s3 cp "${TMP_DIR}/filestore.tar.gz" "${BUCKET}/daily/${DATE}_filestore.tar.gz" 2>/dev/null || true

if [ "$DAY_OF_WEEK" = "7" ]; then
    aws s3 cp "${TMP_DIR}/db.dump" "${BUCKET}/weekly/${DATE}_db.dump"
    aws s3 cp "${TMP_DIR}/filestore.tar.gz" "${BUCKET}/weekly/${DATE}_filestore.tar.gz" 2>/dev/null || true
    echo "[$(date)] Weekly backup copied"
fi

# Cleanup temp files
rm -rf "$TMP_DIR"

echo "[$(date)] Backup complete and uploaded to S3"

Step 5: Schedule with Cron

# Run daily at 3 AM
sudo crontab -u odoo -e
# Add:
0 3 * * * /opt/scripts/odoo-backup-s3.sh >> /var/log/odoo-backup.log 2>&1

S3 Lifecycle Rules

# Auto-delete old backups to control costs:
aws s3api put-bucket-lifecycle-configuration \
    --bucket company-odoo-backups \
    --lifecycle-configuration '{
        "Rules": [
            {
                "ID": "Delete daily after 30 days",
                "Filter": {"Prefix": "daily/"},
                "Status": "Enabled",
                "Expiration": {"Days": 30}
            },
            {
                "ID": "Delete weekly after 90 days",
                "Filter": {"Prefix": "weekly/"},
                "Status": "Enabled",
                "Expiration": {"Days": 90}
            },
            {
                "ID": "Move to Glacier after 60 days",
                "Filter": {"Prefix": "weekly/"},
                "Status": "Enabled",
                "Transitions": [{"Days": 60, "StorageClass": "GLACIER"}]
            }
        ]
    }'

Restore from S3

# List available backups
aws s3 ls s3://company-odoo-backups/daily/ --human-readable | tail -10

# Download backup
aws s3 cp s3://company-odoo-backups/daily/20260322_030000_db.dump /tmp/restore.dump

# Stop Odoo
systemctl stop odoo

# Restore database
dropdb -U odoo odoo
createdb -U odoo odoo
pg_restore -U odoo -d odoo /tmp/restore.dump

# Restore filestore (if applicable)
aws s3 cp s3://company-odoo-backups/daily/20260322_030000_filestore.tar.gz /tmp/
mkdir -p /var/lib/odoo/.local/share/Odoo/filestore/odoo/
tar -xzf /tmp/filestore.tar.gz -C /var/lib/odoo/.local/share/Odoo/filestore/odoo/
chown -R odoo:odoo /var/lib/odoo/

# Start Odoo
systemctl start odoo

Cost Estimate

Database SizeDaily Backup (30 days)Weekly (90 days)Monthly Cost
1 GB30 GB13 GB~$1.00
5 GB150 GB65 GB~$5.00
20 GB600 GB260 GB~$20.00

Monitoring

# Check if backup ran successfully:
# 1. Check cron log
tail -5 /var/log/odoo-backup.log

# 2. Check S3 for today's backup
aws s3 ls s3://company-odoo-backups/daily/ | tail -3

# 3. Set up CloudWatch alarm for missing backups
# Alert if no new object in daily/ prefix for 26 hours

DeployMonkey

DeployMonkey includes automated S3 backups for all Odoo instances — daily database + filestore, lifecycle management, encryption, and one-click restore. Zero configuration needed.