Backing up an Odoo installation means backing up two things: the PostgreSQL database and the filestore directory. Miss either one and your backup is incomplete. This guide covers every method — from the built-in Odoo web interface to automated pg_dump scripts with S3 uploads — so you can choose the right approach for your setup.
What Needs to Be Backed Up
Odoo stores data in two places:
- PostgreSQL database: All your business records — customers, orders, invoices, inventory, users, settings. This is the primary data store.
- Filestore: Uploaded files — attachments, images, report PDFs, document scans. Located at
/var/lib/odoo/filestore/<database_name>/by default. This directory can grow large over time.
A database backup without the filestore is incomplete. Attachments will show as broken links after restore. Always back up both together.
The 3-2-1 Backup Rule
Follow the 3-2-1 rule for any production system:
- 3 copies of your data
- 2 different storage media or locations
- 1 offsite copy (e.g., S3, Backblaze B2, or another cloud provider)
For Odoo: keep one copy on the server, one on a separate server or NAS, and one in object storage like S3. If your VPS provider has an outage or deletes your data, the offsite copy saves you.
Method 1: Odoo Web Interface Backup
The easiest backup method for non-technical users. Navigate to https://your-odoo-instance.com/web/database/manager and click the download (backup) icon next to your database. Odoo will generate a ZIP file containing:
- A PostgreSQL dump (
dump.sql) - The filestore directory
- A manifest file
This ZIP can be used to restore the database through the same interface. Limitations: For large databases (over 1-2 GB), this method can time out or produce very large downloads. It is best suited for development instances or small databases. For production, use pg_dump directly.
Method 2: pg_dump (Recommended for Production)
pg_dump is PostgreSQL's native backup tool. It produces a consistent snapshot of the database at a point in time, even while Odoo is running (no downtime required).
Plain SQL format
pg_dump -U odoo -h localhost -d your_database_name > backup_$(date +%Y%m%d_%H%M%S).sql
Plain SQL is human-readable and can be restored with psql. It is slower to restore for large databases but can be inspected and edited.
Custom format (recommended)
pg_dump -U odoo -h localhost -Fc -d your_database_name > backup_$(date +%Y%m%d_%H%M%S).dump
Custom format (-Fc) is compressed, faster to restore, and supports parallel restore with pg_restore -j. Use this for production databases.
With Docker
docker compose exec db pg_dump -U odoo -Fc odoo > backup_$(date +%Y%m%d_%H%M%S).dump
Method 3: Backing Up the Filestore
After the database dump, back up the filestore directory. The location depends on your setup:
- Default:
/var/lib/odoo/filestore/your_database_name/ - Docker named volume: accessible via
docker run --rm -v odoo_filestore:/data alpine tar czf - /data
Create a compressed tar archive:
tar -czf filestore_$(date +%Y%m%d_%H%M%S).tar.gz /var/lib/odoo/filestore/your_database_name/
Automating Backups with Cron
Manual backups are only as good as the person who remembers to run them. Automate with a cron job and a shell script.
Backup script
#!/bin/bash
# /opt/scripts/backup_odoo.sh
DB_NAME="your_database_name"
DB_USER="odoo"
BACKUP_DIR="/var/backups/odoo"
DATE=$(date +%Y%m%d_%H%M%S)
FILESTORE_PATH="/var/lib/odoo/filestore/${DB_NAME}"
S3_BUCKET="s3://your-bucket/odoo-backups"
RETENTION_DAYS=7
mkdir -p "$BACKUP_DIR"
# Database backup
pg_dump -U "$DB_USER" -Fc "$DB_NAME" > "$BACKUP_DIR/db_${DATE}.dump"
# Filestore backup
tar -czf "$BACKUP_DIR/filestore_${DATE}.tar.gz" -C "$(dirname $FILESTORE_PATH)" "$(basename $FILESTORE_PATH)"
# Upload to S3
aws s3 cp "$BACKUP_DIR/db_${DATE}.dump" "${S3_BUCKET}/db/"
aws s3 cp "$BACKUP_DIR/filestore_${DATE}.tar.gz" "${S3_BUCKET}/filestore/"
# Local retention cleanup
find "$BACKUP_DIR" -name "*.dump" -mtime +${RETENTION_DAYS} -delete
find "$BACKUP_DIR" -name "*.tar.gz" -mtime +${RETENTION_DAYS} -delete
echo "Backup completed: ${DATE}"
Make it executable and add to crontab:
chmod +x /opt/scripts/backup_odoo.sh
crontab -e
Add a line to run daily at 2 AM:
0 2 * * * /opt/scripts/backup_odoo.sh >> /var/log/odoo_backup.log 2>&1
S3 Upload Setup
The script above uses the AWS CLI to upload to S3-compatible storage. This works with AWS S3, Backblaze B2, Cloudflare R2, MinIO, and any S3-compatible provider. Install the AWS CLI and configure credentials:
pip install awscli
aws configure
For non-AWS S3-compatible providers, specify the endpoint:
aws s3 cp file.dump s3://bucket/path/ --endpoint-url https://your-s3-endpoint.com
Backup Retention Strategy
A tiered retention policy balances storage cost with recovery flexibility:
- Daily backups: Keep for 7 days. Covers accidental deletions and short-term issues.
- Weekly backups: Keep for 4 weeks (retain every Sunday's backup). Covers issues discovered after a week.
- Monthly backups: Keep for 12 months (retain the first of each month). Covers long-term compliance requirements and slow-developing data issues.
S3 lifecycle policies can automate this. Set a lifecycle rule to delete daily backups after 7 days, weekly backups after 30 days, and monthly backups after 365 days.
Restoring an Odoo Database
To restore from a pg_dump custom format backup:
- Create an empty PostgreSQL database:
createdb -U odoo new_database_name - Restore:
pg_restore -U odoo -d new_database_name backup.dump - Restore the filestore:
tar -xzf filestore.tar.gz -C /var/lib/odoo/filestore/ - Update the database name in
odoo.confif restoring to a different name - Restart Odoo
To restore via the Odoo web interface, go to /web/database/manager, click Restore, and upload the ZIP file produced by the web backup method.
Testing Your Backups
A backup you have never tested is not a backup — it is a hope. Schedule quarterly restore tests:
- Spin up a test server or use a local Docker container
- Restore the most recent backup
- Log into Odoo and verify data looks correct (check a recent order, invoice, or inventory record)
- Document the restore time — you need to know how long recovery takes before an incident happens
The Easy Way: Use DeployMonkey
Setting up and maintaining a reliable backup pipeline — cron jobs, S3 credentials, retention policies, restore testing — is ongoing operational work. DeployMonkey handles automated S3 backups for every Odoo instance you deploy. Backups run on a schedule, upload to your S3 bucket, and are accessible from the control panel for one-click restores.
You configure your S3 bucket once, and DeployMonkey manages the rest. No cron jobs to maintain, no scripts to debug when they silently fail at 2 AM. The backup status is visible in your dashboard so you always know the last successful backup time. See our pricing page for plan details — automated backups are included from the Base plan upward.
Start protecting your Odoo data today. Create a free DeployMonkey account and connect your server.
Summary
Back up both the PostgreSQL database and the filestore — never one without the other. Use pg_dump custom format for production databases, automate with a cron script, upload to S3-compatible storage, apply a tiered retention policy, and test restores quarterly. The 3-2-1 rule keeps your data safe even if your server is destroyed. With this setup, you can recover from almost any disaster in under an hour.