n8nen.nl logo n8nen.nl

N8N Backup & Restore: Complete Disaster Recovery Strategie (2025)

2025-01-29 Sam van N8Nen.nl
N8N Backup & Restore: Complete Disaster Recovery Strategie (2025)

🛡️ Samenvatting

Een robuuste backup strategie is cruciaal voor elke productie N8N installatie. Deze complete gids behandelt alles: van het backuppen van workflows, credentials en databases tot geautomatiseerde backup scripts, cloud storage integratie, disaster recovery planning en zero-downtime restore procedures. Leer hoe je data loss voorkomt, RTO/RPO targets haalt en business continuity garandeert met bewezen backup strategieën.

Inhoudsopgave

Waarom N8N Backup Essentieel Is

Zonder proper backup strategie loop je enorme risico's:

⚠️ Risico's Zonder Backup

  • • Verlies van maanden werk
  • • Credentials permanent kwijt
  • • Business processen stoppen
  • • Revenue loss door downtime
  • • Compliance violations (GDPR)
  • • Reputatieschade
  • • Onherstelbare data loss

✅ Voordelen Met Backup

  • • Snelle recovery (< 1 uur)
  • • Version control mogelijk
  • • Test/staging sync
  • • Compliance ready
  • • Peace of mind
  • • Business continuity
  • • Rollback mogelijkheden

💰 Kosten van Data Loss

Scenario Downtime Recovery Tijd Geschatte Kosten
Met Backup < 1 uur 30 minuten €100-500
Zonder Backup 1-7 dagen Complete rebuild €10.000-100.000+

Wat Moet Je Backuppen in N8N?

Een complete N8N backup bestaat uit meerdere componenten:

N8N Backup Components Database • Workflows • Executions • Settings PostgreSQL/MySQL Credentials • API Keys • Passwords • OAuth Tokens Encrypted Storage Files • Binary Data • Static Files • Logs File System Configuration • .env files • Docker configs • SSL certs Environment Redis • Queue State • Job Data • Sessions Queue Mode Only Backup Storage S3 / GCS / Local / Git

📊 Backup Prioriteiten

Component Prioriteit Frequentie Retentie
Database Kritiek Elk uur 30 dagen
Credentials Kritiek Bij wijziging Onbeperkt
Workflows Hoog Dagelijks 90 dagen
Binary Data Medium Wekelijks 7 dagen
Logs Laag Maandelijks 30 dagen

Backup Strategieën & Best Practices

Kies de juiste backup strategie voor jouw situatie:

🎯 3-2-1 Regel

  • • 3 kopieën van data
  • • 2 verschillende media
  • • 1 offsite backup
  • • Gouden standaard

⏰ RTO/RPO Targets

  • • RTO: < 4 uur
  • • RPO: < 1 uur
  • • Test maandelijks
  • • Document procedures

🔄 Backup Types

  • • Full: Wekelijks
  • • Incremental: Dagelijks
  • • Snapshot: Elk uur
  • • Continuous: Real-time

Database Backup Procedures

De database is het hart van N8N. Hier zijn backup procedures per database type:

🐘 PostgreSQL Backup

#!/bin/bash
# postgresql-backup.sh - Complete PostgreSQL backup script

# Configuration
DB_HOST="localhost"
DB_PORT="5432"
DB_NAME="n8n"
DB_USER="n8n"
DB_PASSWORD="your-password"
BACKUP_DIR="/backup/postgres"
S3_BUCKET="s3://your-bucket/n8n-backups/postgres"
RETENTION_DAYS=30

# Create backup directory
DATE=$(date +"%Y%m%d_%H%M%S")
BACKUP_FILE="${BACKUP_DIR}/n8n_${DATE}.sql.gz"
mkdir -p "${BACKUP_DIR}"

# Export password for pg_dump
export PGPASSWORD="${DB_PASSWORD}"

echo "[$(date)] Starting PostgreSQL backup..."

# Create backup with compression
pg_dump \
  --host="${DB_HOST}" \
  --port="${DB_PORT}" \
  --username="${DB_USER}" \
  --dbname="${DB_NAME}" \
  --no-password \
  --verbose \
  --format=custom \
  --compress=9 \
  --file="${BACKUP_FILE%.gz}" \
  2>&1 | tee -a /var/log/n8n-backup.log

# Additional compression
gzip -9 "${BACKUP_FILE%.gz}"

# Calculate checksum
sha256sum "${BACKUP_FILE}" > "${BACKUP_FILE}.sha256"

# Upload to S3
if command -v aws &> /dev/null; then
    echo "[$(date)] Uploading to S3..."
    aws s3 cp "${BACKUP_FILE}" "${S3_BUCKET}/" \
      --storage-class STANDARD_IA \
      --metadata "backup-date=${DATE},db-name=${DB_NAME}"
    
    aws s3 cp "${BACKUP_FILE}.sha256" "${S3_BUCKET}/"
fi

# Clean old local backups
echo "[$(date)] Cleaning old backups..."
find "${BACKUP_DIR}" -name "n8n_*.sql.gz" -mtime +${RETENTION_DAYS} -delete

# Verify backup
echo "[$(date)] Verifying backup..."
gunzip -t "${BACKUP_FILE}" && echo "Backup verified successfully" || exit 1

echo "[$(date)] Backup completed: ${BACKUP_FILE}"

# Send notification (optional)
curl -X POST https://hooks.slack.com/services/YOUR/WEBHOOK/URL \
  -H 'Content-Type: application/json' \
  -d "{\"text\":\"✅ N8N Database backup successful: ${BACKUP_FILE}\"}"

unset PGPASSWORD

🐬 MySQL/MariaDB Backup

#!/bin/bash
# mysql-backup.sh - MySQL backup met point-in-time recovery

DB_HOST="localhost"
DB_NAME="n8n"
DB_USER="n8n"
DB_PASSWORD="your-password"
BACKUP_DIR="/backup/mysql"
DATE=$(date +"%Y%m%d_%H%M%S")

# Full backup met binary logs
mysqldump \
  --host="${DB_HOST}" \
  --user="${DB_USER}" \
  --password="${DB_PASSWORD}" \
  --single-transaction \
  --routines \
  --triggers \
  --events \
  --hex-blob \
  --databases "${DB_NAME}" \
  --result-file="${BACKUP_DIR}/n8n_${DATE}.sql"

# Compress backup
gzip -9 "${BACKUP_DIR}/n8n_${DATE}.sql"

# Backup binary logs voor point-in-time recovery
mysqlbinlog \
  --read-from-remote-server \
  --host="${DB_HOST}" \
  --user="${DB_USER}" \
  --password="${DB_PASSWORD}" \
  --to-last-log \
  --result-file="${BACKUP_DIR}/binlog_${DATE}.sql"

Workflows & Credentials Backup

N8N biedt CLI commands voor workflow en credential export:

📦 N8N Export Commands

Workflows Exporteren:

# Export alle workflows
n8n export:workflow --all \
  --output=/backup/workflows_$(date +%Y%m%d).json

# Export specifieke workflow
n8n export:workflow --id=5 \
  --output=/backup/workflow_5.json

# Export met pretty format
n8n export:workflow --all --pretty \
  --output=/backup/workflows.json

Credentials Exporteren (Encrypted):

# Export credentials (blijven encrypted)
n8n export:credentials --all \
  --output=/backup/credentials_$(date +%Y%m%d).json

# Decrypt voor backup (VOORZICHTIG!)
n8n export:credentials --all --decrypted \
  --output=/secure/credentials_decrypted.json

# Direct encrypten met GPG
n8n export:credentials --all --decrypted | \
  gpg --encrypt -r backup@company.com > \
  /backup/credentials_$(date +%Y%m%d).gpg

Geautomatiseerde Backup Scripts

Complete geautomatiseerde backup oplossing voor N8N:

🤖 Master Backup Script

#!/usr/bin/env python3
# n8n-backup.py - Complete N8N backup automation

import os
import sys
import json
import subprocess
import datetime
import hashlib
import boto3
from pathlib import Path
import logging
import smtplib
from email.mime.text import MIMEText
from email.mime.multipart import MIMEMultipart

# Configuration
class Config:
    # Paths
    N8N_DATA_DIR = "/home/node/.n8n"
    BACKUP_DIR = "/backup/n8n"
    TEMP_DIR = "/tmp/n8n-backup"
    
    # Database
    DB_TYPE = "postgres"  # postgres, mysql, sqlite
    DB_HOST = os.getenv("DB_HOST", "localhost")
    DB_NAME = os.getenv("DB_NAME", "n8n")
    DB_USER = os.getenv("DB_USER", "n8n")
    DB_PASSWORD = os.getenv("DB_PASSWORD")
    
    # S3
    S3_BUCKET = os.getenv("S3_BUCKET", "your-backup-bucket")
    S3_PREFIX = "n8n-backups"
    AWS_REGION = "eu-west-1"
    
    # Retention
    LOCAL_RETENTION_DAYS = 7
    S3_RETENTION_DAYS = 30
    
    # Notifications
    SMTP_HOST = "smtp.gmail.com"
    SMTP_PORT = 587
    SMTP_USER = os.getenv("SMTP_USER")
    SMTP_PASSWORD = os.getenv("SMTP_PASSWORD")
    NOTIFY_EMAIL = os.getenv("NOTIFY_EMAIL")

class N8NBackup:
    def __init__(self):
        self.timestamp = datetime.datetime.now().strftime("%Y%m%d_%H%M%S")
        self.backup_name = f"n8n_backup_{self.timestamp}"
        self.backup_path = Path(Config.BACKUP_DIR) / self.backup_name
        self.temp_path = Path(Config.TEMP_DIR) / self.backup_name
        
        # Setup logging
        logging.basicConfig(
            level=logging.INFO,
            format='%(asctime)s - %(levelname)s - %(message)s',
            handlers=[
                logging.FileHandler('/var/log/n8n-backup.log'),
                logging.StreamHandler()
            ]
        )
        self.logger = logging.getLogger(__name__)
        
        # Initialize S3 client
        self.s3 = boto3.client('s3', region_name=Config.AWS_REGION)
        
    def run(self):
        """Main backup execution"""
        try:
            self.logger.info(f"Starting N8N backup: {self.backup_name}")
            
            # Create directories
            self.temp_path.mkdir(parents=True, exist_ok=True)
            self.backup_path.mkdir(parents=True, exist_ok=True)
            
            # Backup components
            backup_files = []
            backup_files.append(self.backup_database())
            backup_files.append(self.backup_workflows())
            backup_files.append(self.backup_credentials())
            backup_files.append(self.backup_files())
            backup_files.append(self.backup_config())
            
            # Create archive
            archive_path = self.create_archive(backup_files)
            
            # Upload to S3
            self.upload_to_s3(archive_path)
            
            # Cleanup old backups
            self.cleanup_old_backups()
            
            # Send notification
            self.send_notification("success", archive_path)
            
            self.logger.info("Backup completed successfully")
            return True
            
        except Exception as e:
            self.logger.error(f"Backup failed: {e}")
            self.send_notification("failed", str(e))
            return False
        finally:
            # Cleanup temp files
            if self.temp_path.exists():
                subprocess.run(["rm", "-rf", str(self.temp_path)])
    
    def backup_database(self):
        """Backup database based on type"""
        self.logger.info("Backing up database...")
        db_backup = self.temp_path / "database.sql.gz"
        
        if Config.DB_TYPE == "postgres":
            cmd = [
                "pg_dump",
                f"--host={Config.DB_HOST}",
                f"--username={Config.DB_USER}",
                f"--dbname={Config.DB_NAME}",
                "--no-password",
                "--format=custom",
                "--compress=9",
                f"--file={db_backup}"
            ]
            env = os.environ.copy()
            env["PGPASSWORD"] = Config.DB_PASSWORD
            subprocess.run(cmd, env=env, check=True)
            
        elif Config.DB_TYPE == "mysql":
            cmd = [
                "mysqldump",
                f"--host={Config.DB_HOST}",
                f"--user={Config.DB_USER}",
                f"--password={Config.DB_PASSWORD}",
                "--single-transaction",
                "--routines",
                "--triggers",
                Config.DB_NAME
            ]
            with open(db_backup.with_suffix('.sql'), 'w') as f:
                subprocess.run(cmd, stdout=f, check=True)
            subprocess.run(["gzip", "-9", str(db_backup.with_suffix('.sql'))])
            
        elif Config.DB_TYPE == "sqlite":
            sqlite_file = Path(Config.N8N_DATA_DIR) / "database.sqlite"
            subprocess.run([
                "sqlite3", str(sqlite_file),
                f".backup {db_backup.with_suffix('.db')}"
            ], check=True)
            subprocess.run(["gzip", "-9", str(db_backup.with_suffix('.db'))])
        
        return db_backup
    
    def backup_workflows(self):
        """Export all workflows"""
        self.logger.info("Backing up workflows...")
        workflow_backup = self.temp_path / "workflows.json"
        
        cmd = [
            "n8n", "export:workflow",
            "--all",
            "--pretty",
            f"--output={workflow_backup}"
        ]
        subprocess.run(cmd, check=True)
        
        # Compress
        subprocess.run(["gzip", "-9", str(workflow_backup)])
        return workflow_backup.with_suffix('.json.gz')
    
    def backup_credentials(self):
        """Export credentials (encrypted)"""
        self.logger.info("Backing up credentials...")
        cred_backup = self.temp_path / "credentials.json.gpg"
        
        # Export and encrypt in one step
        export_cmd = [
            "n8n", "export:credentials",
            "--all",
            "--decrypted"
        ]
        
        encrypt_cmd = [
            "gpg",
            "--encrypt",
            "--armor",
            "-r", "backup@company.com",
            "-o", str(cred_backup)
        ]
        
        # Pipe export to encryption
        export_proc = subprocess.Popen(export_cmd, stdout=subprocess.PIPE)
        encrypt_proc = subprocess.Popen(encrypt_cmd, stdin=export_proc.stdout)
        export_proc.stdout.close()
        encrypt_proc.communicate()
        
        return cred_backup
    
    def backup_files(self):
        """Backup binary data and static files"""
        self.logger.info("Backing up files...")
        files_backup = self.temp_path / "files.tar.gz"
        
        # Files to backup
        files_dirs = [
            Path(Config.N8N_DATA_DIR) / "files",
            Path(Config.N8N_DATA_DIR) / "nodes",
            Path(Config.N8N_DATA_DIR) / "custom"
        ]
        
        tar_cmd = [
            "tar", "-czf", str(files_backup),
            "-C", Config.N8N_DATA_DIR
        ]
        
        for dir_path in files_dirs:
            if dir_path.exists():
                tar_cmd.append(str(dir_path.relative_to(Config.N8N_DATA_DIR)))
        
        subprocess.run(tar_cmd, check=True)
        return files_backup
    
    def backup_config(self):
        """Backup configuration files"""
        self.logger.info("Backing up configuration...")
        config_backup = self.temp_path / "config.tar.gz"
        
        config_files = [
            "/etc/n8n/.env",
            "/etc/n8n/config.json",
            "/home/node/.n8n/config"
        ]
        
        existing_configs = [f for f in config_files if Path(f).exists()]
        
        if existing_configs:
            subprocess.run([
                "tar", "-czf", str(config_backup),
                *existing_configs
            ], check=True)
        
        return config_backup
    
    def create_archive(self, files):
        """Create final backup archive"""
        self.logger.info("Creating backup archive...")
        archive_path = self.backup_path / f"{self.backup_name}.tar.gz"
        
        # Create archive with all backup files
        subprocess.run([
            "tar", "-czf", str(archive_path),
            "-C", str(self.temp_path),
            "."
        ], check=True)
        
        # Generate checksum
        checksum = self.calculate_checksum(archive_path)
        checksum_file = archive_path.with_suffix('.tar.gz.sha256')
        checksum_file.write_text(f"{checksum}  {archive_path.name}\n")
        
        return archive_path
    
    def calculate_checksum(self, file_path):
        """Calculate SHA256 checksum"""
        sha256_hash = hashlib.sha256()
        with open(file_path, "rb") as f:
            for byte_block in iter(lambda: f.read(4096), b""):
                sha256_hash.update(byte_block)
        return sha256_hash.hexdigest()
    
    def upload_to_s3(self, archive_path):
        """Upload backup to S3"""
        self.logger.info("Uploading to S3...")
        
        s3_key = f"{Config.S3_PREFIX}/{archive_path.name}"
        
        # Upload with metadata
        self.s3.upload_file(
            str(archive_path),
            Config.S3_BUCKET,
            s3_key,
            ExtraArgs={
                'StorageClass': 'STANDARD_IA',
                'Metadata': {
                    'backup-date': self.timestamp,
                    'backup-type': 'full',
                    'n8n-version': self.get_n8n_version()
                }
            }
        )
        
        # Upload checksum
        checksum_file = archive_path.with_suffix('.tar.gz.sha256')
        if checksum_file.exists():
            self.s3.upload_file(
                str(checksum_file),
                Config.S3_BUCKET,
                f"{Config.S3_PREFIX}/{checksum_file.name}"
            )
        
        self.logger.info(f"Uploaded to S3: {s3_key}")
    
    def cleanup_old_backups(self):
        """Remove old backup files"""
        self.logger.info("Cleaning up old backups...")
        
        # Clean local backups
        cutoff_date = datetime.datetime.now() - datetime.timedelta(days=Config.LOCAL_RETENTION_DAYS)
        for backup_file in Path(Config.BACKUP_DIR).glob("n8n_backup_*.tar.gz"):
            if backup_file.stat().st_mtime < cutoff_date.timestamp():
                backup_file.unlink()
                self.logger.info(f"Deleted old backup: {backup_file.name}")
        
        # Clean S3 backups
        self.cleanup_s3_backups()
    
    def cleanup_s3_backups(self):
        """Clean old S3 backups"""
        cutoff_date = datetime.datetime.now() - datetime.timedelta(days=Config.S3_RETENTION_DAYS)
        
        response = self.s3.list_objects_v2(
            Bucket=Config.S3_BUCKET,
            Prefix=Config.S3_PREFIX
        )
        
        if 'Contents' in response:
            for obj in response['Contents']:
                if obj['LastModified'].replace(tzinfo=None) < cutoff_date:
                    self.s3.delete_object(
                        Bucket=Config.S3_BUCKET,
                        Key=obj['Key']
                    )
                    self.logger.info(f"Deleted S3 backup: {obj['Key']}")
    
    def get_n8n_version(self):
        """Get N8N version"""
        try:
            result = subprocess.run(
                ["n8n", "--version"],
                capture_output=True,
                text=True
            )
            return result.stdout.strip()
        except:
            return "unknown"
    
    def send_notification(self, status, details):
        """Send email notification"""
        if not Config.NOTIFY_EMAIL:
            return
        
        subject = f"N8N Backup {status.upper()}: {self.backup_name}"
        
        if status == "success":
            body = f"""
            N8N Backup Successful!
            
            Backup Name: {self.backup_name}
            Timestamp: {self.timestamp}
            Location: {details}
            S3 Bucket: {Config.S3_BUCKET}
            
            Components backed up:
            - Database
            - Workflows
            - Credentials (encrypted)
            - Files
            - Configuration
            
            Next backup: Tomorrow at same time
            """
        else:
            body = f"""
            N8N Backup FAILED!
            
            Backup Name: {self.backup_name}
            Timestamp: {self.timestamp}
            Error: {details}
            
            Please check logs at /var/log/n8n-backup.log
            """
        
        # Send email
        msg = MIMEMultipart()
        msg['From'] = Config.SMTP_USER
        msg['To'] = Config.NOTIFY_EMAIL
        msg['Subject'] = subject
        msg.attach(MIMEText(body, 'plain'))
        
        try:
            server = smtplib.SMTP(Config.SMTP_HOST, Config.SMTP_PORT)
            server.starttls()
            server.login(Config.SMTP_USER, Config.SMTP_PASSWORD)
            server.send_message(msg)
            server.quit()
            self.logger.info("Notification sent")
        except Exception as e:
            self.logger.error(f"Failed to send notification: {e}")

if __name__ == "__main__":
    backup = N8NBackup()
    sys.exit(0 if backup.run() else 1)

⏰ Cron Schedule voor Automatische Backups

# Crontab entries voor N8N backup
# Edit met: crontab -e

# Database backup elk uur
0 * * * * /backup/scripts/postgresql-backup.sh >> /var/log/n8n-backup.log 2>&1

# Complete backup dagelijks om 2:00 AM
0 2 * * * /usr/bin/python3 /backup/scripts/n8n-backup.py >> /var/log/n8n-backup.log 2>&1

# Workflow export elke 6 uur
0 */6 * * * n8n export:workflow --all --output=/backup/workflows/workflows_$(date +\%Y\%m\%d_\%H).json

# Weekly full backup op zondag
0 3 * * 0 /backup/scripts/weekly-full-backup.sh

# Monthly archive op de 1e van de maand
0 4 1 * * /backup/scripts/monthly-archive.sh

Cloud Storage Integratie

Offsite backup naar cloud storage voor disaster recovery:

☁️ AWS S3 Setup

# S3 bucket met versioning
aws s3api create-bucket \
  --bucket n8n-backups \
  --region eu-west-1 \
  --create-bucket-configuration \
  LocationConstraint=eu-west-1

# Enable versioning
aws s3api put-bucket-versioning \
  --bucket n8n-backups \
  --versioning-configuration Status=Enabled

# Lifecycle policy
aws s3api put-bucket-lifecycle-configuration \
  --bucket n8n-backups \
  --lifecycle-configuration file://lifecycle.json

☁️ Google Cloud Storage

# Create bucket
gsutil mb -p your-project \
  -c STANDARD \
  -l EU \
  gs://n8n-backups/

# Enable versioning
gsutil versioning set on gs://n8n-backups/

# Upload backup
gsutil -m cp -r /backup/* \
  gs://n8n-backups/$(date +%Y%m%d)/

# Set lifecycle
gsutil lifecycle set lifecycle.json \
  gs://n8n-backups/

Restore Procedures

Stap-voor-stap restore procedures voor verschillende scenarios:

🔄 Complete System Restore

1. Stop N8N Services:

docker-compose down
# of
systemctl stop n8n

2. Restore Database:

# PostgreSQL
gunzip -c /backup/n8n_20250129.sql.gz | \
  pg_restore --host=localhost --username=n8n \
  --dbname=n8n --clean --if-exists

# MySQL
gunzip -c /backup/n8n_20250129.sql.gz | \
  mysql -h localhost -u n8n -p n8n

3. Restore Files:

tar -xzf /backup/files.tar.gz \
  -C /home/node/.n8n/

4. Import Workflows:

n8n import:workflow \
  --input=/backup/workflows.json

5. Start Services:

docker-compose up -d
# Verify
curl http://localhost:5678/healthz

Disaster Recovery Planning

Complete disaster recovery plan voor business continuity:

🚨 Disaster Recovery Checklist

Preventie:

  • ☑️ Dagelijkse backups actief
  • ☑️ Offsite backup geconfigureerd
  • ☑️ Monitoring alerts werkend
  • ☑️ Documentatie up-to-date
  • ☑️ Team getraind

Response:

  • ☑️ Incident commander aangewezen
  • ☑️ Communication plan actief
  • ☑️ Restore procedures getest
  • ☑️ Failover environment ready
  • ☑️ Status page updated

🛡️ Bescherm Je N8N Installatie

Implementeer een robuuste backup strategie en slaap gerust. Van automatische backups tot disaster recovery - alles voor business continuity!

Backup Testing & Validatie

Een backup is alleen goed als je het kunt restoren. Test regelmatig:

✅ Test Procedures

Maandelijkse Test:

  • • Restore naar test environment
  • • Verify workflow functionality
  • • Check credential decryption
  • • Test data integrity

Quarterly DR Drill:

  • • Complete failover test
  • • Time recovery process
  • • Document issues
  • • Update procedures

Conclusie

Een goede backup strategie is geen luxe maar een noodzaak voor elke productie N8N installatie. Met de scripts en procedures in deze gids kun je data loss voorkomen en snel herstellen van disasters.

Start vandaag met het implementeren van automatische backups. Het kost je een paar uur setup tijd, maar kan je bedrijf redden wanneer disaster strikes.

Hulp nodig met backup strategie en disaster recovery planning? Contact me voor professionele backup setup!

#n8n #backup #restore #disaster recovery #database #workflows #credentials #business continuity