# OpenBSD Server Backup Script A robust, production-ready backup solution for OpenBSD servers that compresses directories and uploads them to cloud storage with comprehensive logging and email notifications. ## Author **Biswa Kalyan Bhuyan** 📧 [biswa@surgot.in](mailto:biswa@surgot.in) ## Features - ✅ **Individual tar archives** for each directory - ✅ **Maximum gzip compression** (level 9) - ✅ **Cloud upload** via rclone - ✅ **Automatic cleanup** after successful upload - ✅ **Email notifications** with detailed logs - ✅ **Directory validation** before backup - ✅ **Disk space checking** (1GB minimum requirement) - ✅ **Log rotation** (keeps last 10 logs) - ✅ **Permission validation** with root privilege warnings - ✅ **Dry-run mode** for testing - ✅ **Server-optimized** with production-ready configurations ## Requirements ### System Requirements - **OpenBSD** (tested and optimized for OpenBSD) - **Root access** (recommended for complete system backups) - **Minimum 1GB free space** in `/tmp` ### Dependencies ```bash # Install required packages pkg_add rclone mailx # Configure rclone (setup 'cf' remote) rclone config ``` ## Installation ### 1. Clone Repository ```bash git clone cd openbsd-backup-script ``` ### 2. Deploy to Server ```bash # Copy script to system location sudo cp backup.sh /usr/local/bin/ sudo chmod +x /usr/local/bin/backup.sh sudo chown root:wheel /usr/local/bin/backup.sh ``` ### 3. Configure - Setup rclone remote named 'cf' pointing to your cloud storage - Verify email functionality: `echo "Test" | mail -s "Test" root@mail.surgot.in` - Edit script to change `EMAIL_RECIPIENT` if needed ## Usage ### Basic Commands ```bash # Test mode (no actual upload) sudo backup.sh --dir /etc --dry-run # Backup single directory sudo backup.sh --dir /etc # Backup multiple directories sudo backup.sh --dir /etc,/var/www,/var/log # Process existing tarball sudo backup.sh /path/to/existing.tar # Show help backup.sh --help ``` ### Server-Specific Examples #### Web Server Backup ```bash sudo backup.sh --dir /etc,/var/www,/var/log ``` #### Mail Server Backup ```bash sudo backup.sh --dir /etc,/var/mail,/var/spool/mail,/usr/local/etc ``` #### Database Server Backup ```bash sudo backup.sh --dir /etc,/var/lib/mysql,/var/postgresql,/usr/local/etc ``` #### Complete System Backup ```bash sudo backup.sh --dir /etc,/var/www,/var/log,/home,/usr/local,/var/mail ``` ## Automation with Cron ### Daily Backup (2:00 AM) ```bash # Edit root's crontab sudo crontab -e # Add daily backup 0 2 * * * /usr/local/bin/backup.sh --dir /etc,/var/www,/var/log >/dev/null 2>&1 ``` ### Weekly Full Backup (Sunday 3:00 AM) ```bash # Weekly comprehensive backup 0 3 * * 0 /usr/local/bin/backup.sh --dir /etc,/var/www,/var/log,/home,/usr/local/etc >/dev/null 2>&1 ``` ### Multiple Schedules ```bash # Daily configs (2 AM) 0 2 * * * /usr/local/bin/backup.sh --dir /etc,/usr/local/etc >/dev/null 2>&1 # Daily web content (3 AM) 0 3 * * * /usr/local/bin/backup.sh --dir /var/www >/dev/null 2>&1 # Weekly logs (Sunday 4 AM) 0 4 * * 0 /usr/local/bin/backup.sh --dir /var/log >/dev/null 2>&1 ``` ## Configuration ### Script Configuration Edit these variables in `backup.sh`: ```bash RCLONE_REMOTE="cf:backups/" # Cloud storage remote EMAIL_RECIPIENT="root@mail.surgot.in" # Email for notifications COMPRESSION_LEVEL=9 # Gzip compression level MIN_FREE_SPACE_MB=1024 # Minimum required space (MB) MAX_LOG_FILES=10 # Log retention count ``` ### File Naming Convention - **Archive files**: `etc-20250606.tar.gz`, `var-www-20250606.tar.gz` - **Log files**: `/var/log/backup-log-20250606.log` - **Backup staging**: `/tmp/backups-20250606/` ## Monitoring & Logs ### Log Locations - **Primary**: `/var/log/backup-log-YYYYMMDD.log` - **Fallback**: `/tmp/backup-log-YYYYMMDD.log` (if `/var/log/` not writable) ### Email Reports - **Success**: `[SUCCESS] Backup Script Log - hostname - date` - **Failure**: `[FAILED] Backup Script Log - hostname - date` - **Content**: Complete execution log with statistics ### Log Rotation - Automatically keeps last 10 log files - Older logs are automatically removed ## Output Examples ### Archive Creation ``` [INFO] Creating tar archive for: /var/www [INFO] Directory size: 486M [INFO] Available space in backup location: 5229 MB [INFO] Creating archive: /tmp/backups-20250606/var-www-20250606.tar [SUCCESS] Tar archive created in 3 seconds [INFO] Archive size: 484 MB ``` ### Compression & Upload ``` [INFO] Compressing 'var-www-20250606.tar' with maximum compression (level 9)... [SUCCESS] Compression completed in 45 seconds [INFO] Original size: 484 MB [INFO] Compressed size: 156 MB [INFO] Compression ratio: 67% [INFO] Space saved: 328 MB [INFO] Uploading 'var-www-20250606.tar.gz' to cf:backups/... [SUCCESS] Upload completed successfully in 23 seconds ``` ## Error Handling ### Common Issues & Solutions #### Permission Denied ```bash # Run as root for system directories sudo backup.sh --dir /etc ``` #### Insufficient Disk Space ```bash # Check available space df -h /tmp # Clean up old files or increase disk space ``` #### Directory Not Found ```bash # Script validates directories before starting [ERROR] Directory does not exist: /var/www ``` #### rclone Not Configured ```bash # Configure rclone remote sudo rclone config # Setup remote named 'cf' ``` ## Security Considerations - **Run as root** for complete system access - **Sensitive files** (like `/etc/shadow`) may show permission warnings (normal behavior) - **Cloud credentials** stored in rclone config - **Log files** contain backup statistics but not file contents - **Email logs** sent to configured recipient ## Performance ### Typical Performance - **Compression**: ~10-15MB/s (depends on data type and CPU) - **Upload**: Depends on network bandwidth and cloud provider - **Disk Usage**: Temporary files in `/tmp/` (auto-cleaned) ### Optimization Tips - **Schedule during off-peak hours** (2-4 AM) - **Separate large directories** into different backup jobs - **Monitor disk space** in `/tmp/` - **Use SSD storage** for better compression performance ## Contributing 1. Fork the repository 2. Create a feature branch (`git checkout -b feature/amazing-feature`) 3. Commit your changes (`git commit -m 'Add amazing feature'`) 4. Push to the branch (`git push origin feature/amazing-feature`) 5. Open a Pull Request ## License This project is open source and available under the [BSD License](LICENSE). ## Support For issues, questions, or contributions: - **Email**: [biswa@surgot.in](mailto:biswa@surgot.in) - **Issues**: Use my email for bug reports and feature requests ## Changelog ### v1.0.0 - Initial release - OpenBSD optimized backup script - Cloud upload via rclone - Email notifications - Comprehensive logging - Directory validation - Dry-run mode - Automatic cleanup - Log rotation --- **Made with ❤️ for OpenBSD servers**