• 1 Post
  • 2 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle


  • For smaller backups <10GB ea. I run a 3 phased approach

    • rsync to a local folder /src/backup/<service>
    • rsync that to a remote nas
    • rclone that to a b2 bucket

    These scripts run on the cron service and I log this info out to a file using --log-file option for rsync/rclone so I can do spot checks of the results

    This way I have access to the data locally if the network is down, remotely on a different networked machine for any other device that can browse it, and finally an offsite cloud backup.

    Doing this setup manually through rsync/rclone has been important to get the domain knowledge to think about the overall process; scheduling multiple backups at different times overnight to not overload the drive and network, ensuring versioning is stored for files that might require it and ensuring I am not using too many api calls for B2.

    For large media backups >200GB I only use the rclone script and set it to run for 3hrs every night after all the more important backups are finished. Its not important I get it done asap but a steady drip of any changes up to b2 matters more.

    My next steps is to maybe figure out a process to email the backup logs every so often or look into a full application to take over with better error catching capabilities.