Basically title. I’m in the process of setting up a proper backup for my configured containers on Unraid and I’m wondering how often I should run my backup script. Right now, I have a cron job set to run on Monday and Friday nights, is this too frequent? Whats your schedule and do you strictly backup your appdata (container configs), or is there other data you include in your backups?

  • @Darkassassin07@lemmy.ca
    link
    fedilink
    English
    33
    edit-2
    27 days ago

    I run Borg nightly, backing up the majority of the data on my boot disk, incl docker volumes and config + a few extra folders.

    Each individual archive is around 550gb, but because of the de-duplication and compression it’s only ~800mb of new data each day taking around 3min to complete the backup.

    Borgs de-duplication is honestly incredible. I keep 7 daily backups, 3 weekly, 11 monthly, then one for each year beyond that. The 21 historical backups I have right now RAW would be 10.98tb of data. After de-duplication and compression it only takes up 407.98gb on disk.

    With that kind of space savings, I see no reason not to keep such frequent backups. Hell, the whole archive takes up less space than one copy of the original data.

    • Sips'OP
      link
      fedilink
      English
      527 days ago

      Thanks for sharing the details on this, very interesting!

    • @FryAndBender@lemmy.world
      link
      fedilink
      English
      3
      edit-2
      26 days ago

      +1 for borg


                         Original size      Compressed size    Deduplicated size
      

      This archive: 602.47 GB 569.64 GB 15.68 MB All archives: 16.33 TB 15.45 TB 607.71 GB

                         Unique chunks         Total chunks
      

      Chunk index: 2703719 18695670

  • Andres Salomon
    link
    fedilink
    127 days ago

    @Sunny Backups are done weekly, using Restic (and with ‘–read-data-subset=9%’ to verify that the backup data is still valid).

    But that’s also in addition to doing nightly Snapraid syncs for larger media, and Syncthing for photos & documents (which means I have copies on 2+ machines).

  • @AMillionMonkeys@lemmy.world
    link
    fedilink
    English
    227 days ago

    I tried Kopia but it was unstable and janky, so now it’s whenever I remember to manually run a bunch of rsync. I backup my desktop to cold storage on the first of the month, so I should get in the habit of backing up my server to the NAS then also.

  • @truxnell@infosec.pub
    link
    fedilink
    English
    227 days ago

    Daily backups. Currently using restic on my NixOS servers. To avoid data corruption, I make a zfs snapshot at 2am, and after that restic does a backup of my mutable data dirs both to my local Nas and CloudFlare r3. The Nas backup folder is synced to backblaze nightly as well for a more cold store.

  • itsame
    link
    fedilink
    English
    127 days ago

    Using Kopia, backups are made multiple times per day to Google drive. Only changes are transferred.

    Configurations are backed up once per week and manually, stored 4 weeks. Websites and NextCloud data is backed up every hour and stored for a year (although I’m doing this only 7 months now).

    Kopia is magic, recommended!

  • @desentizised@lemm.ee
    link
    fedilink
    English
    326 days ago

    rsync from ZFS to an off-site unraid every 24 hours 5 times a week. on the sixth day it does a checksum based rsync which obviously means more stress so only do it once a week. the seventh day is reserved for ZFS scrubbing every two weeks.

  • @HeyJoe@lemmy.world
    link
    fedilink
    English
    227 days ago

    I honestly don’t have too much to back up, so I run one full backup job every Sunday for different directories I care about. They run a check on the directory and only back up any changes or new files. I don’t have the space to backup everything, so I only take the smaller stuff and most important. The backup software also allows live monitoring if I enable it, so some of my jobs I have that turned on since I didn’t see any reason not to. I reuse the NAS drives that report errors that I replace with new ones to save on money. So far, so good.

    Backup software is Bvckup2, and reddit was a huge fan of it years ago, so I gave it a try. It was super cheap for a lifetime license at the time, and it’s super lightweight. Sorry, there is no Linux version.

  • battlesheep
    link
    fedilink
    English
    326 days ago

    Backup all of my proxmox-LXCs/VMs to a proxmox backup server every night + sync these backups to another pbs in another town. A second proxmox backup every noon to my nas. (i know, 3-2-1 rule is not reached…)

  • Shimitar
    link
    fedilink
    English
    226 days ago

    Daily toward all my three locations:

    • local on the server
    • in-house but on a different device
    • offsite

    But not all three destinations backup the same amount of data due to storage limitations.

  • @JASN_DE@lemmy.world
    link
    fedilink
    English
    527 days ago

    Nextcloud data daily, same for the docker configs. Less important/rarely changing data once per week. Automatic sync to NAS and online storage. Irregular and manual sync to an external disk.

    7 daily backups, 4 weekly backups, “infinite” monthly backups retained (until I clean them up by hand).