I just started getting into self hosting using docker compose and I wonder about possible backup solutions. I only have to safe my docker config so far, but I want host files as well. What software and hardware are you using for backup?

    • neardeaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      Seconding this. On my unRAID host, I run a docker container called “Vorta” that uses Borg as its backend mechanism to backup to my SynologyNAS over NFS. Then on my Syno, run two backup jobs using HyperBackup, one goes to my cousin’s NAS connected via a Site-to-Site OpenVPN connection on our edge devices (Ubiquity Unifi Security Gateway Pro <-> UDM Pro), the other goes to Backblaze B2 Cloud Storage.

      OP, let me know if you need any assistance setting something like this up. Gotta share the knowledge over here on Lemmy that we’re still used to searching evil Reddit for.

        • neardeaf@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          Niiiice, quick question, are both of y’all running the latest UniFi Controller version & using the new WebUI view layout?

          • PlutoniumAcid@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            10 months ago

            His gear is v7 (Unifi and also Synology DSM) and I am still on v6 because I didn’t have a good reason to upgrade. If it works, don’t fix it, you know? Feature-wise there the same anyway just different UI. But sure, give me a good reason to upgrade, and I will :)

    • Llamajockey@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I had to upgrade to Hopes&Prayers+ after I ran out of hope and my prayers kept getting return to sender.

    • webjukebox@mujico.org
      link
      fedilink
      English
      arrow-up
      3
      ·
      edit-2
      1 year ago

      I was in the same boat, until my prayers weren’t listened and my hopes are now dead.

      I lost some important data from my phone a few days ago. My plan was to backup at night but chaos was that same day in the morning.

  • 0110010001100010@kbin.social
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    Local backup to my Synology NAS every night which is then replicated to another NAS at my folks house through a secure VPN tunnel. Pretty simple and easy to deploy.

      • neardeaf@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        Most likely Hyper Backup & Hyper Vault, two applications built into Synology’s DSM software that runs on their NAS devices.

  • bier@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    my 20 TB storage is currently hosted by Hetzner on a SMB Share with a acompanying server The storage is accessable via NFS/SMB i have a Windows 10 VPS running Backblaze Personal Backup for 7$/Month with unlimited storage while mounting the SMB share as a “Physical Drive” using Dokan because Backblaze B1 doesn’t allow backing up Network shares If your Storage is local you can use the win Backup Agent in a Docker container

  • francisco_1844@discuss.online
    link
    fedilink
    English
    arrow-up
    5
    ·
    1 year ago

    Restic for backup - can send backups to S3 and SFTP amongst other target options.

    There are S3 (block storage) compatible services, such as Backblaze’s B2, which are very affordable for backups.

  • DataDreadnought@lemmy.one
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    I doubt your using NixOS so this config might seem useless but at its core it is a simple systemd timer service and bash scripting.

    To convert this to another OS you will use cron to call the script at the time you want. Copy the part between script=“” and then change out variables like the location of where docker-compose is stored since its different on NixOS.

    Let me explain the script. We start out by defining the backupDate variable, this will be the name of the zip file. As of now that variable would be 2023-07-12. We then go to each folder with a docker-compose.yml file and take it down. You could also replace down with stop if you don’t plan on updating each night like I do. I use rclone to connect to Dropbox but rclone supports many providers so check it out and see if it has the one you need. Lastly I use rclone to connect to my Dropbox and delete anything older than 7 days in the backup folder. If you end up going my route and get stuck let me know and I can help out. Good luck.

    systemd = {
          timers.docker-backup = {
            wantedBy = [ "timers.target" ];
            partOf = [ "docker-backup.service" ];
            timerConfig.OnCalendar= "*-*-* 3:30:00";
          };
          services.docker-backup = {
            serviceConfig.Type = "oneshot";
            serviceConfig.User = "root";
            script = ''
            backupDate=$(date  +'%F')
            cd /docker/apps/rss
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/apps/paaster
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/no-backup-apps/nextcloud
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/apps/nginx-proxy-manager
            ${pkgs.docker-compose}/bin/docker-compose down
    
            cd /docker/backups/
            ${pkgs.zip}/bin/zip -r server-backup-$backupDate.zip /docker/apps
    
            cd /docker/apps/nginx-proxy-manager
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/apps/paaster
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/apps/rss
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/no-backup-apps/nextcloud
            ${pkgs.docker-compose}/bin/docker-compose pull
            ${pkgs.docker-compose}/bin/docker-compose up -d
    
            cd /docker/backups/
            ${pkgs.rclone}/bin/rclone copy server-backup-$backupDate.zip Dropbox:Server-Backup/
            rm server-backup-$backupDate.zip
            ${pkgs.rclone}/bin/rclone delete --min-age 7d Dropbox:Server-Backup/
            '';
          };
        };
    
    
    • thejevans@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Thanks! I just started setting up NixOS on my laptop and I’m planning to use it for servers next. Saving this for later!

  • lynny@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    Someone on lemmy here suggested Restic, a backup solution written in Go.

    I back up to an internal 4TB HDD every 30 minutes. My most important files are stored in an encrypted file storage online in the cloud.

    Restic is good stuff.

    • mr.nEJC@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Thanks for this tip. Seems interesting - watched this tutorial/presentation video. Will try it out asap 😅

  • stown@sedd.it
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    I host everything on Proxmox VM’s so I just take daily snapshots to my NAS

  • angrox@feddit.de
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    At home I have a Synology NAS for backup of the local desktops. Offsite Backups are done with restic to Blackblaze B2 and to another location.

  • thisbenzingring@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    veeam is pretty simple and powerful, the community version is free if you are only using it for a small environment (CPU cores is what it counts)

    I havn’t used it for docker but it says it is supported

    • vairfoley@reddthat.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      I use Veeam to backup shares on my NAS to rotated external drives. I also backup a Linux server.

  • pirate526@kbin.social
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    I run a second Unraid server with a couple of backup-related applications, as well as Duplicati. I have my main server network mounted and run scheduled jobs to both copy data from the main pool to the backup pool, as well as to Backblaze. Nice having the on-site backup as well as the cloud based.

    I occasionally burn to 100gb blurays as well for the physical backup.

  • WxFisch@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Backblaze B2. Any software that is S3 compatible can use B2 as the target and it’s reasonably priced for the service. I backup all the PCs and services to a Synology NAS and then backup that to B2 (everything except my Plex media, that would be pricy and it’s easy enough to re-rip from disc if needed).

  • RxBrad@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    Rsnapshot to an external USB drive.

    Probably not the best, but it works for my little 6TB OpenMediaVault server with some Docker thrown in.

  • Chifilly@lemmy.world
    cake
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    I use kup to back up my important PC files (the basic pre-installed backup software on KDE neon), which backs up to a separate drive on my PC, and that gets synced to my Nextcloud instance on my local server, and that - along with all the other data for my containers running on it - gets backed up by Kopia to DigitalOcean spaces.

    I couldn’t recommend Kopia strongly enough, because you have such fine control of what gets backed up, when it gets backed up, how many to keep etc. and it is versioned so doesn’t grow exponentially, and it compresses and encrypts the backup. I also have a setup where it executes a script before and after the backup starts that stops and starts the containers to maintain file integrity since nothing will be writing to the files. And it’s also a Docker container so it can just fit into your current compose setup.