I need a backup solution, and I wanted to explore how Duplicati worked. I’ve been using Attic, but the code seems to be old and not getting updated. Before I go installing it on my actual server, I wanted to run through it once in a virtual environment.
Spoiler alert: Duplicati didn’t do what I wanted. I’m glad I ran through it in a test environment before committing to it. It seems like it is designed for personal or single computer setups. It doesn’t have a good way to copy files over the network to the server. It might be able to copy them to a remote drive, but not pull them from a remote computer/laptop.
So, here’s a walk through of what I tried.
I have two virtual machines setup to test this out. One is an Ubuntu 22.10 Server machine. That’s the machine where I want to install Duplicati and house all the backups. The other machine is an Ubuntu 22.10 Desktop machine. That is the machine that I want to backup. Both of these machines are basically just a default installation.
The only non-default thing is that I setup the SSH server. I added a host-only network adapter to the virtual machine. Then, I can use the “ip address” command to find out that the address is 192.168.56.102. So, I can use “ssh 192.168.56.102” on the host to get access. This makes it much easier to copy and paste commands to run.
I did put a few files on the server to test backing up. I tried to pick a few different file formats and get a little size to confirm the deduplication feature. Here’s what I have:
$ du -h * 4.0K February2023_8933.qfx 16M Fiery Serpents.pptx 8.0K Google Fi Summary.xlsx 2.6M IMG_20230227_184723916_HDR.jpg 3.0M IMG_20230301_105021853.jpg 3.9M IMG_20230301_111744010.jpg 664K matthew_backup.sql 24K Online_Security_SQL_problem.docx 39M Photos-001.zip 2.7M PXL_20230224_140417679.MP 4.2M PXL_20230224_140417679.MP.jpg 1.6M PXL_20230224_192040851.jpg 1.4M PXL_20230227_004318122.jpg 2.4M PXL_20230301_175831837.MP 4.5M PXL_20230301_175831837.MP.jpg 4.2M PXL_20230301_175845815.MP 6.3M PXL_20230301_175845815.MP.jpg 2.0M PXL_20230302_005003487.jpg 32M Refiner's Fire - How is Silver Refined.web
And, to confirm the files’ contents, I did a checksum:
$ md5sum * 45b51f0e1f87b900725196cef83a7844 February2023_8933.qfx c0f42ab9f71d1b8e02da0df45d1dbc43 Fiery Serpents.pptx 66a9dc3f4fd2095af80867779208fb43 Google Fi Summary.xlsx 4fef662d0ebe7013cf46fd5b3ae2fcfa IMG_20230227_184723916_HDR.jpg fc3a57346262c7f2c4210fe138b93378 IMG_20230301_105021853.jpg f994e530f02f7dda138a2b70f6679a43 IMG_20230301_111744010.jpg 8979d1b730e1aa730de831edb7329fcf matthew_backup.sql 3572c134abd3a9b0cc0a6bb6098393d1 Online_Security_SQL_problem.docx c9c20e556496c19ed782ce5de695b0e1 Photos-001.zip d0f26d34c92d08c2b7cc197fffb1d680 PXL_20230224_140417679.MP 5702a93b3e97f7c00e1dffe5e74e081d PXL_20230224_140417679.MP.jpg 94d540e676b411403076fc23ae9f85ed PXL_20230224_192040851.jpg 77978e3dea655197d629ded492d02c56 PXL_20230227_004318122.jpg ad4e9d94a21baea012edf081560fe18e PXL_20230301_175831837.MP bc51e401e941496c135b3d16d23f2ede PXL_20230301_175831837.MP.jpg eef1653209cc54f3b2e70290285e8afd PXL_20230301_175845815.MP 254a6930c5fdf632b52f24716172e458 PXL_20230301_175845815.MP.jpg 1d3b32fea98d19f079b72baf89b1e0d4 PXL_20230302_005003487.jpg 803cec96b838bf165b8ca9a8d66d66bd Refiner's Fire - How is Silver Refined.webm
The total size is 124M.
$ du -hs . 124M .
I followed in the install instructions on the server. The first step is to install Mono. I couldn’t find any instructions for Ubuntu 22.10, so I just went with the 20.10 instructions:
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys 3FA7E0328081BFF6A14DA29AA6A19B38D3D831EF echo "deb https://download.mono-project.com/repo/ubuntu stable-focal main" | sudo tee /etc/apt/sources.list.d/mono-official-stable.list sudo apt update sudo apt install mono-devel gtk-sharp2 libappindicator0.1-cil libmono-2.0-1
There are a few prerequisite packages suggested:
sudo apt install apt-transport-https nano git-core software-properties-common dirmngr -y
Then, I downloaded Duplicati from the download page. Rather than try to download it on the host and transfer, I just used the Ubuntu link and wget:
wget https://updates.duplicati.com/beta/duplicati_220.127.116.11-1_all.deb sudo apt install ./duplicati_18.104.22.168-1_all.deb
After installing, I created the duplicati service file. This should ensure that it stays running in the background.
sudo vi /etc/systemd/system/duplicati.service
I put this in the file:
[Unit] Description=Duplicati web-server After=network.target [Service] Nice=19 IOSchedulingClass=idle EnvironmentFile=-/etc/default/duplicati ExecStart=/usr/bin/duplicati-server $DAEMON_OPTS Restart=always [Install] WantedBy=multi-user.target
I went ahead and updated the options from the installation instructions:
sudo vi /etc/default/duplicati
Here is the option line:
DAEMON_OPTS="--webservice-interface=any --webservice-port=8200 --portable-mode"
Then, I enabled the service…
sudo systemctl enable duplicati.service sudo systemctl daemon-reload sudo systemctl start duplicati.service sudo systemctl status duplicati.service
I used the GUI web page to configure Duplicati. The IP address came from the “ip address” command, and I used the default port 8200.
When I first logged in, I saw this:
I clicked Yes because I have this available on my network where others could get to it. That took me to the settings page where I entered a new password.
When I clicked OK on the settings page, it told me that I wasn’t logged in, and it made me enter the password. My new password worked to get me in.
Backing Up the Server
To start, I tried backing up my test files on the server itself. I used the Add Backup option on the menu.
I picked a new configuration and entered some general information:
I created a new directory in /opt.
For the source, I picked the directory where all my files are
I went ahead and scheduled it, not that I intend on leaving the machine on overnight. Afterward, on the home screen, I found where I could run the backup now.
After the backup was done, here’s what the backup directory looks like:
$ ls -lh /opt/backup/ total 124M -rw-r--r-- 1 root root 3.0K Mar 2 04:41 duplicati-20230302T044042Z.dlist.zip.aes -rw-r--r-- 1 root root 50M Mar 2 04:41 duplicati-b457de0ebf23144fd8b58a412a48ee27b.dblock.zip.aes -rw-r--r-- 1 root root 50M Mar 2 04:40 duplicati-b66203a3f4e9446e88840747b9ca24f61.dblock.zip.aes -rw-r--r-- 1 root root 24M Mar 2 04:41 duplicati-bf474459f125b4f1088e99fc33f77a991.dblock.zip.aes -rw-r--r-- 1 root root 23K Mar 2 04:41 duplicati-i07035fa9fb2244868e5e2f063241c7c8.dindex.zip.aes -rw-r--r-- 1 root root 32K Mar 2 04:41 duplicati-i1f10d47aa65d4705894dd8166dec9ac9.dindex.zip.aes -rw-r--r-- 1 root root 36K Mar 2 04:40 duplicati-i38b3fe631c2f45f4a62d304be08e0aed.dindex.zip.aes
And, the backup directory is 124MB. That tells me that there is no compression. The SQL file in my tests should have been considerably smaller if there was.
$ du -hs /opt/backup/ 124M /opt/backup/
Restoring on the server
So, now, before we declare this a victory, we have to be able to restore the files. I just used the Restore feature in the web GUI.
It allows me to choose which files that I want to restore. For now, I’ll just choose all of them.
I picked a different directory to restore to so that I could confirm it worked.
To check, I created an md5sum file from my original files. Then, I ran the checks against the restored files.
$ cd my_test_data/ $ md5sum * > ../checks.md5sum $ cd ../my_test_restore/ $ md5sum --check ../checks.md5sum February2023_8933.qfx: OK Fiery Serpents.pptx: OK Google Fi Summary.xlsx: OK IMG_20230227_184723916_HDR.jpg: OK IMG_20230301_105021853.jpg: OK IMG_20230301_111744010.jpg: OK matthew_backup.sql: OK Online_Security_SQL_problem.docx: OK Photos-001.zip: OK PXL_20230224_140417679.MP: OK PXL_20230224_140417679.MP.jpg: OK PXL_20230224_192040851.jpg: OK PXL_20230227_004318122.jpg: OK PXL_20230301_175831837.MP: OK PXL_20230301_175831837.MP.jpg: OK PXL_20230301_175845815.MP: OK PXL_20230301_175845815.MP.jpg: OK PXL_20230302_005003487.jpg: OK Refiner's Fire - How is Silver Refined.webm: OK
Everything seems to look great at this point.
From the Command Line
Now, I wanted to see if I could script the backup from the command line. I found I could cheat, and export my GUI job. There’s an export option on the Home page.
I used the Command-line option:
Here’s the command line that I got:
mono /usr/lib/duplicati/Duplicati.CommandLine.exe backup file:///opt/backup /home/skp/my_test_data/ --backup-name="My Test directory" --dbpath=/usr/lib/duplicati/data/LOHYBEDFWH.sqlite --encryption-module=aes --compression-module=zip --dblock-size=50mb --passphrase="This is my test" --disable-module=console-password-input
When I ran it, it failed. It said it couldn’t open the database. I think it’s a permission thing.
Failed to load connection with path '/usr/lib/duplicati/data/LOHYBEDFWH.sqlite'. => Unable to open the database file Mono.Data.Sqlite.SqliteException (0x80004005): Unable to open the database file at Mono.Data.Sqlite.SQLite3.Open (System.String strFilename, Mono.Data.Sqlite.SQLiteOpenFlagsEnum flags, System.Int32 maxPoolSize, System.Boolean usePool) [0x00077] in <132a2409ede24587885b92b7f48af81f>:0
After adding sudo in front, the command worked fine:
Checking remote backup ... Listing remote folder ... Scanning local files ... 0 files need to be examined (0 bytes) Uploading file (22.86 KB) ... Uploading file (1.03 KB) ... Uploading file (2.93 KB) ... Compacting remote backup ... Checking remote backup ... Listing remote folder ... Verifying remote backup ... Remote backup verification completed Downloading file (2.93 KB) ... Downloading file (1.03 KB) ... Downloading file (23.55 MB) ... Duration of backup: 00:00:04 Remote files: 10 Remote size: 123.59 MB Total remote quota: 13.67 GB Available remote quota: 5.75 GB Files added: 0 Files deleted: 0 Files changed: 2 Data uploaded: 26.82 KB Data downloaded: 23.56 MB Backup completed successfully!
I can see that it picked up on the 2 files that I changed.
Backing Up My “Remote” computer
So, the “server” computer is pretty easy to backup both from the Web GUI and the command line. Now, the question is whether or not we can backup a remote computer. So, on the “remote” computer, I installed SSH.
sudo apt install openssh-server
Next, back on the server, I generated a key. Rather than putting my email in the comment, I put duplicati, and I changed the file name to id_backup. I also left the passphrase blank.
$ ssh-keygen -t rsa -b 4096 -C "duplicati" Generating public/private rsa key pair. Enter file in which to save the key (/home/skp/.ssh/id_rsa): /home/skp/.ssh/id_backup Enter passphrase (empty for no passphrase): Enter same passphrase again: Your identification has been saved in /home/skp/.ssh/id_backup Your public key has been saved in /home/skp/.ssh/id_backup.pub The key fingerprint is: SHA256:YXC5/yqpjwJwtiZ1LTwbfmNNNXCm5jTAVQ3x1/1/NxE duplicati The key's randomart image is: +---[RSA 4096]----+ | .o.++*+ | | .+.+o.. ..| | . . Bo .. .Eo| |. + * .=oo . o| | = + = oS. ..| |. + o + . . o| | o . o . . . .+| | . .o . +| | .oo.... | +----[SHA256]-----+
Next, I used ssh-copy-id to copy the key over to the remote computer.
$ ssh-copy-id 10.0.2.15 /usr/bin/ssh-copy-id: INFO: Source of key(s) to be installed: "/home/skp/.ssh/id_backup.pub" /usr/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed /usr/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys firstname.lastname@example.org's password: Number of key(s) added: 1 Now try logging into the machine, with: "ssh '10.0.2.15'" and check to make sure that only the key(s) you wanted were added.
Now, to connect I have to specific the key. That’s only because I changed the key name from id_rsa.
ssh -i ~/.ssh/id_backup 10.0.2.15
Next, I need to mount the “remote” computer’s file system so that Duplicati can get to it. I used sshfs, which I need to install.
sudo apt install sshfs
Then, I created a directory in my home directory for mounting. And, I used the sshfs command to mount it.
$ mkdir mnt $ sshfs -o ssh_command='ssh -i ~/.ssh/id_backup' 10.0.2.15:/home/skp ./mnt $ ls mnt
Now that the folder is accessible, I can use the GUI to setup a backup. I just selected that mnt folder that I created. (Note: I forgot the trailing slash in the following screenshot, but you get the point.)
When I ran that backup, I got an Unauthorized to access source folder error message. I assume that this is because I have it mount under my user rather but Duplicati is running under root. Let me try to move to the /mnt folder…
umount mnt sudo sshfs -o ssh_command='ssh -i /home/skp/.ssh/id_backup' email@example.com:/home/skp /mnt sudo ls /mnt
Then, I updated my backup configuration to point to /mnt instead of the mnt in the home directory.
I did get this message, which may be because my files are the same on both computers. I chose to run the repair.
I never could get this to backup for some reason. I’m not exactly sure I understand why. I ran out of motivation to play with it because it is clear that it wasn’t designed for this.
I probably should have installed Duplicati on my test-desktop computer. Then, I could expose my backup directory with sftp and have the test-desktop send the files to the server. Maybe some other time I’ll explore that idea.