![]() I’ve used restic in the past and just started to use Duplicacy because pruning requires full downloads/repacking/re-uploads of most of the content with a restic repo - which just isn’t feasible with a slower internet connection. Thank you for the comparison data that you have provided I’ve only recently come across Restic and Arq. I had a need to recently trial Cloudbacko on another PC for 30 days and didn’t have any issues with it backing up to Azure. So I keep Cloudbacko backing up a tiny subset of program data that I need VSS for and it has remained fine on SFTP. There have been many Cloudbacko program revisions and bugs worked out since this experience 3 years ago however. After a bit of research on Cloudbacko’s forums I realised that Onedrive isn’t great at being backed up to so I changed to an SFTP server, but from what I remember, I still had problems, so I switched to using Duplicati. Secondly, when I had Cloudbacko backing up a large amount of data to Onedrive, it was fine for a while and then started having errors and I had to rebuild the backup a few times (a pain on my slow connection). In case you are interested, the reason why I use both Duplicati and Cloudbacko is simple (in my eyes!):Īt the time, I struggled to get VSS working with Duplicati and Cloudbacko works with VSS out of the box (I realise now that you have to have Duplicati running as a service, but back then, there wasn’t the helpful guides written!). So, please don’t worry about doing a Cloudbacko comparison for me, that was not my intention. I mentioned Cloudbacko because I thought that, like me, you were always on the lookout for Cloud backup programs and that is the reason why you did the comparison. I first heard about Duplicati v1 through Steve Gibson’s podcast, so I have known about Duplicati for a long time.įor the avoidance of doubt, I am not hear as an “advertiser” for Cloudbacko. ![]() I have been using both Duplicati and Cloudbacko together for just over 3 years now. Restore I will tackle after a few rounds of updates - likely to be artificial induced restore so any suggestion on setup can look into… Updates and restore timings - it is the production backup so basically when I have more baby photos to backup then will update. Unfortunately, I have limited disk space and can’t keep a 400gb backup store for each backup provider and so this is unlikely to be revised. The bigger issue is latency where local HDD will hose the other providers. FYI - I am actually on a gigabit connection but the issue isn’t really bandwidth as the “little file” approach means that the link is unlikely to be saturated. Two more things that could be tested are incremental updates with some change and restore times.Īgree that comparability is going to be difficult. Also, it would be great to have a single target cloud backend to further improve consistency. Hi there! Thanks for this, however, having different target backends for each is a bit misleading, as backing up to a local HDD is always going to be faster ( unless you have more than gigabit on upload ). The table is getting a bit wide due to request on different backends and if having diffculty reading, can download excel version here ![]() While this is not the default, as mentioned above Duplicati is my production backup and I have found this step adds an unneccesary slow down and have had it disabled for years This means that after checking for updates, duplicati will not download one block and test for consistency. This is a change from default.ĭuplicati - the backups have the option “backup-test-samples” set to zero. File modification is based on size modification date and not inode. These linux port-ins do not set themselves up properly and without this the backup times can be 30 times worse. (TO DO: update when WSL2 is out and stable)īorg/Restic - very important disabled windows antivirus “real-team protection” for the backup folder/process. Wifi bandwidth to router is generally 30-50MB/s in favourable (large file bulk transfer) conditions.īorg - no windows binary available currently running through a docker container. I currently use Duplicati / Arq5 as “production” backup (production in so far as I still experiment with them…!)Ĭomputer is desktop PC connected via wifi to gigabit pipe to the world. photo/video collection (about 95% of the storage) - long tail here with most files being small (3-4Mb jpg) but with some largish videos making up the majority of the space.Do let me know any testing desired and I’ll see what I can do.īackup source is ~107k files totalling 460Gb of storage. I hope to make this a living resource so happy to do more testing based on feedback and include here. Have a bit of time on my hands (all COVID induced…) and trying out / comparing the different backup solutions.
0 Comments
Leave a Reply. |