Hello everyone! I'm having a problem that I'm sure others have had, but I'm struggling to figure out the easiest and most reliable solution. Small company who is growing and trying to use low cost solutions for the moment.
It might be the classic "long fat pipe" situation, but I'm a bit surprised that there are no clean and easy OOTB solutions, at least not that I've directly found.
We have two sites each with 1gbit fibre connections to the internet. Ping times are very low - Google is less than 1ms, to each other is less than 5ms. SpeedTest.net on both ends multi-connection show full gbit speed, single connections are over 600mbit each side easily.
We have backups that are > 1TB in size, and want them sent from one site to the other. The backup software that we're using doesn't support splitting the file, and it's better to have it as a single file for instant recovery purposes. Herein lies the problem.
I see several different solutions that handle multi-stream or segmented file transfers, but they ?all? seem to split into multiple files locally, then combine after. Considering both windows and linux support sparse files, I have no idea why this is the default behavior. Recombining a 1TB file in this manner is incredibly inefficient and slow, especially as we're leveraging external 2TB USB drives on the receiving end for the offsite copy of the backup (have them on hand and they work for redundant storage).
Using standard SFTP, FTP+TLS, FTP, or SCP have varying levels of success, but all seem to be anywhere between 15 and 80MB/sec depending on which way the wind is blowing. I assume it's more a matter of what path happens to be found between the two sites through the internet.
Ideally I'd want an automated solution, but I'm not against manual ones for now. I'm toying with the idea of download accelerators but those are manual and all the ones I've seen use separate files and recombine in the end. I need to wrap my head around FastDataTransfer (http://monalisa.cern.ch/FDT/download.html) to see if that'll cover it. I'm also trying to wrap my head around rclone to see if that will do it.
TL;DR - in today's age of multi-gbit connections and multi-TB file transfers, I can't seem to find an easy and reliable method for gbit file transfer speed across the internet that just segments a single file transfer without splitting it into multiple files on one end.
No comments:
Post a Comment