On a possibly related situation, we have a requirement to move
many terabytes of data from one machine to another in close
proximity.  We were using rsync, but transfers would take many
hours, if not days, even over a 10-GBe Ethernet link.  So we installed
rsh and rsh-server and configured our rsync invocations to use rsh
instead of ssh.  This gave us a bandwidth boost of about 2.5 times.
Yes, we're using jumbo frames too.

Regards,
Lew

On 04/03/2020 09:52 AM, Derek Atkins wrote:
Are you sure that SSH encryption is actually slowing down the backup?  I
wouldn't think so, as most devices have AES in hardware and SSH is pretty
efficient.  Before you just bypass it, I would test to see if that's
actually the bottleneck.

You could just run raw rdiff vs rdiff-over-ssh to test.  I suspect the
local I/O is the bottleneck and not encryption or network speed.

-derek

On Fri, April 3, 2020 12:18 pm, Dark Empathy wrote:
Hi,

Please excuse the simple question, but I am unable to work out what to do
from the man page alone.

What is a way to disable encryption for an internal network, where it is
not required? (Linux to Linux).  Is this possible with perhaps a rsync
server and --remote-schema?

I have a situation where we have data on Raspberry Pi 3's. spread over a
private WAN.  Data security, in this particular case, is of zero
importance.  However, I would prefer not to disable encryption on the SSH
server.

Are there any suggestions to run rdiff-backup at maximum speed, no
encryption, on an internal network?



Reply via email to