What would be the reason for getting such large data sets back on premise? Why not leave them in the cloud for example in an S3 bucket on amazon or google data store.
Regards, Jonathan -----Original Message----- From: Beowulf <beowulf-boun...@beowulf.org> On Behalf Of Chris Samuel Sent: Sunday, 28 July 2019 03:36 To: beowulf@beowulf.org Subject: Re: [Beowulf] Lustre on google cloud On Friday, 26 July 2019 4:46:56 AM PDT John Hearns via Beowulf wrote: > Terabyte scale data movement into or out of the cloud is not scary in 2019. > You can move data into and out of the cloud at basically the line rate > of your internet connection as long as you take a little care in > selecting and tuning your firewalls and inline security devices. Pushing > 1TB/day etc. > into the cloud these days is no big deal and that level of volume is > now normal for a ton of different markets and industries. Whilst this is true as Chris points out this does not mean that there won't be data transport costs imposed by the cloud provider (usually for egress). All the best, Chris -- Chris Samuel : http://www.csamuel.org/ : Berkeley, CA, USA _______________________________________________ Beowulf mailing list, Beowulf@beowulf.org sponsored by Penguin Computing To change your subscription (digest mode or unsubscribe) visit https://beowulf.org/cgi-bin/mailman/listinfo/beowulf _______________________________________________ Beowulf mailing list, Beowulf@beowulf.org sponsored by Penguin Computing To change your subscription (digest mode or unsubscribe) visit https://beowulf.org/cgi-bin/mailman/listinfo/beowulf