On Nov 9 00:56, Linda Walsh wrote: > I was running some network bandwidth tests using "dd" (WinXP2/Cygwin) > > I was timing copies of a 300MB file from local disk to a remote > server. The local computer has enough memory to hold the file > in the memory cache once it is loaded. > > I ran through increasing power-of-two block sizes from > 512 bytes up to 512MB, where it, theoretically could read > and write the file in one read and write for the whole file. > Essentially (say "s:" is a network drive): > > for (i = 512; i<=512MB; i = i*2) do { > dd bs=$i if=/tmp/input of=/s/Video/output.dat > } > > Unfortunately, the test /fails/ at or above 64MB (actually at or > above 65,005KB). The ERROR message is: > > dd: writing '/s/Video/output.dat': Resource temporarily unavailable
I tried this with a 120 Megs and 1.2 Gigs file and dd works with all blocksize up to the tested 512 Megs for me. Could this be a network issue with big blocksizes, maybe? Corinna -- Corinna Vinschen Please, send mails regarding Cygwin to Cygwin Project Co-Leader cygwin AT cygwin DOT com Red Hat -- Unsubscribe info: http://cygwin.com/ml/#unsubscribe-simple Problem reports: http://cygwin.com/problems.html Documentation: http://cygwin.com/docs.html FAQ: http://cygwin.com/faq/