Package: coreutils Version: 6.10-6 Severity: minor
Here is a simple patch to allow using a fixed size random file source with shred. Shred has the limitation that if --random-source=<file> is specified, the source must be at least as large as the destination, which may not be feasible in many cases. Without this patch, shred will exit prematurely when it reaches the end of the source file rather than just completing the job using as many copies of the given data as is necessary.
This is very helpful if you are not concerned with having blocks of random data duplicated as much as significantly speeding up the process over using /dev/urandom, or if you want to overwrite with a specific pattern.
===== cut here ===== *** lib/randread.c.orig 2008-01-17 03:14:51.000000000 -0600 --- lib/randread.c 2009-09-30 19:38:10.000000000 -0500 *************** *** 217,222 **** --- 217,226 ---- size -= inbytes; if (size == 0) break; + if (feof(s->source)) { + rewind(s->source); + continue; + } errno = (ferror (s->source) ? fread_errno : 0); s->handler (s->handler_arg); } ===== cut here ===== Thanks. Evan -- To UNSUBSCRIBE, email to debian-bugs-dist-requ...@lists.debian.org with a subject of "unsubscribe". Trouble? Contact listmas...@lists.debian.org