Re: Copying and taring large amounts of data

2002-02-19 Thread Carel Fellinger
On Mon, Feb 18, 2002 at 06:46:14PM -0800, Alvin Oga wrote: ... > if coping vfat files... you've got to use find /vfat -print | tar .. -T - > so solve the problems wiht "tom's budget for $$$ next year" type filenames I first though, yep should keep that in mind. But then I realized that it's only

Re: Copying and taring large amounts of data

2002-02-19 Thread Alvin Oga
hi ya osamu > On Mon, Feb 18, 2002 at 06:46:14PM -0800, Alvin Oga wrote: > > gnu tar ( 1.13.19 ) does not have any problems transfering > > 4GB sized files... from machine-A (linux w/ ext2) to machine-B ( ext2 ) > > ( tar --version ) > > i think some older gzip has a problem w/ > 2GB

Re: Copying and taring large amounts of data

2002-02-19 Thread Osamu Aoki
Hi Alvin, On Mon, Feb 18, 2002 at 06:46:14PM -0800, Alvin Oga wrote: > gnu tar ( 1.13.19 ) does not have any problems transfering > 4GB sized files... from machine-A (linux w/ ext2) to machine-B ( ext2 ) > ( tar --version ) > i think some older gzip has a problem w/ > 2GB files tho

Re: Copying and taring large amounts of data

2002-02-18 Thread Alvin Oga
hi ya osamu... gnu tar ( 1.13.19 ) does not have any problems transfering 4GB sized files... from machine-A (linux w/ ext2) to machine-B ( ext2 ) ( tar --version ) i think some older gzip has a problem w/ > 2GB files though if you're transfering vfat files... thats a differen

Re: Copying and taring large amounts of data

2002-02-18 Thread Karsten M. Self
on Mon, Feb 18, 2002 at 06:05:46PM -0800, Osamu Aoki ([EMAIL PROTECTED]) wrote: > On Mon, Feb 18, 2002 at 12:47:44PM -0500, Scott Henson wrote: > > I need to move large amounts of data from one disk to another and then > > tar it up for back up purposes. I have tried cp and mv, but both take > > v

Re: Copying and taring large amounts of data

2002-02-18 Thread Osamu Aoki
On Mon, Feb 18, 2002 at 09:15:33PM -0500, Alan Shutko wrote: > Osamu Aoki <[EMAIL PROTECTED]> writes: > > > Many utilities still have 2GB file size limitation hidden somewhere, > > even though kernel should be able to handle large files. > > > > So just do not listen to other posts suggesting "tar

Re: Copying and taring large amounts of data

2002-02-18 Thread Alan Shutko
Osamu Aoki <[EMAIL PROTECTED]> writes: > Many utilities still have 2GB file size limitation hidden somewhere, > even though kernel should be able to handle large files. > > So just do not listen to other posts suggesting "tar ..." or similar. The version of tar in woody doesn't have the 2GB limit

Re: Copying and taring large amounts of data

2002-02-18 Thread Osamu Aoki
On Mon, Feb 18, 2002 at 12:47:44PM -0500, Scott Henson wrote: > I need to move large amounts of data from one disk to another and then > tar it up for back up purposes. I have tried cp and mv, but both take > very large amounts of time with many ide resets and faults. The amount > of data I am tr

Re: Copying and taring large amounts of data

2002-02-18 Thread Tom Cook
Elizabeth Barham wrote: > > tar cf - /path-to-be-archived | gzip -c > /new/place/to store-file.tar.gz tar -czf /new/place/to/store-file.tar.gz /path-to-be-archived is identical but easier to type ;-) Tom

Re: Copying and taring large amounts of data

2002-02-18 Thread Dougie Nisbet
On Monday 18 February 2002 5:47 pm, Scott Henson wrote: > I need to move large amounts of data from one disk to another and then > tar it up for back up purposes. I have tried cp and mv, but both take > very large amounts of time with many ide resets and faults. I don't know why your having the r

Re: Copying and taring large amounts of data

2002-02-18 Thread Alex Malinovich
On Mon, 2002-02-18 at 13:44, Elizabeth Barham wrote: > > tar cf - /path-to-be-archived | gzip -c > /new/place/to store-file.tar.gz > > Note that this will take a while, too. Hopefully the system won't > stop, though. You might consider running this command with at, say "at > 1am" or so, and just

Re: Copying and taring large amounts of data

2002-02-18 Thread Elizabeth Barham
tar cf - /path-to-be-archived | gzip -c > /new/place/to store-file.tar.gz Note that this will take a while, too. Hopefully the system won't stop, though. You might consider running this command with at, say "at 1am" or so, and just let it run for a few hours (days?). At will run the command with