Hi, all.

I'm coming across an occasional oddity in 0.119 when saving large binary
files to disk:  the file gets split into two parts.  Instead of getting,
e.g.

   filename.r01         15,000,000

I end up with 

   filename.r01         14,750,400
   filename_copy_2.r01     249,600

If I delete the saved files and resave it again from the cached copy, it
will consistently split it like the previous save.  If I redownload it
(instead of the cached copy) it may resplit again, or it may save just
fine.  Aaargh!

If I cat the two parts together (since it appears that bytewise it's all
there), par2repair reports missing data blocks.

On an unrelated topic, has the rules/filters idea gone?  I miss the raw
power to (locally) delete large swaths of the Usenet population. :)

-- 
Victor Ducedre 


_______________________________________________
Pan-users mailing list
Pan-users@nongnu.org
http://lists.nongnu.org/mailman/listinfo/pan-users

Reply via email to