Ron Johnson <ron.l.john...@cox.net> posted 4a558d7b.5010...@cox.net, excerpted below, on Thu, 09 Jul 2009 01:26:03 -0500:
>> Of course, that assumes it's not a simple permissions issue... > > Nope. > > $ dir alt.binaries.dvdr* > -rw------- 1 me me 3658599777 2009-07-02 03:37:45 alt.binaries.dvdr Just that isolated file doesn't help a lot... Is that supposed to be the <PanDataDir>/groups/ file for that group? I'm guessing so, but it would help if you'd have indicated that some way. At first, I thought maybe it was your filesave dir for that group, but then I decided a 3.6 gig dir size didn't make sense, and (after writing the below about the groups subdir) decided it had to be the groups header file. > $ dir [snip] > drwxr-xr-x 2 me me 4096 2009-07-09 00:59:06 groups/ But what about the files in that directory? Or is that what the top dir was supposed to be, the perms for that specific file? If so, you didn't indicate it. Oh... I think I might have just spotted your issue! That alt.binaries.dvdr group file, assuming that's what it is, 3.6 gigs, single file, right? Are you on x86_32 or x86_64 (assuming Linux, maybe I should ask about that too), how much memory do you have, and if you are 32-bit, what are your kernel bigmem options? Also, what filesystem is that on? Obviously, I'm wondering if you're running into resource limits somewhere, either memory (trying to load a 3.6 gig file into memory), or possibly, the filesystem filesize limit, altho that would normally be a more rounded number, 2 gigs, 4 gigs, or the like. If that's the size of the header data you're dealing with, which seems likely, OUCH! No WONDER you're complaining about performance! I'd try to keep it under, say, 2 gigs. If that means deleting posts when you're done with them, every day, that's what you'll have to do. However, I think/hope every week or twice a week will be sufficient. But meanwhile, you need to figure out how to reduce the filesize. One thing you could do would be simply delete it. I'd move it elsewhere for testing first, so you can move it back and try something else if pan goes crazy, but I think pan should be able to reconstruct it if it's loaded and doesn't find that file, by re-downloading headers, without losing your read message tracking, since that's stored in the newsrc files. You can tell pan to redownload a single day's worth, change groups so it writes, and see how big that is, before deciding how many more days to download at once. FWIW, I'd be interested in seeing the size for one day's worth, or a week's worth if it'll take it (since that will eliminate daily fluctuations a bit better), just to know how big the problem we're dealing with actually is. It seems to me that's more likely to cause huge issues than the comparatively small 1/3 gig tasks.nzb file that you've been complaining about, particularly since the kernel should be caching the file and it will normally be updating fast enough that the writes won't all get to disk. But a 3.6 gig single file, ESPECIALLY on 32-bit, is going to cause **HUGE** issues if pan's trying to work with the whole thing in memory at once, as I suspect it is. If you're 64-bit and are working with a decent filesystem, the issues won't be as bad, but it's still a huge amount of data to be trivially shifting around! I guess it's time to post this and see if I'm right, and perhaps get some idea of just the size of daily or weekly data we're dealing with. -- Duncan - List replies preferred. No HTML msgs. "Every nonfree program has a lord, a master -- and if you use the program, he is your master." Richard Stallman _______________________________________________ Pan-users mailing list Pan-users@nongnu.org http://lists.nongnu.org/mailman/listinfo/pan-users