Ron Johnson <ron.l.johnson@...> writes: > > Having to upgrade to Ubuntu Maverick because Natty sucks, I decided to > also migrate to 64 bits now that Adobe has released a 64 bit Flash. > > One of the first things that I did was try out Pan on a binary group. > > Many hours later, it had fetched 6 weeks of headers and consumed 6.8GB > of RAM. The 2+ years of data in Giganews would require 123GB of RAM. > > :( >
At risk of exposing myself as a Known Idiot... is this 64-bit performance different from 32-bit performance, and can you 'prove' it? ;-) As for the multi-bazillion-header binary groups... is there *any* 'old style' newsreader capable of downloading all their headers? By 'old style' I mean newsreaders intended to include conversation. Giganews, for one, would seem to me to make this nearly impossible due to their vast retention span. My perception is that Usenet usage has changed, especially with regard to binary newsgroups, to revolve around nzb-centric hit-and-run operations with a user base that includes very few who even realize non-nzb-centric Usenet functions exist. For nzb-centric users, header retrieval from date x to date y isn't a factor. The only way I can think of that this could be handled in 'old style' newsreaders at this level of multi-terabyte daily Usenet feeds is to adopt the (brilliant) Xnews method using slider controls to grab manageable chunks of headers. Said method makes it possible to accrete retrieved ranges into a 'reef' of headers larger than any single retrieval operation could manage. Attempting to swallow the entire 'reef' in one go is prohibitive outside some mad cluster design, or so it seems to my admittedly ignorant self. _________________________ On a tangential note, Heinrich Müller's work on Pan is exciting and inspiring and much appreciated by this user. _______________________________________________ Pan-users mailing list Pan-users@nongnu.org https://lists.nongnu.org/mailman/listinfo/pan-users