Re: [Pan-users] Re: Better processing of very large groups?

2009-07-02 Thread Travis
- Original Message - From: "Duncan" <1i5t5.dun...@cox.net> To: Sent: Thursday, July 02, 2009 21:31 PM Subject: [Pan-users] Re: Better processing of very large groups? > "Travis" posted > ee3985f694f44887a37b26eac04af...@travispc, excerpted below, on Thu, 02 > Jul 2009 19:18:45 -0700:

[Pan-users] Re: Better processing of very large groups?

2009-07-02 Thread Duncan
"Travis" posted ee3985f694f44887a37b26eac04af...@travispc, excerpted below, on Thu, 02 Jul 2009 19:18:45 -0700: >> Our recent pan angel, K. Haley, has already included that change in >> his/her git archive (git://github.com/lostcoder/pan2.git) for which I >> once again send my thanks to him/her.

[Pan-users] Re: Better processing of very large groups?

2009-07-02 Thread Duncan
Ron Johnson posted 4a4d68a7.4050...@cox.net, excerpted below, on Thu, 02 Jul 2009 21:10:47 -0500: > On 2009-07-02 20:28, walt wrote: > [snip] >> >> The basic problem overwhelming usenet is that people are using it for >> file sharing, a purpose for which it was not intended and is not well >> s

Re: [Pan-users] Re: Better processing of very large groups?

2009-07-02 Thread Travis
- Original Message - From: "walt" To: Sent: Thursday, July 02, 2009 18:28 PM Subject: [Pan-users] Re: Better processing of very large groups? > Our recent pan angel, K. Haley, has already included that change in his/her > git archive (git://github.com/lostcoder/pan2.git) for which I

Re: [Pan-users] Re: Better processing of very large groups?

2009-07-02 Thread Ron Johnson
On 2009-07-02 20:28, walt wrote: [snip] The basic problem overwhelming usenet is that people are using it for file sharing, a purpose for which it was not intended and is not well suited. Maybe (definitely!) not, but uuencode/decode have been around for a lng time... But you knew that

[Pan-users] Re: Better processing of very large groups?

2009-07-02 Thread walt
On Thu, 02 Jul 2009 23:53:53 +, Duncan wrote: > Ron Johnson posted > 4a4cf8fc.8030...@cox.net, excerpted below, on Thu, > 02 Jul 2009 13:14:20 -0500: > >> Because giganews has such a long retention period, some groups can have >> a very *large number* of messages. If you subscribe to two o

Re: [Pan-users] Re: Better processing of very large groups?

2009-07-02 Thread Ron Johnson
On 2009-07-02 18:53, Duncan wrote: [snip] BTW, for 32-bit users at least (I'm not sure if the number is 32-bit or 64-bit for 64-bit users), at least one group on Giganews has "rolled over" the 32-bit article sequence integer pan uses. It needs to be a 64- bit number, or at least 33-bit. More

[Pan-users] Re: Better processing of very large groups?

2009-07-02 Thread Duncan
Ron Johnson posted 4a4cf8fc.8030...@cox.net, excerpted below, on Thu, 02 Jul 2009 13:14:20 -0500: > Because giganews has such a long retention period, some groups can have > a very *large number* of messages. If you subscribe to two or more of > them, you could run out of memory. > > As it is,

Re: [Pan-users] Better processing of very large groups?

2009-07-02 Thread Ron Johnson
On 2009-07-02 15:46, Jeff Berman wrote: From: Ron Johnson Subject: [Pan-users] Better processing of very large groups? As it is, pan seems to sequentially scan thru all messages when marking a group of them as Read. There needs to be a better and less memory intensive method of handling huge

Re: [Pan-users] Better processing of very large groups?

2009-07-02 Thread Jeff Berman
> From: Ron Johnson > Subject: [Pan-users] Better processing of very large groups? > > As it is, pan seems to sequentially scan thru all messages when marking a > group > of them as Read. > > There needs to be a better and less memory intensive method of handling huge > groups. B-trees, has

[Pan-users] Better processing of very large groups?

2009-07-02 Thread Ron Johnson
Because giganews has such a long retention period, some groups can have a very *large number* of messages. If you subscribe to two or more of them, you could run out of memory. As it is, pan seems to sequentially scan thru all messages when marking a group of them as Read. There needs to be