No, Herb - you're just plain wrong. Check your statistics textbooks. The signal-to-noise ratio in the sum of n random samples is increased by a factor of sqrt(n) over the signal-to-noise ratio in a single sample. So if you add N bits of resolution, you don't get all of that as signal; it's N/2 bits of signal, and N/2 bits of noise. Or to put it another way, doubling the number of samples increases the SNR by 0.5 bits.
And that's the best you can do. If quantisation error is a significant contributor (which it is down in the lower sample levels) you don't get any improvement in that, no matter how much oversampling you do. So in practice you don't even get that sqrt(n) improvement in the parts of the image that are most affected by noise. On Wed, Sep 14, 2005 at 07:21:14PM -0400, Herb Chong wrote: > doubling the number of samples increases the SNR by 1 bit assuming that > there is only thermal noise and that noise temperature remains constant > across samples. what measure you use for SNR determines by what factor the > number increases. > > Herb... > ----- Original Message ----- > From: "John Francis" <[EMAIL PROTECTED]> > To: <[email protected]> > Sent: Wednesday, September 14, 2005 12:56 PM > Subject: Re: What Ever Happened to Chrome? was: Being There > > > >That's correct. But it doesn't add one bit of signal; the noise > >level increases as well. That's where the sqrt factor comes from.

