Moore's Law depends on shrinking the geometries of the lines, traces and
devices in the IC's. There are limits to this. Digital cameras are
starting towards the end of where it is getting hard to shrink things.
It should also be noted that Moore's Law applies to CPU's. It's much
harder to improve the performance of a system than just a part.
I also think that the original comment was made tongue in cheek.
BR
[EMAIL PROTECTED] wrote:
Now there is a brave prediction. Absolutely can't happen! Wilbur and
Orville's critics probably said the same thing. Give me one good
reason why digital cameras won't technologically advance at the same
rate as other electronics?