If you have a "32bit" setting, you have [EMAIL PROTECTED] display with your video card. It is normally assigned as three 8-bit bytes of a 32-bit long word, the fourth 8-bit byte is often used for alpha channel overlay stuff on some systems. It's simply more efficient to push around 32bit long-words than to push around smaller sized chunks of data.

Sorry, this is technical stuff ... when one is discussing computer technology, there's only so "simple" it can be described as without losing meaning or going on for pages of descriptive information.

Needless to say, for you: just set "Highest - 32-bits" and you'll be fine.

Godfrey

On Apr 25, 2005, at 10:29 PM, Shel Belinkoff wrote:

I have no idea what you're talking about. Far to technical for me. What's
a quantization space? I never heard of [EMAIL PROTECTED]
My display says the color quality is "Highest -32-bits" but I don't think I
have a 24-bit video card.


Maybe someone can explain this with a language other than technospeak.
Thing is, I'm not a complete dolt about this. I know what it means to scan
or shoot digital with 8, 12, 14, or 16bit color. Anyway, thanks for the
explanation ... ;-((



Reply via email to