On Sep 7, 2005, at 6:53 PM, Shel Belinkoff wrote:
What I'm seeing with digi are more and more fried highlights in more and more pics, and Jens' comment seemed to indicate that it was acceptable.

I can't speak for Jens or Paul, but burned out Zone IX areas to me are anathema. I choose very carefully what I want to burn, as it were, and render to achieve what was in my mind's eye when I made the exposure.

I noticed bright areas in the photo of your grand daughter that seem to be acceptable to you and to others (at least no one commented on them), that I'd not find acceptable and which would probably (note the qualifier) not have appeared had the photo been made with film, or perhaps with greater care or attention to detail (again, not to be picking specifically on you).

It's very hard, if not impossible, to make a web-resolution rendering that will express all the tonal subtleties of a larger rendering without a lot of extra work for the web rendering. When I look at a normal size photograph on a web page, I think of it primarily as a proof quality image: the real, full resolution image printed to maximize its quality will be far nicer, and is only seeable in a large physical print.

The same is true of scanned film images. Remember: no matter what you're seeing on screen, it's a digital image rendered to a relatively low resolution rendering.

Of course, getting precisely the *right* exposure for a RAW file and processing it with the correct parameters is at least as critical for digital capture as it is for film, and is essential for scenes like a bride in a white dress against natural foliage outdoors. Great attention to detail and care in making the rendering is necessary, whether working with film or digital. It's not so different as it might seem.

A sidebar:
One of the interesting things, to me anyway, about digital capture and digital printing is that printing larger increases resolution and improves tonal gradation up to the point that pixelation causes the image quality to fall apart. The onset of pixelation kills quality very very fast. Film images, magnified and printed with an enlarger, degrade slowly as magnification grows ... beyond a certain optimal point, it's a linear monotonic degradation. Scanned film images don't quite have the same characteristic as either: when you digitally print scanned film images at larger and larger magnifications, you also get increased grain and that degrades resolution and tonal qualities along with pixelation issues.

The test photos I made in the garden showing the tree leaves that were shot
raw and presented unaltered, showed the difference that 1/3 stop of
exposure could make.

There's no such thing as an "unaltered" RAW conversion. You made a conversion, a rendering of your captured data to RGB, using the defaults that the camera metadata provided in conjunction with however the RAW converter interpreted them. You could have easily changed the curves and recovered the highlight detail in those leaves, unless they were right over the edge of photosite saturation. That's the key: choose carefully where you're going to place your saturation point. With digital capture, it's a hard hard edge, unlike with negative film.

Overall, I'm seeing a decline in what many
photographers and editors consider acceptable quality. Is this a result of digital? I suspect that it is to a degree. I also attribute it to other
factors.

I think the other factors far outweigh the influence of the media change.

However, I'd like to see more photographers taking greater care with the photos they present, learning more about what makes a good photo (at least
technically), and spending more time correcting small details.  I'm
disheartened to see what I perceive as an overall decline in the quality of
photography.

Written history going back to at least the Greeks is a long testament to older men bemoaning lost quality with the displacement of their world by the work of younger men.

]'-)

Godfrey

Reply via email to