Photography

Aug 14, 2007 19:21

I have heard it's possible to take multiple photos at different light exposure levels and combine them, and thereby get an image closer to what the human eye sees (since we can perceive both the bright things and dark areas at once, while a photo meters for only one or the other). I've seen these done by otheres, the prime example being the inside ( Read more... )

images, resources, photography, questions

Leave a comment

Comments 18

q10 August 15 2007, 00:22:20 UTC
do humans really have significantly better dynamic range than good film, or is it just a trick?

for example, i suspect that our excellent subjective depth of field is in large part an illusion. after all, you notice that something is out of focus mainly when you give it your attention, and when we give something our attention our eyes refocus on it. a shot that's already been exposed once doesn't have this luxury, so if our attention shifts we know how much is out of focus.

could our subjectively good dynamic range be in part due to something like that? when we're attending to the darker parts of our field of view we can enlarge our pupils, and we can shrink them again when we attend to the brighter parts.

sorry i don't have any actual help. i have heard of that sort of thing but have never tried it. i suspect it's more in demand these days as most digital sensors have significantly worse dynamic range than film used to.

Reply

zandperl August 15 2007, 01:58:50 UTC
I know that the human eye responds logarithmically while CCDs (the digital sensors used for astronomy) respond linearly. I'm guessing that film is roughly logarithmic, and the sensors used for digital cameras (which aren't CCDs, but I forget what they are) are roughly linear, but I could be wrong.

I'm sure you're right that our apparent depth of field is mostly due to how we change focus so quickly; I'm not sure about our dynamic range, but your guess seems reasonable.

Reply

q10 August 15 2007, 02:13:55 UTC
a lot of digital camera sensors are CCDs. those that aren't are CMOS-based.

you are right about the linearity, for prettymuch all digital photosensor technologies being used in cameras.

Reply


calzephyr77 August 15 2007, 03:09:37 UTC
I believe it's called HDR photography, although it's not the same as the film-making technique. This is just one pool on Flickr - http://www.flickr.com/groups/88604496@N00/pool/ Some of it is good, some of it is bad :-)

If you Google HDR and Photoshop, I'm sure you can find tutorials or info.

Reply

seekingferret August 15 2007, 03:50:12 UTC
Thank you for introducing me to a bunch of nifty new wallpapers.

Reply

calzephyr77 August 15 2007, 12:17:49 UTC
You're welcome :-) I really like the other worldly feel of the better done pictures. Buildings really seem to benefit from the treatment.

Reply

seekingferret August 15 2007, 17:16:42 UTC
Yeah, i found a bunch of hdr pictures of nyc buildings and they're absolutely magnificent.

Reply


framefolly August 15 2007, 06:35:44 UTC
Dunno any of the science behind it, but I know that film (the kind with emulsion, not digital) has a lot less range for brightness and hue than human eyes do. Someday I should learn how digital image capture works...

As for Photoshop or similar, I suspect that in addition to adjusting transparencies, you'll need to make a series of mattes and play a bit with the curves -- but you probably already knew that.

Reply

zandperl August 15 2007, 23:44:11 UTC
you'll need to make a series of mattes and play a bit with the curves -- but you probably already knew that.

Um, no. Why will I have to make mattes? And I really hope I don't have to play with curves, that's a pain.

Reply

framefolly August 16 2007, 06:55:05 UTC
That was my first thought -- hence the "I suspect." Mattes and curves together can get a lot done, pain or not. But clearly a lot of your respondents have better ideas, so no need to go through the hassle.

Reply

zandperl August 15 2007, 23:49:37 UTC
Someday I should learn how digital image capture works...

http://en.wikipedia.org/wiki/Photoelectric_effect
http://en.wikipedia.org/wiki/Charge-coupled_device

In short, each pixel is a material where when a photon hits it, an electron pops out (the photoelectric effect, and the only one of Einstein's three pivotal papers to actually win him the Nobel Prize). Put together a bunch of them and you get a CCD (astronomy) or CMOS (commercial digital cameras). Everything other than the sensor is identical in film and digital.

Reply


kelsin August 15 2007, 14:14:24 UTC
Yeah the other post about HDR is what to google. Photoshop can handle it in CS2 and above I think:

http://www.cambridgeincolour.com/tutorials/high-dynamic-range.htm

There is a good tutorial it looks like. Just google HDR and photoshop and you'll find more.

Reply

zandperl August 15 2007, 23:50:23 UTC
Hm, unfortunately that looks like a very specific tool in Photoshop. I should've specified that I'm actually using Gimp. I will google that though.

Reply

kelsin August 16 2007, 02:03:16 UTC
Yeah, Gimp's engine is only 8-bits per channel so it can't handle true HDR. There isn't much on Linux (or open source in general) to be able to handle hdr unfortunately.

Reply


Leave a comment

Up