iOS4.1 now incorporates HDR image processing as a standard camera feature. I was keen to try this out having already been using an HDR "app" on the iPhone and Professional HDR processing desktop software in my image workflows.
These impromptu tests are designed to show the iPhone HDR functionality (new in iOS4.1) in operation in "in the field conditions".
HDR processing is a technique designed to allow digital cameras to process images more satisfactorily in demanding contrast conditions. In simple terms it works by combining multiple images taken at different exposure levels - taking the best bits (most detail) from each of the individual images and combining them into one final image.
HDR technique is designed to help digital cameras deal with demanding exposure scenarios. That's why these tests are not done in straightforward conditions - my point was to test the phone/camera in conditions where HDR is (theoretically) likely to bring some benefit. So, if you think shooting into the sky or even the sun is a bit tough on the camera - it's meant to be! (There's no point showing what HDR can do if the result is only going to be the same as the basic camera image.)
These are not strict, comprehensive "lab" tests using controlled conditions and studio lighting - I may do those later - so that the HDR function can be calibrated against some numbers (such as light levels). For the meantime, I hope these images give some insight into when HDR will benefit you and when it won't.
When commenting on results, no account has been taken of factors in the pictures such a level of noise, sharpness etc. The pictures on this page look relatively noise free and sharp due the resizing (by about a factor 10) that has been applied, but they are not necessarily so.
Remember, it usually will come down to personal preference as to what factors make the "best" image - you may, for example, tolerate more noise to get a more saturated (rich in colour) picture. Or you may like something very constrasty to deliver more 'punch'.
I did a couple of basic tests with the camera - shooting through a window. It's a kind of classic test where HDR can often help rescue a tricky picture - bright outdoors framed by dark indoors. The first image shows an improvement delivered by the HDR function (on the right). The contrast is flatter - as expected - allowing more detail indoors, as well as bringing detail back into the whitest parts of the clouds, some of the sky and a little colour in the red window blinds. If there's a complaint, it's an undue loss of contrast on the buildings and loss of colour saturation. This is unusual for an HDR image, which is usually more saturated.
I'm already starting to think Apple have really tamed this function to deliver only modest, well-controlled differences. What is noticeable is the distinct lack of "halo" effect where the multiple images are blended; more on that later.
The lower image tests contrast control on a bright (but cloudy) day - thus diffuse light - but also with movement. Again, HDR image is on the right. We see an uplift in exposure in the HDR image - possibly even an unnecessary one, resulting in a slight perceived loss of detail - but overall not too much loss of contrast. Tellingly, we can see the composition of the 3 raw images into one HDR image in the right hand result - the person walking appears ghosted. The second exposure of the three images is the most obvious component of the final image but further testing is needed to confirm if this is fixed behaviour or not.
However, what this test does show is that the iOS4.1 HDR processor is not suited to moving images, even at fairly low speed (unless you actually want to create this specific artistic effect). This is one area where the fully automated functionality cannot compete with Professional level HDR processing - which would normally allow you to mask and remove any moving elements of the image.
Furthermore - the onboard HDR processing (and indeed image capture) is extremely rapid - and this still leaves me to wonder exactly how Apple have implemented the feature and what design compromises have been made. It would usually be the case that with multiple exposure the HDR software would have to realign the individual source images - and this takes time; often up to 20-30 seconds on. The iPhone displays no such behaviour, so I tend to conclude it works on the basis of assuming that the source pictures are taken in sufficiently rapid succession that minimal camera movement has occured. Perhaps the onboard gyros & accelerometers help track the phone movement? Either way - it's an impressive feat and knowing Apple they will have come up with a creative approach (as they have with 'Multi-tasking').
I used the following approach in the next set of tests: the first photo (left most) is the standard image from the iPhone. The second photo is the HDR version of the same image as taken by the iOS4.1 HDR function. The 3rd image is an HDR image created using the HDR PRO app on the iPhone. This app actually takes two (not three like Apple's implementation) exposures and combines them, allowing the user some control over the final result (e.g. brightness and contrast).
I recently used this app on holiday and pretty much ditched my pro DSLR camera for the duration and got some very good results.
Finally, the fourth picture, where shown, is an HDR image created using professional desktop software on my laptop. I did this where I felt there would be some difference and "something to show". For fairness (and to make a point, I guess) I used the same raw images that the HDR PRO app took to create its result. So essentially this is an additional test of HDR PRO app versus the professional desktop/laptop software.
Let's deal with the simple stuff first - I think it's fair to say the professional desktop HDR software produces the best results. So it should. It costs significantly more than the iPhone app and the results are not fully automated, but rely on some human input and interpretation. Also the software offers an incredible level of control.
Second - I think in all cases the iPhone HDR PRO app out-performs the basic camera and basic HDR function, even though it only uses two input images compared the Apple's implementation which uses three. But again, perhaps this is to be expected since it offers greater control over both the input (you can select how to expose the source images) and the output - even though it's only a small amount of control. The HDR PRO app produces images of greater saturation and contrast and thus have (in a sense) a more natural feel - i.e. more akin to what the eye is used to seeing (skies that make you go "wow!" for example).
The main problem with the HDR PRO app images is the lack of control over the "halo-ing" effect where the application blends the two image exposures together. This is a common problem in HDR processing - one which Apple seem to have magically avoided. In the desktop software there is total control of this artifacting, not to mention numerous tone-mapping algorithms to use completely different blending methods, avoiding the halo effect altogether.
What's interesting is the general lack of difference between the iPhone standard image and iPhone HDR versions. In most cases the HDR version shows an uplift in exposure and a little more detail in the shadows - but not much - and certainly does not deliver the 'punch' that has become synonymous with HDR imaging. Personally I feel that in most cases the iPhone HDR image is better than the standard one - although not always. For me the candle image is much better as the basic image, though I haven't figured out why the HDR function performed so badly here - unless perhaps it was using some kind of centre-weighted metering system and was fooled by the candle.
Certainly it would lead me to conclude that, for static images, 9 times out of 10 the HDR function will deliver a better image; and even if it doesn't it can be set to save the basic image too anyway. So, if you have the memory space, it seems shooting by default in HDR is probably a safe bet.
The picture under the tree/shrubbery is interesting as this exhibits the opposite effect - here the HDR function has tamed the stray light diffusion in the basic image (making it look a bit washed out) very successfully and produced another good result, adding apaprent contrast that was not there in the original.
The HDR function - and indeed the camera in general - is not without its limitations. The main one is ability to shoot in low light and sadly this limits the HDR function's ability to pull detail out of dark spaces, especially indoors. This is where HDR is traditionally at its most powerful for Pro photographers, so this is disappointing to say the least and by itself means the iPhone will not be replacing any DSLRs any time soon! Or producing serious indoor photography.
This contrasts with its performance in bright light which is very good and sharp and produces images of very acceptable quality. In bright light the HDR function works well because the camera is better able to reduce the exposure (i.e. let less light in and make the picture darker) and thus create multiple exposures of much greater variation. This makes for much punchier HDR images, which is why the outdoor cathedral and sky/building images are more eye-grabbing.
These tests are pointers for me - but not the final result - they help to figure out how the camera is working and under what conditions it works best. They will probably lead to more detail and controlled tests which help to understand in more detail how and when the HDR function benefits the image. Really I also need to produce histograms for the images so we can see how the camera chooses to meter and spread the light in the picture and also assess colour balance. In the meantime I'll be shooting with the HDR function on anyway - because it costs nothing and in most cases produces a better image; sometimes a significantly better one.