The old vs new “HDR” photography
True HDR photography is one of the most significant advancement in image display in decades – I would argue the biggest thing since color. But a lot of people seem to overlook it due to confusion with the older “HDR”, which is completely different (other than confusingly having the same name). This tutorial is intended to help clarify how these are completely different, and why the new HDR is so exciting.
The “old” HDR (Photomatix, etc):
The old HDR involves software like Photomatix, Nik HDR Efex Pro, and Luminar Neo (via “HDR merge”)**. This technique was most famously associated with Trey Ratcliff (whom I greatly respect as both an artist and friend). It was extremely popular for a while, but ultimately was more of a trend and is used much less often now than it was a decade ago.
Almost every photographer is familiar with the old HDR. Some love its ability to reveal more shadow detail and add color and local contrast. Many dislike it because they feel the results show excessive noise, unrealistically bright shadows, over-saturated colors, and generally deviate too much from the real world.
If I can digress for just a moment… Personally, I feel that a large number of bad images were the result of improper processing. I always treated it as a filter to blend into my images at low opacity and have probably never shared an image that used the old HDR technique at anything over 40-50%. I don’t intend to dive into this approach, but merely bring it up to make the point: these are all tools and any tool can produce bad results if not used properly. Give photographers a knob and someone is going to turn it up to 11 (the new HDR isn’t going to escape misuse either, it’s inevitable with almost any new tool – especially while learning how to use it optimally).
Getting back on topic… What is the old HDR? Technically, it is “tone mapping” high dynamic range image data to an SDR display. In simple terms, it’s a way of dealing with the fact that our cameras capture far more dynamic range than our monitors have traditionally been capable of showing. Most monitors support “standard dynamic range” or SDR. These monitors are capable of producing about 8 stops of dynamic range. However, digital cameras have been capable of recording much more than that for decades. Modern digital cameras are often capable of capturing 14-15 stops of dynamic range. There are only two general ways to display such rich image data on a limited display: you can either reduce the contrast, you can discard some of the data (clip highlights/shadows), or use some mix of the two.
Standard RAW processing deals with this problem primarily by compressing and clipping highlight details by default (because the most important content in an image is the shadow and midtone detail). This often leads to a loss of saturation in the highlights (for example, the only way to brighten a blue sky beyond the limits of a blue sub-pixel is to light up the red and green subpixels, which makes white). You can of course darken the image to help recover sky detail and color, but then you start to lose the shadows. These are significant tradeoffs.
The old HDR (tone mapping) tries to offer a different approach to the same problem. It tries to avoid clipping the data in order to simulate a wider dynamic range. Simply using a low contrast curve to show the full range of image data would look dark and dull. Instead, tone mapping employs sophisticated algorithms which try to preserve contrast in local parts of the image. Those methods are what produce the results that some people love and some people hate. Those methods also vary depending on the software used and the results are of course dependent on how the photographer uses the available settings.
Many people associate tone mapping with merging several exposures together to increase dynamic range. This is a source of tremendous confusion. It is completely unnecessary in most situations because a properly exposed RAW file already contains vastly more dynamic range than a standard monitor can display. So in a way, it’s just leaning into the problem: the dynamic range of the image is just getting that much better than the monitor can handle. In practice, it can be beneficial to help reduce shadow noise – but it’s not mandatory for tone mapping and is used much more often than is necessary. It’s not terribly important to this discussion, you can use one or multiple exposures with either the old or the new approaches to HDR – but the biggest differences have nothing to do with how many exposures you use.
In the end, the maximum dynamic range is the same whether you use standard RAW processing or the old HDR methods (tone mapping via software): it’s an inherent limitation of older monitors determined by the the darkest and brightest pixels they can display. Whether you like tone mapping results or not, that’s the key point: it is designed for technology limits which no longer apply with many displays and we now have much better options.
** Note that Lightroom (LR) and Adobe Camera RAW (LR) actually support both this old approach and the new approach we’ll discuss below (the old approach in LR uses “merge to HDR” without subsequently enabling the “HDR” editing mode).
The “new” HDR (requires new display technology capable of brighter pixels):
The “new” HDR tries to address the same problem, but in a totally different and superior way: with better monitors. The new HDR involve new monitor technology which are capable of displaying brighter pixels, while still offering deep (or even darker) black shadow values. This involves technology like mini-LED or bright OLED displays.
The peak brightness of a monitor can be measured in “nits” (which is identical to cd/m^2, often seen in calibration software). Older SDR display technology often shows a maximum white value about 100-200 nits (but potentially higher for use in bright situations such as near a window). New monitors which support HDR offer a peak brightness of typically 400 – 1600 nits, with 1000 nits being the level of support where these displays really start to look incredible (though you’ll likely have a great experience at lower levels if you’re in a sufficiently dark room).
These new HDR displays offer up to 4 stops of additional dynamic range over an SDR. That does not completely close the gap with the capability of our RAW files, but nearly so. That means there is no longer a need to make any significant compromise highlights by reducing contrast, clipping, or tone mapping. The result is more colorful sunsets, city lights that truly glow, truly higher dynamic range, and an image which is much closer to representing the real world light captured by the camera.
So what is the new HDR? It is true high dynamic range display of high dynamic range image data. It may still involve some need to compress the dynamic range, as even an HDR monitor has limits. But those limits are much higher and if there is any compromise of dynamic range, it is far less than when editing for an older SDR display.
If you have not seen a properly edited HDR photo on a good HDR monitor, it is impossible to really appreciate how incredible the results are. It would be like trying to understand the benefit of a hi-fi stereo by listening to an old AM radio. After seeing a properly edited image on a great HDR display, I consistently hear photographers say “wow!” or things like “everything else looks dull in comparison”. You really have to see it for yourself. A great way to do that is to view my comparison images with Chrome on any of the 14 or 16″ Apple Silicon MacBook Pros (ie any M1 or later).
Are images processed for an HDR monitor better? In many cases, yes and the results are dramatically better. But there are many images where using the HDR range would be a terrible creative decision. This is no different from many other creative options. You shouldn’t use a bunch of filters on every image just because they’re installed on your computer. You shouldn’t boost the saturation of your images to the maximum just because you have a wide gamut P3-capable monitor. And you shouldn’t make your images brighter just because you can. All of these options are just creative tools which are neither universally good nor universally bad. They can greatly enhance the right image when used properly, or make a visual mess when used incorrectly.
The benefit of true HDR display depends on your subject and your creative vision. If you shoot images in dramatic light such as sunset landscapes, cityscapes at the blue hour, or concerts with elaborate stage lights – then you almost certainly have many images which will clearly look better in HDR. But if primarily shoot corporate headshots or closeups of wildlife in soft light, you probably won’t have many images that will benefit. Human skin and animal fur shouldn’t glow, and you probably don’t need more contrast. The scenes where HDR will really shine are the ones where the limitations of SDR have compromised highlight detail and color which are an important part of the visual narrative.
Common misperceptions about the new HDR:
Even for those who have some appreciation that there is a completely new HDR display technology, there are a few common misperceptions which merit discussion.
First, there is a common misperception that HDR displays are rare. It is true that they are rare for external monitors. However, they are widely available on many other displays. Almost all decent TV’s sold in the past several years have good HDR (including some models as far back as 2016). The majority of smart phones sold in the past 3-4 years have great HDR displays. Almost every Apple monitor sold since 2018 has at least some degree of support and the MacBook Pros since 2020 have been incredible. There are an increasing number of PC laptops with 600+ nit OLED displays. In many cases, these displays have not been appreciated due to a lack of software or content. The software gaps have been significantly reduced over the past couple years, and we are now very close to a point where you can easily create HDR images which can be appreciated by a large audience (as Instagram is rolling out support for HDR and most people use Instagram on HDR-capable phones).
The cost of external monitor can be a real barrier for creators who don’t have an Apple laptop or edit on a mobile device. This new much newer and better technology, so it doesn’t come at the same price at this point. ASUS has great options (including with hardware calibration and support for both Windows and MacOS) at a range of price points. If you’re willing to use a TV as a monitor (which can work very well), you can get excellent results at a great price. For example, you could probably find a used 42″ LG C2 for $600. Prices and selection will improve considerably in the years ahead. See my list of recommended HDR monitors for much more discussion on the best options and what to consider.
Second, there is a concern that HDR images cannot be shared with those who lack SDR monitors. If we had to wait for every to have an SDR display, that would be a serious limitation. Thankfully, we already have a solution that makes these images completely backwards compatible with any display. A new way of encoding images with a “gain map” effectively allows you to put both a standard (SDR) and enhance (HDR) image into the same image. This can be done with minimal effort and ensures that everyone sees a great result. The viewer either gets something as good as your best standard images, or a vastly better HDR display. With the gain map ISO standard for sharing HDR images likely to be finalized in late 2024, we should expect that 2025 should be a significant inflection point in the awareness and adoption of the new HDR photography. At that point, the value of the already large installed base of HDR displays will become much more apparent.
Third, there is a common concern that you cannot print HDR. Our prints are not getting brighter (the literal interpretation of HDR on paper would require ink that glows), so it is true that trying to directly print an HDR image will likely produce poor results. Thankfully, that doesn’t really matter. It is very easy to adopt workflows which support both print and HDR display of the same image. In fact, you can edit for print and automatically generate an enhanced HDR image using Web Sharp Pro. So you can do both with no change to your existing workflow if you like (as well as upgrade your existing SDR edits). Realistically, most images are never printed and HDR display removes a lot of the time and technical challenge of editing images for an SDR display. So even if you don’t prefer not to have an HDR version of an you print, it may offer a lot of benefit for the rest of your images.
Fourth, some viewers have voiced concerns that an HDR video / image may be too bright. This is potentially true if viewing a bad edit (excessive use of HDR) when viewing in very dark ambient light (such as viewing a phone in the bedroom). But you probably need both of those things to be true in order for it to be an uncomfortable experience (which is the result of the display being far too bright relative to the ambient light). Most pixels in a properly edited HDR image should remain in the SDR range, it’s only a small portion of the image which should be brighter (the highlights which would have been compromised on an SDR display). Of course, that doesn’t stop someone from making bad edits. But keep in mind that even the brightest pixels (around 1,600 nits) are much darker than the original subject (for example, a tungsten filament is probably closer to 30,000 nits). And you probably won’t get anywhere near that maximum when your screen brightness is set appropriately for the ambient light. For example, the peak brightness of a phone or computer is typically limited when brightness is below 50% (even if you have a 1600 nits-capable MacBook Pro, you won’t see anything close to that in your brightest HDR pixel when you set the brightness to a low level). I have seen a lot of confusion around this topic, such as a comment from someone on a test post of mine that it “hurt his eyes” while at the same time making it very clear that his display lacked HDR support and he was just looking at a block of standard SDR white. It’s entirely possible that SDR content is too bright if you’re in a nearly black room. But I don’t want to dismiss the issue either. To there degree that there is a real concern for some poorly edited HDR images viewed in dark conditions, there is also potential for improved software here as HDR continues to evolve. For example, peak brightness might best be limited if a device’s ambient light sensor detects a nearly black room or the image uses too many bright HDR pixels. And perhaps such a solution might offer some degree of user choice in the operating system, as everyone has different preferences and that’s completely valid. Beauty is in the eye of the beholder.
So there are some opportunities for HDR to improve, and that’s not at all surprising for a technology which is rapidly evolving. But most of the concern around these points tends to boil down to reading misleading information, a lack of experience, a lack of spending enough time to get up the learning curve for a new technology and artistic choices, or a lack of awareness of how quickly things are progressing and what’s right around the corner. I had similar questions too when I first started exploring HDR. I encourage everyone to spend some time exploring HDR before forming any strong opinions.
Conclusions
Both the old and new HDR technology try to help us get the best results out of high dynamic range RAW files. The old HDR tone mapping methods were designed to give us an alternative way to represent that great data within the limits of standard dynamic range displays. New HDR display technology remove the need for such comprises as we finally have monitors can live up to the dynamic range our cameras have captured for decades.
The new hardware is widely available in TVs, smart phones, and Apple displays – and continues to advance rapidly. Recent advances in editing software and browsers finally offer great support for that hardware. And as the gain map standard is finalized and critical sites like Instagram continue to offer support, we are quickly approaching a world where it will be very easy to share HDR content with a large audience (and without sacrificing the experience of those still using older SDR displays). Now is an excellent time to start experimenting with editing HDR images, and our ability to share that work with others is set to greatly expand in the very near future. The benefits are substantial: HDR is going to be an important part of the future of photography.