Instagram now supports HDR photos!

Instagram (IG) and Threads just added support for HDR (High Dynamic Range) photos! This means we are starting to see the first steps toward mainstream use of these enhanced images. HDR photos look more true to life by retaining the full color and dynamic range captured by the camera sensor.

I’m still testing to see what works best, but have posted a few HDR test images to my IG account (note that I have yet to find a way to get a final post that fully matches my the complex edits I’m doing on my computer, which isn’t surprising as support for sharing phone captures is the natural first step for a mobile-first platform).

 

How can you view HDR photos on Instagram?

IG supports HDR on all major platforms: iPhone, Android, and on the web (with a supporting browser like Chrome). Most smart phones  support HDR, as do most Apple laptops sold in the last several years (the 14-16″ M1 or later MacBook Pros look especially stunning). The majority of your audience will likely be able to see your HDR images, as IG is primarily used on mobile phones.

Support is automatic for most users. There’s nothing you need to do on a phone other than ensure you’re up to date on your operating system and the Instagram app. For a computer, see my tests to confirm that your display supports HDR.

Photos will show as HDR in the feed and when viewed large (such as by clicking an image in the grid on a web browser). They currently show as SDR (standard dynamic range) when viewing a profile grid on a mobile device until you click to view a specific image (the grid is HDR on the web). The images are shared as gain maps, which is a standard intended to help optimize display on any monitor (regardless of whether it supports HDR or just SDR).

Support appears to be starting to roll out across other Meta-owned platforms as well. The Threads iOS now supports HDR. And if you share share an IG or Threads post to Facebook, it will show as HDR when viewed on a supporting web browser. Support seems to be expanding/evolving quickly.

 

How do you capture and upload HDR photos to your account?

Initial support appears focused on sharing images from smart phone cameras (which makes sense as the primary way most people interact with IG). Just use the native iOS / Android camera app and you should have HDR support. I have tested JPG, HEIC, and RAW on the iPhone and all those formats are supported when captured with the native camera. On Android, I have found both the native Android app and IG’s built-in camera work.

You can also upload images exported from Web Sharp Pro, Lightroom, and ACR as an HDR AVIF encoded in the P3 colorspace. – at least when uploading through the iOS IG app. This is not an optimal solution as it does not include a gain map for optimal SDR rendering, but that’s less important here because most images on Instagram will be viewed on HDR-capable mobile phones. The following video provides more context and shows the ACR pathway through Web Sharp Pro (which offers an “enhance SDR to HDR” option to upgrade any image easily to HDR, Instagram size / cropping templates, converting to P3, optimized sharpening, optional borders, etc).

 

That’s an incredible start, but what about other options? It would be amazing to have support for every camera app, custom gain maps, etc right now – but we have to start somewhere. I haven’t even seen support documentation on these capabilities, so we are probably in beta territory at this point. Instagram has done an amazing job with this initial support and deserves a lot of credit for this achievement (along with the other the technologies involved here from Google, Apple, Samsung, etc).

 

What isn’t supported?

I have found the following scenarios do NOT currently support HDR (this is not an exhaustive list, and limitations here may be resolved by the time you read this):

  • Full HDR capacity. If you look at my latest test post, you’ll see an image which with 1.5 stops of HDR headroom – but my source image is edited up to 4 stops and looks much more compelling on my computer. This just looks like a glitch resulting from me trying to share something outside the initially planned support (ie images captured on a phone). So as nice as it is now, it could get much better with full support in the future.
  • Custom gain maps. The full gain map spec used by Adobe is not yet supported on IG, so we can’t provide HDR images which are also optimized for SDR fallback. I’m ok with the tradeoff for now, as HDR is already the norm for mobile devices.
  • Some iOS native camera captures do not include the required gain map. This includes pano images, as well as as portrait mode shots if you shoot too quickly (before text like “natural light” shows in yellow) or you use either of the mono (black and white) modes.
  • Third-party camera apps may not work.
    • I tested ProCamera and found that JPG / HEIF images did not include gain maps and RAW / ProRAW captures did not generate gain maps when uploaded (unlike RAW files from the native iOS camera).
    • The IG camera on iOS did not support HDR in my testing (but direct captures within IG work great on Android / S24).
  • Filters work great for HDR in the Android app. However, selecting any IG filter in the iOS app during upload will result in an SDR result.
  • Uploading through a web browser on a computer will fail to preserve HDR in most cases.
  • Uploading iOS captures on Android or vice versa. Apple and Google use different gain map formats.

None of these limitations for advanced HDR photography surprise me for initial support. There are different gain map specs and implementations at this point which will likely require further work from multiple companies to allow full support across platforms. Focusing on sharing smart phone images is the right first step and the results are gorgeous and very easy to obtain (automatic actually). I had assumed we probably wouldn’t see social media support for HDR any sooner than the end of 2024, so this is a very exciting development.

Keep an eye on my newsletter and Instagram account for updates as HDR support continues to evolve on social media.

What the HDR histogram can teach you about your RAW images

Lightroom (LR) and Adobe Camera RAW (ACR) don’t offer a RAW histogram, but there is a new way to get more information about your RAW images: the HDR histogram. And you can use it even if you do not have an HDR monitor or do not intend to process your image as HDR. Just click the “HDR” button in the develop module and watch how the right side of the histogram changes. Detail which is bunched up towards the right side of the SDR histogram can now be properly rendered as it was in the original scene. It is not a RAW histogram, but it will show you much more about the highlight data in your RAW image. **

 

And this full histogram helps illustrate a couple lessons:

When don’t “recover” highlights, we just process them so they fit into the capabilities of our monitors:

Truly “blown” pixels cannot be recovered with RAW processing. RAW processing tools like LR and ACR do not invent highlight detail (though I hope someone invents an AI tool for that to help manage slightly blown skies, that would be quite useful). We can clip pixels in our SDR processing, but if you are able to extract highlight detail from RAW, it was always there.

The HDR vs SDR histogram helps show the difference between truly blown pixels in the RAW vs “highlight rolloff” (compression of the highlights to squeeze them into the SDR range). You can also visualize this information on an SDR display simply by reducing the exposure slider. Everything is now too dark, but it does show what’s possible with your image.

As a side note for those of you who haven’t experienced HDR yet, you can also use this information to help determine if HDR display would help improve your image. For example, if you see color and detail restored when you set exposure to -2, then you you know that 2 stops of HDR headroom would be sufficient to see these pixels properly on an HDR display. Given many displays which offer 4 or even 5 stops of headroom (such as the M1 or later MacBook Pro), you would easily be able to see detail which becomes visible when you slide exposure down to -4.

 

If the untouched RAW doesn’t use much of the HDR range, it is probably under-exposed:

If you are properly exposing to the right (ETTR), your RAW images should frequently look too bright at first (ie require that you set exposure slightly negative when you process them). The reason for this is that brighter exposures have less noise (better signal to noise ratio).

There isn’t an exact point where the HDR histogram tells you your camera would have clipped – it varies by ISO and probably by camera. However, if you consistently see little or no use of the HDR range in the histogram, you are probably frequently under-exposing your images. Your RAW at base ISO can probably offer 3 or more stops of HDR support. So you might have an image that looks properly exposed for SDR, but the HDR histogram might be telling you it could have safely been exposed 3 stops brighter. In other words, your ISO 100 image might have the noise quality of an ISO 400 – 800 image and you could have avoided that with a longer shutter speed.

Note that you will likely see the histogram shows exposure increasing more than you set the slider, not everything in here is based on precise physics (this is still just an HDR histogram based on Adobe processing, not a RAW histogram). This is just a visual way to help understand ETTR principles. You can also watch for predicted clipping when bumping the exposure slider in SDR mode, you just won’t be seeing the same details in the histogram.

Of course, sometimes slower shutters or wider apertures aren’t ideal given other considerations for movement or detail in the image. But if you know the limits, you can reduce noise when possible.

 

** To see an actual RAW histogram:

The HDR histogram helps see more of our image data in LR / ACR. A true RAW histogram would show the mosaic sensor data, typically including two green channels. It wouldn’t have color data and it wouldn’t be responsive to any RAW processing choices (since it would be measured from the original data). This can be helpful to better understand your camera and shooting decisions.

If you want to go a step further and see a true RAW histogram, you might want to check out RAW Digger. I have not personally used it, but have heard great things both about the software and the developer’s excellent support of it.

Real estate exposure blending

Many of you have asked me for more real estate editing tutorials, especially for exposure blending of windows using the Lumenzia luminosity masking panel. New Zealand-based photographer Anthony Turnham has a great YouTube channel with several outstanding tutorials on that very topic (plus many other great post-processing videos). So I’m creating this post to share a collection of his videos to help answer your questions. The videos are set to play at the time point where Lumenzia is used, but I highly recommend watching the full videos as they are packed with great information on the complete edit.

His videos are also a great compliment to my Exposure Blending Master Course, which goes into great depth on fundamentals of blending – but doesn’t cover interiors with the depth Anthony does.

 

 

 

If you’re looking for even more videos showing how others use Lumenzia, be sure to check out this playlist on my channel as well.

Custom luminosity masks with Lumenzia v11.7

Lumenzia v11.7 now includes numerous enhancements to make it easier than ever to create the perfect luminosity mask. The previews are now fully interactive so you can quickly isolate the foreground from the sky, separate a red flower from green grass, let the levels automatically refine themselves for a stronger mask and much more. This significantly reduces the number of steps it takes to create custom luminosity masks and selections.

Under the flyout menu (top-right 4 bars icon) is a new “orange preview options…” dialog which offers the following enhancements while you preview L2, L3, etc:

  • Activate your favorite selection or pen tool so that you can quickly target luminosity for specific subjects in your image. When you then load the preview as a mask/selection, the luminosity mask will only use the areas you’ve selected in the preview.
  • Auto-optimize the levels layer. Each time you create a preview or adjust the sliders, the levels will be automatically optimized to give you a stronger mask. This eliminates the need to manually refine the levels in most cases.
  • Immediately paint / dodge the preview. The paint brush and black/white paint will automatically be selected for you to start brushing.

The active tool will automatically switch for you based on the active layer, so you can easily use any combination of these new tools. You can also set the initial default, so that you can immediately start with your favorite refinement method. And when you’re done with the preview, the active tool and paint will be set back to where they were before the preview so you can just keep working. This helps eliminates numerous clicks and decisions so you can work more quickly and efficiently, and should ultimately make it easier to get the perfect luminosity mask for your image.

For more details, please see the release notes.

How to avoid color shifts with Lightroom curves

Adobe Lightroom (LR) and Adobe Camera RAW (ACR) recently added a very powerful but mostly overlooked feature: “refine saturation” (or just “refine” in ACR). You’re probably well aware of the luminosity blend mode for layers in Photoshop and how it can often be useful to avoid unwanted shifts in color when making adjustments. The same problem affects LR / ACR and this new slider gives us a way to help manage it.

How does refine saturation work?

  • When you increase contrast with a point curve, colors get more saturated (hue may also shift somewhat).
  • When you decrease contrast with a point curve, colors start to desaturate (hue may also shift somewhat).
  • In either case, adjusting the refine slider down will offset those changes. Dragging it down is therefore a lot like the luminosity blend mode in Photoshop, but with the full range of a slider instead of only offering one extreme or the other.
  • The refine slider has no impact on gray pixels (and won’t even be available if the image is fully black and white).

You’re much more likely to be using a curve to add contrast, so most of the time it would seem that the refine slider desaturates the image – but it really depends on the curve.

This refine slider is available for both global and local adjustments. However, it is only available for the point curve, not the parametric one with sliders (you can create a similar point curve though) or the R,G,B curves (which are intended for color adjustment).

What are the best ways to use refine saturation?

  • Any time you see unwanted saturation or hue shift with a curve, it is worth adjusting the slider to see if you get improved results.
  • Contrast boosts on skin tones or highly saturated colors are especially important cases, as boosting saturation further in either is typically unwanted.
  • If the “contrast” slider shows unwanted color shifts, try using an S-curve and its refine slider instead.
  • If you need to make tonal adjustments for highlights / shadows, you can use a local adjustment based on a luminance range and then use a local curve with refine slider.

Note that just like luminosity blend mode in PS, the refine slider may affect apparent luminosity of shadows and may not hold the hue visually constant. It helps quite a bit to try intermediate slider values to find the optimal results, and you may wish to make small tweaks to the curve after big moves in the refine slider.

Greg Benz Photography