logo Sign In

Info: High Dynamic Range

Author
Time
 (Edited)

Hi guys,

I’m starting to look into HDR, because it seems to be all the rage now and some of the scans we have are done at such quality that the projects could definitely benefit from an HDR workflow and release. HDR is still quite new on the consumer market, so I’m not saying we’ll start seeing HDR preservations tomorrow but it would be good to be ready when the time for it comes, and also keep it in mind for possibly future-proofing projects.

I tried reading up on it but to be honest, I find the whole thing quite confusing - does anyone have any experience or insight they could share?

What tools and formats can be currently used to produce HDR renders and encodes that would fit the new home video HDR standards?

And is it possible to convert SDR footage to HDR - I realize that there would probably be little benefit to doing this to an entire project but it may be important in cases where we might need to patch things up with SDR footage, like restoring missing frames using a BD release.

Author
Time
 (Edited)

HDR (high dynamic range) is basically when you shoot the same scene with 3 different shots; 1 at normal exposure, one at a lower exposure, and the third at a higher exposure. You would then combine them to get a picture with the highlights NOT blown out to white (think of the sky) and shadows dark but able to see fine detail. This is in terms of photography, not video.

In terms of video, and what is seen in new 4k HDR HDTVs, the major benefit is having less banding. HDR video is encoded with HEVC (or x265) on new 4k blu-rays at 10bit, instead of the normal 8bit.

8bit_vs_10bit

You can encode traditional 8bit video (current regular blu-rays) at x265 10bit and get very good compression while maintaining very high quality. I’ve been encoding some of my own personal blu-rays to x265 10bit and the results are quite good.

http://x264.nl/x264/10bit_02-ateme-why_does_10bit_save_bandwidth.pdf (PDF file)

Also, here is a short x265 10bit clip I did of the DeSpec 2.7. The original was around 185MB, this is only 35MB. VLC and MPC-HC can play this file.
https://drive.google.com/file/d/0B9tjpukbbRggZWdtdlhNNzduT0k/view?usp=sharing (MKV file @ 720p)

x265 10bit still has some trouble with maintaining grain, but there wasn’t a whole lot in the original v2.7.

As for encoders, I use Staxrip.

If someone else can offer more insight (and correct me if I’m wrong), that’d be great.
Hope this helps!

“Your eyes can deceive you. Don’t trust them.” – Obi-Wan Kenobi

Author
Time

Harmy said:
I tried reading up on it but to be honest, I find the whole thing quite confusing …

That’s what happens when step-up technology is put into the hands of the Ad department.

Who would junk their present HD-TV for some step-up technology that shows all the present media … looking the same? So it gets some hot-new name that lets the imagination fly. (Ever see a store demo of a UHD-TV? The picture “pops”! Wow! Is that the new technology? Well … no, because the store-mode setting is “high-saturation” and “high-contrast”. But you can do the same thing at home with your lowly HD-TV, because 1 (UHD) panel is what they manufacture to break up into 4 panels for 4 HD-TVs.

It really is more like this . .

. . more visible spectrum being saved in a bigger container, to hold the extra range. Digitally speaking, a few more bits for the extra colors. (Because our technology can’t affordably do more right now.)

That’s about as much as I can perceive without technical details – slow in coming because “an extra bit (or 2) per Red, Green, and Blue” wouldn’t sell more TVs. 😃

Author
Time
 (Edited)

I’m currently finishing off grading a HDR project, I’ll do a writeup when I am done.
I have created HDR workflows for Resolve and Nucoda that work fairly well. Ideally you will want a HDR screen for monitoring, but it isn’t strictly necessary.

The comments above miss some key details. HDR for video and projection is not the same as what people refer to as HDR for photography. For televisions and projectors it is all about the colour detail, range, and light output, and controlling it.

So the new TVs and projectors have staggeringly bright light outputs, so you can display scenes with a light output range not previously possible. This lets you put out enough light to make you wince, so if the sun was in scene for example, instead of just being a white circle on screen as it is on a standard TV, it would blow out to nearly painful to look at levels, like in real life.
Also there is a turdload™ more detail in the colours, and reds are finally red.

Good HDR sets look incredible, they will be an easy sell once people see them, normal TVs alongside them look insipid and flat by comparison.

Grading is a challenge, but doable.

Donations welcome: paypal.me/poit
bitcoin:13QDjXjt7w7BFiQc4Q7wpRGPtYKYchnm8x
Help get The Original Trilogy preserved!

Author
Time

@ poita
Very cool! I guess we’ll have to beware when reading the specs, like HD-TVs and their phony “refresh rates”, if we want “the real thing”.

@ Anonymous105
So you encoded from 8-bit to 10-bit and it plays back at 8-bit? And that processing is enough to mitigate banding? If so, that’s unexpectedly good. Could you post a before-and-after screenshot that may demonstrate this? (Of course, I also like the idea of smaller file size at same/better quality.)

Author
Time

From what I understand, the 8bit vs 10bit is just a small part of it, the real difference (and the real challenge in dealing with HDR video) is in the rec709 vs rec2020 color spaces.

Author
Time

Thanks! I guess it’s going-to-HDR-school time for me. (Another week of free time – shot.)

Author
Time

Before we get into it, let’s setup the nomeclature so we can discuss it without confusion.

EOTF - The electro-optical transfer function of your (hopefully calibrated) display, typically defined by BT.1886 for normal HD displays.

HLG - The Hybrid Log-Gamma Standard. Maximum peak luminance of 5000 nits

Nits - A shorthand term for light output, measured in candelas per metre squared (cd/m²).

SDR - Standard Dynamic Range (independent of resolution, e.g. SDR 1080P, SDR UHD, SDR 4K). As a delivery format this is usually DVD or Blu-ray in Rec.709 where your peak luminance is 100 nits (ST.2080-1 standard).

HDR - video created to be displayed delivering much, much higher peak white levels(4000 nits on the Dolby Pulsar reference monitor, vs the 100 nits of SDR!) That is, instead of SDR’s BT.1886 EOTF, HDR uses an EOTF that’s described by either HLG or by the ST.2084 standards. There is no resolution or colourspace requirement for HDR, but typically P3 is being used, and once displays improve it will likely switch to Rec.2020. For delivery, at least 10bits is required, and masterin requires 12bits so the colour gamut is vastly improved over SDR.

DaVinci Resolve 12.5 can currently support 3 flavours of HDR: HLG, Dolby Vision and HDR10 using ST.2084. You can play with this via LUTs, but the best way is use Resolve Colour Management in your project settings. However if you want to play with Dolby Vision you will need the CMU hardware from Dolby themselves, so that is not all that likely for anyone on these boards, but I can go into how it works if anyone is interested.

Everyone else will be using SMPTE ST.2084, or HLG. They don’t require any licensing fees or special equipment other than a HDR monitor for doing your grading and setting resolve to the correct settings.

I will write more later when I have time, but I do want to say, once you have seen a good HDR TV or projector, you will never want to watch a normal display again. Even the most non-videophile person can see the difference instantly, it is way, way more impressive than the jump from 1080P to UHD, or even from DVD to BD.

Donations welcome: paypal.me/poit
bitcoin:13QDjXjt7w7BFiQc4Q7wpRGPtYKYchnm8x
Help get The Original Trilogy preserved!

Author
Time

Something else to mention is that, as usual, TV manufacturers are being a little creative with their specifications when it comes to the nits output of their HDR TVs.

The current luminance king, the Samsung KS9500 for example is rated at 1400 candelas, but that is only at 10% of the screen having maximum white, it drops off to 500 candelas at 100% white.

You can see how the current models fare here:
http://www.rtings.com/tv/tests/movies/hdr/peak-brightness

Broadcast HDR monitors are still crazily expensive, I think the Sony is the cheapest at around $42,000 Australian Dollarydoos, for their BVM-X300 30" monitor. Hopefull FSI will come up with something more affordable.

Donations welcome: paypal.me/poit
bitcoin:13QDjXjt7w7BFiQc4Q7wpRGPtYKYchnm8x
Help get The Original Trilogy preserved!

Author
Time
 (Edited)

Everyone else pointed out some good stuff as well!

@ Spaced Ranger
In terms of encoding 8bit to 10bit and playing back 8bit, it really depends on the source and how far you are compressing. Encoding from the highest quality source is crucial. Other than the new 4k blu-rays, I’m not aware of any other high quality 10bit source available to the general public.

I tried to put together a quick comparison here from the movie Saving Private Ryan; granted, this movie is probably a worst-case scenario for good encoding at small file sizes, but still (45GB vs. 7.8GB) is not too bad. Most of my encodes are right around 2GB and still look extremely good compared to the blu-ray.
http://screenshotcomparison.com/comparison/175765
As well I hope you checked out the short x265 clip from DeSpecialized 2.7 in my last post.

The main benefit I see from encoding a blu-ray source (or other high quality source) to x265 10bit is the small file size while maintaining at least 90%, I’d say, of the original quality.

It also depends on whether the movie has a lot of action or whether the camera moves a lot, as well as similar colors throughout a scene.

I’ve learned a lot already just reading through this page.

“Your eyes can deceive you. Don’t trust them.” – Obi-Wan Kenobi

Author
Time

Anonymous105 said:

HDR (high dynamic range) is basically when you shoot the same scene with 3 different shots; 1 at normal exposure, one at a lower exposure, and the third at a higher exposure. You would then combine them to get a picture with the highlights NOT blown out to white (think of the sky) and shadows dark but able to see fine detail.

If someone else can offer more insight (and correct me if I’m wrong), that’d be great.
Hope this helps!

This is not at all what HDR video is. HDR for photgraphy (bracketing and combining) is a totally different animal.

Donations welcome: paypal.me/poit
bitcoin:13QDjXjt7w7BFiQc4Q7wpRGPtYKYchnm8x
Help get The Original Trilogy preserved!

Author
Time
 (Edited)

@ poita
HDR video is indeed different; I was referring specifically to the photography aspect in the first post. Should’ve made that clearer; my apologies about that.

I don’t know too much about HDR video, but I am quickly learning about the x265 10bit compression side of things.

“Your eyes can deceive you. Don’t trust them.” – Obi-Wan Kenobi

Author
Time
 (Edited)

My understanding is that HDR video isn’t all that different from photography, but instead of photographing the scene more than once, they transfer the print more than once.

Now it’s still vaguely unclear what is being done, and there are no public tools for transferring or encoding in this format yet.

Many films have a digital intermediate (DI) that is approximately 2k (this includes post production work and effects). They then do some ‘magic’ to extract the brightest and darkest regions from the film and create an artificial 4k LAYER that they place on top of the 2k DI and re-color time to take advantage of the wider Rec.2020 colors.

Don’t get me wrong, there are true 4k films, but percentage-wise, not as many films as will be released as UHD. The process above allows them to call a film UHD, even though there is upscaling involved.

The important thing to realize is this is just 21st Century Colorization. Directors and cinematographers of today barely understand the technology right now, and certainly none of them who shot films prior to the last few years had any intention of their films looking like how UHD/HDR discs are being created.

I’ll watch new films intended for HDR, but applying the tech to older films is a travesty, just like colorizing black and white.

Btw, it should be mentioned the open tool of x265 is still immature and inferior to x264. A good alternative if you are doing your own 10-bit transfers is to use Hi10P. It’s less compatible with hardware devices, but is a mature portion of x264 that allows 10-bit color.

Dr. M

Author
Time

If using HDR to re-grade an old film, then they are usually working from the neg, not prints, and current scanners are able to capture the neg without the need for multiple transfers.

You are right in that the original grade for older movies is totally different to what you could do with a HDR grade, and anything is revisionism really when making a home version, all home versions are radically different to the cinema version, regardless of whether HDR or not.

However, exactly what the grade is, is up to the colourist. You could do a totally new grade to change the experience into something completely new, or you could use the wider colour gamut, and finer colour detail of HDR to create a home version of a movie that is closer to the original cinema release than ever before

It is up to the people using it, their skill and intention. It is a huge step forward in visual fidelity and brings the recorded image much closer to reality. What directors and others choose to do with the tools is another thing altogether, but it can certainly be used to make versions of films that are more true to the originals.

Donations welcome: paypal.me/poit
bitcoin:13QDjXjt7w7BFiQc4Q7wpRGPtYKYchnm8x
Help get The Original Trilogy preserved!

Author
Time

question, in your workflow do you have to keep in mind the two different HDR standards currently used in TVs? HDR10 uses 10-bit, DolbyVision uses 12. if that could become a problem for private scans I don’t know. I mean do you have to think about that when you want to export into a final (possbile h.265) file?
Would appreciate an answer

やるか、やらぬかだ。「試し」など要らん

Author
Time
 (Edited)

It was dat right toyn, doc …

.

@ Anonymous105

These newer encoders do a nice job. And I did watch the vid … looks great. Time to bid a fond farewell to MPG2.

I was trying to see how well banding was handled (but I’m not aware of any particular shot where it may originally show).

.

poita said:
Broadcast HDR monitors are still crazily expensive, I think the Sony is the cheapest at around $42,000 Australian Dollarydoos, for their BVM-X300 30" monitor.

Sounds very affordable … in cartoon currency (at least Australia has bigger numbers in their bucks than has the U.S.) 😃

Author
Time
 (Edited)

poita said:

If using HDR to re-grade an old film, then they are usually working from the neg, not prints, and current scanners are able to capture the neg without the need for multiple transfers.

You are right in that the original grade for older movies is totally different to what you could do with a HDR grade, and anything is revisionism really when making a home version, all home versions are radically different to the cinema version, regardless of whether HDR or not.

However, exactly what the grade is, is up to the colourist. You could do a totally new grade to change the experience into something completely new, or you could use the wider colour gamut, and finer colour detail of HDR to create a home version of a movie that is closer to the original cinema release than ever before

It is up to the people using it, their skill and intention. It is a huge step forward in visual fidelity and brings the recorded image much closer to reality. What directors and others choose to do with the tools is another thing altogether, but it can certainly be used to make versions of films that are more true to the originals.

That’s the part I’m actually sure about (via details at TheDigitalBits). For non-4k films, they start with the DI, not the original negative. This is so they have a finished cut with completed effects and such. They may go to the negative to re-extract the HDR, but I’ve not seen that specifically mentioned. While you could get >2k out of the original negatives, any visual effects wouldn’t be present and might stand out if you layered a pre-fx HDR layer onto a post-fx video.

What they don’t do is re-scan the original negatives, recut the film and recomposite effects and what-not. That would be a full restoration and too expensive.

You are completely right, theatrically there is usually a wider color gamut than what has been brought to home theaters before now, but that’s just a portion of it.

The confusion comes from there being 3 things going on with UHD BDs.

  1. The color: Theatrical colors are quite wide. Far superior to NTSC & PAL, and better than standard HD. Wider gamut is good, we can get closer to a theatrical presentation with UHD discs. Unfortunately, regular viewers probably wouldn’t know the difference (not sure I would). So it is a poor selling point.

  2. The resolution: 4k is good for 4k films. That works best for 70mm prints and newer 4k digital films. Unfortunately many/most catalog titles even if shot on film capable of 4k scans were edited and finished around 2k (or a bit better).
    That means either a limited number of UHD BD titles or upscale films and hope you don’t get sued… which leads us to why #3 exists.

  3. HDR: Extracting more light information from the source material (probably with some computer jiggery pokery) to ‘enhance’ the picture and make it look more natural or realistic.

Good or bad, this gives them the ability to do what I was talking about: Upscale 2k to 4k and then create a 4k layer of HDR that can be overlayed on the film. You can then legitimately call it 4k even if the original print of the film you’re working with is only 2k.

HDR lets studios sell catalog titles as 4k when there was no other way they could remarket them! BD is the end of the upgrade treadmill without it.

Edit: I also wouldn’t worry about DolbyVision HDR. They are late to the party, have low adoption from content providers, and limited TV manufacturer support. One of the bigger ‘gets’ was Vizio, which has thrown a shoe this year by deciding their TVs are only “Home Theater Displays” AKA dumb monitors. Without tuners, they can’t even legally call them TVs. Without apps, they can’t call them ‘Smart Devices’. Vizio fans are pushing back.

In a few years if DolbyVision is still around then that may be the superior HDR, but I think they’re going to end up being the losers in the HDR format war.

Dr. M

Author
Time

monol0g said:

question, in your workflow do you have to keep in mind the two different HDR standards currently used in TVs? HDR10 uses 10-bit, DolbyVision uses 12. if that could become a problem for private scans I don’t know. I mean do you have to think about that when you want to export into a final (possbile h.265) file?
Would appreciate an answer

As I am scanning in 16-bit, then mastering in 12bit, it isn’t much work to create a version from there to whichever standard one wants.

However, when doing the grade, the workflow is very different depending on which direction you choose, which I will go into later.

Remember, the range for different HDR TVs varies greatly, so it depends what your aim is in creating a HDR version. It doesn’t make a difference for my own viewing, as it happens in 12bit 4:4:4, but for mainstream delivery, yes there are a lot of considerations.

Donations welcome: paypal.me/poit
bitcoin:13QDjXjt7w7BFiQc4Q7wpRGPtYKYchnm8x
Help get The Original Trilogy preserved!

Author
Time
 (Edited)

Here is a good read on the comparison of the two HDR formats: http://www.hdtvtest.co.uk/news/dolby-hdr-201606214303.htm

What is a bit misleading is it shows that Dolby Vision uses a much higher peak brightness (1000-10000 nits!!! (that burning in your eyes will never go away)) than HDR10 (540-4000 nits), but they had to lobby the UHD Alliance to accept that as a trade-off for the fact that Dolby Vision has much poorer black levels, 0.05 nits, whereas HDR10 reaches 0.0005 nits.

So DV may go much much brighter, its articulation of dark regions should be worse.

Dr. M