logo Sign In

Post #1306740

Author
Jay
Parent topic
4K restoration on Star Wars
Link to post in topic
https://originaltrilogy.com/post/id/1306740/action/topic#1306740
Date created
21-Nov-2019, 12:21 PM

DrDre said:

Turns out the problem with washed out colors, and a flat image is at least partly caused by the HDR settings on the display device. HDR can look radically different from one device to the other (there are no real standards for HDR)

I hear this said frequently and it’s partially true, especially on the display side of things. There are standards for creating HDR content (like the EOTF curve), but there are few guidelines for translating that data for display. If video is mastered with a peak nit level of 4000 and the display can’t actually hit 4000 nits (none of them can at this point I think), the display has to tone map the image to fit within the display’s dynamic range, at which point it comes down to the manufacturer’s goals and philosophy. They can either maximize dynamic range (render all the highlight detail while reducing average brightness) or sacrifice some dynamic range to maintain average brightness levels (have a high average brightness and crush some highlight detail). But since we’re talking about Dolby Vision here, that means these decisions are being made in the material itself rather than the display (unless the display doesn’t support Dolby Vision and it falls back to the HDR10 layer).

Are you optimizing your display with settings specific to the OT or are you referring to general optimizations for all HDR content? This is a case where I think Sony’s approach in their consumer displays is the right one; you calibrate the display for SDR, then the display performs the necessary calculations and adjustments to render HDR content.

Your screenshots do look great, so I might sign up just to check out these transfers.

Fang Zei said:

Jay said:

HDTVTest evaluates the OT and ST in HDR on Disney+

https://www.youtube.com/watch?v=VGZmMjPJiAk

Vincent of HDTVTest is a respected reviewer and display calibrator. He doesn’t get into color grading or anything like that, but it’s an interesting look into whether the OT on Disney+ is true HDR or just some contrast tweaks (spoiler: it’s not real HDR).

I wonder how much of the issue is “fake HDR.” They may have simply intentionally graded it that way, giving it a restrained HDR pass. It may also be that there isn’t a whole lot of dynamic range to squeeze out of the camera negatives at this point, if there ever was to begin with.

But if I had to guess, I’d say that even if they could have gotten more HDR “pop” they still chose not to, if only so that the OT would still recognizably look like the OT. For all the time they’ve spent keeping the OOT buried, Lucasfilm sure seems to be striving for authenticity as far as the color and contrast goes.

It’s possible, but as soon as you watch film that was intended for a large screen in a dark room at 16fL transferred to a digital medium for viewing on a consumer display calibrated for 100 nits (~30fL, and that’s conservative), it no longer looks like projected film anyway. I can appreciate wanting to maintain the aesthetic (I watched the Criterion edition of Scanners back in October, and it looked about as close to film as you can possibly get on video, and I loved it), but we are talking about movies with glowing laser swords, laser pistols, gleaming golden robots, and big explosions in this case.

It’s also important to note that the goal of HDR (“High Dynamic Range”) isn’t necessarily eye-searing brightness levels, but high peak brightness (which would be seen sparingly in the objects noted above) with extra dynamic range for everything else. This is where Dolby Vision (and I suppose HDR10+) shines, allowing adjustments on a scene-by-scene basis.

I think Vincent’s key point isn’t that we get lower peak brightness, but that it’s indicative of reduced dynamic range.