- Time
- Post link
Is the DNR so bad just so Jar-Jar Binks doesn’t look so out of place?
Is the DNR so bad just so Jar-Jar Binks doesn’t look so out of place?
I’ve been watching the PT in fits and starts over the past couple days, 20 or 30 minutes here and there when I need a break from writing papers. Anyway, I’m now into ROTS, so I’ve got an idea of how each looks. I’d say TPM is the worst-looking; the color’s a definite improvement but it suffers from that same weird DNR waxiness as the blu-ray. AOTC is probably most-improved, the quality of the upscaling is impressive and the HDR does a reasonably effective job of selling the green screen environments. ROTS doesn’t look too different from the blu-ray through a 4K player. The HDR is nice but doesn’t really transform the look of the movie like it does in Clones, and the upscaling doesn’t look any better or worse to me than what my 4K disc player does to the blu-ray on the fly.
Maybe my eyes are deceiving me, but I’m pretty sure that the atmosphere around Tatooine in ROTJ is no longer there.
HDTVTest evaluates the OT and ST in HDR on Disney+
https://www.youtube.com/watch?v=VGZmMjPJiAk
Vincent of HDTVTest is a respected reviewer and display calibrator. He doesn’t get into color grading or anything like that, but it’s an interesting look into whether the OT on Disney+ is true HDR or just some contrast tweaks (spoiler: it’s not real HDR).
MTFBWY…A
Turns out the problem with washed out colors, and a flat image is at least partly caused by the HDR settings on the display device. HDR can look radically different from one device to the other (there are no real standards for HDR), and after optimizing my TV settings, I can now report, that at least the 4K HDR transfer of the OT looks quite stunning, with great detail and color, and is actually suprisingly faithful to the 1997 SE prints. Just to get an idea of what the 4K color grading looks like with these optimized HDR settings, here are a few photos I made of my screen (the image is less contrasty and saturated in reality, but so are photos of a 35mm screening, so you’ll get the idea):
Here’s a second set:
Here are a few shots of Empire and Jedi:
@DrDre: The grading is really great, I have to admit that.
HDTVTest evaluates the OT and ST in HDR on Disney+
https://www.youtube.com/watch?v=VGZmMjPJiAk
Vincent of HDTVTest is a respected reviewer and display calibrator. He doesn’t get into color grading or anything like that, but it’s an interesting look into whether the OT on Disney+ is true HDR or just some contrast tweaks (spoiler: it’s not real HDR).
I wonder how much of the issue is “fake HDR.” They may have simply intentionally graded it that way, giving it a restrained HDR pass. It may also be that there isn’t a whole lot of dynamic range to squeeze out of the camera negatives at this point, if there ever was to begin with.
But if I had to guess, I’d say that even if they could have gotten more HDR “pop” they still chose not to, if only so that the OT would still recognizably look like the OT. For all the time they’ve spent keeping the OOT buried, Lucasfilm sure seems to be striving for authenticity as far as the color and contrast goes.
HDTVTest evaluates the OT and ST in HDR on Disney+
https://www.youtube.com/watch?v=VGZmMjPJiAk
Vincent of HDTVTest is a respected reviewer and display calibrator. He doesn’t get into color grading or anything like that, but it’s an interesting look into whether the OT on Disney+ is true HDR or just some contrast tweaks (spoiler: it’s not real HDR).
I wonder how much of the issue is “fake HDR.” They may have simply intentionally graded it that way, giving it a restrained HDR pass. It may also be that there isn’t a whole lot of dynamic range to squeeze out of the camera negatives at this point, if there ever was to begin with.
But if I had to guess, I’d say that even if they could have gotten more HDR “pop” they still chose not to, if only so that the OT would still recognizably look like the OT. For all the time they’ve spent keeping the OOT buried, Lucasfilm sure seems to be striving for authenticity as far as the color and contrast goes.
I was thinking the same thing as I was watching that video from Vincent linked above. From what I’ve seen of movies and TV, unless it was intentionally created very bright, the bright areas on film and TV aren’t really that bright (to avoid loss of detail). Star Wars OT has been criticized in the past for not getting the brightness correct and crushing the blacks. I bet they avoided utilizing the full extent of HDR to avoid criticism and to keep it more faithful to the original look. I don’t think it really matters that the core of the lightsabers and other FX in the OT aren’t as bright as in the ST.
Turns out the problem with washed out colors, and a flat image is at least partly caused by the HDR settings on the display device. HDR can look radically different from one device to the other (there are no real standards for HDR)
I hear this said frequently and it’s partially true, especially on the display side of things. There are standards for creating HDR content (like the EOTF curve), but there are few guidelines for translating that data for display. If video is mastered with a peak nit level of 4000 and the display can’t actually hit 4000 nits (none of them can at this point I think), the display has to tone map the image to fit within the display’s dynamic range, at which point it comes down to the manufacturer’s goals and philosophy. They can either maximize dynamic range (render all the highlight detail while reducing average brightness) or sacrifice some dynamic range to maintain average brightness levels (have a high average brightness and crush some highlight detail). But since we’re talking about Dolby Vision here, that means these decisions are being made in the material itself rather than the display (unless the display doesn’t support Dolby Vision and it falls back to the HDR10 layer).
Are you optimizing your display with settings specific to the OT or are you referring to general optimizations for all HDR content? This is a case where I think Sony’s approach in their consumer displays is the right one; you calibrate the display for SDR, then the display performs the necessary calculations and adjustments to render HDR content.
Your screenshots do look great, so I might sign up just to check out these transfers.
HDTVTest evaluates the OT and ST in HDR on Disney+
https://www.youtube.com/watch?v=VGZmMjPJiAk
Vincent of HDTVTest is a respected reviewer and display calibrator. He doesn’t get into color grading or anything like that, but it’s an interesting look into whether the OT on Disney+ is true HDR or just some contrast tweaks (spoiler: it’s not real HDR).
I wonder how much of the issue is “fake HDR.” They may have simply intentionally graded it that way, giving it a restrained HDR pass. It may also be that there isn’t a whole lot of dynamic range to squeeze out of the camera negatives at this point, if there ever was to begin with.
But if I had to guess, I’d say that even if they could have gotten more HDR “pop” they still chose not to, if only so that the OT would still recognizably look like the OT. For all the time they’ve spent keeping the OOT buried, Lucasfilm sure seems to be striving for authenticity as far as the color and contrast goes.
It’s possible, but as soon as you watch film that was intended for a large screen in a dark room at 16fL transferred to a digital medium for viewing on a consumer display calibrated for 100 nits (~30fL, and that’s conservative), it no longer looks like projected film anyway. I can appreciate wanting to maintain the aesthetic (I watched the Criterion edition of Scanners back in October, and it looked about as close to film as you can possibly get on video, and I loved it), but we are talking about movies with glowing laser swords, laser pistols, gleaming golden robots, and big explosions in this case.
It’s also important to note that the goal of HDR (“High Dynamic Range”) isn’t necessarily eye-searing brightness levels, but high peak brightness (which would be seen sparingly in the objects noted above) with extra dynamic range for everything else. This is where Dolby Vision (and I suppose HDR10+) shines, allowing adjustments on a scene-by-scene basis.
I think Vincent’s key point isn’t that we get lower peak brightness, but that it’s indicative of reduced dynamic range.
MTFBWY…A
@Dr Dre - it’s a little sad to see those shots, knowing its the crap SE’s that I’ll never watch again. Not saying the new transfers are perfect, but if those were the OOT’s I’d finally be content.
Unfortunately I’ve now begun to have a little hope again that they could still release them when the supposed 4k discs come out next year. Just setting myself up for further disappointment I guess…
“In the future it will become even easier for old negatives to become lost and be “replaced” by new altered negatives. This would be a great loss to our society. Our cultural history must not be allowed to be rewritten.” - George Lucas
Turns out the problem with washed out colors, and a flat image is at least partly caused by the HDR settings on the display device. HDR can look radically different from one device to the other (there are no real standards for HDR)
I hear this said frequently and it’s partially true, especially on the display side of things. There are standards for creating HDR content (like the EOTF curve), but there are few guidelines for translating that data for display. If video is mastered with a peak nit level of 4000 and the display can’t actually hit 4000 nits (none of them can at this point I think), the display has to tone map the image to fit within the display’s dynamic range, at which point it comes down to the manufacturer’s goals and philosophy. They can either maximize dynamic range (render all the highlight detail while reducing average brightness) or sacrifice some dynamic range to maintain average brightness levels (have a high average brightness and crush some highlight detail). But since we’re talking about Dolby Vision here, that means these decisions are being made in the material itself rather than the display (unless the display doesn’t support Dolby Vision and it falls back to the HDR10 layer).
Are you optimizing your display with settings specific to the OT or are you referring to general optimizations for all HDR content? This is a case where I think Sony’s approach in their consumer displays is the right one; you calibrate the display for SDR, then the display performs the necessary calculations and adjustments to render HDR content.
Your screenshots do look great, so I might sign up just to check out these transfers.
HDTVTest evaluates the OT and ST in HDR on Disney+
https://www.youtube.com/watch?v=VGZmMjPJiAk
Vincent of HDTVTest is a respected reviewer and display calibrator. He doesn’t get into color grading or anything like that, but it’s an interesting look into whether the OT on Disney+ is true HDR or just some contrast tweaks (spoiler: it’s not real HDR).
I wonder how much of the issue is “fake HDR.” They may have simply intentionally graded it that way, giving it a restrained HDR pass. It may also be that there isn’t a whole lot of dynamic range to squeeze out of the camera negatives at this point, if there ever was to begin with.
But if I had to guess, I’d say that even if they could have gotten more HDR “pop” they still chose not to, if only so that the OT would still recognizably look like the OT. For all the time they’ve spent keeping the OOT buried, Lucasfilm sure seems to be striving for authenticity as far as the color and contrast goes.
It’s possible, but as soon as you watch film that was intended for a large screen in a dark room at 16fL transferred to a digital medium for viewing on a consumer display calibrated for 100 nits (~30fL, and that’s conservative), it no longer looks like projected film anyway. I can appreciate wanting to maintain the aesthetic (I watched the Criterion edition of Scanners back in October, and it looked about as close to film as you can possibly get on video, and I loved it), but we are talking about movies with glowing laser swords, laser pistols, gleaming golden robots, and big explosions in this case.
It’s also important to note that the goal of HDR (“High Dynamic Range”) isn’t necessarily eye-searing brightness levels, but high peak brightness (which would be seen sparingly in the objects noted above) with extra dynamic range for everything else. This is where Dolby Vision (and I suppose HDR10+) shines, allowing adjustments on a scene-by-scene basis.
I think Vincent’s key point isn’t that we get lower peak brightness, but that it’s indicative of reduced dynamic range.
In this case I optimized the display for HDR content in general, which always appeared somewhat dark, flat, and desaturated on my TV screen, compared to SDR content even if the dynamic range was obviously increased. So, I now have two customized profiles on my TV, one for SDR, and one for HDR content.
@Dr Dre - it’s a little sad to see those shots, knowing its the crap SE’s that I’ll never watch again. Not saying the new transfers are perfect, but if those were the OOT’s I’d finally be content.
Unfortunately I’ve now begun to have a little hope again that they could still release them when the supposed 4k discs come out next year. Just setting myself up for further disappointment I guess…
While I would obviously very much like to see an OOT release, I probably disliked the revisionist color grading even more than the changes. At least these versions look like the Star Wars I remember 95% of the time. So, from my perspective the OOT has finally been 95% restored to its former glory. Now for the remaining 5%…
“In this case I optimized the display for HDR content in general, which always appeared somewhat dark, flat, and desaturated on my TV screen, compared to SDR content even if the dynamic range was obviously increased.”
The problem is that most people have their TVs set with very high SDR brightness levels.
At least for me the recommended SDR brightness of 100 to 200 nits (according to sources, most mention 100) is just too dark…
So yes, HDR will look dark in comparison, but that’s how it was mastered…
“In this case I optimized the display for HDR content in general, which always appeared somewhat dark, flat, and desaturated on my TV screen, compared to SDR content even if the dynamic range was obviously increased.”
The problem is that most people have their TVs set with very high SDR brightness levels.
At least for me the recommended SDR brightness of 100 to 200 nits (according to sources, most mention 100) is just too dark…So yes, HDR will look dark in comparison, but that’s how it was mastered…
Exactly. You can’t take something mastered for 4000 nits (or 10,000 nits in some cases), smash it into 700 nits, and expect it to look as punchy (on average) as SDR content when most consumers adjust their settings to something well above the recommended 100 nits. This is why some manufactures (Sony) clip some highlight detail in HDR in favor of maintaining a higher APL.
The HDR spec is well ahead of today’s display capabilities, so it’s going to take some time for the full benefits to present themselves.
MTFBWY…A
“In this case I optimized the display for HDR content in general, which always appeared somewhat dark, flat, and desaturated on my TV screen, compared to SDR content even if the dynamic range was obviously increased.”
The problem is that most people have their TVs set with very high SDR brightness levels.
At least for me the recommended SDR brightness of 100 to 200 nits (according to sources, most mention 100) is just too dark…So yes, HDR will look dark in comparison, but that’s how it was mastered…
I don’t have my SDR set to very high brightness levels. I don’t like high contrast, brightness and saturation on my TV. I set the colors to a natural profile for SDR content, which generally means I have ro adjust, since most TVs by default go for high contrast, and punchy colors.
“In this case I optimized the display for HDR content in general, which always appeared somewhat dark, flat, and desaturated on my TV screen, compared to SDR content even if the dynamic range was obviously increased.”
The problem is that most people have their TVs set with very high SDR brightness levels.
At least for me the recommended SDR brightness of 100 to 200 nits (according to sources, most mention 100) is just too dark…So yes, HDR will look dark in comparison, but that’s how it was mastered…
I don’t have my SDR set to very high brightness levels. I don’t like high contrast, brightness and saturation on my TV. I set the colors to a natural profile for SDR content, which generally means I have ro adjust, since most TVs by default go for high contrast, and punchy colors.
What do you use to calibrate your display?
MTFBWY…A
“In this case I optimized the display for HDR content in general, which always appeared somewhat dark, flat, and desaturated on my TV screen, compared to SDR content even if the dynamic range was obviously increased.”
The problem is that most people have their TVs set with very high SDR brightness levels.
At least for me the recommended SDR brightness of 100 to 200 nits (according to sources, most mention 100) is just too dark…So yes, HDR will look dark in comparison, but that’s how it was mastered…
I don’t have my SDR set to very high brightness levels. I don’t like high contrast, brightness and saturation on my TV. I set the colors to a natural profile for SDR content, which generally means I have ro adjust, since most TVs by default go for high contrast, and punchy colors.
What do you use to calibrate your display?
For my own viewing pleasure, I optimize my TV to my personal preferences, so I wouldn’t call it calibrating.
“In this case I optimized the display for HDR content in general, which always appeared somewhat dark, flat, and desaturated on my TV screen, compared to SDR content even if the dynamic range was obviously increased.”
The problem is that most people have their TVs set with very high SDR brightness levels.
At least for me the recommended SDR brightness of 100 to 200 nits (according to sources, most mention 100) is just too dark…So yes, HDR will look dark in comparison, but that’s how it was mastered…
I don’t have my SDR set to very high brightness levels. I don’t like high contrast, brightness and saturation on my TV. I set the colors to a natural profile for SDR content, which generally means I have ro adjust, since most TVs by default go for high contrast, and punchy colors.
What do you use to calibrate your display?
For my own viewing pleasure, I optimize my TV to my personal preferences, so I wouldn’t call it calibrating.
Fair enough, and totally valid. I like to clarify this sort of thing so when I see screenshots, I have some context.
MTFBWY…A
“In this case I optimized the display for HDR content in general, which always appeared somewhat dark, flat, and desaturated on my TV screen, compared to SDR content even if the dynamic range was obviously increased.”
The problem is that most people have their TVs set with very high SDR brightness levels.
At least for me the recommended SDR brightness of 100 to 200 nits (according to sources, most mention 100) is just too dark…So yes, HDR will look dark in comparison, but that’s how it was mastered…
I don’t have my SDR set to very high brightness levels. I don’t like high contrast, brightness and saturation on my TV. I set the colors to a natural profile for SDR content, which generally means I have ro adjust, since most TVs by default go for high contrast, and punchy colors.
I’m the same way as DrDre. Natural color palette and nice contrast (Good detail in highlights and detail in dark scenes), but nothing to fancy.
Edit: With a slight pop effect.
One day we will have properly restored versions of the Original Unaltered Trilogy (OUT); or 1977, 1980, 1983 Theatrical released versions (Like 4K77,4K80 and 4K83); including Prequels. So that future generations can enjoy these historic films that changed cinema forever.
Yoda: Try not, do or do not, there is no try.
DrDre can you look at AOTC again since you made adjustments to your display. Although nowhere near as atrocious as the Blu-rays teal shift, I’m still seeing a light teal shift. It now looks like a cross between the DVD and Blu-ray.
I’m still watching the 1080p Hd D+ as I don’t have any 4k tech yet.
Thanks.
One day we will have properly restored versions of the Original Unaltered Trilogy (OUT); or 1977, 1980, 1983 Theatrical released versions (Like 4K77,4K80 and 4K83); including Prequels. So that future generations can enjoy these historic films that changed cinema forever.
Yoda: Try not, do or do not, there is no try.
All modern TVs come with a “calibrated” mode, usually called “movie” or “cinema”.
That’s quite close to calibrated settings, and unless you have measurement equipment (*), it’s as good as you will get.
If your TV is 1-2 years old it should be quite good really.
I have my TV set like this (movie), but with higher brightness that default for SDR. I see no problem with personal preference, as long as you keep the baseline in mind.
You can check this even with no equipment.
Whites on the TV should match these references 😉
DrDre can you look at AOTC again since you made adjustments to your display. Although nowhere near as atrocious as the Blu-rays teal shift, I’m still seeing a light teal shift. It now looks like a cross between the DVD and Blu-ray.
I’m still watching the 1080p Hd D+ as I don’t have any 4k tech yet.
Thanks.
Having looked at the PT in general, my conclusion is, that TPM looks the worst, and ROTS looks the best. TPM has more DNR applied to it, and its colors are more contrasty, and saturated than any of the other films. I was surprised how orange the faces are in the scene, where Obi-Wan and Qui-Gon discuss Anakin being a vergence. Overall I think this transfer is somewhat subpar, compared to other films based on a 2K DI. I was suprised how good AOTC looks, given it was filmed with 1st generation digital cameras. If a teal effect was applied to the HDR color grading, it was done very conservatively, and with good taste imo. Skies generally look pretty natural. ROTS is pretty much comparable to a modern day transfer, based on a 2K DI. The colors on AOTC and ROTS are a bit more saturated than the OT, but I think it fits the material. The HDR color grading in my view improves the blend of practical and digital effects, particulary for AOTC.
Will the forthcoming “filmmaker mode” standard apply to colors/brightness/contrast or just to other picture settings like the soap opera effect?
I would put this in my sig if I weren’t so lazy.