poita said:
If using HDR to re-grade an old film, then they are usually working from the neg, not prints, and current scanners are able to capture the neg without the need for multiple transfers.
You are right in that the original grade for older movies is totally different to what you could do with a HDR grade, and anything is revisionism really when making a home version, all home versions are radically different to the cinema version, regardless of whether HDR or not.
However, exactly what the grade is, is up to the colourist. You could do a totally new grade to change the experience into something completely new, or you could use the wider colour gamut, and finer colour detail of HDR to create a home version of a movie that is closer to the original cinema release than ever before
It is up to the people using it, their skill and intention. It is a huge step forward in visual fidelity and brings the recorded image much closer to reality. What directors and others choose to do with the tools is another thing altogether, but it can certainly be used to make versions of films that are more true to the originals.
That’s the part I’m actually sure about (via details at TheDigitalBits). For non-4k films, they start with the DI, not the original negative. This is so they have a finished cut with completed effects and such. They may go to the negative to re-extract the HDR, but I’ve not seen that specifically mentioned. While you could get >2k out of the original negatives, any visual effects wouldn’t be present and might stand out if you layered a pre-fx HDR layer onto a post-fx video.
What they don’t do is re-scan the original negatives, recut the film and recomposite effects and what-not. That would be a full restoration and too expensive.
You are completely right, theatrically there is usually a wider color gamut than what has been brought to home theaters before now, but that’s just a portion of it.
The confusion comes from there being 3 things going on with UHD BDs.
-
The color: Theatrical colors are quite wide. Far superior to NTSC & PAL, and better than standard HD. Wider gamut is good, we can get closer to a theatrical presentation with UHD discs. Unfortunately, regular viewers probably wouldn’t know the difference (not sure I would). So it is a poor selling point.
-
The resolution: 4k is good for 4k films. That works best for 70mm prints and newer 4k digital films. Unfortunately many/most catalog titles even if shot on film capable of 4k scans were edited and finished around 2k (or a bit better).
That means either a limited number of UHD BD titles or upscale films and hope you don’t get sued… which leads us to why #3 exists.
-
HDR: Extracting more light information from the source material (probably with some computer jiggery pokery) to ‘enhance’ the picture and make it look more natural or realistic.
Good or bad, this gives them the ability to do what I was talking about: Upscale 2k to 4k and then create a 4k layer of HDR that can be overlayed on the film. You can then legitimately call it 4k even if the original print of the film you’re working with is only 2k.
HDR lets studios sell catalog titles as 4k when there was no other way they could remarket them! BD is the end of the upgrade treadmill without it.
Edit: I also wouldn’t worry about DolbyVision HDR. They are late to the party, have low adoption from content providers, and limited TV manufacturer support. One of the bigger ‘gets’ was Vizio, which has thrown a shoe this year by deciding their TVs are only “Home Theater Displays” AKA dumb monitors. Without tuners, they can’t even legally call them TVs. Without apps, they can’t call them ‘Smart Devices’. Vizio fans are pushing back.
In a few years if DolbyVision is still around then that may be the superior HDR, but I think they’re going to end up being the losers in the HDR format war.