I’m confused by the HDR thing fairly frequently - I don’t have any 4K displays in my house, although I do have an HDR-capable TV, but only a few apps/games have tried to kick the HDR on (Fleabag on Amazon automatically shifted over to HDR for season 2, and I found it didn’t look as good, honestly. It might have just been that title wasn’t converted/applied well). I understand that there are separate tags/flags that are sent along with the picture info that tell the tv what color range and values to apply to the image, so my understanding (and this is the part where I want someone to please pick this apart and explain it to me, haha) is that when the stream is fast enough to make 4K HDR viable, it switches over from the SDR file to this 4K file. This 4K file is now basically as flat as possible because the HDR tags/flags need to be applied to it, and those tags will change on a scene to scene basis.
but the 1080p file that I’m getting when I watch any of the Star Wars movies - those files are sourced from the same 4K master. Is the primary difference that their color grading isn’t sent along as additional picture information, but baked into the file/stream itself?
And while HDR colors/contrast should have more pop, depth, and vibrancy, there should be a general similarity between what it was graded to look like at 2K/1080p, and what it looks like in 4K, right? Like you shouldn’t go from one to the other and have the other look markedly different in terms of color temperature and contrast levels beyond the basic advantages that the expanded range will provide?
When I bring up Attack of the Clones in 1080p, it doesn’t look the same as my blu-ray. When I bring up Star Wars, it DEFINITELY doesn’t look the same as my blu-ray. That’s how it’s supposed to be even at standard def, correct?