“In this case I optimized the display for HDR content in general, which always appeared somewhat dark, flat, and desaturated on my TV screen, compared to SDR content even if the dynamic range was obviously increased.”
The problem is that most people have their TVs set with very high SDR brightness levels.
At least for me the recommended SDR brightness of 100 to 200 nits (according to sources, most mention 100) is just too dark…
So yes, HDR will look dark in comparison, but that’s how it was mastered…
Exactly. You can’t take something mastered for 4000 nits (or 10,000 nits in some cases), smash it into 700 nits, and expect it to look as punchy (on average) as SDR content when most consumers adjust their settings to something well above the recommended 100 nits. This is why some manufactures (Sony) clip some highlight detail in HDR in favor of maintaining a higher APL.
The HDR spec is well ahead of today’s display capabilities, so it’s going to take some time for the full benefits to present themselves.