you have not done anything wrong but video will definitely be produced only in 10 bit color depth from now on whilst previously a lot of content was produced in 8 bit color depth. This produced Color Banding and solid gradients sometimes in some video sources.
8 bit depth was a mistake really.
To process 10 bit color depth and a high contrast you need to have a display that can handle it. If you want to get a New TV only get one that is labelled “ULtra HD premium” anything else is not of any Standard of HDR. A TV can exceed the standard also of Ultra HD premium.
I would not bother getting a 12 bit color depth display though either. The LCD panel needs to be 10 bit native not 8 bit dithered to be proper HDR.
There is very little demand for 4K video. there is a demand for an improved image and more detail better contrast more color range. Nobody want’s to stream 4k video to everyone it stupid and unrealistic and to be able to do that it’s very demanding.
all of these improvements that can be done without the massive burden of 4k for which you need a massive screen to get the benefit from anyway also and which most people don’t have the space for either. And the infrastructure for the needed bandwidth is not there either to do it and provide the service for everyone. It’s also a stretch for broadcast nobody is going to be able to get 4k down a normal aerial.
There is however everything right with scanning old films in 4k and with a higher exposure. And then displaying it on a TV with HDR at whatever resolution is needed for the size of your room / screen.
I would suggest this is the way forward for most people. Average everyday people that is.
http://www.techradar.com/news/sony-is-bringing-hdr-to-its-1080p-tvs-but-only-ps4-owners-will-benefit