canofhumdingers said:
I should also mention that, when scanning film the quality of the source material is incredibly important. Working with a high quality OCN or IP can certainly provide enough information to make 4k scans or better worthwhile. It's just important to understand you're not seeing anything NEAR that at even the best theater. Which then opens the debate of should the goal of home video be to reproduce what you would've seen opening night? Or provide the best possible viewing experience, even if it's technically significantly BETTER than even an absolutely perfect theatrical presentation?
Depends on when the film was made. Like I said, recent-er films go through a digital intermediate for colour correction, so what you get at the theater is no better than whatever resolution the DI was made at. Older films are analog from negative to print, so even though perceivable information drops, you're still getting a "pure" transfer - it's more of a softening than a downrezzing...
Here's a way that I like to put the debate. If you watched Star Wars in 1977, the photons from the studio lights bounced off of Mark Hamill, and were chemically transferred to film. Then that image was chemically transferred down several generations to the theatrical print. Along this line, the film that was on the set naturally reacted through each successive generation eventually into your eyes, so what you got to see was the equivalent of sitting on the set, with a hazy filter set up in front of you.
If you went to see Attack of the Clones, the studio lights bounced off of...one of the actors, and each photon was mechanically estimated almost exactly the same way each time by a digital sensor. What you got you see was a very sharp binary equivalent of the set.
So what would you prefer? A blurry direct view of the action, or a very accurate estimation of the action by a group of transistors?