Originally posted by: zombie84
Of course there is generational loss--there is generational loss in HD as well. But because film starts off with such a high-resolution that even by the time it gets down to the release print it still holds a tremendous amount of detail, many times more than HD, which also gets degraded as its put into a computer and printed back out to film.
Presumably, one of the reasons Lucas shot digital was to reduce the generational losses inherent in working with film. If you're comparing fine-grain camera negative, I agree that there's no scientific basis for arguing against the fact that film has more resolution than HD. (HD at 1080p, anyway). But by the time the results are up on the screen? Well, then, I think it's far less of a slam dunk for good ol' film.
Of course there is generational loss--there is generational loss in HD as well. But because film starts off with such a high-resolution that even by the time it gets down to the release print it still holds a tremendous amount of detail, many times more than HD, which also gets degraded as its put into a computer and printed back out to film.
I think that's crap. I think there are two points there that you're pulling out of your ass, and that you have not proven:
1. "There is generational loss in HD" -- I'd like to see proof (or at least a rationale) for this. HD is digital, and the workflow is typically of significant precision that I don't see how there's a generational loss anywhere.
2. "[A release print] still holds a tremendous amount of detail, many times more than HD" -- I would allow that a release print might have "somewhat" more detail than HD, but "many times more"? I think that's your fanciful imagination. Release print stock might have the capacity for holding much better resolution, but by the time the processed image is on there? I think you'd be hard pressed to argue that so much more detail from the camera neg has ended up on a film release print than if you'd gone HD.