Originally posted by: Karyudo
I think that's crap. I think there are two points there that you're pulling out of your ass, and that you have not proven:
1. "There is generational loss in HD" -- I'd like to see proof (or at least a rationale) for this. HD is digital, and the workflow is typically of significant precision that I don't see how there's a generational loss anywhere.
Originally posted by: zombie84
Of course there is generational loss--there is generational loss in HD as well. But because film starts off with such a high-resolution that even by the time it gets down to the release print it still holds a tremendous amount of detail, many times more than HD, which also gets degraded as its put into a computer and printed back out to film.
Presumably, one of the reasons Lucas shot digital was to reduce the generational losses inherent in working with film. If you're comparing fine-grain camera negative, I agree that there's no scientific basis for arguing against the fact that film has more resolution than HD. (HD at 1080p, anyway). But by the time the results are up on the screen? Well, then, I think it's far less of a slam dunk for good ol' film.
Of course there is generational loss--there is generational loss in HD as well. But because film starts off with such a high-resolution that even by the time it gets down to the release print it still holds a tremendous amount of detail, many times more than HD, which also gets degraded as its put into a computer and printed back out to film.
I think that's crap. I think there are two points there that you're pulling out of your ass, and that you have not proven:
1. "There is generational loss in HD" -- I'd like to see proof (or at least a rationale) for this. HD is digital, and the workflow is typically of significant precision that I don't see how there's a generational loss anywhere.
This is a misunderstanding of HD. HD is currently recorded onto tape. In the case of Star Wars II and III it was shot on HDCAM tape, so after the tape is done it duplicated and then the original is put into an HDCAM deck and captured to the computer. Then after the edit is done, it is printed back onto film, an IP which the release prints are then duplicated from. And this is starting from 1K resolution. So tehre is your generational loss. I don't know why you would think I am pulling this out of my ass. (now there is actually storage technology coming out for HDD recording but this is still very cumbersome and only practical for studio shooting)
Film goes through a process of being scanned into a computer if a DI is being made--at 2K or nowadays 4K resolution--and then printed back onto film for the IP which release prints are made. So even in the highly-degraded DI method of release it is still many times higher. A one day we will have 6K DI's, meaning near-lossless scanning.
A lot of times a film is merely optically printed--the negative is duplicated in a printer for color timing and the resulting IP is used for release prints. This is much higher than current DI technology and actually preserves a truer film look--for instances of why early DI technology sucked see X-Men 2.