logo Sign In

Info Wanted: Trying to understand film preservation... perhaps a stupid question, but shouldn't digital masters be struck from theatrical prints?

Author
Time
 (Edited)

So today I was thinking about film preservation, but more specifically, the process that goes into scanning a movie for home video release (eg, blu ray).  I kind of had an “ah ha” moment (that could very well be wrong!).

Now, I know very, very little about this subject, so this may be an ignorant question… but for home video releases, doesn’t it make the most sense to scan a theatrical print of a film, rather than an earlier generation source, such as an interpositive?

I could be totally off-base with this, but my reasoning is: when filmmakers make a movie, don’t they take into consideration the generational losses that occur when going from the original camera negative to the interpositive, from the interpositive to the internegative, and finally from the internegative to the theatrical print?  And aren’t those losses part of what makes the movie “the movie”?

That is, I would assume they make filmmaking decisions knowing that grain will increase, and resolution will be lost by the time theatrical prints are made.  For example, when doing practical effects, perhaps they realize that the harness cables that yank the stunt man across the room may be clearly visible on early generation prints, but won’t show up on theatrical prints.

So, when a digital master is struck using an earlier generation print, is it possible the audience might see things that the filmmakers did not intend them to see?  Other than the filmmakers themselves, does anyone actually ever get to see an interpositive projected?  It sounds crazy, but wouldn’t a scan of an interpositive be too good, and not represent what the film was actually supposed to look like?

In other words, isn’t the theatrical print the final product, and is in fact “the movie”?  As Star Wars fans who want to see what audiences saw in 1977, shouldn’t we actually want a (preferably pristine) scan of what would’ve been projected in theaters?  Wouldn’t a digital scan of an interpositive not represent that?

Of course, I assume there are no “pristine” theatrical prints floating around, and scanning one and cleaning it up takes a tremendous amount of effort (thank you team_negative1!).  But couldn’t a studio that’s releasing an old movie clean up the interpositive, print a new internegative from it, and then print a new “theatrical” print from that (all on film), and then scan this new “theatrical” print for the blu ray release?  I know this sounds crazy, but wouldn’t the resulting digital master be the closest thing to what was projected in theaters when the film was released?

Does this make sense?  I know I have a lot to learn, so please be gentle. 😃

Anyone remember different camera angles from ROTJ?

Author
Time
 (Edited)

 

I don't know much about the subject either, but I believe early generation material probably represents filmmaker intent better, since that's the level of quality they compose their movies in. If we are interested in color timing as well then the final answer print might be the movie. Even with an effect-heavy picture like Star Wars, where generational loss could help cover some "problems", the filmmakers would probably prefer a clean, highly resolved image that could be projected large.

Also, the Negative1 project might actually show more than what viewers in 1977 could see in theatrical showings because I believe their Telecinch device's focus is more precise than that of a regular movie projector.

 

 

Author
Time

There was a long contentious discussion about the Blu Ray of The Wizard of Oz at the Home Theater Forum sometime back concerning this. Some people were dismayed that the string making the Cowardly Lions tail has been digitally removed. It was argued that on a 1939 release print, audiences would never have been able to see the wire, because the filmmakers knew the film grain would conceal it.

I'd wager you'd probably not see the wires on the Learjet model on an old print of Goldfinger, either. Lowry painted them out for the UE master.

One article I read about matte paintings as a kid specifically mentioned film grain helps them blend into the live action.

Forum Moderator

Where were you in '77?

Author
Time

People complaining about Wizard of Oz have to be crazy.  A Beautiful 8k Scan from the ocn except for the black and white opening which was a duplicate neg the original was lost.

Beautiful technicolor restoration even better than the restore done on the adventures of Robin Hood.

Did not even notice that they did digital mucking about with the blu ray, looks about as close to the premiere print as possible on a blu ray.

“Always loved Vader’s wordless self sacrifice. Another shitty, clueless, revision like Greedo and young Anakin’s ghost. What a fucking shame.” -Simon Pegg.

Author
Time

I don't think there were issues with the look of the film, but rather the ethics of erasing the wires.

There is still supposed to be a bit of Judy's dialog that was accidentally deleted several years before. An error that has carried over to the original mono mix on the Blu Ray.

Forum Moderator

Where were you in '77?

Author
Time

This is a good philosophical point. He's right. Discuss!

Author
Time

The primary reason for using IP and higher generation sources is to get the cleanest transfer possible to begin with, and the one closest possible to the o-neg. I agree with you that the theatrical should be kept in mind always, and to be perfectly honest If I had my way there would also be a comparison scan of a well-kept theatrical print for extreme reference, as to fully match the original color timing and presentation as possible.

The color timing is what is typically lost in transfers, and without the theatrical in mind there's no chance of fully replicating it. The best example of doing it correctly I can think of is the Criterion Spartacus, for which Robert Harris and the colorist used specific multi-frame references from the 1991 restoration to hand color the DVD master in order to properly preserve the film. Universal scrapped all this for their Blu-ray in addition to heavy image manipulation.

With a pristine release print, it is possible to make a great master and release. What you have to deal with is a greater chance of deterioration, generational loss, improper printing or anything else that could have arisen in the printing process. Additionally you are stuck with the print audio instead of higher generation materials, such as the audio masters.

VADER!? WHERE THE HELL IS MY MOCHA LATTE? -Palpy on a very bad day.
“George didn’t think there was any future in dead Han toys.”-Harrison Ford
YT channel:
https://www.youtube.com/c/DamnFoolIdealisticCrusader

Author
Time

There were 2 processes in distributing film. The first is chemically developed film. The second is printed film.

Chemical developed film uses the master negative (usually referred to as the o-neg) to create a limited number of interpositive prints. This is pretty much the last the stage that the production team/studio have any say in the matter. The negative is color timed to create the interpositives and then the distribution company uses the interpositives to create a set of internegatives that are used to create the final film prints. Other than color interpositives having a tint (which I believes aids with the whole chemical duplication process) they are the best source for correctly color timed images. There is no guarantee that any of the internegatives or the final prints aren't a bit off in color. If the o-neg still exists and is in good shape, it still needs the correct color timing. That data can be recorded but the interpositive can also be used to recover that information.

Technicolor is the best example of printed film. Originally they used 3 separate negatives, one for each color, but it can be derived from color film as well. Again, you have to start with the o-neg and create a color timed master. As this is a printing process, it requires a master for each color. These are called the separation masters. Using a chemical process, the masters are used to create the print masters. What it amounts to is that the process pits the film. The more color, the deeper the pit. In the printing process, the pit is filled with gel ink in cyan, magenta, and yellow, and transferred onto the final print. When the gel ink dries, it flattens out and leaves the final print smooth. It also creates some inherent alignment issues.

So no matter the movie, the best source is the o-neg properly timed and the next best source is an interpostive of the color separation master. In some cases, they have used part of the o-neg and part of the separation master, usually retrieving the cyan and magenta from the o-neg and the yellow from the separation master. They have a new all digital technicolor restoration process that aligns the separate colors and produces a truly outstanding picture that is clearer than any techincolor IB print ever was.

So whether it is a chemically developed print or a technicolor IB print, it is sufficiently removed enough from the work of the production team and is prone to enough errors in production that it doesn't represent the best source to retrieve a film from. For the DVD market and earlier, they often did use prints (cheaper and easier than going to the effort to do a restoration on the negative), but with the advent of HD and Blu-ray, the clarity called for calls for doing a proper job from the best source material available. In some cases we have what we have. Citizen Kane was restored from a print because the negative was lost, but in the case of most modern films, that will produce an inferior product. HD still isn't high enough quality that the majority of film making slight of hand is revealed.

Author
Time
 (Edited)

I think zombie84 nails it, that this is a philosophical discussion, and your position can vary depending on how you define your terms and what your basic assumptions are.

I think most of the people restoring classic films from original negatives would argue that "fidelity" means "closest to what was captured on film" rather than "closest to what was seen in theatres", for example.  It all depends on what you consider "the film", and what you think of the various processes that happen to film between shooting and projecting.

If you think "fidelity" could include any source higher up the chain than prints audiences saw in theatres, then it becomes important to sort of what you think of those processes. I think the processes that happen to a film during can be separated into a few categories:

1) Changes that a filmmaker expects to happen and they take these changes into consideration during filming.
2) Changes that a filmmaker expects to happen but which are not taken into consideration during filming.
3) Changes that a filmmaker does not expect.

What falls into these categories, of course, can vary a bit from film to film, but #1 could include generational loss hiding detail or aspect ratio cropping, #2 could include reel change cue marks and gate weave, #3 could include tears, scratches, bad duplication, etc.  Or to be more topical, Tantive IV's burn marks.

And there's the rub.  Assuming for a moment that these issues can be so neatly segregated, I personally think #2 and #3 are okay to take out by using a higher-generation source, but #1 is not.  How can you get some of these without the others?

Using The Wizard of Oz example, I think they should have used sources as close as possible to the original negatives--this would ensure a clean, undamaged print with no cue marks (color accuracy would be taken from whatever source best preserved it, not necessarily the earliest generation).  However, taking out the wires was IMO a mistake.  First, you have to ask yourself if the filmmaker really expected the wires to be hidden by generational loss--audiences at the time had no expectation of perfect visual effects, and seeing the wires would have been unremarkable to a contemporary theatre-goer.  So if you land on that side, leave the wires in--no big deal.  Or if you think, as I do, that there was just an industry-wide trust that generational loss would tend to hide wires, then the right solution would be to soften the entire film until the wires disappeared, simulating generational loss, but without the other unwanted effects.

I also think this is ultimately academic, because unlike a lot of film tinkering that goes on during Blu-ray restorations, this one is largely user-fixable.  If you think your 1080p Blu-ray is too sharp, set your device to 720p.  Still too sharp, go for 480p.  Yes, you still have to take positive steps to achieve the results you want, but compared to getting rid of DNR or revisionist color timing, it's pretty easy.  At least my player allows this, maybe it's not common?

Project Threepio (Star Wars OOT subtitles)