logo Sign In

Making our own 35mm preservation--my crazy proposal — Page 6

Author
Time
Presumably, one of the reasons Lucas shot digital was to reduce the generational losses inherent in working with film. If you're comparing fine-grain camera negative, I agree that there's no scientific basis for arguing against the fact that film has more resolution than HD. (HD at 1080p, anyway). But by the time the results are up on the screen? Well, then, I think it's far less of a slam dunk for good ol' film.

Of course there is generational loss--there is generational loss in HD as well. But because film starts off with such a high-resolution that even by the time it gets down to the release print it still holds a tremendous amount of detail, many times more than HD, which also gets degraded as its put into a computer and printed back out to film.


Originally posted by: boris
The fact is I'm not saying you can't get 6000 lines or more from 35MM - but the level of detail in 35MM is about the level Lucas achieved with his digital filming. I don't see my views as "extreme" - like I said, look at the DVD transfer for Last House on the Left - now it wasn't made from the original camera negatives - but they did use the highest quality sources for the movie available - and the film was shot in 16MM - it's not meant to have a lot in common with SW, except to say that the film was in relatively bad condition. There's no grain removal either, and in my opinion the level of detail in the film is fully bought out by the DVD resolution, which means that the level of detail in the theatrical print reels they used is less then the level of detail expressed by DVD resolution. That's how it was shown theatrically - and the sources they used were the best quality ones they could find.

I already addressed this--shoe-string budget using 1970 technology versus ultra-high-budget 2005 technology. My previous post debunks this and actually comes from first hand experience and not home video viewing. I noticed you ignored it.

Now as far as I know, they mastered the LHOTL DVD from 35MM prints - which are "blown up" from the 16MM negatives, and those 35MM prints will hold the quality and detail in there better then a 16MM print will, if that makes sense. You know, like if you get a photo developed onto A4 it'll hold more detail and quality then if you get it developed onto you standard sized photo, because there's more information in there. With SW you don't get that - it was transferred from 35MM anamorphic negatives to 35MM prints - so the prints are at the same level of detail and quality as the source (or less detailed if anything).

If LHOTL was indeed blown up to 35mm for the DVD this actually degrades the original image. It introduces grain and softens the image, and then the scanning of the print introduces more errors. If they had gone back to the original 16mm originals the film would look much better, but again, this is a shoe-string budget fourty year old film.

I've said I think 35MM and HD is roughly equal in terms of detail and quality. Now I think most 16MM films would have a bit more detail in there then LHOTL has, but it would still only be about standard definition in quality.

If the DVD of Last House of the Left is your only reference for 16mm resolution then how can you make these kinds of statements? 16mm, by scientific fact, is higher resolution than HD, and when photographed properly can rival 35mm in apparent sharpness. I say "apparent" sharpness because even though 16mm film shot with an SR3 with a Cook S4 prime lens and 50D stock will actually look similar to or equal to 35mm to the eye, in actuality it is not, and the resolution is the same as LOTL, which is about just over HD.

An example of a modern 16mm film i guess would be The Devils Rejects--i am positive that everyone who viewed this in theaters simply assumed it was a 35mm film. A film like Crash for instance, which had a kind of "dirtyness" to the image appears nearly equal to the eye to Devils Rejects in terms of image resolution.

To zombie84: Super35 is not always cropped, and can use "all the negative" without becoming anamorphic - this yields to being able to film for longer.


Well yeah, thats what Super-35 is--it exposes the whole frame but then there is an optical step where it is cropped and blown up. When comparing it to anamorphic widescreen this then indicates a 2:35 AR, so there is indeed extensive cropping and blow-up.

And anamorphic filming presents its own problems that are created by stretching the image vertically onto the film, such as depth of field - which will always be expressed better using a non-anamorphic lens.


The qualitys created by an anamorphic lens are considered desirable, and as for the apparent increase in depth of field, this is also considered a very beautiful quality and is one of the strongest assets of the anamorphic format--DP's are usually fighting for less depth of field, not more.

"Let me tell you something though--HD, and AOTC and ROTS obviously, do indeed appear sharper and more detailed than 35mm because HD gives that impression."

Then why are you saying 16MM is better? If it "gives that impression" that's all it's supposed to do.


You just totally missed the point. 16mm is still better. HD lacks the inherant softeness that film--any size film--has, because the edges are harsh, the image is very clear, and the exposure lattitude creates more contrast. Thus the image becomes crisp and sharp-looking and fools the eye into looking higher-quality than it actually is. Think of edge enhancement on a DVD--an untreated image and a sharpened image have the same resolution, but the edge enhancement creates the impression of a higher quality image. If you were to examine a resolution test chart you would see that 16mm and HD are about equal (again, resolution is also dictated by the lens and stock choice but as far as inherant limits of the format, 16mm has the advantage over HD in theory).

And there isn't ugly "too sharp" shots in Superman Returns.


There is. Superman Returns is the best-looking HD film i have ever seen, but it is still there--oh man is it still there. The fact that you cant tell the difference only demonstrates your lack of expertise in the area. But to casual viewers, they don't notice because they are not educated in film and don't have a discernable eye for the subtlties of the format--I doubt people even noticed the difference in a film as horribly photographed as AOTC.

Boris, i know your arguments seem logical to you, but they really are not. I work in cinematography, I'm professionally trained to know this kind of stuff and actually have experience working with 16mm and 35mm film as well as HD. You obviously are not any kind of professional in the field, and while you do seem to know a little about imaging systems its that little bit of knowledge that has led you to HUGE misunderstandings. No one who actually has experience in shooting would make the claims you are making. I'm not trying to belittle you or anything but you literally don't know what the fuck you are talking about.

And no, I'm not some "anti-HD" film drone. HD has its advantages in areas--but not in image quality. No one ever shoots in HD because it would yield a better image. HD can be less expensive and allow a better post flow but there is absolutely no advantage visually and it is flawed with tons of problems that will take at least a decade to work out.

As for buying a film scanner or telecine--yup, it would be more than just the price of the machine. Because aside from that you need all sorts of controllers and high-tech accessories that add up to more than the price of the machine itself.
Author
Time
Originally posted by: zombie84
Presumably, one of the reasons Lucas shot digital was to reduce the generational losses inherent in working with film. If you're comparing fine-grain camera negative, I agree that there's no scientific basis for arguing against the fact that film has more resolution than HD. (HD at 1080p, anyway). But by the time the results are up on the screen? Well, then, I think it's far less of a slam dunk for good ol' film.


Of course there is generational loss--there is generational loss in HD as well. But because film starts off with such a high-resolution that even by the time it gets down to the release print it still holds a tremendous amount of detail, many times more than HD, which also gets degraded as its put into a computer and printed back out to film.


I think that's crap. I think there are two points there that you're pulling out of your ass, and that you have not proven:

1. "There is generational loss in HD" -- I'd like to see proof (or at least a rationale) for this. HD is digital, and the workflow is typically of significant precision that I don't see how there's a generational loss anywhere.

2. "[A release print] still holds a tremendous amount of detail, many times more than HD" -- I would allow that a release print might have "somewhat" more detail than HD, but "many times more"? I think that's your fanciful imagination. Release print stock might have the capacity for holding much better resolution, but by the time the processed image is on there? I think you'd be hard pressed to argue that so much more detail from the camera neg has ended up on a film release print than if you'd gone HD.
Author
Time
Originally posted by: Karyudo
Originally posted by: zombie84
Presumably, one of the reasons Lucas shot digital was to reduce the generational losses inherent in working with film. If you're comparing fine-grain camera negative, I agree that there's no scientific basis for arguing against the fact that film has more resolution than HD. (HD at 1080p, anyway). But by the time the results are up on the screen? Well, then, I think it's far less of a slam dunk for good ol' film.


Of course there is generational loss--there is generational loss in HD as well. But because film starts off with such a high-resolution that even by the time it gets down to the release print it still holds a tremendous amount of detail, many times more than HD, which also gets degraded as its put into a computer and printed back out to film.


I think that's crap. I think there are two points there that you're pulling out of your ass, and that you have not proven:

1. "There is generational loss in HD" -- I'd like to see proof (or at least a rationale) for this. HD is digital, and the workflow is typically of significant precision that I don't see how there's a generational loss anywhere.


This is a misunderstanding of HD. HD is currently recorded onto tape. In the case of Star Wars II and III it was shot on HDCAM tape, so after the tape is done it duplicated and then the original is put into an HDCAM deck and captured to the computer. Then after the edit is done, it is printed back onto film, an IP which the release prints are then duplicated from. And this is starting from 1K resolution. So tehre is your generational loss. I don't know why you would think I am pulling this out of my ass. (now there is actually storage technology coming out for HDD recording but this is still very cumbersome and only practical for studio shooting)

Film goes through a process of being scanned into a computer if a DI is being made--at 2K or nowadays 4K resolution--and then printed back onto film for the IP which release prints are made. So even in the highly-degraded DI method of release it is still many times higher. A one day we will have 6K DI's, meaning near-lossless scanning.
A lot of times a film is merely optically printed--the negative is duplicated in a printer for color timing and the resulting IP is used for release prints. This is much higher than current DI technology and actually preserves a truer film look--for instances of why early DI technology sucked see X-Men 2.
Author
Time
Originally posted by: zombie84
HD is currently recorded onto tape. In the case of Star Wars II and III it was shot on HDCAM tape, so after the tape is done it duplicated and then the original is put into an HDCAM deck and captured to the computer. Then after the edit is done, it is printed back onto film, an IP which the release prints are then duplicated from. And this is starting from 1K resolution. So there is your generational loss.


Where? I see camera to digital tape (no loss); digital tape to computer (no loss); computer editing (no loss); digital projection. That's how I saw Ep II, for example. The only generational losses you have indicated are film based. Without introducing crappy old film back into the equation (like Lucas wants to avoid, for example), it's lossless until it's projected.
Author
Time
Originally posted by: Karyudo

Where? I see camera to digital tape (no loss); digital tape to computer (no loss); computer editing (no loss); digital projection. That's how I saw Ep II, for example. The only generational losses you have indicated are film based. Without introducing crappy old film back into the equation (like Lucas wants to avoid, for example), it's lossless until it's projected.


The act of digitizing it is lossy and the act of transmitting it back to data for digital projection is lossy as well--obviously though the loss is very minimal. However, digital projectors currently cannot even faithfully render the image properly so this is moot. One day however, HD will be lossless in this manner--but not yet. I am not arguing that HD will never be good, ever--obvious digital video is the future, and one day it will rival and more-or-less replace film. But not for a while.

I actually prefer HD transfered to film rather than projected directly--it hides the inherant ugliness of the HD image by softening it and introducing some grain.

And all of this ignores the fact that 35mm film is still higher quality even with the loss introduced through duplication. The loss from negative to release print, while notable, still yields an image much higher quality than anything HD can give us, even if you were projecting the original HD footage straight from the original tape. The only instance where the two are about equal is in the case of the early 2K DI's from years ago.

But this is all besides the point. Because even though HD is not up there as far as resolution goes, the detail level still holds up fairly well--but this is not the issue. The issue is the inherant flaws of digital video that have not yet been fixed. The sharpness, the unnatural clarity, the colour space issues, the small lattitude. These IMO are more important than resolution, which is currently at a level that is acceptable for most people (including myself--but of course it would be preferrable to have more). This is why most still prefer 16mm over HD--the resolution is almost the same (16mm actually has a bit more) but the inherant beauty of 16mm makes it more preferrable.

I would say it will be at least five to ten years before the playing field actually gets levelled--once we start seeing 2K HD cameras that have acceptable image characteristics then the film versus HD argument will begin to evaporate but even then film will still be technically superior and still preferred and used by most productions who spend millions of dollars to make the film look good and who are used to working with film. My estimation is twenty to thirty years before film actually begins to go away, and even then there will still be those who prefer it.

Author
Time
Originally posted by: zombie84
[M]ost still prefer 16mm over HD--the resolution is almost the same (16mm actually has a bit more)


Hope boris doesn't see this! You might have to try to convince him that no, DVD isn't better than 16 mm film...

I appreciate your arguments and your expertise. Like someone said about two pages back, this has been one of the more fun and interesting threads around here in a while!

I guess we're already seeing a move to higher-resolution cameras and workflows -- 'Superman Returns' used a camera with a 12.4 megapixel, native 16:9 sensor that's the same width as 35 mm film. It hasn't been that many years since George used Cinealtas at just 1920 x 1080, so things are already improving.

In fact, I suspect your estimate on timing is probably slow. I can imagine (although I'm not exactly in a position to predict) that once a few more pictures are done on HD and the idea of HD sort of reaches its "tipping point" (thanks, Malcolm Gladwell), then motion pictures will go the way still cameras have gone in recent years. Could anyone have predicted in 1996 that just 10 years later, digital would be king in the consumer camera market? I bought my first digital camera in 1997, and my friends and family thought digital photography was certainly intriguing, but not ever going to be worth their own hard-earned cash...

Author
Time
The reason i give a rather modest time estimation is that i have heard the argument "film is dead!" so many times over the years that i know it simply wont go away. I have an article from Variety from 1980 which proclaimed "film is dead!" in light of the recent video developments being made, such as the high quality Beta format. In fact in the 70's this argument was made as well. Then in the late 1990's as HD cameras were first being introduced the old argument was brought out: "film is dead!" Well no--here we are many years later and not only is film not dead it is stronger than it ever has been. Kodak's sales of 35mm motion picture stock are higher than they ever have been in the companys 100+ years of existance, and more productions are out there shooting on film than ever before. There is also the fact that many people simply wont want to switch to HD, simply because they can do the same thing with film, only better--i don't expect Roger Deakins to ever contemplate shooting something on video, no matter how good the developments get.

And this is a large part of the issue: sure, HD will one day be able to emulate film with 95% faithfullness--but film is film, with 100% faithfullness; as technology develops and HD prices go down, so too does film-to-digital prices--a 2K DI is now affordable and common for indie films, whereas in 1999 it was extraordinarily expensive. So if a low-budget production can shoot on film, theres no reason why a big-budget studio production cant. There are so many institutions and companies well-ingrained in the film world that cater to the actual film market that even if HD comes out with a 6K perfect-looking camera these companies and inviduals are not going to go away over night; political and economic factors play a big role.

And yeah, the digital still growth is quite impressive--but a lot of professional photographers still don't use digital cameras unless it is for sports and news type of shooting, where the speed, cheapness and ease-of-use is a welcome tradeoff to the lost quality; for professional photographers of any other kind, ie fashion, the majority of them use 35mm film, and for the really high-quality shoots they have to use medium-format film because not only can digital get nowhere near the resolution, film simply looks better. When will we see digital equivalent to medium-format cameras that offer the same image characteristics? Probably ten years, plus another five or so for professionals to make the transition. When you consider that digital still has been around since the late nineties, this makes a thirty year development period, which is about the same as i estimate for motion pictures.
In motion pictures the process that you talk about in the consumer digital still world has already happened as well. Look at the camcorders on the market--they are all either miniDV or DVD. I havent seen Hi8 or 8mm or VHS or VHS-C sold for a long time (although if you look hard you can still find them). Low-budget and no-budget filmmakers also shoot on DigiBeta, BetaCAM, DVCAM, MiniDV or any other form of DV SD videotape because the ease-of-use, speed and inexpensiveness is an acceptable tradeoff for the level of work they need it for. HD is in this category as well--in fact, HD cameras were designed with documentary use in mind, which is why the Sony F900 resembles the long, news-style shoulder-mount cameras. The Arri D-20 that i believe hasn't even come out yet is really the first HD camera designed with dramatic motion pictures in mind.
Really, the so-called HD revolution hasn't begun yet because all the efforts thusfar have been the equivalent of low-budget filmmakers picking up camcorders and news cameras and making movies with them.
Author
Time
I just want to add my 2 cents here. I am basically with Zombie84 on the topic all the way. I shoot film all the time and I have to say that digital isn't film. In fact, it doesn't and shouldn't have to try to be film.

Film is light. Digital is a bunch of 1s and 0s

Digital has fixed resolutions. Film is silver halide crystals 'painted' on a strip of plastic (it doesn't really have a "fixed" resolution. Not all the crystals are in the same place for each frame. It moves - it's very organic and a major component in the look of film). Depending on a number of factors (ASA speed, color/BW, reversal/negative) the crystals will be bigger or smaller.

Now here's what I'd like to see: I'd like to see digital try to capture light in a way that film cannot. I'd like to see digital technology not TRY (I stress this word) to act like film, but become its own entity.

In fine arts, you have watercolors, acrylic, oils, not to mention canvas, glass, copper to paint on. All these different tools to give you different feels and looks.

I would wish people would stop trying to kill off film, and embrace it for what it is. Then I want people to stop embracing digital for what it isn't and instead explore what it can be.

What’s the internal temperature of a TaunTaun? Luke warm.

Author
Time
My last 2c worth.

Can a theatrical print hold more resolution than HD?
Yes it can but doesn't always. Anyone can make a crappy print, and it depends what you consider HD to be.
Almost any theatrical print that was shot well and printed reasonably holds more than 1080P does, that is easily proved by scanning a frame of any given feature at above 1080P and looking at the detail.
You could however find some features that have around 1080P or less detail wise if you looked hard enough.
Film resolution is dependant on lots of things. Weave reduces the resolution markedly, so depending on the camera used the resolution could be much lower than its theoretical limit.
Loss through the printing process - this also reduces the resolution because of light transfer loss and the grain issue - grain boundaries are different on each frame of film, so detail is lost with each optical process you do.
The prints I have messed with on Star Wars do have better than 1080P resolution, I can state that with certainty.
If you consider HD to be 2K or 4K then it comes down to a case by case basis.

Colour resolution is another thing. Once again, if you consider HD to be 8bit colour in 4:2:0 then any film print will have far better colour resolution. If you consider HD to be 12bit 4:4:4 then you come out about square either way.

Latitude, dynamic range, ANSI CR whatever you want to call it, film outshines current HD in most iterations of HD. However before I left we were working with some prototype HDR digital cine cameras that *far* outstripped the dynamic range of film (and recorded in OpenEXR format), they were incredible and should make it to market in the next 5 years.

Depth of field. This is probably the thing that makes the biggest difference between film and video as far as 'look' goes. It is also the easiest to solve. It simply means making the chip the same size as the film frame. Then you get the same depth of field, can use the same lenses etc. Why isn't the 'better' depth of field on small chip digitals better than films shallow depth of field?
Well, for dramas, comedies etc. a shallow depth of field is preferable as it leads the viewers eye to the subject and makes the backgrounds less distracting. That is a big reason why up until recently video always looked like video, even to the total layman.
For some things though a deep depth of field can be preferable, like 3D movies or wide shots etc. or even just for a particular look.

Film is a funny thing, a lot of its attraction is that we grew up with it, and we are comfortable with its 'look'. Had we grown up with a grain free recording medium (i.e. if film had never existed) and then film was introduced *now* people would probably want to know what is with all of the 'noise' in the picture. They would probably also want to know why the pans are all so blurry as well. (low framerate), and why it can't do true blacks. (OK perhaps an IB print could, but in a cinema a black frame from a theatrical print will always let some light through and give you grey)

We all grew up with film, as did the artists who use it, so we have come to find inventive ways to exploit its limitations and they have become 'features' to many of us, and we have grown to love the look.

There is no reason film cannot be emulated digitally, if the resolution was high enough (say 16K), the HDR system employed, then the rest can be simulated. Organic grain can be emulated exactly if you have enough rez to play with (you can either do it algorithmically or you can scan the grain from a reel of stock and transpose it into the picture). Weave through the gate can also be easily achieved, and DoF is the same if you use the right camera. So it could be done even today if you wanted to throw the time and money at it - but it would be easier to just use film anyway.

In practice though, not enough people in the general public (i.e. the people who pay to go to the movies) care about the intracacies of film and will be happy if the DoF is right and the picture looks good and they can't see any artefacts. A lot of people prefer a 'punchier' picture, (i.e. crushed blacks and high contrast) so even the reduced latitude of current digital doesn't bother them. I think the move away from film will be sudden once the new cameras come through. Film's death has been foretold falsely many times, but never before has there been systems available to do uncompressed 12bit with standard lenses and identical DoF - it will be too attractive to the mainstream studio stuff. It may never die completely, but I think it will disappear from the mainstream suddenly when it does go.

I agree that if you want the film look, you might as well shoot film, and that digital is a great hope for bringing something new to film making. Better framerates, HDR and as yet un-thought of possibilities. There doesn't seem to be much point in reproducing the limitations of film - you would be better off trying to exploit the new things that digital offers.

Short answers are though that:

1) In general 35mm feature films, even the prints exceed 1080P 4:2:0 in resolution and colour depth in nearly all cases. Once you get into uncompressed 2K or better with 4:4:4 colour then you could find examples in both camps that would exceed the other.

2) Distribution prints are nowhere near as good as negatives or even the archive prints, but are still very high quality, and still maintain well over 1080 lines of measurable resolution.

3) DVD is far below the quality of any film stock 720 x 576, 720x480, compressed with colour at 4:2:0 is laughably low resolution and doesn't even look great on 42" televisions let alone projection systems. Any argument that film only measures up to DVD is ludicrous. Even the T2 pics I posted show that easily. DVD is the MP3 of the visual world. i.e. It is great for what is is designed for (TV size viewing/ listening on poratble music players) but isn't great when pushed (watching on a large screen/listening on a great sound system in a good room). I'll say it again - any commercial feature shot even half assed on 35mm film exceeds DVD's capabilities end of story.

4) Generational loss. Optical transfers of film suffer generational loss pretty badly. This is why Lucasfilm went and bought up all the 70mm vistavision cameras as they had to do lots of optical composites for the OT which meant lots of generational loss with each composite. for non effects work though the generational loss between the negative and the print isn't severe at all.
HD can also suffer generational loss if it is captured using a lossy compression method and you have to do composites. If the frames have effects added then they have to be *recompressed* causing compression artefacts and detail loss. This isn't a problem if using a lossless/uncompressed workflow however. It is then usually recompressed with a different codec for digital projection which can cause another 'generational' loss. Once again however, if it gets to an uncompressed workflow from start to finish then digital will be lossless and generational issues will become a thing of the past.

5) Digital projection also currently can't do blacks and has a limited dynamic range, and limited resolution (in some cinemas) compared to film (or even CRT in some cases)

When it comes to the whole analogue vs digital debate to say one is "better" than another is to say that Jet is better than The Strokes, or oil painings are better than watercolours, or that Heavy Metal is better than Punk...


For our purposes here, if you could find a mint technicolour print, and could have a professional run it through a high end scanner like the arri with digital ice, then you would get a better result than the current DVD.

A standard theatrical print transferred by an untrained individual on lower end equipment would probably fall short of the DVD, but would be interesting.

As I said before, first step would be to find a blinder of a print, until you find that, the rest is just fantasy.


Author
Time
That sounds like the last word to me.
Author
Time
Laserman it is so good to see you bacl =].
I've been gone for a bit and just caught up on this thread. And All I have to say is film is way better than HD and whoever thinks the other way around is crazy. I mean I am 15 and I think film is far better.
Author
Time
Originally posted by: Laserman
5) Digital projection also currently can't do blacks and has a limited dynamic range, and limited resolution (in some cinemas) compared to film (or even CRT in some cases)
I heard, just yesterday, that digital projection now can do blacks, with 0% measurable light in pure-black pixels.
Some were not blessed with brains.
<blockquote>Originally posted by: BadAssKeith

You are passing up on a great opportunity to makes lots of money,
make Lucas lose a lot of his money
and make him look bad to the entire world
and you could be well known and liked

None of us here like Lucas or Lucasfilm.
I have death wishes on Lucas and Macullum.
we could all probably get 10s of thousands of dollars!
Author
Time
I´m bumping this up again, so that everyone can reread boris ridiculous remarks here.
Author
Time
Good ol' fashioned telecine:
Get a Sony Z1 HDV camera ($5,000), a good computer with an HD card, RAID, and a couple terabytes (several thousand $). Set up the camera on a tripod, project the print on a small, clear screen, run component HD out of the camera (uncompressed) into the computer, and then do cleanup in post.

Cheaper version:
If you can't do uncompressed, you can record in HDV mode (25 Mb/s MPEG2 in HD) and it'll only take up less than 30 GB and still look great. This doesn't need to cost a lot--I'm sure we could find someone with a 35mm print and coerce him or her into letting us shoot it.

I must have overlooked something, so feel free to put your two cents in. I don't think we'll get the OOT in HD, and the longer we wait, the longer the prints will deteriorate. I think a solid bootleg with a 3-chip HD camera and some fixes in post would look fantastic. Especially compared to laserdisc.
Author
Time
Originally posted by: mcfly89Cheaper version:
If you can't do uncompressed, you can record in HDV mode (25 Mb/s MPEG2 in HD) and it'll only take up less than 30 GB and still look great.


The problem when filming projected film off of a screen is to synchronize the frame-change of the projector with the camera... otherwise you might catch a frame "in between" two projected frames, resulting in a blur. Alternatively you could play the film back at 1/4th of the speed, so that you'll most probably end up with at least one perfect shot of each frame. To increase resolution you could shoot the movie in sections... top-left, top-right, bottom-left, bottom-right, and stitch it together in post. Problem: Lens distortion... the edges of the frame might distort a bit, depending on the lens used, resulting in difficulties comping together the four sections.

Generally filming off of a screen isn't the best way, since you lose a lot of light, color and general quality.

Author
Time
There's a reason film scanners (and old skool telecines) cost a fortune: you can't achieve comparable results taping off a screen.
Author
Time
Filming it of a screen is not at all the same as a telecine. The differences in exposure is enough to throw it away, plus colour issues, noise, any kind of distortion, as well as frame sync. Also individual camera models have their own issues--the Sony Z1 has some very bad colour rendering issues and especially noise and problems with black levels. Its also HDV and not true HD if i am not mistaken.
Author
Time
HDV is true HD, but it is compressed. The reason I picked the Sony Z1 is that it's cheap, and it can output uncompressed HD through component RGB. I have an FX1 (the cheaper model to the Z1) and the image clarity is outstanding, and that's in HDV mode, with compression. The frame synch sounds like a tough one to get around, but it may be worth seeing about, since it may be our only hope of seeing Star Wars in HD. I'll see if I can get access to a projector and use my Sony FX1 for a test on some random film.
Author
Time
Originally posted by: mcfly89
HDV is true HD


No, that's not correct... I think. True HD would be 1920 x 1080, while HDV is working at the anamorphic 1440 x 1080, which has to be stretched to the correct 16:9 aspect ratio.
Author
Time
Originally posted by: mcfly89
Good ol' fashioned telecine:
Get a Sony Z1 HDV camera ($5,000), a good computer with an HD card, RAID, and a couple terabytes (several thousand $). Set up the camera on a tripod, project the print on a small, clear screen, run component HD out of the camera (uncompressed) into the computer, and then do cleanup in post.

Cheaper version:
If you can't do uncompressed, you can record in HDV mode (25 Mb/s MPEG2 in HD) and it'll only take up less than 30 GB and still look great. This doesn't need to cost a lot--I'm sure we could find someone with a 35mm print and coerce him or her into letting us shoot it.

I must have overlooked something, so feel free to put your two cents in. I don't think we'll get the OOT in HD, and the longer we wait, the longer the prints will deteriorate. I think a solid bootleg with a 3-chip HD camera and some fixes in post would look fantastic. Especially compared to laserdisc.


Couple of problems.

1) Finiding a 35mm projector that you can fire onto a small screen.
2) Your ANSI Contrast Ratio drops through the floor by filming the projected image, so a *lot* of detail is completely lost.
3) The Sony records an interlaced image, the film moves in the gate between fields so every frame will have interlace 'stepping' problems. You will need to re-align all of the fields in post. Not impossible but problematic. You will also need to stabilise the footage unless using a pin registered projector.

4) You could use a Decklink HD card - does the sony actually put out 4:4:4 uncompressed via RGB or component? I really didn't think it did.

5) You will get a turdload of frames with the shutter in shot - you would have to replace or remove the shutter in the projector - you would still get thousands of frames that are 'inbetween' frames.

6) "Do a cleanup in post." Sounds easy when it is put that way, but cleaning up 170,000 frames of HD takes a bit of work. Film is *dirty* and even if professionally cleaned is still *dirty* and pretty much unwatchable on TV, especially in HD. Once again, can be done, but is a hell of a lot of work.

7) Exposure. This is the biggy, there is no way to capture anywhere near the full exposure range in a single pass using a projector and a camera, multiple passes would have to be shot, aligned and merged (similar to the HDR process used by photographers) to get an even remotely acceptable exposure.

8) The projector will also need the gate widened if it isn't already.

You could end up with something usable, it may be better than the OUT release even, but this method would be unlikely to even come close to the official DVD release.

But hey, if you can find someone with the projection equipment and a good print, give it a go with HDV, there is nothing to lose except buckets of time if you already have the camera.


Author
Time
I did a very rough test with an old 8mm projector, just to see the frame-blurring, and it is prominent. There may be a work-around, but I don't have access to a 35mm projector and won't be able to do a serious test unless I find someone who does.

When looking into the cost of buying a 35mm projector (several thousand $), it occured to me that it would be cheaper to pay for a true telecine. I found a quote through google for about $1,000 for 120 minutes, and that includes cleaning. Not bad, assuming we can find a print and the teleciners are willing to transfer copyrighted material.
Author
Time
Originally posted by: mcfly89
I did a very rough test with an old 8mm projector, just to see the frame-blurring, and it is prominent. There may be a work-around, but I don't have access to a 35mm projector and won't be able to do a serious test unless I find someone who does.

When looking into the cost of buying a 35mm projector (several thousand $), it occured to me that it would be cheaper to pay for a true telecine. I found a quote through google for about $1,000 for 120 minutes, and that includes cleaning. Not bad, assuming we can find a print and the teleciners are willing to transfer copyrighted material.



And I guess this forum has enough users to chip in a buck or two to pay for the whole thing... I'd be in.
Author
Time
I've never been able to find a telecine shop that would touch a commercial film, but your mileage may vary.
Author
Time
Originally posted by: Laserschwert
Originally posted by: mcfly89
HDV is true HD


No, that's not correct... I think. True HD would be 1920 x 1080, while HDV is working at the anamorphic 1440 x 1080, which has to be stretched to the correct 16:9 aspect ratio.



There are a number of standards for HD, in terms of resolution, frame rate, and interlaced or progressive. If you want to look only at resolution, Lucas shot AOTC using Panasonic Varicam HD cameras, at a resolution of 1280 × 720 (less considering he CROPPED the 16:9 image to 2.35:1). Most interlaced HD (like the Sony camera) doesn't go higher than 1440 lines, but I've upsampled to 1920 and deinterlaced and the image is still stellar.

Obviously, $5,000 HD is not the same as $90,000 HD, but the compromises don't make it any less "True." That would be like saying DV isn't "True SD" because it's compressed, or because it's shot with a chip with less lines of resolution than the broadcast standard.