logo Sign In

Star Wars coming to Blu Ray (UPDATE: August 30 2011, No! NOOOOOOOOOOOOOOOOOOOOOOOO!!!!!!!!) — Page 34

Author
Time
 (Edited)

Some would say the advantages of 4K is a myth:

So 4K is not these 8 mega pixel or 9 mega pixel or 10 mega pixel CMOS images for the Bayer pattern where they add up all the pixels in a row and say hey, we got 4K. The great perpetrators of that mythology have been RED and Dalsa. That's why I call these “marketing pixels." It's intentional obfuscation. Because they really do nothing to improve image quality. They may improve sales volume. But they don't do anything to quality.

But somehow the world has accepted that that's 4K. It's purely semantic. It's like saying, “I don't like my weight in pounds so I converted to kilos. It sounds better!” You'd be amazed at how many non-technical people I meet, often producers and directors, but sometimes even cinematographers get fooled by that stuff.

<SNIP>

So if you had true 4K resolution in your local theater, everybody would have to sitting in the first 6 rows. Otherwise they wouldn't see any extra detail. Their eyes wouldn't LET them see it. You know this intuitively from passing by these beautiful new monitors at trade shows. You find yourself getting absolutely as close as possible to see the detail, and to see if there are any visible artifacts. At normal viewing distances, you can't.

So the whole 2K 4K thing is a little bit of a red herring.

Creative Cow: What do you think about IMAX as a filmgoer?

John Galt: I don't like the frame rate. I saw Gorillas in the Mist and the gorilla were flying across the forest floor. Every frame they seemed to travel like 3 feet. [laughs]. It's really annoying. I mean I loved Showscan: 70mm running at 60 fps. In terms of a sense of reality, I think it was far superior to IMAX.

That's why I subscribe to Jim Cameron's argument, which is we would get much better image quality by doubling the frame rate than by adding more pixel resolution.

 

 

http://magazine.creativecow.net/article/the-truth-about-2k-4k-the-future-of-pixels

 

I saw Star Wars in 1977. Many, many, many times. For 3 years it was just Star Wars...period. I saw it in good theaters, cheap theaters and drive-ins with those clunky metal speakers you hang on your window. The screen and sound quality never subtracted from the excitement. I can watch the original cut right now, over 30 years later, on some beat up VHS tape and enjoy it. It's the story that makes this movie. Nothing? else.

kurtb8474 1 week ago

http://www.youtube.com/all_comments?v=SkAZxd-5Hp8


Author
Time
 (Edited)
JAMES CAMERON:
Because people have been asking the wrong question for years. They have been so focused on resolution, and counting pixels and lines, that they have forgotten about frame rate. Perceived resolution = pixels x replacement rate. A 2K image at 48 frames per second looks as sharp as a 4K image at 24 frames per second ... with one fundamental difference: the 4K/24 image will judder miserably during a panning shot, and the 2K/48 won't. Higher pixel counts only preserve motion artifacts like strobing with greater fidelity. They don't solve them at all

Read more: http://www.variety.com/article/VR1117983864.html?categoryid=1043&cs=1#ixzz0zsxWPYR3

I saw Star Wars in 1977. Many, many, many times. For 3 years it was just Star Wars...period. I saw it in good theaters, cheap theaters and drive-ins with those clunky metal speakers you hang on your window. The screen and sound quality never subtracted from the excitement. I can watch the original cut right now, over 30 years later, on some beat up VHS tape and enjoy it. It's the story that makes this movie. Nothing? else.

kurtb8474 1 week ago

http://www.youtube.com/all_comments?v=SkAZxd-5Hp8


Author
Time

In fact, everything talking about 4K belies the fact that most of the theater installations around the world are basically going at 2K. I mean the only commercial 4K digital cinema projector that I am aware of is the Sony 4K projector. But the bulk of theatrical installations around the world are the Texas Instruments DLP. And its maximum resolution is 2048x1080. I mean, let's face it. The difference between 1920 and 2048 is 6%. Believe me, you cannot see a 6% difference. Six percent is irrelevant.

http://magazine.creativecow.net/article/the-truth-about-2k-4k-the-future-of-pixels

I saw Star Wars in 1977. Many, many, many times. For 3 years it was just Star Wars...period. I saw it in good theaters, cheap theaters and drive-ins with those clunky metal speakers you hang on your window. The screen and sound quality never subtracted from the excitement. I can watch the original cut right now, over 30 years later, on some beat up VHS tape and enjoy it. It's the story that makes this movie. Nothing? else.

kurtb8474 1 week ago

http://www.youtube.com/all_comments?v=SkAZxd-5Hp8


Author
Time

Yeah, but what I can see (and belive me, I can, because I do prefer to sit in the front 6 rows and have a very good eyesight) is the projection grid (well mainly just in white areas.) Hell, I even see the grid on my laptop's 15" monitor now. If the projection was 4K, I would see the extra detail and I would not see the annoying grid in the whites...

Author
Time

Douglas Trumbull's new Showscan digital is using 60fps and 120fps. It's kinda scary.

They can blab on and on about how much clarity HD can give etc.

It still isn't the same. The technology is lifeless, their work is lifeless, and the videos (not films!) are lifeless.

70mm versus digital? If you say digital then you need your head examined.

VADER!? WHERE THE HELL IS MY MOCHA LATTE? -Palpy on a very bad day.
“George didn’t think there was any future in dead Han toys.”-Harrison Ford
YT channel:
https://www.youtube.com/c/DamnFoolIdealisticCrusader

Author
Time
 (Edited)

James Cameron is full of shit. You can see the difference between a 35mm negative and 1920x800 with your eyes. What he's talking about is digital 4K camera capturing, which is different than scanning a film.

I saw Inception digitally a couple weeks ago and the picture quality was awful. Tons of artifacts, and a general softness. I forgot once the movie got going because its an absorbing piece of cinema, but once you start examining the image you can see how weak it is. Of course projection is not the same as capturing. Resolution isn't the be-all end-all for film, otherwise everyone would be shooting on 65mm and IMAX. Especially when it comes to video--don't forget, DVD and VHS have the exact same resolution, but one looks like shit and the other can be projected in a small screening room to pretty good quality.

However, as it relates to scanning a film from 35mm in 4K this is a separate issue. You can film something in 1920x1080 today and have it look pretty good because the sensors of the cameras have improved. The prequels look like shit not just because of the resolution, but because of the sensors, which were 2/3" CCDs, and with 4:2:2 colour. But when you start comparing scanning 35mm in HD and 4K its a totally differerent game. An HD scan simply cannot retain the same amount of picture information that a 4K scan can, because there is more information on the negative than 1920x1080 is capable of displaying. This talk about HD cameras versus 4K cameras is misleading because a telecine machine is not the same as motion picture camera.

Author
Time
 (Edited)

Actually, what he was clearly saying was that you don't ever see 4K resolution in theatres, because 35mm print is 4 generations from the negative and the resolution of the prints is no more than 1080p. So 1080p (in terms of resolution) is the same or better than what you see in theatres from 35mm. And I just saw the Inception too and it was projected from film and I'd say the level of detail was about as high as 720p, so I must agree with that. But it was great, film has that special, warmer feeling to it, which digital just lacks. (And the film itself was awesome too, one of the best I've seen in the last few years actually.)

Author
Time

And btw I don't think that VHS and DVD have the same resolution, I don't remember the exact numbers but VHS is about half the resolution of DVD (and LD is somewhere in between, closer to DVD).

Author
Time

Unfortunately, AMC has phased out real film projectors. Luckily, I would guess the other chains have not just yet.

A Goon in a Gaggle of 'em

Author
Time

1080p is nowhere close to the level of detail a 35mm camera negative has.  Hell, I'd go so far as to say that an original camera-negative 35mm frame has at least 8K worth of detail, probably more.

Now release prints, those have somewhere between 2K-4K worth of detail (probably nudging closer to the 2K side).

Even 16mm has more detail than 1080p (though not nearly as dramatic as 35mm).

So yeah, Cameron's full of shit.

So you don't think I'm full of shit, too:

I worked on a grad film last year.  The director was contemplating shooting 16mm, 35mm, 1080p/24, or RED 4K.  So we did tests for each.

The 16mm and 35mm were scanned in at 4K and 2K (we hadn't decided what to do about the DI at that point, so we did both).

For the 16mm test, you couldn't tell the difference between 2K and 4K.  They looked pretty much the same.

For the 35mm, the 4K was the clear winner.  You could pick up much more detail than you could in the 2K scan.  This was my vote (film on 35mm, do a 4K DI).

Then we compared the 35mm 4K, 2K, the 1080p/24, and the RED 4K.

The 35mm 4K blew them all out of the water.  The 35mm 2K looked better than the 1080p, but not as crisp as the RED 4K.  The RED footage looked very good, but it didn't have the "feel" of film that I love so much.  But as I said, it did look better than the 35mm 2K, as the RED footage was shot at 4K.

While my vote was still 35mm scanned at 4K, the director decided to go with the RED.  He did a second comparison between RED 4K and RED 2K.  The RED 2K looked a little better than the 1080p, but not as good as either 35mm scan.

Against my protests, the director decided it "looked good enough," and went with the RED 2K, simply because the post-production process would be simpler and cheaper.

What was the point of that rant?

Oh, yeah.  There is a point to 4K.  It looks better.  And even when you downscale it to 1080p, you can tell.

The only thing Cameron has right is that release prints, after they're run for a while, do have less detail than a 1080p digital "print" would, as digital files don't degrade with use like film does.  If you release your film digitally, it will always look the same, no matter what.  That, in my opinion, is the most convincing argument for digital.

Author
Time

bkev said:

Unfortunately, AMC has phased out real film projectors. Luckily, I would guess the other chains have not just yet.

What?!

NNNNNNNNNNNOOOOOOOOOOOOOOOOOOOOOO!

That just made me very sad, as all the theaters in my area are AMC's.  :-(

Author
Time

Again, that is all surely true, but you keep talking about the o-neg but the question is what I as a consumer see in the cinema. If the film is shot at 4K digital and projected at 4K digital, I'll basically see projected in cinema the equivalent of what the o-neg is for film, it will be the best possible quality for the film (of course there are other factors to the digital projection than just resolution,so not entirely but close). When I watch a 35mm print, I watch the film in 4 generations worse quality and the situation is very similar as watching 1080p downscale of a 4K digital film. The best situation here would probably be to have the film shot on 35mm and then scan the o-neg at 4K or even 8K and project that digitally at 4K.

Author
Time

Rather, they are in the process of phasing them out and expect to be done by 2012 according to this press release. Naturally, with LA being a big-time market, at least all of my theaters have been conformed.

A Goon in a Gaggle of 'em

Author
Time

What you see in a theater on opening day, the first (or second, or third) time a print is run, is probably somewhere between 2K-4K quality.

Today, though, any film print will have been made from a 2K or 4K DI anyway, so it doesn't really matter.  If you're only concerned about the most detail, see it digitally with a 4K projector.

I, however, prefer seeing something that was shot on film projected on film, for aesthetic reasons, regardless of whether or not the 4K digital theater across the hall has more detail or not.  If it was shot digitally, it should be seen digitally.

Basically, I want to see it as close to the way it was shot as possible.  Which is why I don't see films in IMAX unless, like The Dark Knight, at least part of the film was shot on IMAX.

But if you want to capture Star Wars the way it was seen in theaters in 1977, it's likely to be somewhere between 2K and 4K quality.  Unless you saw it in 70mm, in which case it was probably above 4K quality.

Author
Time

AMC switching completely over to digital projection by 2012 is good news to me and I'll tell you why. Like ChainsawAsh said, every new movie these days goes through a DI anyway, so you're not really gaining anything by seeing it on film.

Secondly, let me give you a couple stories:

I saw Star Trek on opening day and then went back to see it again less than two weeks later. There was a line running down the right side of the print during the last act. Yeah, it's really thin and easy enough to ignore if the scene is relatively dark, but when Spock was flying his ship above the BIG BRIGHT EARTH through the BIG BRIGHT ATMOSPHERE, the line was really noticeable and just made me think "wow, that wouldn't be there if I was seeing this in digital projection."

Then there's my Inception story, oh boy ....

Didn't see it until the Monday after it opened. For starters, there was a thread/fiber/hair/whatever stuck in the gate, so on the lower center of the screen there was this shape sticking out and moving every 24th of a second, really distracting especially during those bright white snow scenes towards the end. It was there from when the trailers started and never went away throughout the entire movie (as a side note, I've learned that if you ever notice something weird about the projection during the trailers then you'd better say something because that's how it's gonna be during the movie). Also, about 30 minutes into the movie the projector turned off and then started up again a few seconds later.

Funny thing is, there was also a screw-up with the one movie I saw projected digitally this summer. Splice was shot in Super 35 but framed at 1.85:1. In other words, it was shot Godfather III style. Whoever was in charge must've thought it was framed at 2.35:1, forgotten to open up the screen, who knows. Subtitles got cut off, tops of heads were getting cut off to an unusual extent (even for Super 35) and - even though I hadn't seen the movie before - it was pretty clear that there was picture information we weren't seeing. For instance, in an establishing shot where we're supposed to see a car driving across the horizon .... we only hear it. I also noticed the odd framing during, ahem, that scene (you know the one I'm talking about if you've seen the movie).

Author
Time
 (Edited)

 

Gotta agree with Harmy

What was on the o-neg of star wars in 1977 is not what people saw  on the release IP's in 1977!

What all of us saw in the cinemas in either 77',78',79',81,82' or 97' was nothing more than 500-800 lpph(not that much more than Standard definition)

 

This report performed tests:

http://www.etconsult.com/papers/Technical%20Issues%20in%20Cinema%20Resolution.pdf

and this was there verdict:

Film theoretically has very good resolution capabilities. What is delivered to the theatre is another story. If we believe the ITU tests, then images captured at almost 2400 lines per picture height on the camera negative deliver significantly degraded on screen resolution through the projection system – in the range of 500 – 800 lines per picture height. 500 lines corresponds to about 9 line pairs per degree from 2 screen heights.

 

Chinawash wrote:

But if you want to capture Star Wars the way it was seen in theaters in 1977, it's likely to be somewhere between 2K and 4K quality.  Unless you saw it in 70mm, in which case it was probably above 4K quality.

 

I saw SW in 81' and 83'  on the big screen and it looked fantastic.....compared to my 1982 rental tape!!

And that is the problem.Old school Cinema only had to compete with crappy 70's /80's TV's and video systems.But now the playing field has been redefined.

 A 1080p/24 presentation would be better(but not necessarily superior) than a late 70's/early 80's deluxe(or technicolour )3rd/4th generation  InterPositive release print.

This study confirms it:

Image Hosted by ImageShack.us

To create an equivalence to the release prints tested by ITU, if the pixels on screen are “1 to 1” with resolution, current 1280 x 1024 projectors are adequate. Oversampled displays will substantially reduce pixelization, driving towards 2K x 1K display requirements to satisfy an equivalence to 500 to 800 lines per picture height.

I saw Star Wars in 1977. Many, many, many times. For 3 years it was just Star Wars...period. I saw it in good theaters, cheap theaters and drive-ins with those clunky metal speakers you hang on your window. The screen and sound quality never subtracted from the excitement. I can watch the original cut right now, over 30 years later, on some beat up VHS tape and enjoy it. It's the story that makes this movie. Nothing? else.

kurtb8474 1 week ago

http://www.youtube.com/all_comments?v=SkAZxd-5Hp8


Author
Time
 (Edited)

ChainsawAsh said:

What you see in a theater on opening day, the first (or second, or third) time a print is run, is probably somewhere between 2K-4K quality.

Today, though, any film print will have been made from a 2K or 4K DI anyway, so it doesn't really matter.  If you're only concerned about the most detail, see it digitally with a 4K projector.

I, however, prefer seeing something that was shot on film projected on film, for aesthetic reasons, regardless of whether or not the 4K digital theater across the hall has more detail or not.  If it was shot digitally, it should be seen digitally.

Basically, I want to see it as close to the way it was shot as possible.  Which is why I don't see films in IMAX unless, like The Dark Knight, at least part of the film was shot on IMAX.

But if you want to capture Star Wars the way it was seen in theaters in 1977, it's likely to be somewhere between 2K and 4K quality.  Unless you saw it in 70mm, in which case it was probably above 4K quality.

 You refuse to see films in IMAX but you think Star Wars in 70mm has better quality? Imax and 70mm are the same principle. In both case you are seeing a 35mm blown up to a duplicate format approximately double its original size. And in both cases, I doubt you actually gain any resolution.

 

As for the argument for digital projection--Cameron is right there. Most prints you see are not in 4K when first run. Resolution does not degrade the more a print is run, either a print will resolve four thousand lines of resolution or it won't, and the fact is that a print will never, ever give you 4K resolution. It is roughly HD resolution, and in many cases between 720 and 1080 resolution. There were studies conducted by a European cinema group [EDIT: I think that is what was linked above] and they did tests across dozens of theatres with different subjects and they found on average viewers could only discern 800 lines of resolution in a print. Now, I take issue with this, as I have seen prints that have more picture information in them than their HD counterparts, so this tells me that prints can outperform HD. However, not all prints are the same--even the reels differ. Some reels are sharp and some are soft within the same film. This is because it varies as they are printed, due to light pollution and registration among other things. I would say that in a best case scenario if you saw a perfect print from a limited-run (which, because they make only a couple hundred prints and not 50,000, tend to be much better quality) it would be in the 2K range. But otherwise, your typical theatrical print could not possibly be in the 4K range, just because it is not possible to retain that much information when it has been (at least) three generations old.

However, as I mentioned previously, digital projection often tends to "look" digital, you can see the video artifacts, so this is why I prefer film most the time. Its not because it is digital per se, if you saw it from the DI itself it would look great, it's the HD downconvert and the projectors used. When I saw Inception last week it looked bad, even though the resolution was about the same as a film print. However, when I saw the Final Cut of Blade Runner in 2007, it was one of the best looking projections I have seen in my entire life, because it was from a premium projector in a single-screen theatre, not those typical multiplex ones. Maybe it was a 4K projector, but I don't know if any commercial films were being shipped to theatres in 4K. Also though, and I will get into this as it relates to film, just because something is being projected in 1920x1080 doesn't mean that's what the resolution is. That is the size of the image, true, but film and video aren't measured the same way because film has no pixel resolution, you have to measure resolving power of both. A lot of HD-downconverts for multiplexes simply do not have a thousand lines of verticle resolution when projected. The picture may technically be that pixel size, but if the movie was precursed with a lens chart you may discover that it only resolves about 800 lines. The digital projection of Inception I saw might have been such a case. That's why it is misleading to just look at resolution and say "this number is bigger", because resolving power is different than resolution and film has no fixed pixel size so that is what you will be comparing--what actually ends up being captured.

On a similar train of thought, you also have to keep in mind that while 35mm films can resolve more than 4K resolution, in a practical sense they often don't. The convention wisdom is that 35mm films resolves about five thousand lines. But film isn't like digital video, where the resolution is fixed--the resolving power of film depends on what you put into it. There are two main factors that contribute to this: 1) lens, and 2) film stock. If you shoot on a 500 ASA stock with a 200mm zoom lens, you aren't going to get more than 2K on the negative. But if you shoot on a 100ASA fine-grain stock with something like a 50mm Cooke S4 prime, you will probably get all five thousand glorious lines of resolving power on that negative.

However, stock and lenses today are not at all the same as the ones from eras past. In fact, in the 1990s, film stock became so sharp that cinematographers started complaining it was beginning to look like video, which is part of what prompted the move to grainy, coarser looks like Saving Private Ryan. And the lens technology of the last 20 years is absolutely incredible--with stuff like the Cooke S4 series, the sharpness you get is just incredible. So, a movie from the 1970s like Star Wars, because it was shot on older lenses and stocks that aren't as clear and defined as today, might only pick up 4K picture information in the best of instances. It's hard to say without examining the negative itself, but that is generally what should be the case.

That's why you can't say "film=[this resolution]". Maybe it does for one film, but on another film it will be totally different. 35mm has no fixed resolution, the resolution is the lens and the stock and in ideal conditions with modern stocks and lenses it performs at about five thousand lines of resolving power but in practice it will fluxuate depending on the production and even the shot.

Also bear in mind that most films are done in 4K D.I.s these days so even in the rare case where you exceeded 4K resolution, the actual negative you print out will be 4K resolution because that is what the camera negs were scanned at. So, cool you used the best lenses and stocks and I count 5500 lines on the lens chart, that is great, but the negative becomes a 4K digital file so you threw away the difference, and then prints and downconversions just get degraded from there. In most cases though, an 8K scan is unnecessary. In the traditional world, also, if we doing straight film to film, the negative is not the completed movie, it would have no colour correction, and any opticals would already be second generation, so by the time your pristine Interpositive is printed it's a mix of second and third generation film material and probably only shows just above 2K resolution.

So, uh, I've gone on an aimless tangent here, but what I'm trying to say is that Cameron both has a point, because pixel count ("resolution"...which really isn't an accurate term in the sense that we are using it) isn't everything, and his notion of frame rate and perceived resolution is interesting and probably correct, but I don't think anyone that worked with film would say HD can beat 35mm film. I did a bunch of tests in 2007 using the camera Attack of the Clones was shot on, among others, and 35mm and the difference was night and day. That was 2007 though and things have improved greatly since then, but I still find it hard to believe. Chainsaw Ash's own tests seem to confirm my suspicions.

Author
Time
 (Edited)

The thing is, there are many different factors involved. In other threads we discussed the new transfer of Aliens and how important the quality of the transfer is. What I'm trying to say is that under ideal conditions you shouldn't be able to tell the difference between a high definition digital projection of scanned film and a projection from a good 35mm print. A high quality digital projection of a really well done digital transfer should look exactly like a projection of a very pristine 35mm print. It doesn't have to be one or the other, they are both just different ways of achieving the same thing.

Author
Time

@Zombie

This is a pretty cool discussion and I know where you are coming from mate.

I agree that a 2K scan of the 35 mm o-neg would not be as good as a 4k scan of the same 35mm o-neg.

But every official study i have seen claims that the all important release prints are less than 1 k:

 

 

 

Image Hosted by ImageShack.us
http://www.efilm.com/publish/2008/05/19/4K%20plus.pdf

I think this explains why Lucas went for 1080p for the prequels.

A digital release at 1080p is better than a 4th gen 35mm release print(which is what the original trilogy was).

 

 

 

 

I saw Star Wars in 1977. Many, many, many times. For 3 years it was just Star Wars...period. I saw it in good theaters, cheap theaters and drive-ins with those clunky metal speakers you hang on your window. The screen and sound quality never subtracted from the excitement. I can watch the original cut right now, over 30 years later, on some beat up VHS tape and enjoy it. It's the story that makes this movie. Nothing? else.

kurtb8474 1 week ago

http://www.youtube.com/all_comments?v=SkAZxd-5Hp8


Author
Time

Oh, and concerning the frame rates I can't agree with Mr. Cameron. It is only in 3D where the frame rate is important, because of the motion blur which the human brain perceives as natural  in 2D and when you increase the frame rate in 2D it looks more fluent but less film-like, now in 3D, which tries to look less like film and more like reality and that is how it is perceived by the brain, the motion blur is felt as a disruption and so they need to make the frame rate higher in order to minimize the motion blur while retaining the illusion of fluent movement.

I have a TV with the 100Hz function, which basically means, that the TV shows 100fps even when you watch a 25fps or 30fps video and makes up the missing frames more or less successfully, which results in a more smooth, fluent movement. But when this function is used on a movie, it immediately ceases to look like film and starts to look like a soap opera, so to me the current frame rate is a part of the magic of film and I'd hate to see that change.

Author
Time

Pretty interesting. It's too bad all I see in the theater is the light from all the fucking iphones and blackberrys anyway, makes things very academic. As for Star Wars, if episode II and III have such a low "ceiling", how viable are they really for a big 3D release? I mean, by the time they get to that, the standards will be pretty high and the tickets will be 20 bucks. In other words, it's going to cost 100 bucks to take your kids and slog through Attack of the Clones again, how will they make it worthwhile?

Author
Time
 (Edited)

@ Danny Boy:

Yes, an HD cam to HD projection should be better resolution than a 35mm theatrical print.

In theory.

In practice, it is not. In 2002, for example, digital projection was a joke. It looked awful. It still often looks mediocre today, and in 2002 it was experimental. So, yes, if you could take the HD DI from the computer at Lucasfilm and put that on the screen, it would be better. But that's not what you see. First of all, you have a file conversion that gets compressed, so what is possible to see in a theatre is not the same as what came from the computer. But more importantly, with digital the projector is a key element. With film, the film is the element and the projector is just a light to shine behind it. But with digital the projector is actually creating the image. So, you have a compressed file that is being shown on a projector that loses resolving power and introduces tons of artifacts. If you saw AOTC digitally in 2002, for example, you would have noticed that the stars were square blocks, because thats how the projector or file interpreted them.

So it seems an unfair comparison. People say "oh 35mm resolve 5000 lines but in practice it is only 1080 when you get to a print". Fair enough. But HD can resolve 1080 lines but when you get to a projection its about 700 and it looks like ass. So make the comparisons fair. But as I said, resolution isn't even the full picture here, because in the case of Lucas you have the camera sensor which breaks up in dark areas, can't display gradients, doesn't have any dynamic range, has no depth of field because it is a 2/3" CCD, and has 4:2:2 colour space (plus it limits your shot choices, and takes more time and money to set up). This is why I always thought Lucas was a fucking moron for shooting the prequels digitally. No DP ever shot digitally because they knew all the limitations, technical and aesthetic. All the early adopters were directors, like Lucas, Rodriguez and Bryan Singer, but they weren't trained professionals, all of their DPs tried to talk them out of it and they were right.

The picture is changing nowadays of course. But there are still issues to be worked out.

Author
Time

Actually, as far as my knowledge of the matter goes, films for cinematic projection are delivered to cinemas on huge external HDDs because they are uncompressed HD. A losslessly stored digital film at 1080p will have somewhere around 300GB, which by today's standards is not that much, a 300GB HDD is considerably cheaper than 2hrs worth of 35mm stock and the transfer of the digital data to the HDD is virtually free, so there is no reason for the cinemas to project from compressed sources, that wouldn't make any kind of practical sense.

But I agree that the picture quality depends greatly on the quality of thedigital projector in question and also on the size of the screen. Where film simply looks less detailed on bigger screen, digital will reveal the atrocious projection grid, which really annoys me...

Author
Time

Also, I'm not sure if that is common practice but it would make a lot of sense to use the same anamorphic principle used for film on HD projections, in other words, have the 2,35:1 picture stretched to 1920x1080 and unsqueeze it optically upon projection.

Author
Time

Zombie84:

I was simply saying that a 70mm blow-up release print of Star Wars would have more detail than a 35mm release print, simply due to its larger size being able to capture more of the detail from the internegative than the 35mm print would.

As I don't have any real experience with 70mm, I could be wrong about that.

 

Oh, and about 120Hz or 240Hz or whatever displays - I hate them.  I was at Best Buy yesterday, and they were showing Avatar in Blu-Ray with the 240Hz setting turned on.  Every movement looked rubbery and fake.  And there were people there commenting on how great it looked.  I just hung my head and walked away.