logo Sign In

Attack of the Clones 35mm - on eBay, bought - and now project thread (a WIP) — Page 5

Author
Time
 (Edited)

more from the blu-ray screencaps site that are similar to my frames for you to comapre with (more and more it looks like either these caps were taken from a computer that either didn’t do max decoding or had jpgs too compressed or that Lucasfilm actually did some filtering and various jiggering around with the image):

https://i1.wp.com/caps.pictures/200/2-starwars2/full/starwars2-movie-screencaps.com-352.jpg
https://i0.wp.com/caps.pictures/200/2-starwars2/full/starwars2-movie-screencaps.com-592.jpg

i guess my scans just have tons more contrast and a different luminance curve
now I’d need to go see what the 35mm frames look like, did my scanning do it properly or get the wrong curve

it will be interesting to see what this project turns out like, hopefully they have a scanner and everything with perfect calibration to the film type, interesting to see how it turns out if so and what it looks more like, the blu or my scans here

Author
Time

The bluray has a significantly different color to the DVD, Better to compare with the HDTV version.

Author
Time
 (Edited)

ah dang, I forgot, I got rid of my 35mm reel (the final one with the different marriage scene) for AOTC when I heard the blu-ray was coming out. arrrrr
now that was silly

I did keep Reel 6 from ROTS (a super quick peek at it again just now makes it appear to be in ultra good condition, basically fresh new with reel intro and countdown markers and taping), so it seems that is actually the only full reel I have (along with a few trailers (including AOTC from which pics above sourced), including a 1977 SW that has far less red shift than most, if still shifted).

Author
Time

“Once those are done, I will move onto AOTC, once I have the HDDs to store the scan on. It will be later this year, couldn’t put a timeframe on it at the moment.”

What is happening to the print reels once scanned later this year?

Author
Time

Hey was wondering who if anyone is currently restoring this project in 4K? Would be happy to donate if necessary.

Author
Time
 (Edited)

AotC was digitally shot in 1,5k… So there is really no need to restore it in 4k.

Author
Time

The film was shipped to a gentleman in LA with connections to Poita. I have not heard from Poita himself for quite a while (I hope all is well). At this point, I wish I could offer more information, but that’s all I can say at the moment.

If I had some gum, I’d chew a hole into the sun…

Author
Time

ZigZig said:

AotC was digitally shot in 1,5k… So there is really no need to restore it in 4k.

WHY LUCAS WHY! You could of done both digital and FILM! 😦

Author
Time

ZigZig said:

AotC was digitally shot in 1,5k… So there is really no need to restore it in 4k.

It was shot in HD and then cropped. Restoring it in 4k from film will keep the full detail from printing the digital image to film. It is also the only way we will be able to compare it frame by frame to finally find all the changes. The in theater camera recording lacks the resolution to really be sure. We’ve found a few, but Rick McCallum indicated there could be more.

Author
Time
 (Edited)

It was shot with a 2/3 3-CCD EFP camera, which captured 3:1 compressed 1440 x 1080 (= 1.5K)

Author
Time

ZigZig said:

AotC was digitally shot in 1,5k… So there is really no need to restore it in 4k.

It was indeed shot at less than 2K, however motion picture film should still be scanned at 4K or more. The reasons being many, but in broad strokes: digital video resolution signifies the size of the ‘container’ that contains the image. During film-out the digital image on any given frame is printed onto a piece of film, and film is a very different ‘container’ than digital. These two types of ‘containers’ are a lot different than any consumer facing publications explain. They often explain that film’s resolution equivalent is somewhere around 4-6K, but that is a bad analogy because the two systems are fundamentally different, i.e. Film’s grain structure, and thus image structure, is continuous, whereas digital is discreet. The image does not retain its discreet nature once it’s on film, it is now rendered by film as continuous. So in order to capture the image as best as possible, a higher resolution is needed. I’m not suggesting that the image ‘gains’ in quality from the film-out process, I’m saying that to transfer a continuous film image to digital, there is ‘loss’ of spacial resolution if you scan at say, 2K. Basically film and digital are not 1:1, in fact they are far from it. In other words, whether an image on film came from a digital file film-out or an optical/photochemical process, you lose quality by scanning at lower resolutions. The size of these losses is debatable, but remember that the pixel resolution of an image is only one part of its image resolution, especially when transferring back and forth between mediums.

Author
Time
 (Edited)

I respectfully disagree.

Even if the original master was shot in 6k, after going through a traditional Intermediate Positive, then to IN to release print stage in the lab, it wouldn’t go above 2K. Plus the weave and lack of pin-registration in most projectors, plus the less-than-optimum focus, you could easily wind up with well below 1080 resolution.

Furthermore, considering that the master was digitally shot at only 1,5k, there is really nothing to get at 4k, but dust and film grain.

Author
Time
 (Edited)

These look great with the film grain. It was added in post if I recall correctly?

EDIT: ZigZig cleared things up for me in regards to what resolution AOTC was filmed, sad we can’t truly get better. Thanks! Looking forward to your 4K release of TPM.

Author
Time
 (Edited)

ZigZig said:

It was shot with a 2/3 3-CCD EFP camera, which captured 3:1 compressed 1440 x 1080 (= 1.5K)

It is a little more complicated than that. The image that camera captures is 1920 x 1080, but part of the image (I believe the Y portion) is less. But when you examine the final image, each pixel is distinct from the others. It wasn’t until the next generation (used on ROTS) that it was true 1920 x 1080 (meaning each pixel was uniquely captured and recorded instead of the partial processing the Sony HDW-F900 used). But the key thing is that the final image was 1920 x 1080 hd, not 1440 x 1080. If you research the camera, that is very clearly stated and backed up by the resulting image. And HD is not 2k. The 2k format is slightly higher resolution. After ROTS was shot, Panavision developed a lens to compress the image so the 1080 lines didn’t need to be cropped for wider images like Lucas used for Star Wars. The resolution for both AOTC and ROTS is 1920 x 816. ROTS has more color depth.

https://cinequipt.com/cms-files/sony-hdw-f900-brochure.pdf

Author
Time
 (Edited)

Again, I respectfully disagree… the resolution of AotC is definitely 1440x800.

Keith Walters said:

His choice of format caused many a sideways glance among those who actually understood these things at the time: It was the Sony HDW – F900; a ½” * 3-CCD EFP camera which captured 3:1 compressed 1440 x 1080 component video in “SR”, a bastardised “segmented” Tape format. That basically means each progressively scanned frame is converted into a pseudo-interlaced format, and each “field” is recorded as two separate JPEG-like images, (which does NOT give the same result as storing the whole frame as a single image).

Since SW2 was to be displayed as 2.35:1, and Panavision were not able to come up with the promised Anamorphics to work with a Prism-splitter 3-CCD camera, the movie was shot letterboxed, so the master images were only 1440 x (about) 800. At the time, Cinema video projectors were very thin on the ground, which meant the vast majority of punters wound up watching a 4th generation film print, struck from a master video image with considerably less resolution that a 4th generation film print struck from 35mm negative! And there weren’t no Arrilasers then either, just a lot of rather dodgy CRT video printers.

A few years before this epoch-marking event we’d already been told that the then-new HDW – 750 was already a “Replacement for 35mm film” and we laughed hysterically, so hence we were left wondering what had been done to the aforesaid 750 to give us the F900.
Well … apart from adding 150 and an “F” to the model number … not a lot….

Well anyway, Boy George went on to produce exactly the sort of results we said he’d get, and nothing daunted, he then proceeded to sever all ties with the aforesaid Panavision and pitched woo to a new upstart startup called “Plus8 Digital” (nee “Plus 8 Video”) to equip his next instalment: SW3 “Revenge of the Sith”.
This time he used Sony HDC-F950 cameras - still 1/2” prism jobs, * but with true 1920 x 1080 recording, which produced noticeably better pictures than Episode 2, (by now the Arrilaser had become available which also helped) but still crap compared to Episode 1, which was still shot on film….
(Plus8 Digital then proceeded to go broke and were eventually bought by Panavision, ROTS apparently being the only feature of any significance to be shot on their brace of expensive new cameras…)

(https://cinematography.com/index.php?/topic/63610-star-wars-episode-2-a-millstone-in-cinematic-history/)

yotsuya said:

And HD is not 2k. The 2k format is slightly higher resolution.

About 2K vs. HD, I never wrote something about HD, I just said that it wasn’t shot in 2K.

Author
Time

ZigZig said:

Again, I respectfully disagree… the resolution of AotC is definitely 1440x800.

Keith Walters said:

His choice of format caused many a sideways glance among those who actually understood these things at the time: It was the Sony HDW – F900; a ½” * 3-CCD EFP camera which captured 3:1 compressed 1440 x 1080 component video in “SR”, a bastardised “segmented” Tape format. That basically means each progressively scanned frame is converted into a pseudo-interlaced format, and each “field” is recorded as two separate JPEG-like images, (which does NOT give the same result as storing the whole frame as a single image).

Since SW2 was to be displayed as 2.35:1, and Panavision were not able to come up with the promised Anamorphics to work with a Prism-splitter 3-CCD camera, the movie was shot letterboxed, so the master images were only 1440 x (about) 800. At the time, Cinema video projectors were very thin on the ground, which meant the vast majority of punters wound up watching a 4th generation film print, struck from a master video image with considerably less resolution that a 4th generation film print struck from 35mm negative! And there weren’t no Arrilasers then either, just a lot of rather dodgy CRT video printers.

A few years before this epoch-marking event we’d already been told that the then-new HDW – 750 was already a “Replacement for 35mm film” and we laughed hysterically, so hence we were left wondering what had been done to the aforesaid 750 to give us the F900.
Well … apart from adding 150 and an “F” to the model number … not a lot….

Well anyway, Boy George went on to produce exactly the sort of results we said he’d get, and nothing daunted, he then proceeded to sever all ties with the aforesaid Panavision and pitched woo to a new upstart startup called “Plus8 Digital” (nee “Plus 8 Video”) to equip his next instalment: SW3 “Revenge of the Sith”.
This time he used Sony HDC-F950 cameras - still 1/2” prism jobs, * but with true 1920 x 1080 recording, which produced noticeably better pictures than Episode 2, (by now the Arrilaser had become available which also helped) but still crap compared to Episode 1, which was still shot on film….
(Plus8 Digital then proceeded to go broke and were eventually bought by Panavision, ROTS apparently being the only feature of any significance to be shot on their brace of expensive new cameras…)

(https://cinematography.com/index.php?/topic/63610-star-wars-episode-2-a-millstone-in-cinematic-history/)

yotsuya said:

And HD is not 2k. The 2k format is slightly higher resolution.

About 2K vs. HD, I never wrote something about HD, I just said that it wasn’t shot in 2K.

That’s why I included the brochure from the Camera that Lucas used on AOTC. It was the HDW–F900 and according to the brochure your estimate is way off. 1440x1080 is 1555200 pixels and the camera is rated at 2200000 pixels. The pickup device is listed as 3-chip 2/3-type FIT type CCD. My undertanding is that the yellow chip was indeed 1440x1080, but the other 2 were 1920x1080 resulting in an image that is almost as good as the next generation cameras. But was not 1440x1080. The final cropped image is 1920x816, exactly what we get on the Blu-rays. It does pay to investigate and read the documentation on the camera used on the film. The brochure I included the link to is copyrighted 2002, so it is not some later and updated product, but the very one used for AOTC. This topic has been discussed fully before and I remember most of the details. So the HDW–F900 was slightly inferior, but once you print it to film, as all the FX shots were going to be anyway, and make the distribution prints, viewers can’t tell the difference.

While the o-neg itself can produce nice crisp images that benefit from being scanned at very high resolution, distribution prints fall somewhere below 1080p so these digital cameras provided cutting edge digital editing and digital intermediates. And it made the whole movie match in quality. We know better today with our DLP projectors and 4k TV’s, but in 2002 they were not looking so far forward, just as many movie makers never imagined that some of the tricks they used that were obscured by the old optical printing process would be revealed by modern digital scans. We have surpassed the quality level they planned for. And until digital FX started being done higher than 1920x1080 or 2K, there wasn’t much point in the rest of the movie being at a noticeably higher resolution.

And my comment about HD is not 2k was just a general comment, not in reply to you.

Author
Time
 (Edited)

@yotsuya: Again, I respectfully disagree.
AotC was definitely shot in 1440x800.

I think the important part is this:

Since SW2 was to be displayed as 2.35:1, and Panavision were not able to come up with the promised Anamorphics to work with a Prism-splitter 3-CCD camera, the movie was shot letterboxed, so the master images were only 1440 x (about) 800.

Lucas didn’t plan to use a HDW-F900, but a ‘Panavised’ one (HDW-F900F). But Panavision didn’t deliver in time, so Lucas had not other choice than cropping his shots.

Furthermore, HDCAM SR tape format was not yet available, so he had to use a ‘bastardized’ HDCAM 4:2:2 (instead of 3:1:1, but not SR) limited to 1440x1080.

So the final cropped shots were in 1440x800 (which is still HD).

Some other interesting quotes:

this meant that, unfortunately as is the case with digital masters in general, 1440x1080 would remain 1440x1080 until the end of Time

(https://www.redsharknews.com/technology/item/2990-how-george-lucas-pioneered-the-use-of-digital-video-in-feature-films-with-the-sony-hdw-f900)

According to Wikipedia :

The actual resolution of Attack of the Clones is not 2k, but just 817x1440 pixels. This is because the HDCAM format subsamples the 1920 horizontal lines to 1440. The 1080p aspect ratio of the camera only applies when the 16:9 aspect ratio is used. To produce the 2.39:1 aspect ratio, the top and bottom of the image are cropped, reducing detail. This cropping is why Spy Kids 2, (shot with the same camera) looks better then attack of the clones. Spy Kids used the native 16:9 aspect ratio and thus used all the pixels of the camera. (Anamorphic lenses could have allowed the full 1080 lines to be used, but were not available for the HDW-F900.) . --Algr (talk) 19:17, 27 June 2019 (UTC)

(https://en.wikipedia.org/wiki/Talk:Star_Wars:Episode_II%E2%80%93_Attack_of_the_Clones)

And the final word to ILM HD Supervisor Fred Meyers himself :

With the earlier equipment, RGB color from the camera was converted into 4:2:2 YUV format when it was recorded. This format effectively slices the color bandwidth in half because one color value represents more than one pixel. The result is fewer chroma (color) samples than luma (luminance). This chroma sub-sampling combined with spatial sub-sampling effectively reduced HD’s 1920 resolution to 1440 for luma and 960 for chroma.

(https://boards.theforce.net/threads/were-the-cameras-used-on-2-and-3-really-that-bad.50033313/#post-52654498)

JEDIT: ChewieLewis is right, most of the movie is CGI, not related to HD cameras.
IIRC, CGI was rendered in 2k (so really no need to scan AotC in 4K, which is the main question here)

Author
Time

I disagree about a 4K scan being pointless. It doesn’t matter what resolution the original digital source was. The print needs to be scanned to preserve the unique version of the film, and the scan would benefit from the higher resolution, which better reproduces the grain on the print. Scanning in 4K also means it can be released in native 4K for those who want to watch a 4K version.

TPM has what, 70% or more shots with digital effects? That’s most of the film that’s maxed out at 2K. But it’s still worth a 4K scan of the print.

Army of Darkness: The Medieval Deadit | The Terminator - Color Regrade | The Wrong Trousers - Audio Preservation
SONIC RACES THROUGH THE GREEN FIELDS.
THE SUN RACES THROUGH A BLUE SKY FILLED WITH WHITE CLOUDS.
THE WAYS OF HIS HEART ARE MUCH LIKE THE SUN. SONIC RUNS AND RESTS; THE SUN RISES AND SETS.
DON’T GIVE UP ON THE SUN. DON’T MAKE THE SUN LAUGH AT YOU.

Author
Time
 (Edited)

Plus there’s the fact that scanning in 2K might mean the “pixel grid,” for lack of a better term, that’s printed on the film probably won’t exactly line up with the “pixel grid” of the scanner, meaning some original pixels might be lost during scanning. Scanning it in 4K means this is less of an issue, since each original pixel on the print would be covered by up to 4 pixels on the scanner.

Author
Time

ZigZig said:

@yotsuya: Again, I respectfully disagree.
AotC was definitely shot in 1440x800.

I think the important part is this:

Since SW2 was to be displayed as 2.35:1, and Panavision were not able to come up with the promised Anamorphics to work with a Prism-splitter 3-CCD camera, the movie was shot letterboxed, so the master images were only 1440 x (about) 800.

Lucas didn’t plan to use a HDW-F900, but a ‘Panavised’ one (HDW-F900F). But Panavision didn’t deliver in time, so Lucas had not other choice than cropping his shots.

Furthermore, HDCAM SR tape format was not yet available, so he had to use a ‘bastardized’ HDCAM 4:2:2 (instead of 3:1:1, but not SR) limited to 1440x1080.

So the final cropped shots were in 1440x800 (which is still HD).

Some other interesting quotes:

this meant that, unfortunately as is the case with digital masters in general, 1440x1080 would remain 1440x1080 until the end of Time

(https://www.redsharknews.com/technology/item/2990-how-george-lucas-pioneered-the-use-of-digital-video-in-feature-films-with-the-sony-hdw-f900)

According to Wikipedia :

The actual resolution of Attack of the Clones is not 2k, but just 817x1440 pixels. This is because the HDCAM format subsamples the 1920 horizontal lines to 1440. The 1080p aspect ratio of the camera only applies when the 16:9 aspect ratio is used. To produce the 2.39:1 aspect ratio, the top and bottom of the image are cropped, reducing detail. This cropping is why Spy Kids 2, (shot with the same camera) looks better then attack of the clones. Spy Kids used the native 16:9 aspect ratio and thus used all the pixels of the camera. (Anamorphic lenses could have allowed the full 1080 lines to be used, but were not available for the HDW-F900.) . --Algr (talk) 19:17, 27 June 2019 (UTC)

(https://en.wikipedia.org/wiki/Talk:Star_Wars:Episode_II%E2%80%93_Attack_of_the_Clones)

And the final word to ILM HD Supervisor Fred Meyers himself :

With the earlier equipment, RGB color from the camera was converted into 4:2:2 YUV format when it was recorded. This format effectively slices the color bandwidth in half because one color value represents more than one pixel. The result is fewer chroma (color) samples than luma (luminance). This chroma sub-sampling combined with spatial sub-sampling effectively reduced HD’s 1920 resolution to 1440 for luma and 960 for chroma.

(https://boards.theforce.net/threads/were-the-cameras-used-on-2-and-3-really-that-bad.50033313/#post-52654498)

JEDIT: ChewieLewis is right, most of the movie is CGI, not related to HD cameras.
IIRC, CGI was rendered in 2k (so really no need to scan AotC in 4K, which is the main question here)

The Panavision one was to compress the vertical image so the full 1080 was used. The camera is rated for 1920x1080, not 1440x1080. That was the previous camera. The Panavision lens vertically compresses that 1080 into what ATOC an ROTJ cropped down to 816 (or some sources say 818) giving 262-264 lines more vertical resolution to widescreen film. And yes, the color levels were compromised compared to what came after, but the pixel resolution was not. I did some tests on what impact it would have on each frame if the yellow was horizontally compressed (1/3 of the image) and there is more noise from compression artifacts than there is from doing that. And looking at the image of many films, the yellow layer is the lowest resolution of them (if you study how film is made, there are some interesting tricks that give us what we perceive as full color without giving each of the three colors equal clarity).

And if what you say is true, the evidence should be there in the frames. I isolated a frame that lacks any FX (not easy to do in those two films) and if what you say is true, I should be able to compress any frame to 1440x1080 and expand it back to 1920x1080 and there should be no quality loss. Well, there is quality loss. That process degrades the image in a detectable way. It is not readily apparent to the naked eye, but it is there. I ran the same process on a couple of other images (not from movies) and they show the same level of detail and the same degradation of the image if I compress them the the same way. I don’t see any evidence that the image was reduced to 1440 and expanded while I do see evidence that it wasn’t.

But regardless of the exact resolution of the image that was printed to 35 mm film, in order to get the best result when scanning the film, scanning it at 4k is best. Scanning at the exact resolution of a printed image will result in quality loss. Where if you scan at a higher resolution (and then reduce it properly if needed) it will preserve the quality.

Author
Time
 (Edited)

yotsuya said:

And if what you say is true…

Hi yotsuya,

I’m sorry, I don’t want to argue with you, I’m just reading and believing what ILM HD Supervisor Fred Meyers said about this matter, and I’m assuming it is true: “This chroma sub-sampling combined with spatial sub-sampling effectively reduced HD’s 1920 resolution to 1440 for luma and 960 for chroma.”

About the ideal scanning resolution, as I said before: even if the original master was shot in 6k, after going through a traditional Intermediate Positive, then to IN to release print stage in the lab, it wouldn’t go above 2K. Plus the weave and lack of pin-registration in most projectors, plus the less-than-optimum focus, you could easily wind up with well below 1080 resolution.

Believe me, I’m currently scanning The Phantom Menace in 4K: there is nothing to get more than in 2K (and Harmy seems to think the same).

Author
Time
 (Edited)

ZigZig said:

yotsuya said:

And if what you say is true…

Hi yotsuya,

I’m sorry, I don’t want to argue with you, I’m just reading and believing what ILM HD Supervisor Fred Meyers said about this matter, and I’m assuming it is true: “This chroma sub-sampling combined with spatial sub-sampling effectively reduced HD’s 1920 resolution to 1440 for luma and 960 for chroma.”

About the ideal scanning resolution, as I said before: even if the original master was shot in 6k, after going through a traditional Intermediate Positive, then to IN to release print stage in the lab, it wouldn’t go above 2K. Plus the weave and lack of pin-registration in most projectors, plus the less-than-optimum focus, you could easily wind up with well below 1080 resolution.

Believe me, I’m currently scanning The Phantom Menace in 4K: there is nothing to get more than in 2K (and Harmy seems to think the same).

I have spent years scanning photos and you always want to go higher and then reduced after the scan. A movie is just a series of 180,000 photos. The post scanning image handing tools are much more sophisticated than the scanning tools. It just pays to get more and then reduced to what you really want. It also helps during the repair process (removing dirt and scratches). I’ve had to repair a number of old photos with missing corners and enlarging them 2x, doing the repair, and then shrinking them back to the original size helps hide the signs of the repair and results in a better end product. So scanning at 4k and then fixing the dirt and scratches will give the best final product rather than scanning at 2k or HD.

As for the resolution, it appears to be a recording device limitation, not the camera. And they used a number of different recording devices. Some probably are at the resolution you are describing and the couple of frames I checked may have been done with higher end equipment. And when you consider how many scenes in many movies are made, the quality has always varied a bit depending on whether the editor/director wants to use the scene as shot or crop it for a better picture. I remember noticing in 1997 that the front shot of Luke looking at the binary sunset had more grain than the surrounding scenes indicating it was originally a wider shot. That one shot of Qui-gon Jinn in the Counsel chamber from TMP is just awful because it was digitally shot and then cropped.

Author
Time
 (Edited)

yotsuya said:

ZigZig said:

yotsuya said:

And if what you say is true…

Hi yotsuya,

I’m sorry, I don’t want to argue with you, I’m just reading and believing what ILM HD Supervisor Fred Meyers said about this matter, and I’m assuming it is true: “This chroma sub-sampling combined with spatial sub-sampling effectively reduced HD’s 1920 resolution to 1440 for luma and 960 for chroma.”

About the ideal scanning resolution, as I said before: even if the original master was shot in 6k, after going through a traditional Intermediate Positive, then to IN to release print stage in the lab, it wouldn’t go above 2K. Plus the weave and lack of pin-registration in most projectors, plus the less-than-optimum focus, you could easily wind up with well below 1080 resolution.

Believe me, I’m currently scanning The Phantom Menace in 4K: there is nothing to get more than in 2K (and Harmy seems to think the same).

I have spent years scanning photos and you always want to go higher and then reduced after the scan. A movie is just a series of 180,000 photos. The post scanning image handing tools are much more sophisticated than the scanning tools. It just pays to get more and then reduced to what you really want. It also helps during the repair process (removing dirt and scratches). I’ve had to repair a number of old photos with missing corners and enlarging them 2x, doing the repair, and then shrinking them back to the original size helps hide the signs of the repair and results in a better end product. So scanning at 4k and then fixing the dirt and scratches will give the best final product rather than scanning at 2k or HD.

It depends on the quality of the 35mm. My guess, based on what I see on the 35mm of TPM, is that you’ll get something far below the sharpness of 1080p.

As for the resolution, it appears to be a recording device limitation, not the camera.

It is both a limitation of the camera (read carefully: it had to be cropped at 800x1440 due to the lack of the promised Panavision lenses) and the tape format.

Anyway, it confirms that the original shots are limited to 1440x800.

Author
Time
 (Edited)

Like I said, I’ve read that they used different recording formats and not all of them were limited. I think you are right that some shots were limited to 1440x818, but others weren’t. But also, as you have pointed out, it was a hybrid format that was part 1440 and part 960. Mathematically when you put those two together, you end up with a 1920 image where every pixel is distinct. And when you have a film where nearly every shot is an FX shot which was done digitally at 2k, a few shots that look a little soft are not going to impact it. Look at ANH. A great many FX shots in that film are soft and lack the crispness we expect today. So overall I don’t think the film is as low quality as you are making it out to be. but it definitely isn’t a flaw in the camera. They have continued to use it for other films and I bet with the right recording equipment and lenses, it gives a nice HD picture. You are focusing on one instance where it wasn’t as high quality as they expected. Elsewhere I have read that they used a number of different recorders and some were very advanced and others were portable. And it’s not like people involved in making a film always tell the full story. Or that interviews are fully accurate. Unless you have talked to him in person, you can’t take what he is quoted as saying as 100% accurate in all instances. Other places have reported slightly different information and they can both be correct. I will admit that if one form of tape was inferior and only did 4:3 HD recordings that one of the team would let that stick in their memory and relate it. But the camera stats and reports of multiple types of recording equipment being used leads to the conclusion that some shots are compromised and others are not. I would guess all the location work is where you will find the lower quality recordings, such as Tatooine. Portable technology is always behind bulkier studio technology.