logo Sign In

Speculation about the 4K Future of AOTC and ROTS

Author
Time

I hope some of you with the technical insight about these things will chime in.

As we all know, AOTC and ROTS were shot using then-state-of-the-art digital camcorders rather than on film, the former being the first ever movie shot completely digitally. This arguably did the industry a favor by pioneering or what have you, but the films seem to pay the price by having substantially less picture information than their film-based predecessors to hand off to increasingly higher resolution remastering.

AOTC and ROTS were shot, mastered, and distributed in 1080p and short of re-rendering out CGI created around 2001-2004 (which I imagine would be extremely unfeasible), they seem to be locked into that resolution forever, since there is no additional picture information to pull out of not just the masters, but the ‘original negative,’ as it were.

I know that the new UHD 4K format offers more than just sheer resolution increase, like HDR and better color depth. What I want to know if whether anything at all about the new format could possibly benefit movies like AOTC and ROTS, over the efforts of people around here who would do upscales of them (like emanswfan). Would the studios just be doing exactly what they are doing here? Or does the new format allow color details, if not outright resolution details, to better come through from these two movies?

I wonder, too, if they’ll do TPM justice in 4K if its two sibling movies can’t be helped.

My stance on revising fan edits.

Author
Time

The overwhelming majority of current movies are still finished at 2K resolution. It’s not that much more than 1080p, an upscale will be just fine.

Han: Hey Lando! You kept your promise, right? Not a scratch?
Lando: Well, what’s left of her isn’t scratched. All the scratched parts got knocked off along the way.
Han (exasperated): Knocked off?!

Author
Time

AFAIK, AotC and RotS, though shot in 1920 by 817, would’ve been finished at cinema 2k res for scope (2048 by 858) and had their vfx shots rendered out at that res as well. For the entirely cgi shots, this would’ve been relatively straightforward. Shots involving live action footage were presumably upscaled very slightly, but I could be wrong about that. It’s possible that ILM worked on all the live-action shots at their native res and they weren’t upscaled to “true 2k” until the final DI pass.

The other thing to keep in mind is that the blu-rays are not the best these movies can look. Lucasfilm would presumably have the 2k, 10/12-bit, 4:4:4, DCI-P3 digital cinema source master files on a hard drive somewhere in the vault whereas the blu-rays are 1920, 8-bit, 4:2:0, rec 709 hd video. These movies can look a lot better than their current home video iterations. Even if we take the limitations of 1080p bd into account, I’d still really love to know how AotC turned out looking so tinted and blurry.

There’s actually already been at least one movie shot in 1920:1080 released on UHD, Resident Evil: Afterlife. The pixel count of the cameras used on AoTC/RotS isn’t the problem, it’s more the other specs like color sub-sampling and whatnot. But like I said, these movies were still finished at 2k.

TPM, though shot on film, had vfx added to almost every last shot. This would’ve been done at 2k. So, the “original negative” for all intents and purposes are the finished rendered frames that were filmed out to actual 35mm negative 20 years (and at least a couple weeks) ago today. The digital filmout tapes were still readable in 2011 and were used to rebuild the movie.

It’s worth noting that 2k vfx were the norm up until only very recently. Even a movie like After Earth, which was shot and finished in 4k, only had its vfx rendered at 2k. The effects shots for Blade Runner 2049 (which Harmy worked on) were finished at 3.4k for the live action shots (to match the native res of the cameras) and 4k for the fully cg shots. TLJ had its vfx rendered out at 2048 by 1718 so that when it was “unsqueezed” by 2x (just like the cinemascope camera footage) it would be full 4096.

Author
Time

Thanks for that answer; it clears some things up I was wondering about. The color bits/range/whatever should stand to be improved in almost any film moving from BluRay to UHD, it sounds like.

My stance on revising fan edits.

Author
Time
 (Edited)

It is like putting a PAL master on BR. For those in the UK there will be a little improvement over DVD, but for us in the States it is a vast improvement.

Same with these two movies. Upscaling them and releasing them in 4k, even without any other processing, will produce something that looks far better than the BR release. For one thing the compression artifacts get hidden with the increased pixel depth.

Author
Time

If that recent report from an “unnamed UK source” has any truth to it, AotC and RotS were actually the most challenging of the first six movies to remaster for the UHD format.

Author
Time

So you’re saying that they were good enough for big screen projection but not for current home media standards? I don’t get the mechanics of this. Not that I would care to own them… just curious.

Author
Time

I have a question about the color bit (depth?). Can you explain it to me like I’m a five year old? How does that differ from the resolution of the image? Is it that there are more colors to choose from, or something about how they are displayed?

My stance on revising fan edits.

Author
Time
 (Edited)

Hal 9000 said:

I have a question about the color bit (depth?). Can you explain it to me like I’m a five year old? How does that differ from the resolution of the image? Is it that there are more colors to choose from, or something about how they are displayed?

In binary you only have two “numbers” to work with, 0 and 1.

8-bit is 2 to the 8th power, or 256, for each primary color. 256 cubed (since there are three primary colors: red, green, and blue) gets you a little over 16 million possible colors with 8-bit.

Extrapolate that out to 10-bit and you get billions of colors, 12-bit billions more.

If I’ve got this right, regular blu-ray still uses 8-bit color but UHD blu-ray uses 10-bit.

RotS was probably finished in 10-bit, although the actual cameras recorded in 12-bit color.

I’m not sure about TPM, AotC, or the four most recent movies, but Reliance MediaWorks’ 4k restoration of the OT was done in 16-bit. Alfonso Cuaron’s recent film Roma was also finished this way.

Author
Time

I remember reading that AOTC was only shot in 8 bit 4:2:0, and ROTS 8 bit 4:2:2. These cameras were basically on par with the $2000 prosumer Sony A7SII of today.

Author
Time

And to note the extra bit depth is incredibly useful for color grading work in fanedits.

I’ve run into many issues with regrading the prequels due to the limited amount available on the current blu-rays. Meanwhile working on Solo from the 4K Blu-ray, it’s incredible just how much you can alter the image.

Author
Time

emanswfan said:

And to note the extra bit depth is incredibly useful for color grading work in fanedits.

I’ve run into many issues with regrading the prequels due to the limited amount available on the current blu-rays. Meanwhile working on Solo from the 4K Blu-ray, it’s incredible just how much you can alter the image.

Are you one of the twelve people in the world who actually owns a UHD optical drive?

Author
Time

Fang Zei said:

Are you one of the twelve people in the world who actually owns a UHD optical drive?

Yep, I ended up upgrading to one of those considering my other bluray drive is internal and is in my currently dead PC that I need to either fix or replace.

They’re pretty much the same price as I paid for my internal bluray drive 8 years ago.

Author
Time

I think the HDR/Dolby Vision is a bigger reason to buy UHD discs than the 4K resolution. I hope we don’t get pink Sith lightsabers though.

Author
Time

While I was working on upscaling the existing blu-ray of AOTC, I was reminded at just how god-awful it looks. There’s digital noise, compression artifacts, misaligned color channels, etc. ROTS is actually a huge improvement due to them using a newer digital camera.

So I’m hoping they really clean up the footage well, because it’s a pain to fix this.

This program I was planning on using if they don’t release 4K versions: https://www.videoartifact.com/va/

It can do some amazing stuff!

But I’d hope Lucasfilm can do better than me with cleaning up the footage.

Author
Time
 (Edited)

They just ought to scan a 35mm film out master and call it a day. 😉

Forum Moderator

Where were you in '77?

Author
Time

Hal 9000 said:

Eman, I’d be curious about your candid opinion of the HDTV master we have from schorman13 for AOTC. Head-to-head, how does it do compared to the BluRay?

You know I actually haven’t seen it yet, I was at one point considering switching to it as the main source for AOTC considering it has the theatrical colors.

I’ll take a look at it soon, and get back.

Author
Time
 (Edited)

Maybe you should overlay the Chroma from the HDTV over the luminesce from the Blu-ray @emansfan

Author
Time
 (Edited)

This video could be helpful for understanding some things. It doesn’t go into HDR as much as it should, but it does explain that our uscales usually use just one method, while the professionals use different upscaling methods per shot sometimes. So AotC/RotS would benefit from a professional upscale, not to mention the less-compression UHD provides AND wide color gamut.

https://www.engadget.com/2019/06/19/upscaled-uhd-4k-digital-intermediate-explainer/

Star Wars Revisited Wordpress

Star Wars Visual Comparisons WordPress

Author
Time

doubleofive said:

This video could be helpful for understanding some things. It doesn’t go into HDR as much as it should, but it does explain that our uscales usually use just one method, while the professionals use different upscaling methods per shot sometimes. So AotC/RotS would benefit from a professional upscale, not to mention the less-compression UHD provides AND wide color gamut.

https://www.engadget.com/2019/06/19/upscaled-uhd-4k-digital-intermediate-explainer/

This video does a great job at explaining things. I’ve been doing a lot of research into upscaling footage, and there’s a lot of newer algorithms that are pretty exciting, such as AI and neural network driven upscaling.

Even I’ve very much advocated that even SD material, like many shows shot in video prior to digital HD, would benefit from profesional upscaling and other clean-up. Doctor Who has been re-releasing their classic series on blu-ray and even those manage to get more out of sub-HD material and allow for a cleaner, less compressed image.

Author
Time

PAL sourced material always looked better to my eyes, even as a kid. If Dr. Who had been shot NTSC those 16mm telerecordings of lost episodes would have looked hideous.

Forum Moderator

Where were you in '77?

Author
Time

Anamorphic 16:9 PAL video isn’t that far off from 720p resolution, really.

Author
Time

So in terms of upscaling AOTC, I compared a straight lanczos3 4K upscale up against a waifu2xphoto denoised 4k upscale.

http://screenshotcomparison.com/comparison/1743

As you can see some objects scale well that already have sharp lines, but most of it is still a blurry mess and doesn’t look that much better. I used the maximum setting for denoising which cleans up all the image but does scrub away some detail. So essentially there’s no big difference.

The reason being is that the shot is already cropped in from the full 1080p output of the camera with poor upscaling, so the footage is already sub-1080p. George Lucas frequently cropped shots in, sometimes for a zoom effect. AI upscaling usually only looks good if the input is in it’s native resolution. If it’s been previously upscaled it blurs the input and the AI can’t find the edges and fine detail.

What might work better is a multi-frame upscale for each shot that accounts for details that change across the frames.