- Post
- #1518452
- Topic
- Star Wars Episode I: Cloak Of Deception (Released)
- Link
- https://originaltrilogy.com/post/id/1518452/action/topic#1518452
- Time
PM sent!
PM sent!
My ‘Custom Special Edition’ of ANH addresses this. I use a syllable from the same scene to have him say, “I wanted you to have this….” And I don’t think it sounds too bad.
Also, the ROTS ‘canon cut’ by NFBisms tackles this differently. As Anakin and Padme have their hairbrushing balcony scene, there is dialogue implying Anakin is suggesting giving the baby a lightsaber.
What about obi wan explaining his relationship with anakin in return of the Jedi not remotely lining up with the phantom menace.
🤷
My ‘Custom Special Edition’ of ANH addresses this. I use a syllable from the same scene to have him say, “I wanted you to have this….” And I don’t think it sounds too bad.
Also, the ROTS ‘canon cut’ by NFBisms tackles this differently. As Anakin and Padme have their hairbrushing balcony scene, there is dialogue implying Anakin is suggesting giving the baby a lightsaber.
I’ll do that. I was erring on the side of more smoothing only because I assumed it mattered how closely the test and reference images were identically aligned.
If I understand correctly, I could shuffle the pixels of both images into a random order and get the same result.
Have you found a smoothing number that works best when the sources just aren’t cropped or aspect-ed identically?
It… doesn’t matter if the images are aligned? I assumed the way it worked was to compare each pixel with the pixel at the same address in the other image.
If it doesn’t matter what order the images are in, I guess it shouldn’t matter if the two images are even perfectly aligned. If one were slightly stretched or squished relative to the other… that may not matter?
Thank you, that helps me plan how to wrangle them into usable input. Should it matter whether they are stacked horizontally or vertically? It’d be easier and more precise for me to do that than to arrange them into a grid.
I’ve used VMware for years and had no problems. VirtualBox is a good free one I had used in the past.
Also, stitching multiple frames together seems to work really well. However, mammoth images seem to choke it and it does not progress past a certain point. But even with a low amount of color spaces (since a large amount choked it) the results are pretty darn good across a whole movie with very different locations.
I’ve just been running it within a Windows VM.
Cool, I’ll try that in order to hopefully get away with a scene-by-scene correction rather than shot for shot. Thanks for the orientation.
How would you correlate the number of color spaces with the exactness of the image alignment, and same question for smoothing parameter?
Sounds like a higher smoothing parameter is better when the sources aren’t pixel-perfect. Is a higher number of color spaces always preferable, with the only downside being time?
Also, would it make any sense at all to create a collage of multiple frames (twice, per source) and feed that into the tool for a more informed LUT to export?
I’m trying to use this for two different sources for ESB, one from 2004 and another the 19SE. I align the images as best I can before generating the two frames to feed into the tool, but I can’t get it to be exact.
Would it be better to use a higher smoothing parameter? I haven’t seen DrDre recommend anything higher than 0.1, but it can go as high as 1. If the two images are not totally identical (same frame, but the sources aren’t pixel-perfect), is there a drawback to using 1 as the smoothing parameter?
Looking forward to this. Thank you.
I don’t know what horrible life decisions I’ve been making NOT to have tried this out for myself sooner. I just generated several LUTs for the 19SE BluRay of ESB trained on Adywan’s color corrected 2004 transfer, and the results look really good. Very much looking forward to having time to play with this further.
Good! I’d suggest cutting to the shot of Scmhi watching from a distance (end of the scene), as Jar Jar continues to scream, taken from the bongo sequence.
That’s really pretty well done, but I wouldn’t be interested in overwriting JEJ with someone else.
Oh, don’t worry. When I saw the shape of the wall of text and capital letters, I smirked and shook my head gently with a chuckle while reading the first sentence.
Fan edits… may not be your thing. Like any and all fan edits, LOE is an alternate cut of a movie which in no way precludes the original version from being accessed and enjoyed.
The idea is to step into a hypothetical alternate 2005 and see how this cut hits you. I’m glad you gave the edit a try, evidently without having looked over what was changed, but you still have ROTS unharmed to return to.
That said, I think you realize this is likely your last post before a ban, coming in hot with language and saying you would like to see me dead.
I’d have liked for RJ to tweak things after Carrie passed so that Luke could survive TLJ and give Trevorrow a bone.
I really should check this out and at least play around with it.
Sometimes I’ve wondered what it’d look like to take a bunch of our fan projects and essentially average out their color correction to see what it’d look like. It sounds like you could just about do that, creating one or more LUTs for each of these and then applying them to your unmodified source material. Just apply them all partially, like if there’s ten references apply each of their LUTs to your source at 10%.
You could take the GOUT, 97SE, Despecialized Edition, 4K77/80/83, and whatever else and put them in the blender. Maybe arrive at something that reflects them all in a mitigated way.
As someone who doesn’t really know how to color correct things in any detailed way, treating it that way sounds fun.
Good job!
Never can have enough hard drives. It’s a small miracle things hold together enough to even do all this. Appreciative of your time and effort bringing this to us. Hoping to lay eyes on it in 2022, but will be glad to whenever.
Does this edit use that shot you fix on your 2020 4k of a new hope of C-3PO on the falcon?
This project uses the 2011 Blu-ray as its ultimate source (as color corrected by Adywan circa 2018). The main source for his other, longer-term project that was demo’d with that shot of 3PO on the Falcon uses the 19SE as its primary source.
I don’t know whether he plans to merge the two sometime in the future, but the 19SE-based project aims to present a good base for editors to use going forward.
I wish I had an idea about how to troubleshoot the problem.
Is the .mov file you’re outputting a master file or intended for regular viewing? I don’t suppose you could shuffle around the contents of your boot drive to accommodate the space needed.
Best way is to message the editor. You could also try fanedit (dot) info. But, it’s not updated consistently.
I think that would be very difficult because he and Adywan are working from entirely different primary sources.