- Post
- #1367133
- Topic
- Harmy's STAR WARS Despecialized Edition HD - V2.7 - MKV (Released)
- Link
- https://originaltrilogy.com/post/id/1367133/action/topic#1367133
- Time
Link to start of previous discussion:
Link to start of previous discussion:
Maybe this technique will work great to remove grain and enhance detail in shots that don’t have too much motion?
Why would we want to remove grain? The UHD is already overly grain “managed”.
When you mix 4K77 (heavy grain) and the UHD (little/frozen grain), something’s gotta give or they won’t blend seamlessly. A mixture of degraining and fake grain is pretty much inevitable at some level.
I wasn’t able to repro the issue either, both VLC and MPC-HC, the only software video players I’d ever used. So IMO whatever I saw was either a bug fixed in a newer release, or something in the Windows audio mixdown to stereo, which may be different since I’m using a different machine now than when I saw this before.
PM sent. If you are fluent in both Japanese and English, I’d love some help comparing two different translations to see if one is distinctly better than the other.
PM sent.
Woohoo! A whole reel! Oh, and the colors look good too 😉
There’s a README file included with the project which explains all of the file names. I haven’t used the term “forced” in years, so I think you have a very old version. I sent you a link to the current version.
Summary of the meanings of the various filenames:
Oh, and IMO it’s not entirely clear the problem is a missing center channel. You should still get dialogue bleed-over in the L/R channels if you just mute the center. But this thing, IIRC, was just zero dialog, like a whole different mix, or something created from combining the surround channels only. I’ll also see if I can repro the issue here, but it may take a while.
Yay!
Could also be a bug in the mixdown from 6.1 to 2.0 in the software somewhere, if that’s being done.
PM sent.
Also saw the same issues with the audio tracks right off the commercial discs, so I doubt it’s specific to a version. It may be hard to test if the conditions for making it fail aren’t fully understood – Matt may be invaluable here, as the only one who can reliable repro the issue. I could extract and re-encode the DTS-MA audio if that would help (I have all the right tools for it), but I’m heading out for a week, so I’m not sure if someone else wouldn’t be better for that.
There is definitely something odd about the audio track – I’ve seen this same behavior, but I forgot which software it was that did it (could have been a version of VLC). I’ve been using burned discs so I don’t see the problem on my hardware player. I’m wondering if the DTS lossy core isn’t the same as the DTS lossless – that way it would only appear in players that had trouble decoding the lossless. I saw the same with the audio on the official discs. To test this/fix this, you could decode the DTS-MA lossless to multiple WAVs and then re-encode it again, and then presumably the lossy core would match the lossless audio again.
I, myself, prefer Despecialized for it’s cleaner and polished look. The various 4K restorations all look incredible though; I guess I’ve never been one to watch or own a film exactly the way it was seen in a theater.
In my case, the drive-in with the little metal clip-on speaker and the reel that broke partway through isn’t anything I’d ever want to repeat. Not that the film wasn’t amazing anyway, but still.
Challenge accepted. We have to get a preservation of the Original Trilogy into the Seed Vault.
Projected Derann print:
Any idea what bulb was used for that photo? A lot of old films projected with a newer xenon bulb or some such thing will look much cooler than they would have theatrically with a vintage bulb. Not saying this one was, but it looks pretty cool to me.
I don’t know, but I recently saw one of these prints projected, and it looked very similar. This is also what I see on 1997 SE frames I have. I’m personally more interested in what is seen on the print, because while it’s great to use vintage bulbs, the underlying assumption is, that the bulb burns at the same temperature as it did a few decades ago, whereas I believe lights become warmer with age, and so projecting a film with a vintage bulb may not be the most accurate representation.
That’s a fair point about aging bulbs, but any recent projection you saw was also likely with a cooler bulb unless someone went really out of their way. Still, I like your results regardless of the bulb discussion, and that’s what matters most. I’d consider your take a neutral “what’s on the print” timing. Even if you do want to bias it for a tungsten bulb down the line, your timing is still the best place to start.
Projected Derann print:
Any idea what bulb was used for that photo? A lot of old films projected with a newer xenon bulb or some such thing will look much cooler than they would have theatrically with a vintage bulb. Not saying this one was, but it looks pretty cool to me.
I don’t get how it’s even possible for 44rh1n to send people a Dungeon Master. I mean, that’s cool and all, but put air holes in the packaging.
While I think it’s looking pretty darn great so far, I think I ought to throw my two cents into the teal walls debate. I’m sure some of it would have been visible in '77, but it would have been projected with very warm bulbs, which would have counteracted the teal in the walls a bit, not a ton, but a bit. I don’t have much of a preference, as I think either way is going to be “accurate” in one sense or another. However, I do think you’ve gone a bit overboard on that teal look in certain shots. For instance, I’m seeing quite a bit of teal in Leia’s dress in some of those shots, which I don’t think would have been the case. It may have picked up a bit from the walls around the edges of her dress, but it’s not reflective enough of a material, I think, to have the wall color drastically affect our perception of the dress color itself.
Just a little nitpick, but I think it’s worth addressing.
Question from a person clueless in this process: how much of this would be attenuated if using a TV set to warmest (which I do) vs viewing on my relatively cool computer display?
Hard to say. One display’s warm setting won’t match another’s, and I doubt any preset would emulate a tungsten bulb very well. I think generally people target a calibrated display, and then any variation from that is considered the display’s fault/user preference. Anything other than calibrated, and you just have to feel out whether you think it looks good or not.
As for Darth Lucas’s observation, I think bulb-matching is a legit aesthetic, but if you want to do it, it should be at the very end. i.e. a neutral timing is best for this stage, as it’s ultimately just raw materials for the larger project. It will then go into the meat grinder of despecialization and grain-matching and so on, and if Harmy wants to give it that tungsten bulb touch of yellow, he can choose to do that in one global final color pass.
That wouldn’t happen to be the one frame at the end of Reel 5 I mentioned, would it?
It is, but the frame is not a DVD/BD exclusive – it’s a theatrical frame. It’s mentioned in this thread: https://originaltrilogy.com/topic/Whats-missing-from-GOUT/id/6725/page/1
Having said all that, isn’t theatrical sync the same as GOUT sync for Star Wars anyway?
No, Star Wars GOUT sync isn’t the same as theatrical. 4K77 isn’t theatrical sync, they decided to GOUT sync that one. It’s only one frame different. 4K83 was the first project that decided to test the non-GOUT-synced waters.
So yeah, if we go with theatrical, we’re breaking sync for 4K77 and all of Harmy’s projects (and countless minor/derived projects). And we could even opt to break sync for everything if anyone entertains my crazy ideas about how to implement the theatrical frame standard. Which is kinda why I want the project leads to get together and hammer out some sort of agreement, because it has the makings of a giant cluster if each project just decides to just do its own thing.
This illustrates why the GOUT standard had such staying power – it’s been used since 1993 (a decade before the GOUT it’s currently named for existed), so all kinds of projects gravitated to it for mutual compatibility. You could take an audio track from a 20-year-old Laserdisc preservation project and drop it onto Despecialized released yesterday and it would sync perfectly. Changes to this frame standard should be done in a careful, considered, coordinated manner.
PM sent.
PM sent.
Changing the subtitles would be trivial in terms of effort. It would even be unnecessary if the change was just a couple frames (e.g. to the 4Kxx timing). If Harmy decides to change to a standard that’s neither GOUT nor 4Kxx (which would fix at least two issues with the current situation*), then subtitles could still be used, with an offset (and so could some audio). Rendering the subtitle graphics could still take a week or so, but that’s a fire-and-forget process. All I need is lead time.
Audio would take some work to modify, no doubt about that. Much advance notice would be nice.
* the HMDI handshake muting part of the fanfare on some systems, and users not knowing if their audio is properly synced or not, even while watching.
May be a bit too early to tell, but does anyone know if 3.0 will be GOUT or Theatrically synched?
That’s a big question for me too. Last time I asked him, when 4Kxx first started syncing to a new frame standard, Harmy said he had no plans to jump onto a new sync standard (the experience of finally nailing down the one we have now was harrowing enough), but things may have changed in the intervening time. 4Kxx is popular, and it may not be such a drag anymore to move to a new standard.
If so, I’ll happily sync my stuff to whatever frame standard Harmy wants to use, but I’ll need some advance notice. And certainly there are others who’ll need to do the same.
My opinion is that as long as we’re breaking sync on a lot of audio tracks, we may as well get all the major project leads together to discuss it, rather than unilaterally doing it, so we don’t just break sync again later when someone has a new idea for how things should sync. For example, in my opinion, if we’re going to go through the trouble of breaking sync, why not pad the beginning with an extra half-second of silence. That will fix issues where HDMI handshakes mute the first bit of the Fox fanfare on some systems, and it would be just as “theatrical” as any other sync standard out there, since theatrically, there were leaders and previews and such before the film anyway. I’d prefer a big difference like this over something that’s only a couple frames different, so people will know with certainty that their audio is out of sync, instead of just having a vague uncertain feeling that something’s not right, but not being able to put their finger on it.