it ar a guud
Every single one of husezni’s posts made me want to punch babies.
Actually, just one baby in particular: the one with his face on it.
About three and a half years ago I was going through a bad time and was feeling reckless, so I tried pot for the first time. (I interned at a recording studio for a year, so I was constantly around musicians, and those folks always have some around.) I overdid it and ended up standing in a corner for a few hours, somewhat afraid for no reason and completely forgetting how to talk. It was rather odd, to say the least.
I tried it several more times over the next few months, but it never quite agreed with me the way it does for other people. In general I found using it to be slightly unpleasant, so after a while I just kind of stopped. Thankfully it has no addictive qualities, so there’s absolutely no compulsion to keep going if you don’t want to.
The one thing about it that is rather remarkable is that if you’re really stoned and involved in creative activity, it puts your mind into places it wouldn’t ordinarily go. That first time, I was sitting in the room with my audio mentor as he worked on mixing a song (he himself gets high on a regular basis and loves it), and I had the rather extraordinary feeling of being ‘inside’ the music. It quite literally felt like my mind was inside the song itself, and everything about the way it was written, performed, and mixed just made so much sense, in a powerful way I can hardly describe. When that happened I began to understand why musicians use it so frequently, and I have to admit I’d probably use it more if I didn’t dislike the physical sensation of breathing it in. I’m not sure I could actually function as an audio mixer if I were that stoned, though!
I can’t really be all that arsed about texture packs… they can be kind of nice but I don’t regard them as essential. I love Twilight Princess HD (in fact I got a Wii U specifically so that I could play that version of the game, having already loved the original), but the Gamecube/Wii era lends itself to that sort of thing much better, because the 3D models are more sophisticated. With N64 stuff, it takes you into a weird franken-zone of having detailed textures pasted on top of blocky and simplistic 3D models. And running the whole thing at 1080p or whatever makes the popups and switches between hi- and low-res stuff extremely obvious, when they should not be. To me that looks much more bizarre than simply running the game at its original resolution.
Yes, playing it from the real system with only 320 x 240 resolution does look soft and not very detailed, but it also doesn’t present any of the weirdness that results from running in ways it was not designed for. And using scanlines really does break up the pixelation and render it almost unnoticeable. The Framemeister’s image can be made very close to how it appears on a CRT… and the thing is, playing them that way 20 years ago was just fine. I wasn’t sitting around fretting about how crappy and soft things looked in 1997, I was enjoying playing the games. If I enjoyed it then, I can enjoy it now the same way. I don’t require massive upgrades in the picture to think something can continue to be relevant.
One game I do run with hi-res textures is Dark Forces 2: Jedi Knight. But the reason that works is because all of the 3D models can be replaced with versions that have a much higher polygon count, as well. You can’t do that with 64 stuff.
Another thing with emulation is that the timing of events in the games is often not right due to it running at the wrong speed. People tend to complain about frame rates and such, and when seeing a version that runs without slowdown will pronounce it superior, but I’m starting to believe that this is not always a good thing. If a game ran at a certain speed, then eliminating the slowdown completely makes it feel ‘off’, in an almost indescribable but significant way. The first time I noticed this was in GoldenEye: at the end of the Runway level, a brief fanfare plays as Bond flies off in the airplane. When emulated, this scene happens much faster than it does on the actual N64, with the result that the fanfare is always cut off before it is heard to completion. I’m becoming increasingly convinced, too, that the speed of the gameplay was designed around the amount of slowdown that the N64 produces, because many levels just don’t ‘feel’ the same in their pacing as they do on the console, and the slower version ironically seems more natural. It’s definitely easier to keep Natalya alive in the Control Center on the real N64 than on the computer, because the enemies aren’t shooting as quickly or as often.
And to point out a very dramatic and obvious example in Zelda, when firing a Light Arrow at Ganondorf, the huge flash of light that results from hitting him causes the system to slow down by a huge amount for a several seconds until the flash has dissipated. The visual effect of this is rather stunning, because it causes the impact of the Light Arrow to remain onscreen for far longer than it otherwise would, turning it into an awe-inspiring moment. When emulated, the flash runs at full speed without any slowdown at all, with the result that it is over and done with before it can really call any special attention to itself. It’s still a nice-looking effect, but it lacks the majestic grandeur of the slower version. (Apparently in the 3DS re-release they actually intentionally added the slowdown back in for this scene, though I’ve never played that one.) I’d actually forgotten about this for a while, having been so used to playing it emulated on the faster Gamecube version, that going back to the real thing was kind of shocking, in the best sense of the word.
I can hardly even begin to talk about my frustrations with the sound problems of N64 emulation. Crackling and dropouts abound, and the audio often visibly lags behind the image by quite a bit. Not good.
I seem to have spent quite some time complaining about these things on here… I didn’t really mean to come in here ranting and raving about how emulation sucks or whatever. I don’t mind emulators when they actually work properly–all of my SNES playing for the past few years has been emulated, mostly on the Wii Virtual Console but sometimes on homebrew, since my sister has the SNES we grew up with and hasn’t sent it back to me yet. The few N64 games they they have on the VC tend to work pretty well for the most part. A few years ago I was trying to convince myself that I could set the real system aside and just emulate, since my controller joystick was busted and I thought my cartridges didn’t work anymore (I didn’t realize that cleaning them with rubbing alcohol could work such wonders in getting them going again!), but after running into so many problems with games not running right, graphics glitches and sound problems and crashes and the constant grind of changing settings, I got sick of it and made the effort to start using the real thing again. Once I got a suitable replacement joystick (I currently use the Gamecube-style stick with a custom-made replacement circuit board, which calibrates the sensitivity to accurately replicate the original stick), it was like a reunion with long-lost friends. I tend to divide my time equally between playing on a CRT and using the Framemeister, and it looks great on both through S-video cables. Really it depends whether I feel like sitting in a chair and playing on a larger screen, or sitting on the floor and using a smaller screen, like we did back then. I’m just happy to be able to keep using my N64, rather than having to give it up and get rid of it like I thought I would.
In a few days my new Everdrive should arrive in the mail, and I suspect that I’m going to enjoy the hell out of that thing. I’m okay with using the Wii VC for SNES games for now (it can be set to output 240p, meaning they look really good over component cables on the Framemeister and CRT both), but someday I do want to get back my real SNES too, because I’ve been missing it lately.
This post has been edited.
So a few weeks ago I played through GoldenEye on the N64 and finished the entire thing on 00 Agent difficulty, something I’d only ever done once before (about a year ago). A lot of the levels are perfectly doable, but man, those last few are crazy. The Aztec on 00? Holy crap! It can take several tries just to make it out of the first room alive, let alone surviving the rest of it. The enemies are so insanely fast and have such good aim, you have to be really careful to stay on top of it and not make any mistakes, otherwise you’re a goner…
Now I’m playing through Perfect Dark, on Perfect Agent difficulty; and as hard as GoldenEye is, this makes it look like a walk in the park. Most of the tricks that could be used in GE to avoid taking damage from enemies no longer work, enemy gunfire depletes your health much more rapidly, and the whole thing seems to have been designed to appeal to people with a masochistic need for punishment in their games. Last year I did manage to get through all the regular levels (though not the bonus levels), but many of them required multiple attempts before I could pull it off. I’m pretty sure I must have tried and failed to beat the Skedar Attack Ship at least 35 times before finally managing to scrape through with almost no health left. It is brutally unforgiving, and if you make more than one mistake in the beginning, you pretty much have no chance of completing it. I want to see if I can beat them all again, but I’m not entirely convinced I’ll be able to manage it this time!
GoldenEye and Perfect Dark are one of the main reasons why I eschew N64 emulation in favor of using the real system (and was willing to spend $400 on the Framemeister to get acceptable picture quality from it). There are far too many graphics glitches and emulation inaccuracies when trying to run these on the computer, to the point that it often hardly even feels like playing the same game. Using a non-N64 controller for games designed with a six-button layout in mind is also really irritating, and trying to dial in the joystick sensitivity to allow the weapon aiming to work the way it’s supposed to is an exercise in frustration. Much better to just use the original version, which ‘just works’, and skip all that other garbage.
Is it just me, or does Poita have the worst luck in the world?
What is this, Be Reasonable In The Political Thread Day?
Where’s the name-calling and mindless bickering over nonsensical things that don’t really matter?
Come on guys, I’m disappointed. :p
Game systems that originally rendered in 240p resolution (ie, everything Nintendo 64 and earlier) should always be viewed on a CRT display if at all possible, or with something that mimics what a CRT does to the image. It’s the only way for it to look right, and to see what the game designers would have actually seen at the time the game was made. Anything else–whether it be horrid blocky pixelation, or destructive smearing filters–just looks wrong and ruins the intended presentation.
The CRT effect preserves all the resolution the game produces while smoothing pixelation in a way that our eyes naturally respond to. Blocky edges are rounded out very nicely, and the visible ‘scan lines’ (which are actually blank space in between the lines of resolution, and appear due to the nature of the 240p signal) break up the harshness of the pixelation, tricking our brains into seeing a more detailed image than what is actually shown. In a way it’s similar to how film grain can trick us into thinking we see a sharper image than we really do–even though it’s an illusion, it is one that fits very well with our perception. In contrast with this analog goodness, digital displays are very unforgiving and show only the harshness of the raw image, ruining the illusion the low resolution art is trying to create unless it can be brought back as a post-processing effect.
Just recently I took the plunge and ordered the XRGB-Mini Framemeister upscaler, which excels at turning 240p signals into HD resolutions and can add convincing scan lines as well. The experience of playing classic games through this unit (using S-video, component, or RGB connections) far exceeds the results typically experienced by plugging an old console straight into an HDTV. Almost entirely without fail, modern TV’s misinterpret the 240p signal as 480i, applying unnecessary deinterlacing and ruining image detail, as well as adding a significant amount of input lag. The result is thoroughly unsatisfying, and nothing at all like the original experience. The Framemeister brilliantly restores the image quality and playability of classic games on modern displays, and I’m thrilled to have it. Last year I did pull my old CRT out of the closet and I’ve played a bunch of old games on it since then, and nothing can really beat the experience of using the real thing (that analog ‘glow’ it imparts on the image cannot be entirely replicated by any digital display, although CRT shaders are becoming quite nice), but since CRT’s are no longer made, I like having the ability to use my old systems on newer TV’s as well.
A couple years ago I spent a lot of time messing around with emulators, but lately I haven’t been all that interested in them. With some systems, like the SNES, it has become good enough that I don’t mind it, but others like the Nintendo 64 continue to be unsatisfying. I became very frustrated with the ubiquitous graphics glitches and inaccuracies, the near-constant need to mess with settings, and my progress being hampered by crashes. Once I got my controller fixed and started using the real N64 again, the experience was so much more satisfying that I never intend to go back to emulation if I can help it. Playing on a console is just better than doing it on a computer; it feels more ‘real’ somehow, and in the near future I want to get an Everdrive so I can play a wider selection of games than I’ve been able to find actual copies of.
This post has been edited.
Are you going to leave the bar scene with its home video red tint or revert it to its more neutral film appearance?
Excardon me, miss; could you help an old wintergreen pick up his spectacles?
Protuberances abound, Regilith…
Nah, the 1993 versions are mono surround only.
The one for the first movie was made by using the four main channels of the 70mm mix (L, C, R, S), combining it with additional bass derived from a separate sfx-only master, and then adding new sound effects on top of it in certain places. The mono surround of the 70mm is presented as is with no modifications, aside from some of the new additions also appearing in the rear from time to time.
For ESB, they did not use the 70mm version, but instead took it from the original four-track master conformed to the 35mm edit, adding in bass using the same method. Minor differences exist between this version and both theatrical mixes, but they are very small and the overall sound is very close to what the 70 would have been (when the film was slightly re-edited for 35mm, the entire mix was not re-done but only given minor modification, mainly for the new edits). Again, the surround effects are mono.
Unlike the other two, RotJ '93 is a new mix from multitrack stems, and so does not reflect what the 70mm would have sounded like except in the general sense. However, it was done in the same way, and again has mono surrounds. Since these were only made for home video viewing and intended to be decoded by Dolby Prologic (stereo surround decoding did not yet exist in any Dolby product), there was no reason to do them in any sort of 5.1 style. No 4 or 5 channel version of the 1993 versions was ever made; they were matrixed stereo digital mixes only.
When decoded with Prologic II or other stereo-surround capable algorithms, it is true that these mixes will show separation between the derived surround channels. However, this content consists only of crosstalk from the front. The actual surround effects themselves are mono; they are equal in level between both rear channels. Discrete channel 70mm versions would not have had such crosstalk, so a decoding scheme that allows for the least perceptible amount of it will most accurately reflect what the source would have sounded like prior to matrix encoding.
This post has been edited.
I’m a millipede hermaphrodite.
If you’re upscaling to 5.1 just for the purpose of including the LFE, I wouldn’t bother as it’s a lot more trouble than it’s worth. Maybe mix in the LFE @ -10dB.
Or there’s a more elegant way of delivering stereo+LFE - direct bitstream configuration.
AC3 & DTSMA can support 2.1 natively, and encoders are readily available.
If you’ve actually heard what my 5.1 mixes sound like with the LFE, it’s unlikely you’d say it wasn’t worth it.
As for encoding in 2.1 format, I did try that a long time ago, but I won’t do it again. 2.1 is out of spec for AC3, and only non-Dolby encoders allow for this channel configuration. DTS does allow it, but receivers can be quite unreliable as to whether they’ll actually play it back properly. Some of them will, but others will treat it as a stereo signal only and ignore the .1 altogether. Upmixing to five channels and combining with LFE in order to create a standard 5.1 format was the only way I could reliably obtain both the surround audio and the enhanced bass response together in one mix.
This post has been edited.
I just got CatBus’ PM about this, and have downloaded the file posted by Puggo. Since I’ve been rather swamped with work lately I might not get to it right away, but I will investigate this mono version for any possible differences to the stereo mix.
Without having heard any of it yet, it seems likely to me that it would be a separate mix, but with only minimal differences in content. If the other two movies are anything to go by, there might be a few minor discrepancies in which sound effects are included, but probably nothing particularly significant.
TV’s Frink said:
We’re already in a handbasket
I just heard that NBC will air Megyn Kelly interviewing Alex Jones. What the fuck?
Her time would have been much better spent interviewing Tuck Buckford: https://www.youtube.com/watch?v=gv2RnmQUMhI
If you read the RotJ novelization, it states clearly during the Obi-wan scene that Leia was taken to Alderaan by her mother, and that her mother lived there with her for a while, dying when Leia was a few years old.
In this context, Leia’s ability to remember her is suddenly much more plausible. It also says that when Vader turned to the dark side and joined the Emperor, he had no idea his wife was pregnant — indicating that his turn happened for quite different reasons. And before anyone tries to discount the validity of the novelization as a source, remember that it was based closely on the film script and contains quite a bit of dialog that was written by Kasdan and Lucas but never actually made the final cut of the movie.
So even though these details aren’t in the movie itself, they are nonetheless quite important to the backstory context the film-makers had in mind while working on it. I’ve been aware of them ever since reading the novelization when I was 9. I fully expected that the prequels would adhere to them, and was extremely disappointed when I realized the extent of lazy revisionism that was going on.
This post has been edited.
Well it’s exactly as I predicted, except that the Silly Party won.
Election Night Special
Myself, I voted for Tarquin Fin-Tim-Lim-Bim-Whin-Bim-Lim-Bus-Stop-F’tang-F’tang-Ole’-Biscuitbarrel.
Instead of ‘oh crap’ or ‘oh shit’, I often myself saying, “Oh, butts.”
Sometimes I’ll change it up a little by saying, “Oh, buttboobs.” (Because buttboobs are totally a thing, as we know.)
This post has been edited.
I rest my case about serious mental gymnastics being required to try to stitch these contradictory elements together.
They do not fit and are not even telling the same story. Nothing will ever make them fit, no matter how much threadbare string and worn-out duct tape you try to wrap around them.
These repeated attempts at retconning remind me of a straight woman married to a gay man, who remains adamantly convinced she can make him be attracted to her despite all the evidence that this will never happen. The only solution is to let it go and seek happiness elsewhere…
I’ve occasionally thought of making such a thing. A combination of the 1993 and 1997 mixes would probably be the best way to do it, but in some places that wouldn’t work and more creative editing would be necessary.
Any such thing could only ever be a rough approximation of the mono mix in stereo form, of course: the mono version has so many changes to the balance of the mix that for many of them it is probably not possible to get any results that truly resemble it. The 1997 version did add many mono mix effects, but they are often quite loud compared to the rest of the track, much louder than they were in the real mono mix. To actually blend them relatively seamlessly would take a lot of effort… if I cared more about the mono version I’d probably have already tried to do this, but since I quite prefer the mix balance of the stereo and 70mm versions I could never really be bothered. Still, one day I may end up trying it.
These kinds of retcons suck and are lame. They are obviously not at all what anyone had in mind when the earlier stories were made, and frankly they insult the intelligence of the viewer by expecting us to believe they actually make any kind of sense.
It is far preferable to simply dismiss the contradictory elements rather than trying pretend they fit together as part of some grand plan. The mental gymnastics required to twist them around into such illogical shapes ends up being too tiring and frustrating to keep track of, so at this point I don’t bother even paying attention to a word of any so-called ‘official’ continuity explanations.
The reason for this is because it seems that film prints didn’t actually look that, and the green tint slathered all over the image most likely actually was only added for home video. During the production itself, the main methods by which the unreality of the Matrix was emphasized visually was through bleaching out the sky to eliminate as much blue from it as they possibly could, creating unusual contrast through lighting on set, and sometimes using green filters in-camera and in the color timing. On film, this is evident because some of the Matrix scenes do look somewhat greenish, but not all of them, and not nearly to the extent that they do on video. The DVD already diverges from the original look to a considerable extent, and the Bluray is particularly revisionist (in a nasty, ‘digital’ sort of way), and looks nothing like the film version.
Ghostbusters 2 on Bluray looks pretty good for the most part. It can be a bit over-contrasty sometimes, and the slime seems oddly oversaturated in the magenta range, but it’s not too bad.
The only real problem it has is that in the scenes of Vigo in the museum at the end, particularly as he is about to be reborn, earlier versions had a warm, golden glow throughout, while the Bluray seems to have abandoned this and merely has a neutral color scheme. Evidently they did not look at any existing prints of the film when deciding on the colors for this scene, which is disappointing because it looked rather better in the earlier versions.
True, the surround channel delay is supposed to be implemented by playback hardware only. And it is only supposed to happen during upmixing, not when playing 5.1 mixes.
In this case, the ‘hardware’ is the Dolby Media Decoder application that created the upmix. So for this scenario, it takes the place of the upmixer in the receiver, and the receiver only sees a 5.1 mix. It has no way of knowing that it originally came from a 2-channel source, and so will not apply any additional surround delay beyond what it is already doing to time-align the speakers. Thus when playing it back in 5.1 format, it will sound the same as if the receiver had done the upmix, aside from the addition of the LFE channel. (The LFE channel is, of course, the only reason for even bothering with making it 5.1 in the first place; if not for this, I would have distributed my edits in stereo and just let people upmix them in their receivers like they would anything else.)
So really the only part that is up to interpretation is how much surround delay should be applied in the Dolby software, since it gives full control over parameters which are normally hidden in most consumer equipment. I used longer delays in Empire and Jedi specifically to minimize the comb-filtering issue when the 5.1 is downmixed, since I knew I could not prevent people from doing this no matter how much I urged them not to. The only way to eliminate it completely would be to apply no surround delay at all, which is not an option in Prologic II movie mode or in the original Prologic. Only Prologic II music mode offers this option, but the lower channel separation in this mode is not ideal for film content. I could manually compensate for the delay in Pro Tools after recording the upmix, but then we’re back to the problem of crosstalk influencing the listener into thinking sounds that are front-panned are coming from the back of the room. Dolby specifically designed their movie upmixers to take advantage of the Haas effect, and since my 5.1 is an unusual case, I realized I had to follow their principles as well as I could while taking the differences into account.
Anyway, like I said, I believe I’ll be able to further reduce this in a subsequent version to the point that it will no longer be an issue, taking advantage of the audio engineering experience I’ve gained since then.
The delay of the upmixed surround channels is separate from the delay set by the receiver to time-align the speakers. This is done deliberately by the Dolby process in order to take advantage of the Haas precedence effect.
Due to the way our hearing works, when we hear closely repeated versions of the same sound coming from multiple locations, we perceive it as only emanating from the direction of the closest source, providing the time delay between them is less than 40 milliseconds. Longer delays are perceived as discrete echoes, but for shorter delay values our ears/brain fuse them together and only use the more distant sources to give clues to the size of the space the sound is located in. We cannot distinctly hear identical events that are that closely spaced together. Dolby upmixers take advantage of the Haas effect by delaying the surround channels compared to the front, with a variable range from 15 to 35 milliseconds, so that the inevitable leakage of sound from the front into the rear channels will arrive at the listeners’ ears after the same sounds have already arrived from the front channels. This way it is less likely that crosstalk will influence the listener into believing that front channel cues have come from the surrounds, thus increasing immersion into the aural landscape of the film.
This system works very well when upmixing a matrixed stereo track into multiple channels. When the channels combine acoustically in the air, it sounds as it should. But when downmixing them again after this, which is never intended to happen, a hollow comb-filtered sound is hard to avoid. Shorter delay times, as I used for the first movie, sound rather worse than longer ones. The other two films are less bad in this regard because I set the surround delays longer, resulting in less phase cancellation and weirdness.