Over time, I’ve read a lot about film vs. video frame rates. The most confusing issue, perhaps, is the distinction between true 24p (24.0 fps) and NTSC video 24p (23.976 fps). As I understand, most films are shot in true 24p, and therefore I assume the original Star Wars movies were as well (especially given their age). I notice that, e.g., Harmy’s Despecialized versions indicate a rate of 23.976 when I play them in mplayer (Linux). This means the movie runs about 0.1% slower than originally filmed (and viewed in theaters). The audio is 48000 samples per second. And to match the 0.1% slower video, the audio is also slower and lower in pitch by that amount (I know it’s a small amount…).
I was playing around today with the -speed option in mplayer. If I do:
mplayer -speed 1.001 star_wars.mkv
I should be effectively seeing it in 24p, if all above is correct. I tried “-speed 1.3”, e.g., and it is noticeably higher pitch and faster, so it seems to work. Should this be the way we really should be viewing these old movies? If so, does it make sense to make video files with 24.0 fps and 48048 samples per second (or keeping 48000 but resampling, if that’s the only way)?
Just something to ponder…