Sign In

Should we attempt to watch Star Wars (original trilogy) in true 24p?

Author
Time

Over time, I’ve read a lot about film vs. video frame rates. The most confusing issue, perhaps, is the distinction between true 24p (24.0 fps) and NTSC video 24p (23.976 fps). As I understand, most films are shot in true 24p, and therefore I assume the original Star Wars movies were as well (especially given their age). I notice that, e.g., Harmy’s Despecialized versions indicate a rate of 23.976 when I play them in mplayer (Linux). This means the movie runs about 0.1% slower than originally filmed (and viewed in theaters). The audio is 48000 samples per second. And to match the 0.1% slower video, the audio is also slower and lower in pitch by that amount (I know it’s a small amount…).

I was playing around today with the -speed option in mplayer. If I do:

mplayer -speed 1.001 star_wars.mkv

I should be effectively seeing it in 24p, if all above is correct. I tried “-speed 1.3”, e.g., and it is noticeably higher pitch and faster, so it seems to work. Should this be the way we really should be viewing these old movies? If so, does it make sense to make video files with 24.0 fps and 48048 samples per second (or keeping 48000 but resampling, if that’s the only way)?

Just something to ponder…

Author
Time

I’m not sure if you’re going to notice any difference between 23.97 and 24.

Author
Time

I’m not sure if you’re going to notice any difference between 23.97 and 24.

That’s a good question and could vary from person to person. But the question is really a “purist” one: should that step be done (if there are no drawbacks) to get as close to the original experience as possible? If changing the actual files would create problems (and I bet it very well might), should someone, in their home theater, change the playback speed on these movies by 1.001 times, or would that not be technically the right thing to do?

Author
Time
 (Edited)

Well, the difference is about 7 seconds on a two hour movie. If you want to cut out those extra 7 seconds, go for it. Others might not see it as a worthwhile endeavour.

Author
Time

I noticed. There was an extra 0.03 frames every second, so I called Lucasfilm to complain.

Author
Time
 (Edited)

Actually you would loose .03 frames every seconds.

But such small differences are beyond my limit of caring. You would probably get a better response from the folks over at the Preservation board.

Author
Time

Personally, I think there’s a difference between being a “purist” and going overboard. I think the idea here is mainly to preserve the original theatrical cuts (more or less), but I definitely think there’s such thing as being too persnickety about things like a decimal change in frame rates or exacting color timing.

Keep Circulating the Tapes.

END OF LINE

(It hasn’t happened yet)

Author
Time

Most of the color correction stuff I can sympathize with, 24 vs 23.976 is overboard.

If anybody really cared, you could just export the video at exactly 24 fps, though you may be stretching out certain sources that were exported at 23.976, arriving at a similar problem.

Author
Time

A lot of the color correction is misguided too.

The Person in Question

Author
Time

Tyrphanax said:

Personally, I think there’s a difference between being a “purist” and going overboard. I think the idea here is mainly to preserve the original theatrical cuts (more or less), but I definitely think there’s such thing as being too persnickety about things like a decimal change in frame rates or exacting color timing.

As opposed to simply accurate color timing. Which is currently lacking in official releases of the OT.

“That Darth Vader, man. Sure does love eating Jedi.”

Author
Time

ATMachine said:

Tyrphanax said:

Personally, I think there’s a difference between being a “purist” and going overboard. I think the idea here is mainly to preserve the original theatrical cuts (more or less), but I definitely think there’s such thing as being too persnickety about things like a decimal change in frame rates or exacting color timing.

As opposed to simply accurate color timing. Which is currently lacking in official releases of the OT.

Yes. There’s accurate and then there’s nitpicky and just plain speculative.

Keep Circulating the Tapes.

END OF LINE

(It hasn’t happened yet)

Author
Time

The speculative ones, to the point of being inconsistent for the sake of so-called purity is when it loses me.

The Person in Question

Author
Time

wildlava said:

As I understand, most films are shot in true 24p, and therefore I assume the original Star Wars movies were as well (especially given their age).

Your understanding, as well as your assumption, are incorrect. Very few movies, unless they are very early digital movies, were shot in true 24fps. Digital cameras today only have the capability of shooting 23.976 fps because early adopters of digital video in the film industry wanted the same frame rate as film, because they needed to print to film for distribution (very few, if any digital theaters in the early days of digital). The frame rate has stuck around because it has become synonymous with the elusive “film look” that people shooting digitally strive for.

Author
Time

Darth Lucas said:
Your understanding, as well as your assumption, are incorrect.
Very few movies, unless they are very early digital movies, were shot in true 24fps.

My understanding is that 24 fps (true 24.0 fps) was settled on in the early 1900s (just Google it). This was for films (not video). And that makes sense: why would someone decide on a film frame rate of “23.976 fps” back in those days?

The 23.976 fps rate is related to the 29.97 fps of color TV. 29.97 was used to avoid problems with old TV hardware (see https://en.wikipedia.org/wiki/NTSC).

The Wikipedia page for “24p” (in the “23.976p” section here: https://en.wikipedia.org/wiki/24p#23.976p) states, “Nevertheless, even in NTSC regions, film productions are often shot at exactly 24 frame/s.” It also states that, “Some 24p productions, especially those made only for NTSC TV and video distribution (e.g. in Canada or the USA), actually have a frame rate of 24 * 29.97 / 30 frame/s, or 23.976frame/s.”

So my understanding/assumption that most film productions destined for the cinema were shot at true 24.0 fps is pretty reasonable (if you find a reference that contradicts this, please post it). And if true, the Star Wars trilogy was also shot at 24.0 fps.

To address some earlier posts, I am not talking about dropping frames or anything like that. The video sources of the movies contain the actual frames that were shot by the film camera, and I am just saying to play those frames - all of them - sped up by 0.1% (during playback). Drawbacks could include “beating” with the refresh rate of your monitor, etc., but that’s another topic (and may be why this is not done typically).

Now, were those old cameras and projectors that accurate to begin with? Not sure.

And I know 0.1% is very, very small - I’m not arguing it’s really significant - this is more of a philosophical question. A “why not?”, if you will.

Author
Time
 (Edited)

I think you’re just confused. 24fps has really just always been used as shorthand for 23.976 fps. 24fps was decided on as the frame rate of motion pictures in the early days because it is the smallest amount of fps you can have and still achieve the illusion fluid motion. You have to understand that film is a mechanical process, and it was near impossible for them to achieve TRUE 24fps in the early days and they eventually figured out that what the cameras were actually shooting was 23.976fps. But the frame rate stuck due to aesthetic and tradition, even when it got to a point where we COULD accurately shoot true 24fps. Even today, the standard frame rate for theatrical films both digital and film is 23.976. It came from tradition; it would be kind of odd if that was just a random decimal someone decided to use for the hell of it, right? Now, you could screen any 23.976fps motion picture at true 24fps and not really notice any difference, but I promise (at least in the case of Star Wars, but majority of others as well) it is more accurate to screen at 23.976, since that is what the cameras captured and the projectors projected. (My source for this info is four years of film school and three years of industry experience, which you may or may not find more reliable than Wikipedia)

But long story short, if you ever hear anyone in the industry say “24fps” what they mean is “23.976”. They are used interchangeably and very seldom is anything shot at TRUE 24fps.

Author
Time
 (Edited)

And just to avoid confusion, what your Wikipedia article is talking about is 24p, which is a video/broadcast format, not an analogue film projection format, which is how Star Wars was originally viewed. Hope I’m coming across well. I have a hard time explaining technical stuff like this but I’m just trying to help out your understanding.

When you’re taking about cinema, the answer is almost always 23.976. When you start getting into video standards, that’s when the waters get really muddy and confusion starts to set in.

Author
Time

Interesting insights, Darth Lucas! Thanks!

Keep Circulating the Tapes.

END OF LINE

(It hasn’t happened yet)

Author
Time
 (Edited)

Personally I think this obsession with trying to make everything as close to a 1977 screening as possible is more than a little misguided. At the end of the day, you’re still watching digital video; short of striking a new print, you’re only ever going to get a simulation of authenticity. This refusal to optimize these things for home video has kept me from fully embracing a lot of the 35mm-based projects that have been released. I would love to see a 35mm scan of an entire Star Wars film all cleaned up and degrained like what Harmy did in ROTJ v. 2.5 with that sequence that was out of focus on the blu-ray, but the insistence on carrying over all of the imperfections to a new format where they don’t even really make sense kind of ruins it for me.

Author
Time

Running a film at it’s proper frame rate isn’t the same as watching a beat up film scan warts and all. Why on Earth should any film (or any digital video production for that matter) be run at a different frame rate than it was made to?

Army of Darkness: The Medieval Deadit | The Terminator - Color Regrade | The Wrong Trousers - Audio Preservation
SONIC RACES THROUGH THE GREEN FIELDS.
THE SUN RACES THROUGH A BLUE SKY FILLED WITH WHITE CLOUDS.
THE WAYS OF HIS HEART ARE MUCH LIKE THE SUN. SONIC RUNS AND RESTS; THE SUN RISES AND SETS.
DON’T GIVE UP ON THE SUN. DON’T MAKE THE SUN LAUGH AT YOU.

Author
Time

The “23.976 fps” frame rate is a direct result of video frame rates established when color TV was developed. Doing the 3:2 pulldown process (used to convert film to video), and matching the NTSC frame rate of 29.97 fps, we get:

24 * 29.97 / 30 = 23.976 fps

This rate did not result, accidentally, from the imprecision of old mechanical film cameras/projectors. You are right that 24 fps was chosen for film, but, as you say, the old cameras did not have the kind of precision to all be “actually shooting at 23.976.” It was a result of the chosen NTSC TV scan rate. So my point is that old film cameras shot at 24.0 fps (or rather as close to that as mechanically possible: +/- some error), not precisely 23.976 (which is used for video and derived from the formula above).

Author
Time

Correct. When color tv was introduced here in the States, they had to make the signal compatible with the some 30 million b&w tv sets that had already been sold. What they discovered was that if the audio, the b&w video and the additional color signal were all running at the exact same refresh rate, they interfered with one another. So the video refresh rate was altered very slightly to 29.97. Hence film-based material had to be slowed down from 24 to 23.976.

Author
Time

TheQuazz said:

So, is the OP right?

Yes, this video explains it pretty well: https://www.youtube.com/watch?v=mjYjFEp9Yx0
Essentially, since the scan rate of NTSC is 59.94 fields per second, there are 29.97 “frames” per second (one field is half of a frame). Film to video transfers were done using a process known as a 2:3 pull-down which separates 4 film frames amongst 10 fields (5 frames). The resulting perceived frame-rate is 23.976 fps since 4/5 = 23.976/29.97. Overall 23.976 fps is merely a consequence of video technology and nothing to do with a film cameras inability to maintain a constant 24 fps. What I am not sure about however is why keep it around even when modern equipment can shoot and project at exactly 24 fps?

Author
Time

TheQuazz said:

So, is the OP right?

If your question is, “Is Darth Lucas blowing some stinking hot vapor?” then the answer is very much a “yes.”