Well the thing to remember is, why do you think 24 FPS was chosen as a standard? No, really, this seems like a stupid thing to even ask, but if you know movie history it may make things I say a bit clearer.
Back in the early 1900s, there was no standards for movies. It was a new, experimental technology, and people were branching out and doing all kinds of weird things with it, many of which never survived or caught on. One of the bigger thing was frame rate. There was no standard for frame rate. Most early films shot in something like 18 or 20 FPS. At the time, when you projected it, the motion wasn't totally smooth, it was slightly staccato. But now the standard is 24 FPS, so those old 16 and 18 FPS films are played back at 24 FPS. This gives the sometimes comical effect of fast-motion. That's the reason many old films look sped up.
Why choose a frame rate that, when projected natively looks jerky, or when projected back at the current standard of 24 FPS looks fast? Money. Film is very, very, very expensive. Five minutes of film costs about $1000 for raw stock, and another $1000 to develop and transfer--in today's dollars. At least it did when I worked with 35mm some brief years ago. They could have filmed in 24 FPS, instead of 12, 16, 18 or 20 FPS. They could have also easily filmed in 48 FPS--and they did. But it was so expensive, if you made a feature film that way the costs would skyrocket. They chose an under-24FPS rate at first because it was cheap. It was affordable. So what, it looks a bit jerky, people will be wowed anyway and this is a good compromise. Around the 1920s, as the medium matured and a whole industry sprouted, they had to decide on standards. So, what was the standard frame rate? They chose 24 frames per second. NOT AT ALL because it was ideal, or realistic. They chose that rate because it was the best compromise between affordability, and realism. It wasn't all that realistic, but it was halfway there, and it was a fraction of the cost of filming in something like 60 FPS, which most of the nice looking video games of today run at.
So, from day one it was an arbitrary figure based on financial compromise. But, since we are raised from birth exposed to this on a daily basis it seems "natural" to us. It isn't. There is nothing "natural" or "right" about it, it's an arbitrary decision, but we are just used to it. 30 FPS looks ugly, because it's too little an improvement. 48FPS might just be enough of a difference to cause a shift in perception. I would prefer Dougie's 128 FPS standard, since we aren't printing film anymore, but at least 48 FPS is a doubling of what we have now, and just enough in the "realism" zone to qualify. 24 FPS looks "nice," but that's subjective, it's mainly because we have been raised from birth to look at that, but the history behind that standard is really due to financial issues.