Most movies are shot in 24 frames per second, or sometimes 25. Should you play your video games in 24 FPS to maintain that effect? I have considered playing my games at 24 FPS, but unfortunately not many titles allow you to set your Vsync to that number. Do you believe that video games lose their cinematic effect at above 24-25 FPS? Maybe our video game experiences would be superior in 24 or 25 FPS?
24fps as a standard for cinema came into existence during 1926 because the variable 16-26fps frame-rates of early silent movies at the time only worked when the audio was separate from the film, (eg, an organ in the cinema's "pit" played back live to viewers
), but had to have a standardised frame-rate when "talkies" were introduced, audio was recorded onto the film itself and reliable playback of correct pitch was required. That's all there has really ever been to the 24fps "cinematic standard". It was never about realism or "aesthetics" even 95 years ago, and comparison with games are bogus for several reason:- 1.
fps is not Hz. Even 100 years ago, projectors used dual and triple-blade shutters (each frame was displayed 2-3x times increasing the actual 'playback frame rate' to 48-72Hz shown to viewers from 24fps source material) to reduce eye strain from the chronic flicker that 'actual' 24Hz comes with. Old CRT displays also had more "persistence of vision" that better hid some effects such as interlacing on broadcast TV than today's flicker-free TFT's. Trying to force a monitor to VSync refresh at just 24Hz today is nonsense when that isn't how 24fps movies themselves have ever been shown even 100 years ago. 2.
Film based movies have natural amounts of motion blur. Rendered games do not. Fast pan (turn very quickly) in a game at 24fps and it will be juddery / stuttery as hell but the picture will be sharp when paused on a single frame. Movies are the opposite, they 'appear' to be smoother but pause during a fast pan and it's blurry as hell. 3.
24fps only "works" on domestic TV's (50Hz PAL / 60Hz NTSC standards were chosen purely because they matched the 240v 50Hz / 120v 60Hz mains electrical frequency that were easier to use as a timer on early TV's) by either speeding it up 4.167% (24 -> 25fps PAL) or resorting to the awful 2:3 Pulldown (taking 4 source frames and splitting them up into 5 output frames, 24 -> 30fps NTSC) and then taking either of those and further splitting the frame up into half-frame interlaced "fields" (25p -> 50i or 30p -> 60i). Why on Earth anyone would want to find that desirable to replicate for locally rendered video games when it's always been a massive mess of a compromise for broadcasting is beyond me. 4.
The "Cinematic Look" is mostly psychological habituation. People are "used to it" being normal for 95 years and find 60fps movies "abnormal". Had something else been selected as the standard from 1926, eg, 72fps, they'd feel exactly the same way towards anything different "24fps looks awful because it's not what I'm used to in cinema's"
In short the 24fps standard was never about "realism", "aesthetics" or the limitations of human vision even back in its introduction, 24fps movies have never actually been shown at 24Hz, modern TFT's do not work like older CRT's and VSyncing one to 24Hz would result in unusably harsh flicker, and the "joys" of interlacing and 2:3 Telecine are things we weep tears of joy getting away from rather than artificially replicating in a home environment for no real reason. Almost nothing is comparable in any way between +60fps games vs 24fps movies / broadcast TV.