When news broke that The Hobbit would utilize a frame rate double that of standard film in order to ease the 3D effect, I knew audiences were gonna hate it… they just won’t know why.
Cinemas project film at 24 frames per second, an industry standard (call it 24fps). Soap operas, news programs, and most reality TV are 30 frames per second, interlaced together (which is called 60i but let’s just say 30fps for our purposes, it’s close enough). Here’s a video demonstration of the difference:
If your eyes can’t tell the difference between 60i/30fps and 24fps, don’t beat yourself up, because even if your eyes can’t, your brain actually can and does, like magic. We actually have an organ that does shit we can’t comprehend even though we live right undeneath it. It’s easy to accept something so amazing. So when you watch 30fps, subconsciously your brain associates it with reality tv, news, and soap operas (or TV dramas), which are shot in that frame rate. When you watch 24fps, your brain associates it with narrative storytelling. There’s nothing magic about this part, it’s just how these frame rates been done for half a century, and it’s a cultural norm that’s been wired into our brains, which is fine. It works. The rule of thumb for filmmakers has been if you want your audience to buy into the story, shoot it in 24fps. If you want them to feel the reality of what you’re showing them, shoot it 30fps (sports, news, documentaries, etc). Shooting 24fps doesn’t guarantee “cinematic look” unless you also fulfill basic cinematic requirements like lighting, good storytelling, and believable acting, and likewise some documentaries are better suited for 24fps, but as a guiding principle in the mechanics of film, the gateway to telling the audience they’re watching a narrative is to shoot it at 24 frames per second.
Now Peter Jackson has decided to buck that tradition with The Hobbit, and journalists are already smelling something funny:
While 48fps may create a more realistic, “you are there” picture quality, it actually works against The Hobbit from the 10 minutes of footage we saw. This undeniable “reality” kept pulling me out of the movie rather than immersing me fully into its world as the Lord of the Rings films did; the very fantasy element, the artifice of it all (whether it’s the wigs, fake beards or CG monsters) was plainly, at times painfully, evident. There was none of the painterly gentleness that film offers a fantasy film, as was so beautifully the case with the original (shot on film) LOTR trilogy. I fully expect the 48fps issue to become the much-talked about “mumbling Bane” flap to come out of CinemaCon.
Right as he is about his own perceptions, the actual cause for alarm that the journalist misses is that audiences aren’t going to notice why their cinematic experience now feels like a cheap reality TV show, or a news item. They only subconsciously notice the difference between 24fps and 30fps, and anything beyond that, including 48fps, just looks like 30fps. To make matters worse, at home they can’t even tell that their new television sets render 24fps into silky smooth 30fps for “clarity”. Stu Maschwitz explains a trend that, despite all its intentions, ruins the look of a film at home (it gets technical, but read past the numbers and you get the gist):
Fortunately, you probably won’t be seeing the hyper-colorful showroom “torch mode” version either. Most reputable manufacturers are seeking Energy Star approval on their flatscreens these days, and part of that certification means that the sets cannot come off the truck in “demo mode,” also known as torch mode. You may be presented with a choice when you first power up the set: demo mode or something like “home” mode. Pick “home” mode and your default settings will be somewhat tamer than the “hey, look at me” showroom floor configuration.
And that, right there, is as far as 95% of TV owners will ever go toward “calibrating” their TVs. More than ever, this is a tragedy.
…
[The new] 120Hz and 240Hz TVs have the potential to show you each 24p frame for exactly 1/24th of a second, perfectly replicating The Way Movies Look, and that’s great. The problem is, it’s hard to make them do that, because of awful motion-smoothing settings that are On by default.Manufacturers somehow decided that The Way Movies Look is bad, and that they should “fix” this using technology. The same technology that is used in software like Twixtor and Kronos to change frame rates of video. Why show a mere 24 frames per second when we can magically build, or interpolate, new in-between frames and show 120 or even 240 frames per second?
The results, while varied, are certainly “smoother” than 24p. And the engineers rejoiced. “We’ve fixed that horrible ‘film look!’”
You’ve seen this. It’s in every store. A big, bright, blue LCD set blasting Avatar. The image is so smooth and “live” looking, you catch yourself trying to figure out if it’s the game or the movie. Or it’s some classic film, playing big and bright and smooth and causing you to wonder if you’re somehow seeing the EPK “behind the scenes” video instead of the actual movie.
It’s not that consumers are incapable of seeing the difference between frame rates, but being fed a new frame-rate for films will subconsciously deteriorate the audience’s ability to treat film as “film”, both at home on the television and, possibly now, in the theater.
Whether Jackson’s 48fps helps the 3D or not isn’t what matters, even though other 3D films seem to do it fine. The issue is when a producer packages a new idea like this into a massive hundred+ million dollar package like The Hobbit, with the utterly idiotic assumption that it will go unnoticed in the grand scheme of the film, something will still smell bad to the audience, but since they’re unable to pinpoint it they won’t know what exactly hit em. They’re left with a general feeling of, “Yes it was a good movie, but something was off, and I can’t explain what it was.” Meaning, they were distracted by something, and so the film failed to engage them fully.
Like the influx of hundreds of shoddy economic policies hidden in the folds of a 400-page bill of legislation, producers manage to sneak stuff like 48fps, nauseating shaky camerawork, and rapid-fire editing into these film packages. Producers of any kind do their clients the best service when production goes toward the ultimate end of the product as formally defined. If the product is “good government”, producers protect rights and enforce the law. If the product is “action film”, producers let the audience see what the hell the action is. If the product is plainly “cinema”, then just give us 24fps.
EDIT – Coming from the perspective of an action filmmaker, my being so adamant about using 24fps for film stems from making action films in 30fps for years (alright alright! For you tech buffs I mean 60i!) It’s more difficult to pull off a fight scene in 30fps without frame-dropping or some sort of “cheating”. Moves take up more frames and appear to take longer, and lining up hits takes more accuracy. Shifting the shutter speed to 1/120 or higher just to make action appear faster only aggravates the process of lining up hits, since the subjects have such exact dimensions in the frame and they intersect less often than they do if slightly blurred. A little blur at 1/48 or 1/60 shutter speed makes hits line up better with no cheating while still avoiding the ugly ‘video’ look. If you’re okay with cheating, you can drop frames, undercrank, or speed up in post, but the subjects lose frames in their movement and separate from reality, viewer engagement declines, and your hits lose impact.