The Big Picture: Frame Rate

 Pages 1 2 3 4 NEXT
 

Frame Rate

How many frames per second does it take to anger critics?

Watch Video

Old farts, I mean, critics will spin their head if movies go 60 fps like most gamers love.

Hm... I can certainly see the problem they face and I can understand that to some people, it will seem slightly irritating at first. Seriously though, change is not bad. A progress in technology is not bad. Those critics are wrong.

One thing I know is that higher frame rates are the bane of the post production crew.

Thanks Bob, that explanes why my roommate was going on about frame rates right before we went to see The Hobbit. But what does it have to do with Supergirl?

Hmpf... I don't care, but I wish they'd stop with the CGI. When CGI gets old it gets VERY ugly, to the point of us saying "dude, that's not how I remember it", using props and stuff though, when they're well done, those stick forever or at least get a classic stamp to them.

Boy, I'm sure glad I'm so obtuse when it comes to the making of a film. I was wondering why the movie got mixed reviews, and really, I didn't, and still kind of don't, see why that is. I probably didn't see it in the 48 rate format, but I don't think it matters either way. Maybe it does to some people, but if a movie is more "clear" and more visually impressive, I honestly don't see how that can be viewed as a negative thing.

Milanezi:
Hmpf... I don't care, but I wish they'd stop with the CGI. When CGI gets old it gets VERY ugly, to the point of us saying "dude, that's not how I remember it", using props and stuff though, when they're well done, those stick forever or at least get a classic stamp to them.

Who was talking about CGI?

This frame rate "issue" is just plain dumb to me.

PC gamers are far from being alien to the term "FPS"(no,I don't mean First Person Shooter).I know PC Gamers with decent enough rigs strive for 60+ FPS on all games they run,and for a good reason,the same reason all professional players in e-sports always play on the minimun settings of the games to get maximum performance,which means less time player command to game reaction.
The difference between 30 and 60 FPS and even 120 FPS is quite noticeable.This tech is what "the PC gaming master race" has been developing for YEARS!And they're still working on it,as graphic rendering and fidelity is developing constantly.For example those CGI-rendered cutscenes which we see in trailers and the current CGI-animated films(see Dreamwokrs/Pixar) those take days to be rendered frame by frame,actual movies take months,and they're rendered by supercomputers with dozens of processors and GPUs.I'm not very tech savvy,but this is what I learned from 5 years in lurking around the internet.

Granted this is quite a luxury for games,let along movies to run above 24/30+ FPS,it is noticeable,but the change isn't as staggering to break a deal or fuss over about it.

TL;DR 48 frames is objectively better,but it's a luxury for now.

MovieBob:
Frame Rate

How many frames per second does it take to anger critics?

Watch Video

"This will sound like Klingon to most of you"

Haha oh man. After years of framerate dips every single "HD gen" gamer has a very good working knowledge of what a frame-rate is. Everyone who have ever been near PC gaming has a borderline obsessive relationship with FPSec also. I know i do. I can notice if something is running 40 or 60fps after years of nit-picking in the settings menu. Never-mind a dip to 24FPS. Some Bluray movie conversions especially suffer from pretty noticeable stuttering at 24FPS.

I don't know what the fps of most HD televisions is nor if that's even a factor. What I can say is the clarity of the image, especially when there's movement, is off putting. Anyone who says this is more like real life is going to hell for lying. The sharper image of, say, a tennis match looks less real than the old non HD image.

I think the problem may be there is a bit of an uncanny valley effect going on. As the filmmaking technology approaches when the human eye sees in real life, the difference become more glaring and off-putting. I'm not even talking about make-up, sets, and props effects not being up to snuff in the better image. I was watching a tennis match on my parent's HD television and it didn't look like real life and the effect of all the movement, watching the ball and such was off-putting.

So when I eventually see the Hobbit, it will be in a non-3D 24 fps theater. I don't need to pay amovie ticket prices to have a bad experience.

I was planning to go see this tomorrow, since I'm off work tomorrow, but I just checked online and my local theater has the HFR version so I see what all the fuss is about.

I saw it in 48 FPS. Here are my spoiler-free thoughts:

When it came to the fully-CGI aspects of the film, especially CGI characters and creatures, it was excellent. They felt really alive and authentic. We're used to this from video game cutscenes and the like, though. I think a fully-CGI film will look absolutely great in 48 FPS.

Conversely, when it came to scenes with live actors, it often looked too real. Like a bunch of LARPers with bad makeup putting on a play. This might have been fine with a story more grounded in reality, but my girlfriend and I felt it detracted from the mythic/fantasy illusion they were going for. It got a bit better later on. We want to see it again in 24 FPS and see if that restores the illusion.

Then again, the 3D did look a lot more convincing in 48 FPS than most 24 FPS films do.

On the other hand, when the camera moved very quickly, especially during action scenes, it almost gave me nausea. With static shots or slow pans it was fine. But if the side benefit of this is the death of shaky-cam, consider me an instant convert.

I'm not dismissive of the format as a whole, and I think it will definitely improve over time, but, like Bob, I feel filmmakers will have to do a lot of reinventing of their craft. Not just camera placement, but lighting, editing techniques, new types of makeup, and so on. And I'm not sure having a fantasy film being the first test of the format was the wisest decision, but as my friend said, someone had to be first.

MB202:
Boy, I'm sure glad I'm so obtuse when it comes to the making of a film. I was wondering why the movie got mixed reviews, and really, I didn't, and still kind of don't, see why that is. I probably didn't see it in the 48 rate format, but I don't think it matters either way. Maybe it does to some people, but if a movie is more "clear" and more visually impressive, I honestly don't see how that can be viewed as a negative thing.

I haven't seen The Hobbit, but I've been told what it looks like by reminding me of something that it's supposed to be like. I remember looking at HD TV's at a Sony Store and some of them have some type of technology that makes it look like what I've been told 48fps looks like. It's kind of hard to describe but it's sort of like this: You know the little screen on camcorders that let you see what you're recording? Imagine a movie that looks like you were seeing it through that. It may sound like it doesn't make a difference but trust me it does. The way I imagine the movie it's that it looks like behind-the-scenes footage and you can tell the props are fake and everything. Like I said, it's hard to explain.

I would also like to add that many critics referred to it as "Too bright". This is actually a legit comment as the higher projection speed means there is less 'between time' in frame transition and actually increases the perceived amount of light. I'm not sure what compensation was made for this in the digital grading but traditional theaters and screens may have to adapt to this since the light intensity is matched for 24FPS.

Just an interesting technical point.

brazuca:
Old farts, I mean, critics will spin their head if movies go 60 fps like most gamers love.

Well, if it looks as shitty as 48 FPS looks because they're using old tricks on new tech, they'd have every right to.

Speaking of, 48 FPS isn't new tech. It's old tech, not commonly used for cinematography. This still holds some of Bob's points valid, however. Props look cheap, makeup looks bad, and so on. It's like HD porn: good in theory, but the clarity it gives often shows things you don't want to see.

Compact Discs and digital mastering went through growing pains like this, with few of the early masters being worth a crap. They spent a freaking FORTUNE on the Beatles CDs, and even those aren't so hot compared to what we can do now (Though they've aged much better than almost anything else).

The tech probably will be good eventually, but you're applying new technology to techniques that have been around for decades. If you did the same thing with gaming, it would probably still "spin" the head of gamers. This is falling asleep in Kansas and waking up in Oz.

And the real reason this is getting so much attention is because of how much hype there was behind it. If you tout your visuals, people will look at your visuals. No mystery here, Bob.

the antithesis:
I don't know what the fps of most HD televisions is nor if that's even a factor.

HDTVs can do frame rates in excess of 48.

I saw the "High Frame Rate" (48 fps) version of The Hobbit yesterday. Looked a bit odd for the first few minutes, as the actors seemed to move really fast, but by the time Bilbo left the Shire everything felt perfectly fine.

Hell, this was the first film I've seen in 3D where the visual depth of the scenes felt both natural and constant for the duration of it all. (Note that I have yet to see an animated movie in 3D, due to various circumstances.)

Zachary Amaranth:
HDTVs can do frame rates in excess of 48.

Ah, good. So it is a factor. Put me squarely in the "hate it" camp, then. For something that is supposed to make the image even more clear, that it actually makes the image worse is just false advertising. We should sue.

I was really hoping someone would tackle this subject.
Thanks Bob.

brazuca:
Old farts, I mean, critics will spin their head if movies go 60 fps like most gamers love.

60fps is a target, but many of those games due to differences in even small points will drop that number to somewhere around 40-50 on the PC, but will mostly maintain high 50s on the consoles. though it should be noted that there is still some level of debate as to if the human eye can even tell the difference:

the test is to take a 1-2 minute segment of continues shot, and then randomly inter-splice a different frame, and then show it 3 different times (just in case someone wasn't paying attention the other 2), and then having each person state where the inter-spliced frame was. the result is that many people don't/can't really recognize past the upper 40s

though there is a big difference between "film" and digital (video games, and some movies). film (the medium in which light is exposed onto a strip of thin plastic) is controlled by the rate of the motor that is moving it. while digital is controlled by the refresh rate of the system/screen that is displaying it (along with other calculations, and controls that are taking place), and usually the reason that it is not really recognized in a game is because in a game it is not so much that frameB is moving to replace frameA as frameA is thrown away, and frameB is drawn in its place.

not to mention that in order for movies to reach 60fps (the way in which they are recorded/captured would have to be dramatically changed, and in some cases it would just not be as noticeable as you would think.

In other words "Get of my lawn young people!!!".

I liked the 48 fps. but man, am I glad to hear that those were the complaints. I noticed those problems, but couldn't figure out if they were just in my head or not.

Saw it in the best theater possible in 3d (shutterglasses), HFR, dolby atmos and it was a whole new way to watch movies and a whole lot of fun. Certain camera movements and high moving objects were a touch distracting, but once those get ironed out watching 3d this way will be mandatory for me. I've never seen such clarity in a 3d presentation and the movie was deeply engaging. Love it.

Those disadvantages Bob mentioned (things move unnaturally fast and CG and effects are easier to pick out) are exactly why I hated seeing the movie in 48fps. I'm sure it will be fine once the technology is up to date, but its not there yet. Special effects in the Hobbit felt more fake than they did in the now 10-year Lord of the Rings trilogy. The whole thing broke the immersion for me, and that's the opposite of what it was supposed to do.

2:10

"The theatres showing it in 44 FPS..."

48*

Is it normal for the intro title to be different from the actual video title? Frame Rate vs. Frame Job.

Scrumpmonkey:
I would also like to add that many critics referred to it as "Too bright". This is actually a legit comment as the higher projection speed means there is less 'between time' in frame transition and actually increases the perceived amount of light. I'm not sure what compensation was made for this in the digital grading but traditional theaters and screens may have to adapt to this since the light intensity is matched for 24FPS.

Just an interesting technical point.

I'm fairly certain that's why Peter Jackson shot the movie in 48 FPS was to address the problems the movie would have with 3D technology on top of that (It is in 3D, right? I haven't seen it yet, and my theater probably won't get it until January 2013 at best).

I read a lot of Roger Ebert, and his reviews of 3D movies tend to be, "3D glasses make movies look too dark and therefore, no where near as great as if you watched the movie in 2D", but if the concensus of 48 FPS is to make the screen brighter, then by Hobbit Part 2 (or even 3), The movie will probably run much better, look and feel better, and if 3D has any involvement, it will make the movie seem like it was shot with only the audiences' eyes at best interest (and Hollywood's for making boatloads of money with experimental frame rate techniques and 3D).

Just my two cents on it. I know my movie theater's projector (still running 35mm) runs our movies at something like 18 FPS (according to my boss), but I argue it's more 24 FPS then what he says it is. That's just me, though.

Eabus:
Thanks Bob, that explanes why my roommate was going on about frame rates right before we went to see The Hobbit. But what does it have to do with Supergirl?

Lol. I also came here thinking this would at least be about comics again.

I figure that the unnaturally fast movement is just a product of us not being used to the format. For the first 20-30 minutes, I had the impression that everything was on fast forward but I did eventually get more used to it. While the CGI monsters themselves looked great in my opinion, the scenes where it's in front of a green screen were much more jarring.

Watched the Hobbit this weekend, but the 24 frame version. The 48 frame one was only available in "3D"... Are there at all any theatres that show a non "3D" 48 frame version?

gardian06:
60fps is a target, but many of those games due to differences in even small points will drop that number to somewhere around 40-50 on the PC, but will mostly maintain high 50s on the consoles

Most stuff on consoles runs 30 frames or sub-1080p (or both). Almost any game on PC you can run at 60fps and way higher, as long as I don't try to do that at 2560x1400 with AA on.

"The inherent crappiness of a standard based on near century old technology must be preserved for all time."

No thank you, Mr. Movie Critic.

"The first attempt at using any technology will have certain flaws, therefore it's better to abandon any hope of progress."

Yeah, that doesn't sound any better.

I haven't seen this 48fps, but I can understand that images moving too fluidly can be weird. The newest HD TVs which have that clear motion tech is already too pristine for my taste.

I understand the critics dislike of higher FPS (although I never saw the Hobbit) mainly because similar higher refresh rates on high-end televisions have a fairly disorienting effect. You see everything, and the pictures are crisp and fluid, but the brain recognizes something off about what it's seeing, like something is scratching in the back of the mind, and it's really not a pleasant feeling for most.

Am I opposed to this transition? No. Like HD television a few years back, it's still in its infancy and a lot of the kinks need to be worked out.

the antithesis:
I don't know what the fps of most HD televisions is nor if that's even a factor. What I can say is the clarity of the image, especially when there's movement, is off putting. Anyone who says this is more like real life is going to hell for lying. The sharper image of, say, a tennis match looks less real than the old non HD image.

I think the problem may be there is a bit of an uncanny valley effect going on. As the filmmaking technology approaches when the human eye sees in real life, the difference become more glaring and off-putting. I'm not even talking about make-up, sets, and props effects not being up to snuff in the better image. I was watching a tennis match on my parent's HD television and it didn't look like real life and the effect of all the movement, watching the ball and such was off-putting.

So when I eventually see the Hobbit, it will be in a non-3D 24 fps theater. I don't need to pay amovie ticket prices to have a bad experience.

TV's FPS rate is the Hz rate so all normal TV's whether SD or HD are 60Hz or 60fps TV's the ones with 120Hz or 240Hz modes can display FPS at those rates.

The 48 fps looks like shit to me. It's NOT the same as games, btw. I tire of hearing people like "oh, well, games look better at 60fps, so what's the problem?!" Games are fully graphically rendered. Many of them can afford to be silky smooth without their seams showing. Not the same for film. Not only do the effects tend to look worse, but there's a different aesthetic mentality that we view films with. 24fps seems to lend itself better to this mentality in many cases. Maybe they'll learn how to circumvent the format's problems in the future, but for now it looks crap.

Also, "new" tech (which 48fps filming is not) doesn't necessarily mean better. I think a lot of movies would look better on film rather than being filmed digitally, despite digital being the newer tech. Hell, there's a reason why people laud Breaking Bad for being shot on 35mm as opposed to digitally. It looks good.

I saw it in 3D with 48 FPS here at the local theater.

The CGI is a seriously mixed bag. Many of the CGI creatures (such as Gollum, the Goblins, the Eagles, and the Trolls) looked really good in 48 FPS. On the other hand, some scenes with heavy use of CGI looked appallingly bad in 48 FPS, particularly some of the scenes when Radagast was on a sled being pulled by bunnies. In fact, one particular scene in an open field with that looked like the type of CGI you'd get in an early 2000s video game - I was laughing at it in spite of myself in the theater.

As for the live-action stuff, I'll second the point about the cameras. Whenever the camera was moving smoothly or not at all, it made the live-action scenes look really good. But when they were jerking the camera around really quickly (as with most battle scenes), it looked terrible - like I was watching the movie on fast forward. This didn't disappear after the first 20-30 minutes, either.

 Pages 1 2 3 4 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Registered for a free account here