The Big Picture: Frame Rate

 Pages PREV 1 2 3 4 NEXT
 

Zachary Amaranth:

HDTVs can do frame rates in excess of 48.

I'm guessing that is why most everything I watch on the new HDTV that my dad got himself last Christmas, looks like I'm watching a stage theater production rather than movie/tv broadcasts.

The first thing I saw on that TV was "Christmas Vacation", and I was weirded-out by how real it looked.

I never understood the problem with 48 fps. What's wrong with technology advancing a little? Bob did say it's been 24 fps for quite a while now. I'd be willing to watch movies in 48 fps once they work the kinks out.

Aardvaarkman:
Eyes don't have a frame rate, because they don't use frames. Where are you getting the 15fps figure from? It sounds like quackery to me.

That doesn't make any sense. If the film is moving faster than your eyes/brain can perceive, then you will perceive that as "blur," just as you would with real-life objects moving faster than you can perceive in detail.

If your comment was true, it would mean that film-makers have found a way to bypass human perception, and give the brain more information than it can process outside of a cinema. That would be a pretty amazing discovery, something worthy of a Nobel Prize or other distinguished science award. I'm pretty sure that's not what's happening, especially as 48fps is a pretty low speed, and well within human perception if you're not intoxicated or have vision difficulties.

We do see in frames. Here is a book for reference.
http://books.google.com/books?id=jzbUUL0xJAEC&pg=PA24#v=onepage&q&f=false
We see at about 15fps when you're talking about color. The Retina resets about ever 1/15th of a Second which is 15 frames per second. For some it's as low as 12 and others it could be a bit faster than 15.
There are a couple of notable exceptions though. Your night vision which is in Gray Scale is more sensitive. It has a faster refresh than color. It's also why good compression tech splits RGB into YUV which is Gray Scale, Red Croma, and Blue Croma. Because we're more sensitive to changes in the Gray scale we put the best compression on Gray and the loosy compression on the Croma values.

They haven't found a way to bypass human perception. They just found a way to display a fake image to the eye in a way the brain can tell that it's fake. You're also not getting more information. You're losing information on the motion of the image, and gaining clarity of an image. You've actually lost information to gain the clarity so you're not giving the brain more than it can take it. But because it's not how the brain sees it knows the image is fake. In part, this is because we evolved to pay attention to motion and motion blur more than clarity.

medv4380:
We do see in frames. Here is a book for reference.
http://books.google.com/books?id=jzbUUL0xJAEC&pg=PA24#v=onepage&q&f=false
We see at about 15fps when you're talking about color. The Retina resets about ever 1/15th of a Second which is 15 frames per second. For some it's as low as 12 and others it could be a bit faster than 15..

Those aren't frames. Frames are photographic stills on a film reel. The human brain doesn't process things that way. And a 12-year-old book about film restoration is not an authoritative source for information about human perception. The book you refer to doesn't even have any citations for this claim, and it is a very basic text about film equipment, not a scientific text about how humans perceive motion.

Aardvaarkman:

medv4380:
We do see in frames. Here is a book for reference.
http://books.google.com/books?id=jzbUUL0xJAEC&pg=PA24#v=onepage&q&f=false
We see at about 15fps when you're talking about color. The Retina resets about ever 1/15th of a Second which is 15 frames per second. For some it's as low as 12 and others it could be a bit faster than 15..

Those aren't frames. Frames are photographic stills on a film reel. The human brain doesn't process things that way.

The Image on your Retina is no different than a frame of film, and the fact that we can see it reset gives a clear indication of the maximum rate it take images at.

medv4380:

The Image on your Retina is no different than a frame of film, and the fact that we can see it reset gives a clear indication of the maximum rate it take images at.

It's very different than a frame on film. A frame on film captures a narrow field with specific boundaries. The human eye has more of an uneven field with detail at the center, and more peripheral vision, which tends to be sensitive to motion.

And what's your source for the idea that we can "see it reset"? f that were the case, wouldn't it indicate that we are capable of perceiving things beyond the supposed "frame rate"?

What argument do I want to pick at. Games being at 60 FPS and good while movies must be at 24 FPS to be good dispite both using the same moving picture mechanics? An argument over people saying that you need to give new tech a chance despite hating on DRM and saying it will never work? Maybe noting that people are saying that critics are using old thinking while also going on and on about nostalgic games? Oh so many excellent things to point out and then immediately get shouted at for, but I think I'll instead just leave immediately before I get blasted for having the gall to mention certain double standards in people's logic. AWAY!

I saw it in 48 FPS and loved it. Sure it took a few minutes to get used to it and at first it pulled me a bit out of the movie. But once things got rolling I found it just added more and drew me in further. Really hope more movies will go 48 fps, it really does make a huge difference in my opinion.

Why not just go to 60FPS and get it over with? Otherwise I do not see a down side.

MovieBob:
Frame Rate

How many frames per second does it take to anger critics?

Watch Video

The issue for me has nothing to do with the math. It has to do with the psychology.

When I'm watching something in real-life, there's a certain amount of blur to it, as my mind can only focus on one individual detail at a moment (like everyone else). When I'm focusing intently on someone's nose, let's say, the rest of the "scene" gets a slight haze to it. When something is moving, motion blur does the same thing.

And matching the "frame rate" of the human eye is a flawed idea, anyhow. See, in real life, as your eye tracks movement, you're not getting a consistent frame rate. Every time you move your eyes, your brain shuts down input from your eyes during that split second while your eyes are moving -- this is why you don't see images "drip" or "smear" from one to another. Your brain fills in the gaps and grows accustomed to the out-of-focus elements of the scene, especially during relative movement.

When watching in a very high frame rate, a lot of that blur is undermined. The result is that actually look less natural -- you become more aware that you're watching this scene from a greater distance, and that it's through someone else's eye. Basically, your brain buys the illusion better when it fills in the gaps itself (think Inception).

TIL Movie Critics are Luddites.

Aardvaarkman:
It's very different than a frame on film. A frame on film captures a narrow field with specific boundaries. The human eye has more of an uneven field with detail at the center, and more peripheral vision, which tends to be sensitive to motion.

And what's your source for the idea that we can "see it reset"? f that were the case, wouldn't it indicate that we are capable of perceiving things beyond the supposed "frame rate"?

I've already provided one very good source for you. I'm not hunting down Biology, and Neurology text books or experiments that show the image on the human eye being upside down and resetting.

The only part of your argument about the differences that has any weight is a Field of View Argument, and if you want a fish eye lens and a concaved screen to display it on your welcome to it, but it wont change the film being used. You're also mistaking the focal length for the center of the eye. You can be looking at something with the center of the eye and it be completely blurry due to focus. Your eye has to adjust the focal length in much the same way as a camera man does. Your brain is sensitive to motion regardless of which part of the eye it's in, but we focus using the center to orchestrate stereoscopic vision using both eyes.

Being able to tell that the cones and rods reset at a given rate doesn't indicate that we're capable of seeing at a faster rate. There are tests that can make them run slower, but nothing has ever made them run faster. Even adrenaline tests using bungie jumping couldn't get the eye to send images to the brain any faster.

At 48fps we're actually displaying 3.2 fluid frames on the retina each with a gap of time between them so ~3 gaps.
At 24fps we're only showing 1.6 fluid frames on the retina with ~1 gap.
As far as the brain is concerned it only sees 1 fluid frame but one has a lot of gaps in the motion information and the other has very few gaps. Keep in mind, the only reason we didn't settle on 15fps is because of optical effects that were happening. The brain does a good job filling in the missing information but that's because it does this anyways, but the brain is lazy and doesn't like added work.

The problem is the lack of understanding of what a Fluid Frame is. Think of a single Fluid Frame as a continuous stack of still frames. The brain can extrapolate a lot of information from that single frame. In part, that is way people don't think they see as slow as 15fps, but that's because they think of a frame as a still drawing and not as a 1/15th of a second continuous exposure of information.

By going to 48fps they've made fluid frames look more like still frames. That gives you clarity, but at the sacrifice of the fluid motion. The brain actually cares more about that fluid motion than it does about the clarity of the image.

If all they wanted was clarity they should have done something with the resolution first. Getting the maximum number of dots on the screen for the average theater seat would have been a better bet, and I believe that's only an issue because of Digital and not an issue with 35mm film.

I'm not sure if it was the new frame rate or not, but the special effects in this movie seemed worse to me than Fellowship had over a decade ago. I don't know if the frame rate was affecting that, and THAT is what critics are complaining about, or if it was because Peter Jackson is working with a much smaller budget for the Hobbit trilogy.

I readily admit to having little knowledge in this field, but it didn't matter to me because I still really enjoyed the film start to finish.

I managed to see the movie in 48 frames, and I must admit that I didn't really like it.
On paper, a higher frame rate sounds great: more=better, and all it does is provide a higher image clarity, right?

Well, not exactly. First of all, yes, the image is stunningly clear, which is great. On the other hand, as Bob explained in the video, this is why some of the props and prosthetics in this particular movie look a bit fake. Obviously, this isn't a big problem, and even movies without so many special effects will have no problem with this.

However, my biggest complaint with the 48FPS video is that it gives movement a very "unnatural" feel. When characters move, they look ever so slightly too fast, as if the video is being unintentionally played at a higher speed, but not constantly. As a result, human movement looks a little bit too quick and jerky at times. It's hard to explain, but it kind of looks like a video-game cinematic that keeps slowing down and speeding up (like when you have a weak GPU). The result is an immersion-breaking "feel" to the movie. You do kind of get used to it if there is constant movement, but if the pace of the movie slows down and then speeds up again, you start noticing it again.

The most ironic thing is that human movement in the new 48 frame technology actually reminds me of the footage from old hand-crank cameras from the beginning of the 20th century, for example:

http://www.youtube.com/watch?v=BJNbO1Mbl2w

I saw it in 48fps and it was great. I don't know what people are talking about where it makes things looker faker than it normally would...

wow there are video games that run at 60
though i heard that the lag in games produced between frames is usually ones tv and not the game itself

anyways interesting
read..uh er listen

Eabus:
Thanks Bob, that explanes why my roommate was going on about frame rates right before we went to see The Hobbit.

Seriously! I do music videos for fun, and some wanker posted a comment about it having "slow fps". As if 24 was now arctic slow. Next they'll say smart phones are something "my grandpa uses." Whiny entitled brats.

As far as the movie went, I didn't notice, or gave it a pass on how it looked. The only place I did a double take was Gollum looked... different. 'Course, that could have been an update on effects overall, and not the frame rate. Won't know till the extended edition DVD comes out.

WHEN DOES IT COME OUT?!?

As an animator, having twice as many frames per second to work with scares the ever loving shit outta me for reasons that should be entirely obvious. Why should I draw twice as many inbetweens for minimal added effect?

Thanks for the clarification. I was confused as to why it looked the same as any other movie when I saw it, but considering I live in Montana it's more than likely that it was in plain old 24 FPS.

uneek:

MB202:
Boy, I'm sure glad I'm so obtuse when it comes to the making of a film. I was wondering why the movie got mixed reviews, and really, I didn't, and still kind of don't, see why that is. I probably didn't see it in the 48 rate format, but I don't think it matters either way. Maybe it does to some people, but if a movie is more "clear" and more visually impressive, I honestly don't see how that can be viewed as a negative thing.

I haven't seen The Hobbit, but I've been told what it looks like by reminding me of something that it's supposed to be like. I remember looking at HD TV's at a Sony Store and some of them have some type of technology that makes it look like what I've been told 48fps looks like. It's kind of hard to describe but it's sort of like this: You know the little screen on camcorders that let you see what you're recording? Imagine a movie that looks like you were seeing it through that. It may sound like it doesn't make a difference but trust me it does. The way I imagine the movie it's that it looks like behind-the-scenes footage and you can tell the props are fake and everything. Like I said, it's hard to explain.

The HDTV thing is motion interpolation or some shit. It works by taking two frames and creating an image that would fit in between those two to create the illusion of a higher frame rate. So basically you still get the motion blur and all that jazz with none of the additional clarity of HFR and it just looks smoother.

Basically what I'm getting at is it's not really comparable to actual HFR.

Sexy Devil:

uneek:

MB202:
Boy, I'm sure glad I'm so obtuse when it comes to the making of a film. I was wondering why the movie got mixed reviews, and really, I didn't, and still kind of don't, see why that is. I probably didn't see it in the 48 rate format, but I don't think it matters either way. Maybe it does to some people, but if a movie is more "clear" and more visually impressive, I honestly don't see how that can be viewed as a negative thing.

I haven't seen The Hobbit, but I've been told what it looks like by reminding me of something that it's supposed to be like. I remember looking at HD TV's at a Sony Store and some of them have some type of technology that makes it look like what I've been told 48fps looks like. It's kind of hard to describe but it's sort of like this: You know the little screen on camcorders that let you see what you're recording? Imagine a movie that looks like you were seeing it through that. It may sound like it doesn't make a difference but trust me it does. The way I imagine the movie it's that it looks like behind-the-scenes footage and you can tell the props are fake and everything. Like I said, it's hard to explain.

The HDTV thing is motion interpolation or some shit. It works by taking two frames and creating an image that would fit in between those two to create the illusion of a higher frame rate. So basically you still get the motion blur and all that jazz with none of the additional clarity of HFR and it just looks smoother.

Basically what I'm getting at is it's not really comparable to actual HFR.

But is still in any way similiar to how I described it?

Hutzpah Chicken:
It's all Greek to me...
I don't understand how the speed of the movie projection has anything to do with the content.

It's not the speed, necessarily. It's the amount of frames per second. More frames in the movie means more real-looking motion. Doubling the framerate makes the movie look noticeably different. It's like a flip book. It looks better when it has more papers.

I had a lot of trouble adjusting to HD at first but now I am on board. I just don't understand how this FPS has any bearing when I thought almost everything was recorded and shown digitally. Why use film at all? I can't comprehend a scenario where film would have an advantage over digital. Of course, I know next to nothing about film and am probably the last generation that even used film in home cameras.

People don't put up with games less than 30 FPS in games, why the fuss over higher framerate in film? You'd think removing the effect of the eye's inability to see clearly with fast movement would be a good thing

I got really distracted by how fast the swooping panoramic shots seemed to move. Saw it an an iMAX and some scenes intended to be impressive were very reminiscent of old Benny Hill sketches. Everything moved just a tad too quick - I guess an adjustment in camera movement could solve this.

Just last year 3D became a thing over here, and then every movie theater got at least one room with 3D, from previously having just one movie theater with one 3D room in the whole country. So I'll probably get to see this "48fps" sorcery during 2020, or a bit later.

medv4380:

I've already provided one very good source for you.

No, you didn't. You linked to a book that just flatly stated the eye's effective "frame rate" without citing any actual sources or research for the statement. Not even a footnote. That's a terrible source. Especially as it was a book that had very little to do with the topic. There are lots of these kind of "rule of thumb" or "received wisdom" statements that just get flung around without any fact-checking.

As for the rest of your arguments, again, they are unsupported folk wisdom, not facts.

medv4380:

Aardvaarkman:
Eyes don't have a frame rate, because they don't use frames. Where are you getting the 15fps figure from? It sounds like quackery to me.

That doesn't make any sense. If the film is moving faster than your eyes/brain can perceive, then you will perceive that as "blur," just as you would with real-life objects moving faster than you can perceive in detail.

If your comment was true, it would mean that film-makers have found a way to bypass human perception, and give the brain more information than it can process outside of a cinema. That would be a pretty amazing discovery, something worthy of a Nobel Prize or other distinguished science award. I'm pretty sure that's not what's happening, especially as 48fps is a pretty low speed, and well within human perception if you're not intoxicated or have vision difficulties.

We do see in frames. Here is a book for reference.
http://books.google.com/books?id=jzbUUL0xJAEC&pg=PA24#v=onepage&q&f=false
We see at about 15fps when you're talking about color. The Retina resets about ever 1/15th of a Second which is 15 frames per second. For some it's as low as 12 and others it could be a bit faster than 15.
There are a couple of notable exceptions though. Your night vision which is in Gray Scale is more sensitive. It has a faster refresh than color. It's also why good compression tech splits RGB into YUV which is Gray Scale, Red Croma, and Blue Croma. Because we're more sensitive to changes in the Gray scale we put the best compression on Gray and the loosy compression on the Croma values.

They haven't found a way to bypass human perception. They just found a way to display a fake image to the eye in a way the brain can tell that it's fake. You're also not getting more information. You're losing information on the motion of the image, and gaining clarity of an image. You've actually lost information to gain the clarity so you're not giving the brain more than it can take it. But because it's not how the brain sees it knows the image is fake. In part, this is because we evolved to pay attention to motion and motion blur more than clarity.

We do NOT see in frames!
Our eyes have no shutters!

They send a constant stream of data to our brain!

If you used some simple logic you would know that if our brain could only see at 15FPS then anything higher would not make any difference!!
Objectivity we can see more than 15FPS! Just about all people can show that in a blind test!
Photons hit our eyes many more times that 15 in a second, why does really life not look fake?!

Also, a higher frame rate can not make anything look faster, only of the film was shot at something like 24FPS then played at double speed to get 48FPS!
A film shot at 48 FPS and displayed in realtime at 48FPS will not be faster or slower.

P.S A higher framerate will not make anything look "fake", people and objects don't move at 24 discrete positions per a second!

Nurb:
People don't put up with games less than 30 FPS in games, why the fuss over higher framerate in film? You'd think removing the effect of the eye's inability to see clearly with fast movement would be a good thing

Everyone keeps comparing the 48 frame film to the frame rates in video games, but it's apples and oranges.
A high frame rate in games produces a more natural look and motion. In movies, however, a 24fps version already shows a perfectly natural motion, and adding frames just offsets it.
The movie looks slightly like Benny Hill in the 48FPS version. Believe me, it's distracting.

Image clarity should be achieved with higher resolution photography and cameras, not fiddling with the frame rate. That way, you could get a clearer picture, but you wouldn't unintentionally mess up motion.

chozo_hybrid:

leviadragon99:
Well for some utterly arbitrary and assinine reason I won't be able to see the movie until Boxing day anyway, because Australian cinemas are dumb like that.

Really?

I'm in New Zealand and we already have it, how could you not, you're the closest country to us.

(shrugs) buggered if I know why...

uneek:

Milanezi:
Hmpf... I don't care, but I wish they'd stop with the CGI. When CGI gets old it gets VERY ugly, to the point of us saying "dude, that's not how I remember it", using props and stuff though, when they're well done, those stick forever or at least get a classic stamp to them.

Who was talking about CGI?

I admit MY bad lol I guess I just assumed CGI would be one of the things they'd try to make look better with FPS speed

I think it's time for the parable of the Shit and Turkey Sandwich.

We've been eating shit for nearly a century for no reason.

j-e-f-f-e-r-s:

...
There are all shots where there is not a lit happening. In the Star Wars case, it's just some kid looking into a flat desert and a pretty colourless sunset. And yet, when you see these scenes in motion, the grainy quality of the picture, and the movement at which the frames move, is precisely what gives these scenes their arresting atmosphere.
...

Snipped to save room.

That's some interesting observations. I have a theory that our brains sometimes needs some noise to make an impression stick. If the brain needs to put a little work into it to 'digest' the impression it may last longer. I think film grain, low frame rates or even artificial noise in music may serve to make the impressions a little harder to chew, but also make a more lasting impact because of it. Glossy quality can feel artificial in a way.

Or we may just need to get used to it, it's hard to tell.

Haven't seen the Hobbit yet and not sure wheter to watch it in 24 or 48 fps. I fear the 48 rate will hurt my eyes

Same problem with high-quality LCD TVs. I set one up for my sister recently, tuned the picture to look fantastic, and it does. It looks so good you can clearly see the difference in lighting when the show switches to a set, and in many shots, the difference in reflectivity of the costume materials from the real thing. This has that same yank-you-out-of-immersion effect as bad CGI or the 48fps issue Bob described.

 Pages PREV 1 2 3 4 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Registered for a free account here