The Big Picture: Frame Rate

 Pages PREV 1 2 3 4
 

The framerate they use in cinemas is too low in my opinion. In faster scenes it looks more like a fast slideshow than moving image and that really takes you out of the experience. I haven't seen 48 yet so I don't know what that looks like but what I can say is that 24 should not be the end of the line.

I'm in the "it feels odd" camp. My boyfriend keeps telling me it's just because I'm not used to it, but I spent nearly 3 hours with the Hobbit and, although it wasn't as distracting as the first hour, I could still see the fast-forward effect with all the characters. I agree it makes everything look crisper and sharper than ever, but the motion is still off. I'm not fixed on the 24 FPS film look, although it does prep my brain for a specific experience. Documentaries do seem more real when there's more detail and with more fluid movement (like in behind the scenes footage), but that's just it. It makes the movie feel like a weird documentary. The detail and sometimes the motion want my brain to think it's more real, but lighting, composition, effects, etc, want be to think it's not. It's a strange congnitive dissonance that just doesn't feel right, and yes, ends up seeming fake. I've had to disable such upscaling of frames on my TV because it was really ruining my enjoymnet of movies. I'm not sure the push for "life-like" realism and clarity adds much more to film than "realism" in games, which so far hasn't made them overall better, just more resource-hugging.

Somewhere in the thread someone wrote a little bit that seems pretty relevant (sorry for not quoating you, I'm just too lazy to go back and find that specific post). Psychologically it seems the brain is more drawn into the experience when it does the motion interpolation, not when it's fed the pre-procesed image. I think the gaps it fills up by itself contribute to the movie watching experience much like imagination does in book-reading.

Sheesh, what a bunch of total idiots.
Why not check out what we Europeans have to deal with? 29.97 FPS, with frame interlacing to cover the difference.
This is basically taking two frames and putting a "mixed" one in-between them and with mixed I mean cut like window blinds and put semi-transparent over each other.
When paused or put in slow motion you can clearly see it, however my eyes also seem to pick it up normally and it's a completely horrible and terrible idea. It makes a movie unwatchable for me.

So before people start bitching about dumb things like 48 FPS, just try and think of worse ideas, because there are plenty of them

I just with they would use 60 FPS rather than 48. 60 Is actually close to the maximum speed the human eye can perceive so it would be a logical choice.

Just seen it. One word? EPIC!!! Loved every second of it. I expected to like it more than I did LotR (which I didn't like, actually), but I didn't expect to bloody LOVE it. Best movie I've seen this year, probably the best fantasy movie ever made. Fingers crossed for the next two. I'm glad now that Jackson split it in 3 :D More epicness to see.

Now... for the technical part. I watched it at a 3d hfr theater. Unfortunately, I had seats in the 5'th row, and the screen was HUGE. Since this was a 3D movie, you can bet that I wished it wasn't 3D in the close battle shots and some interior shots. But the rest of the movie was... I was there man... inside. And the improved framerate did help a lot, especially with camera panning and fast action. Even in the fast paced goblin chase scene, my I adjusted pretty well. A word of caution tho... in the first scenes, you'll be "wtf?! why are they moving so fast?", by the end you'll just wish you could stay and see it again (maybe from the back row tho).

I don't get why the 48fps is such a big issue to a lot of people. There are a lot of music videos, sport streamings and even some movies done at 60 or 120 fps, that look WAY MORE NATURAL in regards of human movement and especially camera panning than the standard 24fps stuff.

24fps to me always felt jerky in theatres, and with newer movies and upgraded projectors at my home town the limitation is clearly visible. People like saying that it's the norm and most brain-pleasing, but in reality 24fps is the LOWEST point at which mind doesn't alarm you that something is wrong with what you're seeing, and I don't accept that it gives it the right to be the "be all end all" standard all 48fps naysayers make it out to be.

Also, hello everyone, this is actually my first post here, tried to join a couple of times before in months past, but had some capcha issues that mysteriously went away now !

A lot of the newer movies are not even being released on film, they just ship a hard drive.

My local theatre (http://www.astortheatre.net.au/)has a 4k digital projector and still runs 70mm films (one of the only 70mm screens in the country).

You can definitely tell the difference with the 4k films, I recently saw Lawrence of Arabia at that theatre.

Some films are downloaded via satellite, which does not hold up as well I have heard from the Universal Classic Monsters screenings - http://willmckinley.wordpress.com/2012/10/26/we-belong-dead-why-frankenstein-looked-horrific-on-the-big-screen/

As for the Hobbit, I will probably just end up watching it at the cheap $9 day at another local theatre. The Astor did do a triple bill of the Lord of the Rings movies recently, but I was away that weekend.

Hitchmeister:
"The inherent crappiness of a standard based on near century old technology must be preserved for all time."

No thank you, Mr. Movie Critic.

"The first attempt at using any technology will have certain flaws, therefore it's better to abandon any hope of progress."

Yeah, that doesn't sound any better.

If this is what I think it is (an attack against MovieBob), the man did say he liked the 48 FPS.

Moving On: There's a 95% chance that I saw the Hobbit in 24 FPS, so I really don't have an opinion besides "I bet 48 FPS would look pretty cool in a movie!"

I can tell you right now without even seeing it that I will dislike the change. A few years back, I got a new TV that was 120hz, and it did its own version of motion scaling. While not perfect, it did ruin the illusion of many films or shows I played on it. It gave it as somewhat shot-in-video quality that most soap operas have, even for higher quality shows and films. I doubt I'd hate 48FPS enough to get sick from it, or feel that it ruins a movie, but if my TV is anything to go by, the gritty, jerky quality of motion in a film is part of what makes it look real.

TheDAus:

We do NOT see in frames!
Our eyes have no shutters!

The Cones on your retina activate in the same way as film, and reset once ever 15 seconds. The "shutter" you're so tied up about is the resetting of the Cones in the same way a camera slides in a blank frame for the next exposure. Rod operate faster but just see light intensity since they are mostly responsible for your black and white night vision.

Aardvaarkman:

medv4380:

I've already provided one very good source for you.

No, you didn't. You linked to a book that just flatly stated the eye's effective "frame rate" without citing any actual sources or research for the statement. Not even a footnote. That's a terrible source. Especially as it was a book that had very little to do with the topic. There are lots of these kind of "rule of thumb" or "received wisdom" statements that just get flung around without any fact-checking.

As for the rest of your arguments, again, they are unsupported folk wisdom, not facts.

The Flip to the Bibliography on page 342
Or you can pick up
"Topography of the layer of rods and cones in the human retina" c 1935
Which is a large source of that "received wisdom"
Or sit down and watch some good old Nova on PBS with the Neurologist who tried to test out the effect of people experiencing things "slowly" while they're bungie jumping or thrill seeking. Proved that even though they report seeing things slowly they don't see any faster.

But I'm not buying any of the books for you, or helping you out in any way. You already believe the nonsensical counter points without even looking up the information for yourself.

medv4380:

Aardvaarkman:
Eyes don't have a frame rate, because they don't use frames. Where are you getting the 15fps figure from? It sounds like quackery to me.

That doesn't make any sense. If the film is moving faster than your eyes/brain can perceive, then you will perceive that as "blur," just as you would with real-life objects moving faster than you can perceive in detail.

If your comment was true, it would mean that film-makers have found a way to bypass human perception, and give the brain more information than it can process outside of a cinema. That would be a pretty amazing discovery, something worthy of a Nobel Prize or other distinguished science award. I'm pretty sure that's not what's happening, especially as 48fps is a pretty low speed, and well within human perception if you're not intoxicated or have vision difficulties.

We do see in frames. Here is a book for reference.
http://books.google.com/books?id=jzbUUL0xJAEC&pg=PA24#v=onepage&q&f=false
We see at about 15fps when you're talking about color. The Retina resets about ever 1/15th of a Second which is 15 frames per second. For some it's as low as 12 and others it could be a bit faster than 15.
There are a couple of notable exceptions though. Your night vision which is in Gray Scale is more sensitive. It has a faster refresh than color. It's also why good compression tech splits RGB into YUV which is Gray Scale, Red Croma, and Blue Croma. Because we're more sensitive to changes in the Gray scale we put the best compression on Gray and the loosy compression on the Croma values.

That isn't precisely correct, while an individual eye cell is limited by chemical reaction speed and impulse transmission, 10 fps is correct only for an individual cell. It hasn't been determined for certain how these cells interact, whether that means that two cells just sit side by side, sending images at 10 fps each, or if the cells stagger themselves so that they work together to produce an image at 20 fps. Considering what I've found, the latter seems more likely.

It makes sense that the book you quoted would produce these numbers for the sake of film making, but medical journals focusing on our ability to retrieve and react to visual input have found that the values are excessively variable person to person. For instance, saccadic eye movements can be up to 1000 deg/s, in addition to a constant 30-70 Hz vibration, all working to 'refresh' the image the eye provides to you, since-like the mighty T-Rex of Jurrasic Park-we actually can't perceive things without motion.

Unfortunately I haven't been able to find a medical journal for free that discuses this type of visual acuity, but I did find this: http://www.dtic.mil/dtic/tr/fulltext/u2/a178485.pdf which is a discussion of tests performed on aviators in WWII. Although the results cannot be used as a direct evaluation of eye perception, it is notable that some of the pilots were able to react within 150 msec of visual stimuli. Considering an upper limit of 15 fps that leaves us with 83 msec for the pilot to actually perform a physical action to signal the researchers. However, as noted by this fancy blog because I'm tired now and don't feel like browsing journals I can't reasonably afford [http://blogs.scientificamerican.com/observations/2011/09/15/time-on-the-brain-how-you-are-always-living-in-the-past-and-other-quirks-of-perception/], it takes the brain about 80 msec to get to the present.

That 80 msec actually comes after the 'fps' of our eyes, because it is related to our brain trying to sort out what it just had sent to it, rather than it just being how long it takes for our brain to receive and understand the info in general. So with a little bit of simple subtraction we can see that a number of WWII pilots were able to perform a physical action within 3 msec of registering the need to act. Based on this, admittedly extremely tenuously related, information I would postulate that the human eye is capable of transmitting information much faster than 15 fps. It is possible that the registration time of these faster reacting pilots is substantially lower, but even accepting that they are able to process information 50% faster than the average human, 43 msec is not a very long time to perform a physical action.

That was fun, it's been a long time since I've actually researched on purpose.

medv4380:

Or you can pick up
"Topography of the layer of rods and cones in the human retina" c 1935
Which is a large source of that "received wisdom"

Gee, that's some cutting-edge research there, which has nothing to do with some kind of supposed "frame rate" of the human eye. You're just digging a bigger hole for yourself here.

j-e-f-f-e-r-s:
snip

You just spent a few paragraphs describing how you're a fan of the Reality is Unrealistic trope. Aesthetics borne out of technical limitations are not necessarily better. It's merely a form of stylization, one that can be maintained digitally if future artists so wish. I think it's premature to say that there is no promise in the technology.

The movie looks jaw droppingly awesome. Once I got accustomed to the lack of the usual "movie feel" I started to appreciate the higher framerate greatly, especially during action scenes and when fast movements were involved in general. And the detail on both actors and CGI characters is unbelievable. It's the first time I was actually amazed by a movie. Also I found the higher framerate to be more comfortable to watch, it didn't strain my eyes at all, as opposed to normal 3D.
I can't wait to see more movies filmed this way.

T3hSource:
The difference between 30 and 60 FPS and even 120 FPS is quite noticeable.

Is it?

Often games can drop from 60+ frames in unchallenging areas to just about 20 frames when the heat is up. Other than visual stuttering that occurs under 20 fps (under 30 fps for very fast movements), what difference should I notice?

I saw it in 48 fps and I loved the higher frame rate. Finally, I can actually see what is going on in those fast paced action scenes and the overall picture quality looked absolutely beautiful. Better than any film I ever remember seeing. What's more is that the 3D felt so much more natural than it did in other films I have seen. I only wish all the film purists weren't so tied to the "feel" of 24 fps. It's just like the old "black and white" vs "colour" argument or "talkies" vs "silent film". 48 fps is better in every way. We just aren't used to it yet.

I will admit that it took me about 15 to 30 min for my mind to adjust to the new frame rate during which time things seemed to move "in a strange way". Once I got used to it, it was amazing. I can't wait to see movies like Avatar 2 in 60 fps.

Yvressian:

Nurb:
People don't put up with games less than 30 FPS in games, why the fuss over higher framerate in film? You'd think removing the effect of the eye's inability to see clearly with fast movement would be a good thing

Everyone keeps comparing the 48 frame film to the frame rates in video games, but it's apples and oranges.
A high frame rate in games produces a more natural look and motion. In movies, however, a 24fps version already shows a perfectly natural motion, and adding frames just offsets it.
The movie looks slightly like Benny Hill in the 48FPS version. Believe me, it's distracting.

Image clarity should be achieved with higher resolution photography and cameras, not fiddling with the frame rate. That way, you could get a clearer picture, but you wouldn't unintentionally mess up motion.

It's so completely the same thing; frames per second as measurement for speed of digital refresh rate, it doesn't matter if it's gaming or movies, the only difference is that you and a lot of people are just used to 24fps movies. It mimics the motion blur of the eye, which is a flaw to make up for our brain's processing of detail, but when we're watchin a movie it doesn't need to be there at all. I like the smooth level of detail it gives, and I've been aware of higher frame rates in blu-rays on the right screen and thought it looked "different" but as an improvement. I'm the person that always disables motion blur as in games too lol, most people don't want to play a game that needs precision and have pretend eye weaknesses muddling things up.

It'll take a while but everyone will get used to it.

JenSeven:
Sheesh, what a bunch of total idiots.
Why not check out what we Europeans have to deal with? 29.97 FPS, with frame interlacing to cover the difference.
This is basically taking two frames and putting a "mixed" one in-between them and with mixed I mean cut like window blinds and put semi-transparent over each other.
When paused or put in slow motion you can clearly see it, however my eyes also seem to pick it up normally and it's a completely horrible and terrible idea. It makes a movie unwatchable for me.

So before people start bitching about dumb things like 48 FPS, just try and think of worse ideas, because there are plenty of them

I don't mean to pry (but technically I am, sorry), but where do you live in Europe? I honestly thought all countries (and so does Wikipedia, apparently) were either PAL (25fps) or SECAM. And interlacing is everywhere for SD formats, as well as most 1080 you see on TV (I'm not sure if anyone broadcasts in 1080p, honestly, but I live in the US and am certifiably dumb). It was originally invented for cathode ray tube TV's, but I imagine if you own a TV that doesn't use a very good deinterlacing formula (or not good enough for your eyes; everyone's different), you'd see artifacts from that (if there was no deinterlacing, you'd see teeth on the edge of the video, and it looks horrible).

Anyway...
<rant>
So, what the hell. Why on EARTH didn't they do this sooner? And, really, why 48fps? I thought the whole dang reason we wanted to keep 24fps around for so long was to keep people who learned how to pan a camera in 24fps from having obsolete degrees (although, this was hearsay, and I'd love to know why if this wasn't it). Really, if we're going to change the damn framerate here, why can't it be 29.97 or 59.94? Why do we have to keep TV and movies on separate framerates (or, while we're at it, not just switch to 30 and 60 fps. No one needs to worry about that silly frame anymore, no need to drop it)? Really? Because here's how I see it going:

Right now:
Take 24fps movie. Apply 3:2 pull down (ugly way to get more frames, keeping the movie at the same pace). Win

After 48fps becomes a thing:
Take 48fps movie. Remove half the frames. Apply 3:2 pull down. Make copious quantities of lag from having to math this real time. Make poor people who have to encode this silly conversion cry.

Seriously, if we're going to the trouble to negate all of those 24fps degrees, can't we just consolidate fps so it makes sense? I'd posit merging PAL and NTSC somehow, but I know that's asking too much. Ugh. At the end of the day, all of these silly video conventions are from before digital video and are dumb. But, as the old saying goes, why fix what isn't broke? Even if it makes absolutely no sense whatsoever anymore?
</rant>

Saw 3D with 48 FPS yesterday.

I have never seen such a beautiful thing in my life before. I have no idea what people are seeing, but what I saw was a realistic looking, clear and extremely beautiful movie.

I normally go to the cinema less than one time a year, that might have something to do with it. You guys are all used to the normal 'shitty' movie format and that's why you believe it looks fake or weird.

As far as the critic response is concerned, I figure this complaint will go away as the technique becomes more widespread. For example, if you took a 128 kbps MP3 and played it on a high-end piece of stereo equipment, you might think it doesn't sound right because you've spent so much time listening to that track on your crappy ear buds. That the higher fidelity does a lot to accentuate the flaws in the encoding doesn't make hi-fi playback bad, but it would take you some time to get used to just how different those tracks sound.

In short, you've just got to wait until you get used to it.

If you're going to change the frame-rate, why only 48fps? I can see the logic in keeping to a multiple of 24, given that was the past standard and it would make converting old material easier, but why not make the new standard significantly higher? Say, 192fps. Would go nicely with the 192kHz sample-rate for the audio.

Also, if the movie industry wants to say they're so much better than the TV or video game industry, shouldn't they put their money where their mouth is and actually give themselves a technological leg-up over their competition?

192 fps, 8192*4608 pixels, 384kHz, 26.8 channel audio. Better than TV! Better than games! Available only in theatres!

A lot of you will want to see an example and experience it for yourself. Now I've found a fanmade conversion of the trailer in 48 fps. You might say that because it's fanmade it's not like the real deal, but when I watched it it looked real to me.

http://www.fxphd.com/blog/what-might-the-hobbit-look-like-at-48-fps/

Download links in low, medium and high quality here:http://www.lukeletellier.com/?p=205

The first time I watched my reaction was: "It looks so weird when things move fast."
But the second time I thought: "It's so much more real!"

So even though it's kinda weird sometimes, chalk me up as a fan.

To moviebob. People aren't stupid. We get frame-rate and speed.

The Hobbit got criticism because it meandered about and wants to turn a 300 page book into 3 movies and shoe-horned it's future movies by including other novel material.

the antithesis:
I don't know what the fps of most HD televisions is nor if that's even a factor. What I can say is the clarity of the image, especially when there's movement, is off putting. Anyone who says this is more like real life is going to hell for lying. The sharper image of, say, a tennis match looks less real than the old non HD image.

I think the problem may be there is a bit of an uncanny valley effect going on. As the filmmaking technology approaches when the human eye sees in real life, the difference become more glaring and off-putting. I'm not even talking about make-up, sets, and props effects not being up to snuff in the better image. I was watching a tennis match on my parent's HD television and it didn't look like real life and the effect of all the movement, watching the ball and such was off-putting.

So when I eventually see the Hobbit, it will be in a non-3D 24 fps theater. I don't need to pay amovie ticket prices to have a bad experience.

I just have absolutely no way to relate to that, I don't see how you find clarity during movement off-putting, it is certainly less real looking when it comes to most sports but I definately prefer being able to see individual blades of grass on the field when I'm watching football, I can actually see where the footy is going for a start... it was all guess work on a CRT.

Whatislove:
I just have absolutely no way to relate to that, I don't see how you find clarity during movement off-putting, ...

The Red Letter Media guys said it looks like video, and that's an apt description, I think, if you've ever seen the picture quality of video from the early-to-mid 80's. It kind of looks like that. Maybe with some time I could find other things it looks like, but what I would not say it looks like is real life. For all that technology and expense, it's not an improvement. That's all I can really say. It would be one thing if I found it like real life and that was something I had to adjust to, but that isn't the case. It's not like real life. It's not even closer to real life. It looks just as fake but in a different way. That's not an improvement.

Its all about the motion blur. Everyone is used to the motion blur of 24fps, so 48 is going to be an adjustment. And to the guy complaining about 24fps films using a 3:2 cadence in 29.97, we're not talking about films shown on tv, we're talking about films in the theaters. We have that same problem here in the US, and 3:2 cadence sucks so bad that many of the better TVs adapt for it and reverse the cadence back to pure 24fps. Might be called inverse or reverse telecine, or pulldown depending on the product.

There's yet another challenge to overcome: rolling shutter. Even these super duper expensive Red Epic film cameras have CMOS sensors which are not read all at once, but scanned top to bottom. Most of the time it doesn't seem like a problem, but it can be quite jarring at times.

 Pages PREV 1 2 3 4

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Registered for a free account here