The Big Picture: Frame Rate

 Pages PREV 1 2 3 4 NEXT
 

For some light reading on what this all looks like and the problems that arise, I'd suggest googling the "Soap Opera effect".

Falseprophet:
I saw it in 48 FPS. Here are my spoiler-free thoughts:

When it came to the fully-CGI aspects of the film, especially CGI characters and creatures, it was excellent. They felt really alive and authentic. We're used to this from video game cutscenes and the like, though. I think a fully-CGI film will look absolutely great in 48 FPS.

Conversely, when it came to scenes with live actors, it often looked too real. Like a bunch of LARPers with bad makeup putting on a play. This might have been fine with a story more grounded in reality, but my girlfriend and I felt it detracted from the mythic/fantasy illusion they were going for. It got a bit better later on. We want to see it again in 24 FPS and see if that restores the illusion.

Then again, the 3D did look a lot more convincing in 48 FPS than most 24 FPS films do.

On the other hand, when the camera moved very quickly, especially during action scenes, it almost gave me nausea. With static shots or slow pans it was fine. But if the side benefit of this is the death of shaky-cam, consider me an instant convert.

I'm not dismissive of the format as a whole, and I think it will definitely improve over time, but, like Bob, I feel filmmakers will have to do a lot of reinventing of their craft. Not just camera placement, but lighting, editing techniques, new types of makeup, and so on. And I'm not sure having a fantasy film being the first test of the format was the wisest decision, but as my friend said, someone had to be first.

I also found that the 48 fps helped the CGI characters a great deal, but made the live actors look CGI as well. Actually it seemed to make EVERYTHING look CGI. I think there's still some bugs to work out.

Well for some utterly arbitrary and assinine reason I won't be able to see the movie until Boxing day anyway, because Australian cinemas are dumb like that.

I'll use a quote from Ian Malcom in Jurassic Park to sum up my feelings:

Ian Malcolm:

If I may, I'll tell you the problem with the scientific power you're wielding here. It didn't require any discipline to attain it, y'know? you read what others had done, and you took the next step. You stood on the shoulders of geniuses to accomplish something as fast as you could, and before you even knew what you had, you patented it, and packaged it, and slapped it on a plastic lunchbox, and now you're selling it.

...

Your scientists were so preoccupied with whether or not they could, they didn't stop to think if they should.

image

This idea that people are already throwing around on the thread that 48fps is objectively better simply because it has a higher value is ludicrous.

Yes, 24fps grew out of economical needs at the start of the film industry. But you know what? It looks damn good. The reason directors stuck with the format was because it created that iconic look which instantly says 'film'. The 24fps film speed created the look which practically defines film. Directors stuck with it because it gave their images a canvas-esque, expensive look.

With 24fps on 35mm film, you can shoot anything and, with the right direction, make it look incredible. Need proof? Take a regular old HD handycam, and go out to your nearest valley, forest or canyon and shoot some footage. What'll the result be? Some very nice looking scenery made to look very cheap because of the visual presentation of the camera. Shoot the same scenery on a 35mm camera rolling at 24fps? You get some of the most beautifully presented shots the medium has ever produced:

There are all shots where there is not a lit happening. In the Star Wars case, it's just some kid looking into a flat desert and a pretty colourless sunset. And yet, when you see these scenes in motion, the grainy quality of the picture, and the movement at which the frames move, is precisely what gives these scenes their arresting atmosphere.

It's the same reason why you'll never go "OMG look at the scenery" when watching a news reporter on TV. By emulating real life fluidity, high-fps news footage removes the aesthetic appeal of most scenery, and simply presents it as ordinary. Scenery doesn't look like a work of art, it simply looks like the thing that the reporter is standing in. It takes the deliberate reduction of framerate, down to something less than reality, in order for our eyes to actually sit up and start taking notice of the aesthetic qualities of an image. The minute you try to make cinematic visuals 'real', you make them mundane and forgettable. The advantage of 24fps, 35mm film is that by deliberately avoiding trying to emulate reality, it allows the cinematography to be presented as a work of art to be appreciated visually. Higher framerates simply turn the same cinematography into a bad newsreel.

And yes, I hate films that have been smoothed over for HD tellies for just the same reasons. Higher frame values do not automatically mean better. Whichever best serves the aesthetics of film is better, and in my opinion, 24fps does just that. Directors with a better eye for visual shots than Jackson will ever have were perfectly fine with 24fps: Kubrick, Kurosawa, Leone, Lynch, Coppola, Tarantino... They all made stunning looking films with 24fps. Why are we now acting as if its some primitive tehcnology, just because Jackson started using news cameras to shoot his latest film?

I liked the 48fps HFR 3D it looked simply better, great clarity ad so BUT I agree with the people who felt that some parts in the beginning (ie before Bilbo left the shire) it looked a bit out of sync and like they where moving in fast-forward.. But I think it were a lot the HFR part the movie felt at times like you were watching actors on a theater stage (not cinema..) I guess much because you could see all the fine movements and count their facial hairs... And sometimes the CGI might look a bit of 2000...

Still I enjoyed it and recommend it to all but the ones who have no interest in LotR or didn't like the first 3, and hardcore fans and movie reviewers that were expecting something else or "better" can go visit Smaug... :)

Wax cylinders, vinyl, tape, CD... at no point has a reproductive medium failed to catch on because people were so attached to the sound of its flaws that they refused to adopt the objectively better new technology. This isn't like 3D, where the effect isn't actually the same as that produced by our eyes, a higher framerate produces a reproduction that is closer to what we would see if we were actually there. It might look a bit odd to start with since we're not used to that (or we are but associate it with cheap soap opera), but I'll be very surprised if 24fps is more than a niche product like vinyl in 10 years.

the antithesis:
I don't know what the fps of most HD televisions is nor if that's even a factor.

This gets goofy-bananas fairly quickly. And this only refers to North American video standards.

Take an image. Slice it up into 525 lines. Number the lines 1 through 525. Scan the odd-numbered lines out in 1/60th of a second. Then scan out the even-numbered lines in the next 1/60th of a second. You now have one frame of video, which took 1/30th of a second to display. Hence, video field rate is 60 FPS, and video frame rate is 30 FPS. This alternating display of odd then even then odd then even lines is called interlacing.

Now, introduce color televistion. It turns out you need a little extra space on every video line to synchronize the color circuitry (colorburst), but you can't speed up the horizontal sweep to compensate. Result: You're now scanning frames at 59.94 FPS, and frames at 29.97 FPS. So that's where those weird numbers come from.

Meanwhile, computer displays were originally repurposed television sets, so they swept out imagery at 60 fields/30 frames per second. And it was blurry and jittery. Someone said, "Say, can't we display all 525 lines in one sweep instead of two? The machines are fast enough now..." And thus were born progressive displays, where all the lines of an image are swept out at once, rather than being broken up over a series of fields.

Then the computer guys said, "You know, 640 * 480 really isn't enough pixels. Can we get more lines? And more pixels per line?" And thus was born the multisync monitor, which would try to adapt to whatever horizontal and vertical sweep rates the computer was generating. Soon there was 800 * 600, 1024 * 768, 1152 * 864, 1280 * 1024, and beyond. And you could display them at 60 frames per second, 72 frames per second, and eventually 240 frames per second.

Then someone saw how much nicer the computer displays were looking compared to their broadcast television counterparts and said, "Can we get some of that?" And thus were born the first "hi-def" video standards. But they had to be quasi-compatible with all the very expensive equipment the studios had already paid for. Hence, "Standard Def" is the old interlaced color video standard of 59.94 fields/29.97 frames per second, called "480i" (the 'i' means "interlaced"). 480p is the same number of pixels as 480i, but they're swept out over a single 1/59.94th second frame, rather than two 1/59.94th second fields.

But the new hi-def sets were being designed by the same groups of people as had been making computer monitors all that time, and they said, "There's no reason this multisync tech can't work in a TV. If we're getting a signal that's 59.94 fields per second interlaced, we'll sync to that. If we're getting a signal that's 720p at exactly 60.00 frames per second, we'll sync to that, too. Hell, plug in your old computer; we'll display whatever that thing's kicking out. 1024 * 768 @ 72 FPS? No problem..."

So the "standard" digital video format today is, "Whatever the content creator exported it as," since all the new displays just adapt. If you're going to broadcast over the air, then you are constrained by the FCC to a limited, well-defined set of formats, which are 480i, 480p, 720p, and 1080i. (I don't know if there's an FCC-sanctioned 720i or 1080p.) OTOH, if you're just sharing H.264 files over the Internet, then it can be whatever you think the recipient's video player can handle.

And now all that is stuck in your brain, too.

Nice and educational. Shiny. Movie Framerate = Klingon indeed.

I have not seen it 48 frames but I imagine that is where I would fall. I think it will require adjustments to the film making process in terms of effects and post production. The potential is there and I could easily see the issues being ironed out in the future.

I think it will be something people need to get used to it terms of both watching and producing.

axlryder:
The 48 fps looks like shit to me. It's NOT the same as games, btw. I tire of hearing people like "oh, well, games look better at 60fps, so what's the problem?!" Games are fully graphically rendered. Many of them can afford to be silky smooth without their seams showing. Not the same for film. Not only do the effects tend to look worse, but there's a different aesthetic mentality that we view films with. 24fps seems to lend itself better to this mentality in many cases. Maybe they'll learn how to circumvent the format's problems in the future, but for now it looks crap.

Also, "new" tech (which 48fps filming is not) doesn't necessarily mean better. I think a lot of movies would look better on film rather than being filmed digitally, despite digital being the newer tech. Hell, there's a reason why people laud Breaking Bad for being shot on 35mm as opposed to digitally. It looks good.

I had no idea that BB was filmed on actual film rolls. Interesting!

leviadragon99:
Well for some utterly arbitrary and assinine reason I won't be able to see the movie until Boxing day anyway, because Australian cinemas are dumb like that.

Really?

I'm in New Zealand and we already have it, how could you not, you're the closest country to us.

I saw it in 48 FPS and I'm honestly not sure how I feel about it.

To me it still gives it sort of a low quality "soap opera" look that I find distracting. The fact that I already associate that look to low quality material doesn't help.

So I feel like I can't really make a valid judgement on the tech until I get used to it enough that I don't automatically associate the 48FPS look with low quality material.

I saw it in 48 FPS and I didn't notice any difference at all.

The funny thing was that the 48FPS thing didn't really affect me too much as I've been playing PC games at 60FPS for a long time now. Though the smoothness of the Hobbit was simply awesome. I'm going to find it hard going "back" to 24FPS films now without noticing the same way I found it hard not to notice simply how bad a "standard definition" picture looked compared to a HD one when I first encountered 1080p 6 years ago.

It's all Greek to me...
I don't understand how the speed of the movie projection has anything to do with the content.

Ok, i have a question: Is the 48 FPS film available in 2D? I'd like to see the 48 FPS one but not in 3D. I don't want to sound finicky, it's just awkward cos I wear glasses and wearing two pairs kind of messes up the experience.

We were given the option to draw 24FPS during my animation course... yeah no one did... Fps dat, am I right!? THANKYOU, tip your waiters.

I may also point out that the "24" standard is far from the standard in some medium, most notably video games. The standard is actually 60 frames per second, the absolute minimum before people start thinking the game is unplayable is 30. In games movement and reaction rates are slowed if the buttons are hit in the spaces "between" the frames. This is why some Counter Strike tournaments are done on 1000 servers. No, I have no idea how they make it work.

It's kind of a stretch to even call this "new technology" - cameras have been capable of filming much higher than 48fps for decades. It's just that cinema projectors have been stuck on the 24fps "standard" for so long. Even then, it's not an issue of technology, but one of institutional momentum and laziness.

Interesting mention of the 48fps but it misses out on a lot of the historical details.

15fps was the old standard before 24fps, and it wasn't abandoned due to old hand cranks. It was given up because at 15fps you get odd optical effects like wagon wells moving backwards.

Back in the day a lot of testing was done, and their is plenty of literature on it, and the best frame rate to reduce optical effect and appear as motion is 24fps.

The problem is Computer Geeks have slipped into the Motion Picture filming arena with, and they are marketing 48fps.

For some reason they don't understand
24 Fluid Frames Per Second is what Film is at
Games on the other hand run at
60 Still Frames Per Second

Your eye actually has a very slow frame rate (~15fps for color much higher for gray scale), but it is taking fluid frames, and from them it extrapolates motion. Because games are still and have little or no accurate blur motion you have to do more frames to trick the eye into seeing a blur on the retina that it can interpret as motion.

48fps was marketed as a pipe dream. It was supposed to address the portion of the population that gets motion sick from 3D. If it did that then I might agree with the change. However, the reports of people still getting headaches from the 3D HFR version still exist. They assumed that the issue with the headaches was because of the motion blur, and ignored anything to the contrary. The real issue is with your eyes looking at an image stereoscopically in an unnatural fashion. It causes muscle strain and if your vision is even a little bad it builds up quicker, but even with good vision if you watched something in 3D all day you'd get a headache too.

The next problem is the 48fps is an attempt to get the frames to have less blur. Which makes a fast moving scene clearer, but will also look fake. The reason has nothing to do with the makeup or props. It's because your brain knows that something that is moving is supposed to have a lot of blur. Just move your fingers in front of your eyes if you don't believe me. If it doesn't see the blur it knows the image is fake. Some people say it's just like HD, and in a way they are right, but not for the reasons they think. The reason some people still say HD looks fake is because of foolish CG touch-up on scenes. Each time I see a scene in a film on my HD TV that looks fake it's because the background and the foreground are in focus. Your eyes cant focus on two different planes at the same time and when that happens your brain will know something is fake even if it can't pinpoint the exact issue. SD had an advantage of being just grainy enough that your brain wouldn't notice the background was also in focus with the foreground, and a Theater screen is so big that your eyes dart around enough your brain doesn't realize the image is in complete focus.

In a few years, 48+FPS will look normal and 24FPS will look crappy and dated, like black&white, used for artistic effect and not much else.

I don't think slipping a frame past a viewer actually proves anything, if the viewer can still tell you the framerate is higher or lower. Modern screens it's harder to notice - they're actually much better at displaying 60FPS because they do not go black between frames like CRT's did. But CRT's, I could take a glance at across the room and tell the difference between 60FPS and 75FPS.

-please delete-

chozo_hybrid:

leviadragon99:
Well for some utterly arbitrary and assinine reason I won't be able to see the movie until Boxing day anyway, because Australian cinemas are dumb like that.

Really?

I'm in New Zealand and we already have it, how could you not, you're the closest country to us.

Because the movie distribution industry is even more backwards than the film-making industry?

Aardvaarkman:

chozo_hybrid:

leviadragon99:
Well for some utterly arbitrary and assinine reason I won't be able to see the movie until Boxing day anyway, because Australian cinemas are dumb like that.

Really?

I'm in New Zealand and we already have it, how could you not, you're the closest country to us.

Because the movie distribution industry is even more backwards than the film-making industry?

Damn, that sucks, I have friends in OZ looking forward to it.

Recently, I started to dislike general movie screen quality, and really appreciate the improvements.
I just like that I can actually make out stuff that moves fast across the screen now, as well as not getting dizzy when the camera moves, because it doesn't get so blurry and choppy.

medv4380:
Interesting mention of the 48fps but it misses out on a lot of the historical details.

For some reason they don't understand
24 Fluid Frames Per Second is what Film is at
Games on the other hand run at
60 Still Frames Per Second

This. An exposure =/= as a still rendered image in a video game.

that said I do see 48 as having potential >< particularly in sport, where shutter speed is already ramped up to try and keep the action as sharp as possible... It can also be a potential aesthetic choice... especially in the certain sects of avant garde ><

I don't see one as being objectively better than the other, but it does irritate me when people seem to think that because it's a higher number, it ergo must be objectively better because they play video games at higher frame rates.

medv4380:

Your eye actually has a very slow frame rate (~15fps for color much higher for gray scale),

Eyes don't have a frame rate, because they don't use frames. Where are you getting the 15fps figure from? It sounds like quackery to me.

medv4380:
The next problem is the 48fps is an attempt to get the frames to have less blur. Which makes a fast moving scene clearer, but will also look fake. The reason has nothing to do with the makeup or props. It's because your brain knows that something that is moving is supposed to have a lot of blur.

That doesn't make any sense. If the film is moving faster than your eyes/brain can perceive, then you will perceive that as "blur," just as you would with real-life objects moving faster than you can perceive in detail.

If your comment was true, it would mean that film-makers have found a way to bypass human perception, and give the brain more information than it can process outside of a cinema. That would be a pretty amazing discovery, something worthy of a Nobel Prize or other distinguished science award. I'm pretty sure that's not what's happening, especially as 48fps is a pretty low speed, and well within human perception if you're not intoxicated or have vision difficulties.

Bob:
This will sound like Klingon to most of you

You are aware this is a gaming news website, right?

I'd say most of the people on this website have a pretty good idea what frame rates are.

j-e-f-f-e-r-s:
This idea that people are already throwing around on the thread that 48fps is objectively better simply because it has a higher value is ludicrous.

You're right, it is ludicrous, but so is treating 48fps as objectively worse. It's simply different and artists who are actually thinking about what they are doing will choose the technology that better serves the movie they're trying to make. This whole thing makes me think of "realistic" graphics and the gaming industry; new technology didn't render old art styles obsolete but realism is as valid a style to strive for as an other.

Rhys Davies:
This. An exposure =/= as a still rendered image in a video game.

How is it not? The process to create it is not the same, but the end result is functionally identical. Actually, in computer-generated films (i.e Pixar, etc.) the process is also identical to that of video games.

I just remember the 3D making my eyes tear up

Aardvaarkman:

Eyes don't have a frame rate, because they don't use frames. Where are you getting the 15fps figure from? It sounds like quackery to me.

Actually they technically do. There is only a maximum number of times per second your optic nerve can transmit an impulse to receive an image. If you show an image for less than this time you dont even see it. As long as the recording is above a certain threshhold the motion looks fluid. Anything far less and you start seeing jerky motion. There is no maximum limit to your eyes FPS in terms of media because it will look more fluid up to a point. There is most certainly a minimum value though, which IS about 15-24 above which motion looks fluid. The maximum FPS of your eyes is predicted to be around 200 (pilots can see and remember an image flashed for 1/220th of a second max). But thats totally unnecessary to be honest. The minimum value is all that is necessary for smooth motion since your brain fills in the rest. The brain cuts a LOT of corners to make reality easier to distinguish and understand. Motion is one of them. Our max FPS doesnt have a huge effect on how we see motion since our brain will fill in the gaps after 24 FPS anyway.

Aardvaarkman:
If the film is moving faster than your eyes/brain can perceive, then you will perceive that as "blur," just as you would with real-life objects moving faster than you can perceive in detail.

Right on. Here's the interesting thing: the eye can perceive motion at very high "frame rate", or rather it takes a very high frame rate to deceive the eye into seeing motion where there is none. (Even attempting to assign a number may be inappropriate, since for any object you can perceive at all, the ratio between the size of the moving object and the distance it moves per frame is more relevant than the speed per se, although I think you could find an upper limit where any object you can perceive moving is adequately represented, that upper limit would be very high.) 48FPS should need half as much motion blur as 24FPS, but it still needs motion blur IMO. Motion blur at 24FPS tends to be exaggerated; it "looks like film" because it is, but it doesn't look real, at least not to me. I haven't seen The Hobbit yet and do not know how it looks to me, nor even if they're using a frame-rate appropriate motion blur or not.

Don't believe me about motion requiring a very high frame rate for eye-blur to kick in? Try this simple test. If you have motion blur turned on for your mouse pointer, turn it off. Now move the mouse around quickly. Do you see a blur, or do see a cascade of discrete after-images? At 60FPS, you will easily see the discrete frames of the mouse being moved. The notion that the eye cannot distinguish frame rates above 15/24/60 is self-evidently nonsense, as anybody with a computer can test with a trivial exercise.

No, what the eye has difficulty with is detecting the difference between movement and well-implemented motion blur (and film can do a very fine job of simulating motion blur by the simple expedient of functioning similarly to eye itself). Proving that you can fool the eye does not prove that the eye cannot distinguish frames.

Aardvaarkman:

Rhys Davies:
This. An exposure =/= as a still rendered image in a video game.

How is it not? The process to create it is not the same, but the end result is functionally identical. Actually, in computer-generated films (i.e Pixar, etc.) the process is also identical to that of video games.

Because as it's stated in the video, an exposure has motion blur, as each exposure lasts 1/48th of a second (or 1/96th for 48). Video games on the other hand are a completely still frame. games try to overcome this by adding their own form of motion blur, but doing this in real time just doesn't look any ware near effective as actual motion blur from an exposure.

CGI and 3D animated films add motion blur into their renders, and as they don't have to render it in real time, the motion blur is imposed correctly and subtly. pause toy story when it's in motion and you can see the motion blur clearly... it's what films look smooth at 24 but games look jumpy.

Milanezi:
Hmpf... I don't care, but I wish they'd stop with the CGI. When CGI gets old it gets VERY ugly, to the point of us saying "dude, that's not how I remember it", using props and stuff though, when they're well done, those stick forever or at least get a classic stamp to them.

I wholeheartedly agree. And its not just when it gets old either. I saw this in the standard 24 fps, and I thought the majority of the cgi (like in the LOTR trilogy) was extremely unconvincing. If, as Bob says, the props, makeup and (I assume) visual effects appear more obvious in 48 fps, I can only imagine how terrible the cgi must look at that speed.

I really don't know how to explain it really. I mean, you can go all the way back to Jurassic Park, and (for the most part) the cg still stands up. Which I find surprising, because its all living creatures - which cgi has big problem with. Cgi animals and cgi people always stand out in films, and I can never really figure out why. Something about the way they move perhaps, or how their skin moves, like they don't have a skeletal system, or they entirely lack weight.

When I first heard about this 48 fps thing, I naturally turned my nose up at it with disgust. I cant claim to enjoy change that much. Not that I'm a luditte by any means, but when change is forced upon us when it is quite clearly un-needed (and for the most part doesn't work - thats right, I'm looking at YOU, 3D!) then I start to have a problem...

 Pages PREV 1 2 3 4 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Registered for a free account here