Escapist Podcast - Science and Tech: 009: The "God Particle," 60 FPS, 4D Theaters

009: The "God Particle," 60 FPS, 4D Theaters

In this episode of The Escapist's Science and Tech podcast, host CJ Miozzi and Escapist news writers talk about recent headlines in the world of science and technology: the further confirmation of the discovery of the Higg's Boson, 60 FPS coming to Youtube, whether 4D theaters are just a gimmick, and more.

Watch Video

You know, I was thinking about the 60fps thing, and well in general about the advancement of technology applied to games and everything.

First though, after attending some lecture about VR in GDC, apparently there is a very physical requirement to 60 + fps in that medium. Supposedly it produces less motion sickness and a far more comfortable overall viewing experience. Having only tested the oculus Rift in a few occasions, I couldn't really tell the difference, but I do take their word for it.

However, this research possibly led to Palmer Luckey throwing some veiled passive aggressive tweets (paraphrasing, "30fps on a modern game is a failure in game development") possibly directed at Ready at dawn's comments on "the order 1886", because The Order creatives said that they had intentionally aimed for 30fps to give it a more cinematic feel.

Likewise a Naughty dog artist backed up that by saying he prefered the look of 30 fps rather than 60, for the same cinematic reasons ( obvious justification for all the townsfolk to sharpen their pitchforks ). Because he can't possible be honest, he's only trying to justify the lack of power of new consoles, right?

ANYHOW! this lead me to a thought I've had for some time now. Instead of chasing the 120fps 4k golden goose, Couldn't we make a game today on low definition and low framerate, that looks absolutely astoundingly awesome?

For example, I have a big dell 1440p monitor, and yes, it looks fantastic, but often when playing games I turn the resolution down to 1080p, because it still looks impressive, but less jaggy.

In this sense, if we simulate the actual camera motion blur and all the more complex physical properties of the rendering itself, maybe we could use "low fi" to make our games more believable.

I personally do think that gameplay is king here, some games possibly do benefit very noticeably from 60 fps, while some others really couldn't matter less.
A very anecdotal example is that when playing Sleeping dogs on my PC, there is an option to unlock framerate or lock it at 30, unlocking it normally gives me 70+ fps, but when it dips it drops to near 30, and I can definitely notice, even if it remains over 30.
I prefer dropping it to consistent 30, since it stays there reliably, and without the variation I can barely tell.

But yes in general on the internet, people have found a place to vent their frustrations in very destructive ways, "everyone who thinks different from my very specific biased opinion is an obvious idiot" I really hope that wasn't the case... alas... fleshlings.

In the Youtube case, I did notice SOME difference in the footage, but not really enough to care so much. I also noticed the streaming took much longer to load in HD, which was something I had forgotten about. I suppose it's useful to showcase some things, but I don't generally think it's worth it.

About the GOD particle, I live in Chile, and noone hear really bats an eyelash about it. We do translate it literally as "God's particle" in spanish, which is the same if not sillier.

4D cinema is a complete disgrace though (but I thought the new spiderman was ok, entertaining in general).

Phew, anyway, long post... first time listening to you guys, pretty interesting. I'll keep tuned.

Delcast:
*snip*

Thank you for the insightful response! I wish the internet had more people leaving intelligent, level-headed commentary like this.

The low-def idea is interesting -- in fact, we did a news piece a couple months back in which someone used some technology to "scan in" a very low-res version of himself into a VR setting, and the result felt very realistic, despite the awful resolution.

Good to know about "God's particle" in Chile.

-CJ

Strangely i was trying to explain this to someone recently, and failing, that the problem with the hobbit film was the high frame rate of filming rather than the resolution of the film.

Strangely films are currently shown at 72 fps in cinemas, they just show the same frame three times. For games on the other hand it's about the change in frame rates and refresh rates of the screen. when they drop out of sync is when you see it.

Also no one reported nausea or dizziness from 48fps, that's 3D.

Hoplon:
Strangely i was trying to explain this to someone recently, and failing, that the problem with the hobbit film was the high frame rate of filming rather than the resolution of the film.

Strangely films are currently shown at 72 fps in cinemas, they just show the same frame three times. For games on the other hand it's about the change in frame rates and refresh rates of the screen. when they drop out of sync is when you see it.

Also no one reported nausea or dizziness from 48fps, that's 3D.

Thanks for clearing that up!

This was my first time listening to this podcast, but I'm glad I did. Easily the best paced, everyone gets a chance to give their input, no one completely dominates the whole thing, and it isn't bogged down with overly preachy message mongering like some other Escapist Podcasts.

Definitely will tune in a lot more from now on.

Kameburger:
This was my first time listening to this podcast, but I'm glad I did.

Definitely will tune in a lot more from now on.

Thanks! Glad you enjoyed it, and I look forward to reading your comments in the future!

Not a big deal, but Nod32 is made by ESET. I am pretty sure they are slovak, not russian.

vhailorx:
Not a big deal, but Nod32 is made by ESET. I am pretty sure they are slovak, not russian.

Yeah, they are Slovak. I cheated a little. That's why I said "soviet" instead of Russian, heh.

it sounds strange when you say its afternoon and im listening it at 7 AM :D

Personally i dislike the Higgs-Boson particle being named God's Particle for the same reason physicists dont like it - Its associating it with religion when its not anything religiuos oriented.

"Allah particle". Well, since in thier language Allah literally means "god". Its not a name, it literally is translated to GOD. so Allah Particle would still mean God's Particle.
--------------------------

The difference between 30 and 60 frames per second?

Its proven humans can see the difference, quite easily in fact. if some people genuinely cannot see it, they should get thier eyes checked, because they may have a medical condition. Not being able to tell it means your eyes are not working properly. Healthy human can see a difference. And thats not an opinion, thats scientifically proven, if you want to argue argue against biology.

Hobbit props being fake - higher framerate means less motion blur. Less motion blur means there is less blurring on the props. this means its easier to see flaws in prop design.

you notice more easily during framerate drops because then you see a large change. humans are made in a way to get used to enviroment. sit around a smelly room and in an hour you wont actually smell anything - you get used to it. however enter into such room from clear air and youll notice immediately. we get used to, does not mean its as good though.

Higher framerate will not make you nauseous or headache more than lower one. if that was true, then real life would have same effect since in reality you see it as fast as your eyes can.

as far as people being dicks about it. Lets say your are enthusiastic about something, and then you see somone doing it much better than it was done before, but then suddenly most people star telling you there is no difference. its infuriating how some people seem to be trying to defy basic biology in their arguments aimed specifically against your hopes. i can see how people cna get dickish about it.

You used music analogy, but forgot to mention that:
Vinyl actually got killed.
MP3 is the major trend in gaming
Its almost impossible to buy losless quality audio online. For example Itunes sell 256kbps audio, which is horrible compared to lossless.
the reason MP3 trend came was because most people didnt have equipment to hear the difference - something thats not true with movies or games nowadays (bellow 60 FPS TVs are obsolete and only exist in houses that didnt update in a decade).
Getting back to lossless quality is only a recent trend in music, after over a decade of drop in quality.

This also has ties to resolution - we had FullHD when TVs were CRTs. Flat panels came and SD became standard. a decade later we found our that this was solely because of price cartel by TV manufacturers. now the cartel is legally disbanded, and we see resolution come back up to what it was 2 decades ago.

So yes, quality dip trends exist and are a real threat. And since it already happened multiple times in the past, people that dont want it happen may be very agressive about it.
----------------------------

Delcast:

For example, I have a big dell 1440p monitor, and yes, it looks fantastic, but often when playing games I turn the resolution down to 1080p, because it still looks impressive, but less jaggy.

erm, dont do that.

your monitor is designed for 1440p. anything other will look bad on it. thats the way flatpanel monitors work. also lower resolution will look far more jaggy unless your adding extra blurring AA (like FXAA, which basically hides bad aliasing by blurring it). higher resolution is less jaggy than lower resolution if you keep settings the same. however you would need many settings to overcompensate the "monitor not in native resolution" problem.

In this sense, if we simulate the actual camera motion blur and all the more complex physical properties of the rendering itself, maybe we could use "low fi" to make our games more believable.

motion blur is used to hide low framerate problem. its not something we want. motion blur has no place in videogames. its only blurring the info you see, which makes it worse, objectively.

@rhykker, you seem to be the only one reading the comments of any podcast, thank you for caring :)

Strazdas:
[snip]

Thanks for the commentary! Feels like you were part of the discussion :) The smelly room analogy is fantastic; humans have a tremendous capacity for adaptability.

Speaking of... I was on a CRT monitor for the longest time, and when I first switched to LCD (it was an older LCD), as I was playing an FPS game with lots of fast movement, I felt like I was on LSD. It seemed as though everything was leaving motion blur trails, which felt very disorienting, distracting, and was messing with my eyes. After a few days, I stopped noticing it, though. To this day, I wonder if that has had some negative impact on my vision :\

Rhykker:

Strazdas:
[snip]

Thanks for the commentary! Feels like you were part of the discussion :) The smelly room analogy is fantastic; humans have a tremendous capacity for adaptability.

Speaking of... I was on a CRT monitor for the longest time, and when I first switched to LCD (it was an older LCD), as I was playing an FPS game with lots of fast movement, I felt like I was on LSD. It seemed as though everything was leaving motion blur trails, which felt very disorienting, distracting, and was messing with my eyes. After a few days, I stopped noticing it, though. To this day, I wonder if that has had some negative impact on my vision :\

your welcome. Im glad there are some people that read my ramblings to begin with :)

Ah, the LCD ghosting. indeed some monitors,especially the earlier ones had response lag (and some, response time added on that and back then it was more like 50ms and not 5ms that it is now) which meant that they respond over 100 ms late to the input. this obviuosly created huge response lag issues as games felt sluggish, so the manufacturers decided to make a feature where the monitor tries to guess the next few frames based on previuos frames. when it does this poorly, or when it tries to do that and view changes suddenly you see ghosting effect of monitor trying to display the "guess" and new data at the same time. even in new monitors if you try setting response time to 0 or high you will notice this effect (also very noticable when scrolling webpages). they have worked it out in part or at least made its settings such athat it hides most of the time on one side, whereas the actual response time have decreased to 5ms for modern TVs/monitors which makes it much better (though the 1ms CRT response time can only be mimicked by TN Panel monitors so far which is the type needed for those 114hz ones as well, but CRTs are believed to have better image "quality" and dont have color distortion at angles).

Not sure about affecting your vision. Id say that if you used a monitor without it for a few months and then came back, you would have same effect as you would stop being used to it. i used to game on hardware that ran most games on 20 fps for years. learned to not see the lag (funny story, when my friend tried to play on my PC and were terrible he asked me how cna i play so well with this much lag and the response was akin to "dude, i always lag, its my natural gaming state"). now i have played for 2 months on my new machine that can run things at 60 fps fine, and then when a frined came over i let him use that and used the old one myself for a mini lan party, and damn its so hard to play with this low framerate now. So i dont believe that this is so much of a permanent effect as your just used to this being every day life and dont feel the problems else it would drive you insane. A sort of expect less and you will be less dissapointed situation, except subconciuosly :)

Strazdas:

erm, dont do that.

your monitor is designed for 1440p. anything other will look bad on it. thats the way flatpanel monitors work. also lower resolution will look far more jaggy unless your adding extra blurring AA (like FXAA, which basically hides bad aliasing by blurring it). higher resolution is less jaggy than lower resolution if you keep settings the same. however you would need many settings to overcompensate the "monitor not in native resolution" problem.

Thing is, my gpu has native handling of that upsampling and Antialiasing, so it actually looks nicer to me, and a lot of the usual aliasing and texture jaggedness of native resolution is improved. Resulting in a steadier framerate and a smoothed crispness that -I PREFER-...
(too many people seem to have a hard time getting their head around the concept of people preferring something different from their own preferences are. Having evaluated both I picked one, the other one is not terrible, but I'm partial to my choice).

AND ABOUT THAT. Simulating lens physics to emulate cinematic aesthetics is NOT wrong or worse. It's an aesthetic choice. If I make a black and white 24fps game emulating film noir style, with heavy 8mm film blur and any other choices, that's fine, if the developer owns up to these decisions and considers them in the designed gameplay to enhance the resulting experience, all the power to them. Worse is subjective when you are dealing with experiences and aesthetics, art is not better if the canvas is bigger or the sculpture is made in more luxurious metals.

Also, "Bluring the info" is in fact something that most current engines do for a variety of modern post processes, Ambient occlusion blurs the proximity and normal direction samples of geometry in the scene , Sparse voxel octree Global illumination "blurs the scene" light and emission information composing with the precomputed lighting, HDR lighting blurs the light information considering the scene luminance. In fact most "modern" effects in games are not "real" calculations, but instead fast iterative approximations which you could consider make the scene "objectively worse". There are games that even use jpeg compression artifacts to convey a certain aesthetic.

So, on the contrary, motion blur can indeed be used as an aesthetic decision to accomplish a desired expression and make a particular product more believable. It's not "objectively worse", and in some cases it can be desirable.

Delcast:

Strazdas:

erm, dont do that.

your monitor is designed for 1440p. anything other will look bad on it. thats the way flatpanel monitors work. also lower resolution will look far more jaggy unless your adding extra blurring AA (like FXAA, which basically hides bad aliasing by blurring it). higher resolution is less jaggy than lower resolution if you keep settings the same. however you would need many settings to overcompensate the "monitor not in native resolution" problem.

Thing is, my gpu has native handling of that upsampling and Antialiasing, so it actually looks nicer to me, and a lot of the usual aliasing and texture jaggedness of native resolution is improved. Resulting in a steadier framerate and a smoothed crispness that -I PREFER-...
(too many people seem to have a hard time getting their head around the concept of people preferring something different from their own preferences are. Having evaluated both I picked one, the other one is not terrible, but I'm partial to my choice).

AND ABOUT THAT. Simulating lens physics to emulate cinematic aesthetics is NOT wrong or worse. It's an aesthetic choice. If I make a black and white 24fps game emulating film noir style, with heavy 8mm film blur and any other choices, that's fine, if the developer owns up to these decisions and considers them in the designed gameplay to enhance the resulting experience, all the power to them. Worse is subjective when you are dealing with experiences and aesthetics, art is not better if the canvas is bigger or the sculpture is made in more luxurious metals.

Also, "Bluring the info" is in fact something that most current engines do for a variety of modern post processes, Ambient occlusion blurs the proximity and normal direction samples of geometry in the scene , Sparse voxel octree Global illumination "blurs the scene" light and emission information composing with the precomputed lighting, HDR lighting blurs the light information considering the scene luminance. In fact most "modern" effects in games are not "real" calculations, but instead fast iterative approximations which you could consider make the scene "objectively worse". There are games that even use jpeg compression artifacts to convey a certain aesthetic.

So, on the contrary, motion blur can indeed be used as an aesthetic decision to accomplish a desired expression and make a particular product more believable. It's not "objectively worse", and in some cases it can be desirable.

All GPUs can upsample. some of them have it considered in design, others need to be forced to do it. Neither results in as good quality as supersampling though. so your basically playing on lower resolution, then make your GPU upsample it while retaining the monitor resolution up. Well, thats better than nothing i guess, thats how consoles do it. Does not make it same or nicer than real resolution though.

You are right that this method will allow higher framerate, because all you do is blur out the problems of low resolution. if your having framerate problems, sure, i can perfectly understand that, but if we compare the two things without other improvements higher resolution is objectively better.

And yes, plenty of people have hard time grasping of why people prefer objectively worse things. If you had to pick two shoes and one of them would leave blisters on your feet while other doesnt, i would have very hard time understanding why people pick the ones that leave blisters.

You can make a game in that style, it would be objectively worse from graphical standpoint. you could claim asthetic design choices, but then your compromizing objective quality over subjective looks. Sure, some people may like it, but dont pretend its not worse.

Most of the effects you mention that is blurring were invented specifically to hide the faults of low processing power. for example HDR lighting blurs are made to hide inability to properly process light ray traces and in turn cutting edge with static lighting. its less processor heavy, its also worse looking. maybe some day our hardware will allow us to get rid of it.

Motion blur is indeed just blurring out things that are changing between frames, whose sole intention is to hide low framerate. that is why it exists in movies even though we have a technology to get rid of it. this is why some games use it - to hide the flaws. and flaws are objectively worse.

 

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here