Nvidia Ends Screen Tearing With G-Sync Display Technology

Nvidia Ends Screen Tearing With G-Sync Display Technology

G-Sync module

Nvidia's new G-Sync technology promises to end once and for all the ugliness and hassles of screen tearing and sluggish frame rates.

You know how it goes: In-game images look best with v-sync turned on but games can get sluggish and laggy, and if the GPU can't keep up with the monitor's refresh rate, stuttering becomes an issue too. Turning v-sync off offers the best performance, but then you have to deal with "screen tearing," where the top and bottom parts of the display aren't in sync. In short, v-sync on looks best, v-sync off plays best, but neither are ideal.

Enter G-Sync, the new technomancy from the folks at display company Nvidia, which is actually built around a fairly simple idea. Conventional LCD monitors have fixed refresh rates, typically 60hz, which the GPU must work with, but with G-Sync, a module goes inside the monitor that transfers control of the refresh rate to the GPU. Because the display adapter controls the timing, the two are always synchronized, eliminating screen tearing without sacrificing performance.

Tom Petersen, Nvidia's director of technical marketing, said G-Sync will be compatible with most GeForce GTX GPUs, but while the potential benefits are obvious, it won't likely be an easy sell to mainstream consumers. Unless I'm completely misunderstanding how it works, G-Sync will require specific monitors in order to operate, but it's unlikely that "average" PC users will be willing to fork over extra bucks for technology that has no bearing on them anyway. Is the anti-screen-tearing segment of the enthusiast market sufficient to support proprietary monitors? It's a great idea, but there are some pretty big questions that remain unanswered.

Source: Nvidia

Permalink

I'd certainly shell out for it if it works, I can only play with VSync on if I'm using a controller. The input lag on a mouse renders it unplayable for me, even if it's set to render only one frame ahead - I do prefer higher sensitivity than average so it may be more apparent.

Compatibility with HDTVs would be the icing on a very sweet cake.

I don't see myself investing in a new monitor just for this purpose. Maybe when I'll get a new one for some other reason.

In the meantime, Adaptive V-Sync and framerate limiting seem to do the job when V-Sync slows up the game.

Its a good piece of Tech, but I rather wait until it OR whatever else that is like it but not exactly the same, becomes mainstream.
BTW, Nvidia would be able to use AMD mantle eventually, right? Its a open source thing?

Say goodbye to the evil whore known as input lag then... V-sync is off in every game I'm able to stand the screen tearing.

Unfortunately, it wont fix the strange problem some games have where you need to have V-sync on to continue with the game. Most prominent example is The Darkness 2.

A strp in the right direction. I get very annoyed, when my fairly powerful pc doesn't deliver.
But then again it's mostly due to games like wow and minecraft, one is an old engine the other is "cheap" code.
Props to Mojang for doing a very decent job in updating it, but there's still much to do.

Still, I have low tolerance for both low framerate and especially screen tearing, so this is a show of a promising future.
Get that baby out there and I'll start looking for a kickass monitor.

Well according to Steam 51.98% use nVidia cards (32.66% for AMD), and according to Anandtech it works with most nVidia cards.
People buy 120 and 144 Hz screens to avoid tearing (and Overcloackable 60 Hz IPS screens), I am sure people are willing to cash out on something that makes visible tearing a thing of the past.
Pricing is everything in this case though and if they opt in for IPS or OLED price might be too high, at least initially.

Edit: they should be out Q1 2014.

If it is only available as an integral component of a new monitor, it will take a while to get off the ground. I would absolutely buy it based on this feature, but only in my case if it's available in a top-end gaming monitor in other regards.

It's also tragic that I just bought a top-end gaming widescreen monitor less than 6 months ago so couldn't justify upgrading to a G-Sync one any time soon. Plus my brand new gaming rig (as of Aug/Sept) has no issues with performance/tearing in anything I've tried so far :-) With V-Sync on and Triple-Buffering where available i have no issues in any game with everything at maximum settings (GTX 780 Classified).

But my next monitor purchase certainly. It is a great benefit that directly impacts gaming experience for the better; I'm all for it and would gladly spend money on it.

KingsGambit:

It's also tragic that I just bought a top-end gaming widescreen monitor less than 6 months ago so couldn't justify upgrading to a G-Sync one any time soon. Plus my brand new gaming rig (as of Aug/Sept) has no issues with performance/tearing in anything I've tried so far :-) With V-Sync on and Triple-Buffering where available i have no issues in any game with everything at maximum settings (GTX 780 Classified).

What resolution is it?

Not very useful but good to have, I guess. This is why I use D3DOverrider. It lets you limit your FPS and use triple buffering even with DirectX games, as opposed to just OpenGL. That pretty much does the job.

This is good and all, but will it fix the tearing issues with Silverlight and Netflix? No way to turn on a v-sync option for those applications.

Unless Gsynch chips catch on and monitor manufacturers buy the modules for direct incorporation from nvidia. Its like a trojan horse approach for getting chips on every product manufactured world wide, and then building the video cards to go along with it.

Nvidia: Has all the FLOPs

thiosk:
Unless Gsynch chips catch on and monitor manufacturers buy the modules for direct incorporation from nvidia. Its like a trojan horse approach for getting chips on every product manufactured world wide, and then building the video cards to go along with it.

Nvidia: Has all the FLOPs

ASUS, BenQ, Philips, ViewSonic have already signed up for it.

Why not triple buffering? Doesn't that solve the same problem, while also being possible on pretty much every video card released in the last two decades?

But I tend to stop noticing the framerate at 50+, so my 60Hz refresh rate isn't an issue... no sell over here...

Bad Jim:
Why not triple buffering? Doesn't that solve the same problem, while also being possible on pretty much every video card released in the last two decades?

No, because V-synch synchs the GPU to the monitor. Triple buffering doesn't change the hardware specs.

Thanks but no thanks, I already have triple buffering turned on by default and I use MSI Afterburner to limit my FPS when necessary.

lacktheknack:

Bad Jim:
Why not triple buffering? Doesn't that solve the same problem, while also being possible on pretty much every video card released in the last two decades?

No, because V-synch synchs the GPU to the monitor. Triple buffering doesn't change the hardware specs.

But why do we want to change the hardware specs? Triple buffering already eliminates tearing without hurting your fps, and you can use it right now. How does this G-Sync make our gaming experience better if we have already turned on triple buffering?

Bad Jim:

lacktheknack:

Bad Jim:
Why not triple buffering? Doesn't that solve the same problem, while also being possible on pretty much every video card released in the last two decades?

No, because V-synch synchs the GPU to the monitor. Triple buffering doesn't change the hardware specs.

But why do we want to change the hardware specs? Triple buffering already eliminates tearing without hurting your fps, and you can use it right now. How does this G-Sync make our gaming experience better if we have already turned on triple buffering?

You're misinformed.

Triple buffering does not fix screen tearing.

Tearing happens when (example) a GPU fires 70 frames in one second to a screen that can only output 60 Hz. This means that before the screen is even done rendering the first frame, it's already started rendering a new one at the top. When things are moving, you're looking at two different frames at once (or more, if the frame rate goes to 120 FPS or higher). The place where the two frames are smushed together is a screen tear.

http://en.wikipedia.org/wiki/Multiple_buffering

What buffering does is allow the PROGRAM to do its thing however fast it wants while allowing the GPU to finish what it's doing before using the next set of instructions. This means that the software and GPU are separate. If you play an old game (particularly a DOS game) with no buffering on a modern computer, you may notice that the animations go insanely fast. That's from a lack of buffering.

Inversely, if the software is having momentary issues, and not sending out any drawing instructions, the GPU can take a previous set of instructions from the buffer and work on them instead while it waits.

EDIT: Added extra.

As long as you don't use V-Sync and your FPS is higher than your screen output, you'll get screen tearing. The only fix was V-synch. Buffering doesn't even acknowledge the monitor's existence.

G-Synch is a chip (if I'm reading this right) that overwrites the screen's refresh rate by handing control to the GPU. It... seems a bit dangerous, really. I imagine that you'd need specialized monitors to avoid explosions, but if that's the case, why not just invest in a 120Hz monitor and be done with it?

Well I've only been saying refresh rates on LCD are the dumbest idea in human history since they started selling them, but hey better late then never.
Sadly this being yet another proprietary standard it makes for a very shit solution, hopefully this sparks some work in creating a proper standard for general use.

Oh and triple buffering is a system to solve inherit problems with graphics cards, synchronizing with your screen is a completely separate matter.

lacktheknack:

G-Synch is a chip (if I'm reading this right) that overwrites the screen's refresh rate by handing control to the GPU. It... seems a bit dangerous, really. I imagine that you'd need specialized monitors to avoid explosions, but if that's the case, why not just invest in a 120Hz monitor and be done with it?

You are reading it right, however at 90+ Hz the G-sync will give other benefits aswell according to Carmack, he didn't specify it though.

Boris Goodenough:

KingsGambit:

It's also tragic that I just bought a top-end gaming widescreen monitor less than 6 months ago so couldn't justify upgrading to a G-Sync one any time soon. Plus my brand new gaming rig (as of Aug/Sept) has no issues with performance/tearing in anything I've tried so far :-) With V-Sync on and Triple-Buffering where available i have no issues in any game with everything at maximum settings (GTX 780 Classified).

What resolution is it?

1080p on a 23" Eizo Foris. It's IPS, so better quality than standard TN displays, but with the 8ms response time I'd say is a bare minimum for gaming (fastest IPS I could find). I'm not sure if I would've been better off with a 120Hz TN screen with 2ms response or the like, but this one's great. One should hope so for the price :-)

Boris Goodenough:

lacktheknack:

G-Synch is a chip (if I'm reading this right) that overwrites the screen's refresh rate by handing control to the GPU. It... seems a bit dangerous, really. I imagine that you'd need specialized monitors to avoid explosions, but if that's the case, why not just invest in a 120Hz monitor and be done with it?

You are reading it right, however at 90+ Hz the G-sync will give other benefits aswell according to Carmack, he didn't specify it though.

Well, between this and 3D-Vision still being a thing, they just might get me to buy a better monitor. They'll have to tell me what the other benefits are, though, and they'll have to be reaaaaaal good.

There's no reason this tech can't or wouldn't be "backwards compatible". ie. Monitors with G-Sync would still work with cards that don't support it in the same way they do now. This isn't a "revolution" in PC gaming, but it is a solution to an old problem. On that basis alone I think it's a great idea.

lacktheknack:

Bad Jim:

lacktheknack:

No, because V-synch synchs the GPU to the monitor. Triple buffering doesn't change the hardware specs.

But why do we want to change the hardware specs? Triple buffering already eliminates tearing without hurting your fps, and you can use it right now. How does this G-Sync make our gaming experience better if we have already turned on triple buffering?

You're misinformed.

Triple buffering does not fix screen tearing.

You're misinformed. Triple buffering does indeed fix tearing. The whole idea of having three buffers is that you have enough for one being displayed, another ready to be displayed pending a vertical retrace, while the third can be freely drawn on. It really does give you the best of both worlds, and you only need to enable it.

http://www.anandtech.com/show/2794/2

Charcharo:
Its a good piece of Tech, but I rather wait until it OR whatever else that is like it but not exactly the same, becomes mainstream.
BTW, Nvidia would be able to use AMD mantle eventually, right? Its a open source thing?

Yes, they most certainly could make use of it and ditch DirectX all together. Whether or not they will remains to be seen.

I really hope they do, as forcing Microsoft out of the scene and quasi-standardizing PC hardware would do great things for the market, but there's plenty of reason for Nvidia to want to stay away from it (or even just make their own).

Bad Jim:

lacktheknack:

Bad Jim:

But why do we want to change the hardware specs? Triple buffering already eliminates tearing without hurting your fps, and you can use it right now. How does this G-Sync make our gaming experience better if we have already turned on triple buffering?

You're misinformed.

Triple buffering does not fix screen tearing.

You're misinformed. Triple buffering does indeed fix tearing. The whole idea of having three buffers is that you have enough for one being displayed, another ready to be displayed pending a vertical retrace, while the third can be freely drawn on. It really does give you the best of both worlds, and you only need to enable it.

http://www.anandtech.com/show/2794/2

Then what the heck is Wikipedia going on about?

Lemme try this.

<opens Titan Quest>

<disables V-Sync and Triple Buffering>

Result: barely noticeable screen tearing, physics animations run immensely quick.

<re-enables Triple Buffering>

Result: No change.

?????????

<re-enables V-Sync>

Result: Physics speeding and screen tearing are gone.

Well, this just raises way more questions than it answers.

There was a time, before GPUs, when frames were simply rendered to coincide with screen refreshes, without wasting cycles on x intermediate frames that you'll never see in full anyway...

This will be a feature like Nvidia light boost for 3D gaming, monitors can just come equipped with the technology, but they don't need to be used. I bought a asus monitor because it had a built in 3D IR emmiter and light boost technology so I could 3D game. It's just another little feature people can add to monitors, and this doesn't even seem like much of a chore to integrate into future products. Sounds easy enough to me that it'll just come with most gaming monitors, but it doesn't have to be used.

lacktheknack:
Lemme try this.

<opens Titan Quest>

<disables V-Sync and Triple Buffering>

Result: barely noticeable screen tearing, physics animations run immensely quick.

<re-enables Triple Buffering>

Result: No change.

?????????

<re-enables V-Sync>

Result: Physics speeding and screen tearing are gone.

Well, this just raises way more questions than it answers.

I just found out why.

"Enabling Triple Buffering for OpenGL-based games such as Doom 3, Quake 4, Prey or Enemy Territory: Quake Wars is very simple - go to your graphics card's control panel and enable it from there. However this won't work for enabling Triple Buffering in Direct3D-based games, which are the bulk of modern games. Instead, you will need to use a utility called Direct3D Overrider (D3DOverrider) which comes with free RivaTuner utility."

http://www.tweakguides.com/Graphics_10.html

I don't know why they make it so damn hard to use triple buffering, but that's how you do it.

Bad Jim:

lacktheknack:
Lemme try this.

<opens Titan Quest>

<disables V-Sync and Triple Buffering>

Result: barely noticeable screen tearing, physics animations run immensely quick.

<re-enables Triple Buffering>

Result: No change.

?????????

<re-enables V-Sync>

Result: Physics speeding and screen tearing are gone.

Well, this just raises way more questions than it answers.

I just found out why.

"Enabling Triple Buffering for OpenGL-based games such as Doom 3, Quake 4, Prey or Enemy Territory: Quake Wars is very simple - go to your graphics card's control panel and enable it from there. However this won't work for enabling Triple Buffering in Direct3D-based games, which are the bulk of modern games. Instead, you will need to use a utility called Direct3D Overrider (D3DOverrider) which comes with free RivaTuner utility."

http://www.tweakguides.com/Graphics_10.html

I don't know why they make it so damn hard to use triple buffering, but that's how you do it.

You're telling me that the in-game option for triple buffering doesn't even work because the game uses Direct3D? And I have to use a completely different tool just to get it working?

DAMMIT MICROSOFT!

The idea is great but the execution is lacking. One, I already have a high end, IPS, 27" 2560x1440 monitor and I am not going to down grade for this type of thing.
If was something that could interface with any monitor that would be better, but I understand why that may not work.'
Two, I currently own an AMD card, so now I would have to buy two products, an Nvidia GPU and a new monitor. While I understand why this is Nvidia only, Nvidia is the king of great ideas killed by their proprietary nature, it just limits the market for this technology so much as to render it a non-starter in games.

If nVidia are smart, they'll lease it out to manufacturers for chump change to get it out there as something they can exploit over the long term to every any manufacturer. So long as they don't make it an over priced propriety technology like Apple and hobble it out the door way, this should be very good for nVidia.

don't see what it shouldn't be worked into the norm. I'd certainly like the best of both worlds.

Just keep the V-sync on and purge your monitors to best refresh rates they can handle (60 isnt the limit nowadays, bare minimum more like it) and you wont see any tearing and wont need any extra chips. Seriuosly, what exactly is the point of not having a v-sync? larger numbers you can boast about? there is no downsides to v-sync.

 

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here