SMiThaYe

G-Sync
Nvidia G-Sync News and Discussions

22 posts in this topic

mxRPRcWl.jpg

 

Thought we'd already have a G-Sync thread but nope! Anyway quick thread created and all gossip and news here. I know it's a monitor and not related to a GPU but it does work via the GPU so there ;)

 

av3GhQEm.jpg

 

Links to check out :

  • VideoCardz post covering the announcement [18th Oct 2013]
  • Official Nvidia PR [18th Oct 2013]
  • Official Nvidia website
  • BlurBusters on backlight news [19th Oct 2013]
  • BlurBusters preview [12th Dec 2013]
  • Linus video preview [14th Dec 2013]
  • And as a bonus this is the detailed photo of the G-Sync PCB with SK Hynix modules

 

System Requirements for NVIDIA G-SYNC enabled monitors

  • GPU:
    - G-SYNC features require an NVIDIA GeForce GTX650Ti BOOST GPU or higher.
            - GTX TITAN
            - GTX 780 Ti
            - GTX 780
            - GTX 770
            - GTX 760
            - GTX 690
            - GTX 680
            - GTX 670
            - GTX 660 Ti
            - GTX 660
            - GTX 650 Ti Boost
  • Driver:
    - R331.58 or higher
  • Operating System:
    - Windows 8.1
    - Windows 8
    - Windows 7

I've not posted everything on this because it's stupid o'clock in the morning but feel free to post what you have.

 

I would like to see a comparison of e.g. BenQ XL2420T 144Hz 1ms monitor pitted against the ASUS VG248QE (Linus says there is no G-Sync exclusivity deal with Asus because I did read rumours saying otherwise) that also features 144Hz and 1ms response to see if its comfortable on the eyes for prolonged periods. In any case without G-Sync I never see screentear and can play most titles with vsync. Compared with my laggy 60Hz 6ms Dell IPS panel, I experience no mouse lag or very little. I have always noticed these little changes but then I am a seasoned gamer.

 

The tradeoff in IQ is worth it for shooters where the biggest difference is having a fast panel in the first place but that's not the want of everyone. We'll not see any in depth reviews for at least a month but even then it's hard to convey differences of G-Sync and impossible to prove in an article unless we figure out another cleaver FCAT alternative, very unlikely though. Maybe you are lucky and will be able to test a panel on compatible hardware from a title you've regularly played like Luke did with AC4.

 

Once we get more reviews, let us know you opinion of G-Sync.

1

Share this post


Link to post
Share on other sites

I don't see why this couldn't be supported on any Nvidia GPU (or AMD GPU if AMD was willing to write drivers for this kind of thing, ofc).

Stupid shit to sell more high end cards.

The feature is good, but, just like shadowplay, etc.  It is crippled in ways that force you to buy a high end Nvidia card.


Cool tech, stupidly annoying limitations.

EDIT: Edited out PhysX as a "good feature but forced to buy high end Nvidia card" as any (new) Nvidia card can do physx iirc.

0

Share this post


Link to post
Share on other sites

I see this feature just like a Bonus, and like a Bonus is not for the majority of people.

IMO playing any game with V-Sync at 60fps on a 60Hz monitor is more than enough for smooth gameplay experience, but we are in 2013 and some people now use, 120Hz and 144Hz monitors which requires High-End CFX or SLI solutions to provide frames higher than 60 on the latest titles, even sometimes 2 GPUs are not enough to achieve that.

I believe that G-Sync is a good technology in the end, but just that, nothing too much revolutionary, mainly due restrictions of GPUs and Monitors supported.

A bit of offtopic: Let's see what AMD will provide us with the Mantle API, yes it's a bit less like a Bonus when compared to G-Sync, because some of current and future games will support that new API, besides I heard that Nvidia GPUs will be also compatible with it, which is really good for everyone. ;)

0

Share this post


Link to post
Share on other sites

if AMD makes Mantle Open and NVidia decides to write code to support it, yes.

Expect this 2015 at the earliest, if ever.

Anyhow, G-Sync is something that could be really useful in a large class of machines that are ignored... Laptops.  High end gaming laptops would eat this up.  If you are spending 2.5K+ the extra $200 or whatever it costs to ensure you always have playable framerates (well, unless it drops under 30 iirc) would be insane.

Of course, Nvidia has surely done more market research than me, so I assume they did what makes best biz sense,

-Q

0

Share this post


Link to post
Share on other sites

Not sure G-Sync will be as 'open' as Mantle. At least with Mantle it's possible for Nvidia to code and utilise this but remains restrictive due to Mantle using AMD arc and therefore better capabilities and potential, but Nvidia will be way behind. As G-Sync relies on drivers and feedback from the monitor to control what you see, it makes it quite difficult for AMD. The focus could change if more of the market bought G-Sync capable monitors and there was far better choice than one panel currently.

 

Testbug, it doesn't require you to have a highend GPU, the GTX 650 Ti Boost (May 2013) is a midrange GPU that can manage most games up to 1080, albeit with settings a couple notches down from maximum unless it's an older title or undemanding indie title. The inherent highs and lows in framerate that creates screen and mouse lag is what G-Sync is meant to solve and works just as well for midrange as it does for highend GPUs.

 

Could be the case if proven, that it works better for 144Hz 1ms panels with a couple 780 Ti's than one due the higher variance of maximum fps and minimum fps (plus higher frametimes), for the feeling of smoothness which a couple GPUs can spoil. Even AMD's promised framepacing drivers didn't fix the high frametimes and expect a second update in February 2014. Mantle is meant to improve GPU efficiency and help frametimes, although not solve, the inefficiency of DX and extra overheads it has versus this new API. Until we know for sure, best bet it to wait a while longer yet to see the pros and cons of either tech.

0

Share this post


Link to post
Share on other sites

I'm excited about this technology,  I'm hoping  it will come to high resolution IPS panels as well. 

0

Share this post


Link to post
Share on other sites

is a GTX 650 ti faster than a 470, 480, 560ti, 570, 580?

Nope.

There are no changes from Kepler to Fermi that would enable G-sync to work.  Time to design product + release date of Kepler doesn't match.

That, or Nvidia decided to delay G-sync until they needed a product announced, in which case Nvidia is being horrible towards the market.

0

Share this post


Link to post
Share on other sites

is a GTX 650 ti faster than a 470, 480, 560ti, 570, 580?

Nope.

I was responding to you saying "It is crippled in ways that force you to buy a high end Nvidia card.", hence my mention of GTX 650 TI. I did not say this mid-range GPU was faster than any GPU and no it isn't faster than those.

 

There are no changes from Kepler to Fermi that would enable G-sync to work.  Time to design product + release date of Kepler doesn't match.

That, or Nvidia decided to delay G-sync until they needed a product announced, in which case Nvidia is being horrible towards the market.

Nvidia admits G-Sync was several years in the making and it's wrong that only a small selection of GPUs have approved to work with G-Sync. The fact that Nvidia or users can disable G-Sync capable titles via the control panel must surely mean that older arcs will be added and the small selection is there to make it easier to roll out and troubleshoot. Otherwise adding value to existing products to justify the high pricetag (think EVGA or Asus for a second) and dependence people have of trusting their drivers to work better with fewer issues than the competitor (don't think Asus because their support is very patchy in my experience and others I know).

 

Comparative G-Sync performance via Fraps output is confirmed to work and will be read by FCAT systems :) Reviewers already the expensive and complex FCAT systems setup and should be straight forward for those guys.

0

Share this post


Link to post
Share on other sites

IMO Nvidia brings something " new " to the market , and need to be polish :)

0

Share this post


Link to post
Share on other sites

Read this early today, first impressions of G-Sync from pcper which notes sub 30fps vsync kicks in and 'may' be lowered to 20fps once released to Joe public. No modding options for other panels than the ASUS VG248QE for a long time due to how unique each panels internals are. Above 30fps the lag isn't noticeable at all and mouse movement is smooth with no screentearing. I'd be sold on this but my 1ms 144Hz screen doesn't feel laggy at all until the GPU struggles with BF4 on Ultra and screen resolution at 200% (aka 4k) where at around 33fps it's not unplayable with one GPU. Performance more of an issue with the 780 TI than the screen but SLI isn't without it's own set of problems if you think that will solve it, it only acts as a plaster. Even if G-Sync >30fps felt smoother, for a shooter in the Western world you'd want to aim for a smooth 60fps minimum by reducing ingame options.

 

http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sync-DIY-Upgrade-Kit-Installation-and-Performance-Review

0

Share this post


Link to post
Share on other sites

At CES 2014 (been great so far for new product announcements) Bob Wudeck has revealed two new 1080p gaming TN monitors with Nvidia G-Sync to reduce image tear, stuttering and latency: XL2420G and XL2720G.

They include the usual features such as Flicker Free (reduces headaches and eyestrain for prolonged gaming sessions), and Black Equalizer (lightens very dark areas only without washing out the image to reveal enemies). They are based on 1ms grey-to-grey, 144Hz panels and now with the G-Sync tech it varies from 30Hz to 144Hz via the only port available apart from USB 2.0 and headphone jack - DisplayPort 1.2.

  • "G-Sync Mode for smooth, low-latency gaming; Low Motion Blur Mode for CRT-like sharpness of moving objects; and a full-featured 3D Vision Mode to create more lifelike on-screen action."

Low Motion Mode works using backlight strobing and BlurBusters explains it very well, probably best used if your have enough GPU grunt. For 3D Vision you need to purchase the Nvidia 3D Vision kit separately.

If you are wondering why 1440p has not been announced too, G-Sync won't work with DP 1.2 due to insufficient bandwidth and we must wait til Q2 2014 for DP 1.3. Thanks to omega he mentioned that Asus have a 1440p DisplayPort monitor planned for the coming months, the ROG Swift PG278Q. Doesn't mention which DP version it uses and I'll have a small bet that it will be 1.3. On paper its very appealing for gaming but you need to changing modes from one of the gaming modes to sRGB for browsing, etc. I mention this because its got the same features as my monitor apart from no G-Sync and you should get used to changing modes.

0

Share this post


Link to post
Share on other sites

Only works with one monitor *yawns*

Cash grab by Nvidia.

-Q

0

Share this post


Link to post
Share on other sites

Biased to US region again, same with EVGA US... I can't get my $20 780 Ti Classy backplate from EVGA Europe at all BUT they stock US!

 

0OhYF1km.png

 

This is the grand scheme of Nvidia, if this was ebay would that single CAD render make you want to enter to you card details, where's the rest of the kit ? :P

Obviously joking aside, this is a review video of the box I quickly found from someone who unboxes with one hand and no tripod, doh!

 

0

Share this post


Link to post
Share on other sites

ATH M50 ^^ :D

The pretension is nice, as expected of NVidia.

The module seems a bit larger than I though it would be (of course, it goes into a monitor, so it doesn't matter).

The Box could bit a bit smaller, also, what is the power supply for? Doesn't the module draw from the monitor power?

If it needs it own PSU, well... I think Gsync needs to die until it doesn't need a PSU for itself (I don't think it does)

-Q

0

Share this post


Link to post
Share on other sites

Could be because of Energy Star rating that the monitor doesn't provide enough power but either way not sleek at all and something you'd expect from 3rd party vendors.

 

It's not the case this needs to disappear but the rest (ASUS, BenQ, AOC, Phillips, ViewSonic) need to come to market already with G-Sync ready monitors because this kit is too expensive. Worst case, those already with the monitor should put it on their wishlist and buy it when it's nearly half price. Ideally Nvidia should drop the huge premium and sell it at cost because you need an Nvidia GPU anyway and its feeds back into their ecosystem. If a 1440p monitor sold for a $70 premium I could do that, then $40 for the huge selection of 1080 panels. It ties you into Nvidia with all their control and they know you'll have it longer than any GPU. Remember having this kit deactives image quality settings so its not a case of not using Gsync (Third party apps will resolve this) plus you can only use DP for now.

 

How much does all this sound like $$$ Apple! :(

0

Share this post


Link to post
Share on other sites

Could be because of Energy Star rating that the monitor doesn't provide enough power but either way not sleek at all and something you'd expect from 3rd party vendors.

 

It's not the case this needs to disappear but the rest (ASUS, BenQ, AOC, Phillips, ViewSonic) need to come to market already with G-Sync ready monitors because this kit is too expensive. Worst case, those already with the monitor should put it on their wishlist and buy it when it's nearly half price. Ideally Nvidia should drop the huge premium and sell it at cost because you need an Nvidia GPU anyway and its feeds back into their ecosystem. If a 1440p monitor sold for a $70 premium I could do that, then $40 for the huge selection of 1080 panels. It ties you into Nvidia with all their control and they know you'll have it longer than any GPU. Remember having this kit deactives image quality settings so its not a case of not using Gsync (Third party apps will resolve this) plus you can only use DP for now.

 

How much does all this sound like $$$ Apple! :(

Here is a copy + paste of a post made in an article by me:

TL;DR: Nvidia is milking the market for money for good biz reasons.  As for others coming with Gsync... Why should they spend a few hundred dollars for something they can implement for free by complying with standards?

It doesn't make biz sense for them.

 

As for NVidia not thinking of this... Well, Nvidia must have though about DirectX support long and hard also (they don't fully support all the newest DirectX features (11.2 for sure, I think parts of 11.1 might not be supported).

They don't support them due to the building blocks and design decision put into their GPUs.

They probably cannot support the VESA standard with Kepler GPUs.

Or, they choose not to, which does make sense given their market position. The HPC/Server/workstation market is growing, but their marketshare is falling rapidly (The new Mac Pro going AMD will make that even worse). consoles, which generated a healthy 100-150 million a year for NVidia are on the way out. Tegra is losing money, and they don't appear to have enough customers to support it more than a few more iterations. Consumer GPU market is shrinking (affects AMD also, but APUs offset a bit of that).

The only growing market is mid-high end GPU sales, and the enthusiast market. The decision to take as much money from that market as possible would allow them to keep profitable longer, and also more easily pay for R&D on GPUs, hopefully eventually subsidize them (currently the low end market subsidizes them, but APUs and Intel's strong iGPUs are eating that market) so they can continue on strong.

*shrugs* NVidia has a far stronger near term than AMD, but look past 5 years, AMD will be far stronger (a year ago I would say if AMD survives that long, but now I am certain they will)

 

Nvidia is hitting troubles, they might considering making Tegra [NDA'd information here].  Well, not a real NDA, more of a "Don't tell anyone or I die" from someone who has gotten me nice statistics (that I could actually use) from TSMC/NVidia/AMD before.

Note: Consoles should have said "50 to 150 million [uSD] a year."

-Q

0

Share this post


Link to post
Share on other sites

Yo, I've got some videos comparing G-Sync with V-Sync On and V-Sync Off:

Enjoy! B)

 

http://www.youtube.com/watch?feature=player_detailpage&v=Sy257BQyDus

 

http://www.youtube.com/watch?feature=player_detailpage&v=ZSgHqImxQpE

 

 

1

Share this post


Link to post
Share on other sites

Thanks Skr :)

 

The second video I've seen demo'd before and is designed to show the effect by Nvidia because they can control the framerate in real-time. Works for SLI setups where tearing is more pronounced.

 

First video nicely shows how some people have been playing games with poor IQ which increases eye strain. Smooth image is vital for all games like RTS, casual and not just online shooters where because of the input lag you completely loose frames. On my Dell I put up with tearing for a couple of years and decided enough was enough and got this 144Hz panel, I would never go back to a 60Hz every again and yes that would include 4k which are a mixture of 30Hz (don't touch these because they use tiling which brakes the screen layout for apps/games and is very laggy), then 60Hz which isn't too laggy.

0

Share this post


Link to post
Share on other sites

Please sign in to comment

You will be able to leave a comment after signing in



Sign In Now