Sign in with
Sign up | Sign in
Your question
Closed

Report: Nvidia G-Sync Exclusive to Asus Until Q3 2014

Last response: in News comments
Share
November 1, 2013 6:20:31 AM

Actually, its the other way around......like many technologies such as physX, mantle,etc many customers have been reluctant to adopt such "exclusive technologies"..Asus are the only one to get on board the g-sync technology.....other competitors in the monitor markets want to wait out and see..if this technology really makes a significant change in the user-gaming experience.......the markets are tight and no company would want to increase the price of their monitors on g-sync unless it delivers as promised.......my guess, it will fade away......just like 3d monitors, physX.etc..........
Score
2
November 1, 2013 6:23:01 AM

They should license this tech to everybody and advance the state of the art.
Score
4
November 1, 2013 6:23:28 AM

Personally I don't have a problem with this as I prefer my Asus screens. On the other hand, if you want a new technology to be adopted into the tech world, you don't want to make it exclusive...
Score
17
a b U Graphics card
a b C Monitor
November 1, 2013 6:25:09 AM

Physx didn't fade away. Nvidia bought Ageia and then implemented PhysX into their GPU.
Score
2
November 1, 2013 6:32:42 AM

Great, if people dig this idea, there's a bigger likelihood that companies who are left behind (everyone beside nvidia and asus) might develop an alternative open standard.

Death to proprietary.
Score
5
November 1, 2013 7:22:59 AM

Or, you know, you could just use vsync. Input lag is non-existent if you're already using a 120hz screen.
Score
-13
November 1, 2013 7:24:35 AM

rwinches said:
They should license this tech to everybody and advance the state of the art.


You're talking about Nvidia.
Score
-4
November 1, 2013 7:45:12 AM

There is an incorrect statement in this article:

"G-Sync is a technology that fixes screen tearing in Kepler-based games."

The author probably meant to say "Kepler based video cards". There are no such things as kepler-based games. Kepler is the architecture of Nvidia's current line of video cards and has nothing to do with the games themselves. This isn't something that will have to be programmed into each game; it will work with all games as it will be built into the video card itself.

As it is, the article makes it seem like only certain games will support this, but in reality the games won't have to support it and it will regulate frame rate regardless of what you're doing.

G-sync will work with all games on Kepler video cards (GTX 600 and 700 series), and future cards as well probably.
Score
9
a b U Graphics card
a b C Monitor
November 1, 2013 7:51:35 AM

@Steveymoo, for performance reasons if you have v-sync on you probably want triple buffering, meaning you are always at least 25ms behind. Assuming you want vsync to avoid tearing switching to gsync solves the tearing problem, stops the judder problem on unsynched frames, and reduces your input lag by 33% . . . What's not to like?
Score
2
November 1, 2013 8:31:27 AM

Change my GPU every 2 years, ok. I can use mantle and true audio or physX if devlopper adopt it.
Change my 3 ASUS screen that i just bought that are not compatible with G-Sync: not the same goddamn ballpark. People stick with their screens. This is gonna be hard to force adoption by the market, no mather how cool the technology is.
Score
0
November 1, 2013 8:38:28 AM

I agree with the last part of the article an most of the first comment. Beside being tired of tech that ties you to a specific hardware (being it nvidia or amd) and still being doubtful about the price they will charge for g-sync monitors.
P.s. PhysX dedicated hardware was an unsuccesful experiment but their acquisition from nvidia meant both lots of money for ageia and widespread adoption.
Score
0
a c 107 U Graphics card
a b C Monitor
November 1, 2013 9:06:34 AM

Steveymoo said:
Or, you know, you could just use vsync. Input lag is non-existent if you're already using a 120hz screen.

Traciatim said:
@Steveymoo, for performance reasons if you have v-sync on you probably want triple buffering, meaning you are always at least 25ms behind. Assuming you want vsync to avoid tearing switching to gsync solves the tearing problem, stops the judder problem on unsynched frames, and reduces your input lag by 33% . . . What's not to like?

With a card able to hold 120fps you will not get page tearing, but as soon as it drops you will fall all the way to 60fps without triple buffering.

The idea with g-sync is that your monitor can adjust its refresh to match the video(even as the frame rate changes) card instead of the card trying to match the monitor(something that does not always go over well).This will allow the timing(since even with a 60fps frame cap a 60hz screen can page tear because the card and screen to not refresh the image at the same time) to match ALL the time even if you are at an odd ball frame rate like 50 or 90.

I think this is GREAT, but we need an AMD and Nvidia solution.
Score
1
November 1, 2013 9:12:12 AM

As nifty as an idea this may be, it is wrought with potential pitfalls. They may get away with exclusivity for a brief time because this is a technology that will be adopted slowly anyway. Restricting use to one manufacturer helps identify intrinsic flaws, rather than troubleshooting a bunch of different flavors. But it's a bit of a tall hill to climb asking people to pair their GPU to specific monitors. The opportunity for mass confusion for consumers is fairly high. Furthermore the market for this is already fairly niche (many may like the end results, but how many will be willing for the hassle and expense of achieving those ends for ultimately a limited number of games). Nvidia will have a tough time making a robust profit from this for a while. Their potential profit may come in licensing this out to AMD, Intel, and monitor manufacturers, so that this benefit becomes more universal and a simpler choice for consumers.
Score
0
a b U Graphics card
November 1, 2013 12:02:43 PM

Being exclusive to both Asus and NVidia....the adoption rate will be horrible.
Score
2
November 1, 2013 12:58:52 PM

Steveymoo said:
Or, you know, you could just use vsync. Input lag is non-existent if you're already using a 120hz screen.


What does input lag have to do with refresh rate?
Score
-1
a b U Graphics card
November 1, 2013 1:00:07 PM

Nvidia just doesn't get it. Without nearly 100% market share, their technologies aren't going to be widely used unless they make them widely available.
Score
1
November 1, 2013 1:23:08 PM

Already like ASUS monitors, so wouldn't bother me much. However, it should be expanded to a few others.

For the people saying that V-Sync will fix their problems, you need to read up on what G-Sync does before posting.
Score
1
November 1, 2013 1:23:53 PM

G-Sync is so much more than a "nifty" idea for FPS gamers whom get the visual tearing and stuttering but currently just have to suck-it-up. It's the sort of technology that is so beneficial that it *needs* to be rapidly adopted industry wide. A v-sync timed to the graphics card output... I mean, it's makes you wonder why it wasn't a ubiquitous component in monitor manufacturing twenty years ago.
Score
1
a b U Graphics card
a b C Monitor
November 1, 2013 3:20:24 PM

To everyone saying Nvidia's proprietary software should be open, It can't. Their cards are designed specific to their software. Hardware/software harmony. Physx might be an exception but ShadowPlay, GSync, TXAA, HBAO+,Game streaming(Shield, Grid and TV) etc uses specific Nvidia architectures to support them. AMD cards don't have the required hardware on board to uses these things. Just like how Nvidia can't use the AMD audio software because Kepler cards don't have the processor.

One other thing to consider is that consider how much more money Nvidia spends in its software department than AMD. AMD would quit researching anything because they'd rely on Nvidia's advances and make none of their own.

Ps. Physx isn't "dying" but its still limited. More games are adopting it as time goes on.
Score
0
a b U Graphics card
November 1, 2013 6:02:04 PM

Fierce Guppy said:
G-Sync is so much more than a "nifty" idea for FPS gamers whom get the visual tearing and stuttering but currently just have to suck-it-up. It's the sort of technology that is so beneficial that it *needs* to be rapidly adopted industry wide. A v-sync timed to the graphics card output... I mean, it's makes you wonder why it wasn't a ubiquitous component in monitor manufacturing twenty years ago.


Tearing and stuttering are two completely separate problems at the opposite end of each other.

Gsync won't affect screen tearing any more than vsync already does with the possible exception of it also reduces input lag. In that sense, it's a less terrible vsync, but using the game's engine or third party software to limit framerate already are good solutions for screen tearing and don't require proprietary technology and a new monitor.

And let's be clear, people without really nice setups aren't getting any screen tearing in cutting-edge games at high resolution -- screen tearing is from getting TOO MANY fps. It's a problem when you play something like Portal 2 with good hardware at 1080p, not playing Battlefield at 1600p with midrange hardware.

Stuttering is mostly commonly noticed with multi-GPU setups. It's because the frames aren't delivered in consistent timeframes and humans are really good at noticing changes like this. A solution to stuttering is lowering your settings to raise fps above your monitor's refresh rate. "Uh, I get a lot of microstutter in BF3 at ultra settings with my multi-GPU setup and my minimum frame is 30 fps." Lower your settings to raise your fps, dummy -- problem solved.

Gsync is one of those things I'll have to see to believe. I really doubt a dynamic monitor refresh rate will fool our eyes and brain from detecting drastic frame time differences. Furthermore, I'm not going to pay a premium on the video card AND a new monitor for gysnc... I'd be better off putting that money towards a better GPU.
Score
0
November 1, 2013 9:45:05 PM

hapkido said:
Nvidia just doesn't get it. Without nearly 100% market share, their technologies aren't going to be widely used unless they make them widely available.


I think you have it backwards :) 

Anyhow I can believe this as they may have worked for a few years on the tech with Asus specifically. That module may be made for their monitors specifically and it may take them time to get them prepped for others. Under a year for dev on the rest and troubleshooting, bug fixes etc seems reasonable. Or ASUS paid them. Again, not NV's fault they are a business and if they don't think anyone else (AMD) can do it in a year and someone offers them millions for exclusivity for 10 months or something they'll take the cash. It gives them time to get everyone else on board with QUALITY products, and ensures cash, while ensuring the first tested units are all perfectly working and won't receive some crap review due to malfunctions.

You want to put your best foot forward when putting out a brand new tech that is a huge game changer (by all accounts, no review site said it was NOT good and all said they hope it becomes ubiquitous). You don't want half-a$$ed stuff getting reviewed. Like I said it may have taken 2yrs of working with Asus to figure out all the bugs. They had to make a friggen $100 card for this feature so it's not some simple gpu fix. I'm sure they are capable of doing it faster on other systems now that they know what they're doing, but it may take a year to integrate into everyone, or they don't want to make 100 different cards for every model out there etc. Who knows why, but there could easily be some VERY good reasoning behind the rest taking some time to get it integrated.

Also, AMD doesn't get it then either right? Mantle doesn't work with anything but a VERY small set of AMD cards, no backward compatibility even with their own stuff...So if NV doesn't get it, AMD gets it even less. NV owns 65% of the market. Mantle works with a small number of 35% of the market.

From their page:
G-SYNC features require an NVIDIA GeForce GTX650Ti BOOST GPU or higher
GTX TITAN
GTX 780
GTX 770
GTX 760
GTX 690
GTX 680
GTX 670
GTX 660 Ti
GTX 660
GTX 650 Ti Boost
Display:

G-SYNC DIY modification kit requires an ASUS VG248QE monitor.

Just from that you can guess I'm right. It doesn't work with anything right now but one monitor, which is what I think they built and tested this tech with. Makes total sense to me. The rest will take time to test and I think they want to get rid of the external crap and integrate the module before allowing others on board just for customer simplicity.
Score
0
November 1, 2013 9:48:40 PM

So while mantle/Trueaudio works with NOTHING old, at least a lot of people just need a monitor (as you can see from the list above 650TI+ works fine) and a lot of us are planning 1440p monitor purchases in the future anyway, so for a lot of people this works fine. By the time I want it, I should have dozens to choose from next xmas.
Score
0
November 1, 2013 10:55:59 PM

hapkido said:
Fierce Guppy said:
G-Sync is so much more than a "nifty" idea for FPS gamers whom get the visual tearing and stuttering but currently just have to suck-it-up. It's the sort of technology that is so beneficial that it *needs* to be rapidly adopted industry wide. A v-sync timed to the graphics card output... I mean, it's makes you wonder why it wasn't a ubiquitous component in monitor manufacturing twenty years ago.


Tearing and stuttering are two completely separate problems at the opposite end of each other.

Gsync won't affect screen tearing any more than vsync already does with the possible exception of it also reduces input lag. In that sense, it's a less terrible vsync, but using the game's engine or third party software to limit framerate already are good solutions for screen tearing and don't require proprietary technology and a new monitor.

And let's be clear, people without really nice setups aren't getting any screen tearing in cutting-edge games at high resolution -- screen tearing is from getting TOO MANY fps. It's a problem when you play something like Portal 2 with good hardware at 1080p, not playing Battlefield at 1600p with midrange hardware.

Stuttering is mostly commonly noticed with multi-GPU setups. It's because the frames aren't delivered in consistent timeframes and humans are really good at noticing changes like this. A solution to stuttering is lowering your settings to raise fps above your monitor's refresh rate. "Uh, I get a lot of microstutter in BF3 at ultra settings with my multi-GPU setup and my minimum frame is 30 fps." Lower your settings to raise your fps, dummy -- problem solved.

Gsync is one of those things I'll have to see to believe. I really doubt a dynamic monitor refresh rate will fool our eyes and brain from detecting drastic frame time differences. Furthermore, I'm not going to pay a premium on the video card AND a new monitor for gysnc... I'd be better off putting that money towards a better GPU.


http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-G-Sy...
"We can also eliminate the horizontal tearing of games that occurs when the refresh rate of the monitor does not match the frame rate produced by the graphics card. By only sending complete frames to the monitor and having the panel refresh at that time, you could maximize frame rate without distracting visual anomalies. If you are able to run your game at 40 FPS then your panel will display 40 FPS. If you can run the game at 160 FPS then you can display 160 FPS."

HE bolded it, not me in the article. :) 
"G-Sync changes that by allowing a monitor to display refresh rates other than 60 Hz, 120 Hz or 144 Hz, etc. without the horizontal tearing normally associated with turning off Vsync. "

He seems pretty clear here TWICE and he discussed this crap at length with NV, and devs.

"Using a monitor with a variable refresh rates allows the game to display 55 FPS on the panel at 55 Hz without any horizontal tearing. It can also display 133 FPS at 133 Hz without tearing. Anything below the 144 Hz maximum refresh rate of this monitor will be running at full speed without the tearing associated with the lack of vertical sync."

Again VERY CLEAR. WITHOUT ANY TEARING. Everyone who has seen it says THEY believe and claims it has to be seen!...LOL.

Your responses are AMD is ok, it's ok to look like crap lower your settings...blah blah...The point is we don't want to say it's OK any time to look like crap or to change settings or to put up with ANY junk in our image. This solves it all at once. It's better tech, get over it. It's also allowing devs to use the extra power to AMP up graphics when your gpu can pump out far more frames than needed. All the devs on stage loved this idea as it frees them to do whatever they want on the fly.
"It is one thing to know that NVIDIA and the media are impressed by a technology, but when you get the top three game developers on stage at once to express their interest, that sells a lot. John Carmack, Johan Andersson and Tim Sweeney stood up with NVIDIA CEO Jen-Hsun Huang all raving about the benefits that G-Sync will bring to PC gaming. Mark Rein was standing next to me during a demonstration and was clearly excited about the potential for developers to increase visual quality without worrying about hitting a 60 FPS cap 100% of the time."

You do what you want. I only have a gysnc monitor in my future, nothing else is acceptable ;)  You keep jacking your settings around, I prefer allowing NV to do it on the fly and FIX it for me for good, while giving devs freedom to do what they want with my extra gpu power. I didn't see anyone on stage bragging about mantle :)  They all pretty much dogged it, here and elsewhere and while on stage. Lets be clear, you apparently will put up with things the majority of us would like to be rid of :)  Even if I'd just bought a card that didn't support gysnc, unless my monitor just died, I'd wait for gsync monitor for my next purchase (hoping AMD licenses it, or comes up with a compatible deal, or I'd just go NV by default for the next card).

In a stock fight (money, stocks I mean), I'd bet on the guy who has the tech everyone WANTS, not the one nobody really NEEDS (die shrinks will get more perf for years to come all the way to 5nm or so, with no extra DEV work on games). Mantle doesn't change the world, it just speeds up a few select cards and i'd assume every card they make next version+, though they've left it off of a lot of cards already this time...WHY? Whatever it's a failed idea as it costs devs more programming for no extra return in money (can't charge more for mantle games).

Gsync changes the world and in ways we all want to see happen including devs and it makes their job easier in coding (freedom from things like 60fps caps on consoles etc). Die shrinks, better perf, drivers etc don't fix what Gsync fixes. It's a hardware solution or deal with problem forever. This is basically NV admitting it can't be done in drivers alone. Good luck to AMD funding research to resolve it their own way, I hope they just license it (and hopefully NV is open to that for mobile and everything else). Considering it's only working with ONE monitor currently it's clear NV took some work to get this done (R&D - how long did it take working with ASUS for ONE monitor, how fast can they roll it to others?). How fast could AMD do this alone now that they either have to match it or license it? I vote license.
Score
0
November 1, 2013 11:14:12 PM

BranFlake5 said:
To everyone saying Nvidia's proprietary software should be open, It can't. Their cards are designed specific to their software. Hardware/software harmony. Physx might be an exception but ShadowPlay, GSync, TXAA, HBAO+,Game streaming(Shield, Grid and TV) etc uses specific Nvidia architectures to support them. AMD cards don't have the required hardware on board to uses these things. Just like how Nvidia can't use the AMD audio software because Kepler cards don't have the processor.

One other thing to consider is that consider how much more money Nvidia spends in its software department than AMD. AMD would quit researching anything because they'd rely on Nvidia's advances and make none of their own.

Ps. Physx isn't "dying" but its still limited. More games are adopting it as time goes on.


But I'm sure they can license it and AMD can then make compatible hardware (it may take another gen of cards or something, but it can be done, no different than AMD64, or ARM IP - you can take theirs or ROLL YOUR OWN like Apple, Qcom, NV Denver/Boulder etc). If they don't license it, it could be years before AMD gets anything like it, and may cost monitor makers (mobile etc) more to support two totally different techs. Even with a lic though, AMD would need a hardware rev to support it so at least a year away while NV builds momentum. This will cost AMD market share unless they can come up with a card that is so much faster than NV (No not 290x, doesn't cut it, and party over next week with 780TI) that gsync isn't worth it for some.

Clearly NV has been working on it during 600 series designs or they wouldn't work on 650TI+ already and going that low it's not about perf, it's about tech inside the chip. I'm thinking this means AMD is years away if no license happens. It took FCAT to make them realize they had a problem. NV seemed to already know, worked it out best as possible (they created fcat), found out drivers have limits and moved to a $100 card to resolve it. See how far out that makes AMD?

Until we get more lopsided, I believe phsyx & mantle will only be supported when NV/AMD subsidizes it for headlines. If one gets 90% of the market (which gsync could cause if AMD gets blocked and can't come up with it for years- NV doesn't have to give it out), then you'd see people writing for stuff like physx (or whoever the owner of the 90% is). But yes, I'm sure Gsync cost NV some R&D. I mean it only works with ONE monitor and requires a card to do it. More R&D will need to be done for something that works on all others (they're surely trying to avoid 100 cards for different models) and I don't think AMD has the funding to pull this off for a while. I only hope NV charges reasonable fees (sure they need to recoup money, but hope they don't completely gouge AMD, they get to license to all other screens/products that desire it).
Score
0
a b U Graphics card
November 2, 2013 3:22:23 AM

Well, i'm not sure i can believe this, as during the Montreal even Nvidia announces quite a few partners. Asus was merely the first one to announce a product ready for purchase in the first quarter next year.
Score
0
November 2, 2013 5:34:19 AM

DRosencraft said:
As nifty as an idea this may be, it is wrought with potential pitfalls. They may get away with exclusivity for a brief time because this is a technology that will be adopted slowly anyway. Restricting use to one manufacturer helps identify intrinsic flaws, rather than troubleshooting a bunch of different flavors. But it's a bit of a tall hill to climb asking people to pair their GPU to specific monitors. The opportunity for mass confusion for consumers is fairly high. Furthermore the market for this is already fairly niche (many may like the end results, but how many will be willing for the hassle and expense of achieving those ends for ultimately a limited number of games). Nvidia will have a tough time making a robust profit from this for a while. Their potential profit may come in licensing this out to AMD, Intel, and monitor manufacturers, so that this benefit becomes more universal and a simpler choice for consumers.


What do you mean "ultimately a limited number of games"? Devs don't have to write specifically for this, though they can eventually to take advantage of the freedom from worrying about fps, and in turn taking advantage of any extra power your gpu has.
Watch the vid of the devs talking about it (it's only 2 mins):
http://blogs.nvidia.com/blog/2013/10/18/montreal/
Carmack, Andersson, Sweeney all say the same thing. It takes what is already there and makes it better, buttery smooth. They're not recoding games to work here.
https://www.youtube.com/watch?list=UUHuiy8bXnmK5nisYHUd...
You don't have to change your games. There's a direct link to the vid.

http://www.geforce.com/hardware/technology/g-sync/faq
"Q: Does NVIDIA G-SYNC work for all games?
A: NVIDIA G-SYNC works with all games. However, we have found some games that do not behave well and for those games we recommend that users take advantage of our control panel’s ability to disable G-SYNC per game. Games that NVIDIA discovers that have trouble with G-SYNC will be disabled by default in our driver."

I'm still waiting for a list of PROBLEM games, and if it's perm or temp and fixable via driver updates, but generally speaking all games should work and not need changes (barring the occasional old game with issues I guess). Not sure who is to blame for the issues in these cases, drivers or the game itself having something that just causes issues.

For anyone interested in a lot more gsync info and quotes from devs etc (including stuff not discussed on stage, backlight strobe mode etc etc):
http://www.blurbusters.com/
"John Carmack (@ID_AA_Carmack) tweeted:
“@GuerillaDawg the didn’t talk about it, but this includes an improved lightboost driver, but it is currently a choice — gsync or flashed.”
Score
0
November 2, 2013 12:44:52 PM

it would be nice if you could add the hardware yourself.

For an example, I bought a VG248qe 144hz 24" monitor, so if I could open the back of the monitor and some how add the hardware that would be cool.
Score
0
a c 84 U Graphics card
a b C Monitor
November 2, 2013 1:27:30 PM

Quote:

@Steveymoo, for performance reasons if you have v-sync on you probably want triple buffering, meaning you are always at least 25ms behind. Assuming you want vsync to avoid tearing switching to gsync solves the tearing problem, stops the judder problem on unsynched frames, and reduces your input lag by 33% . . . What's not to like?

Its proprietary to both GPU and monitor...
Score
0
November 2, 2013 3:32:30 PM

smeezekitty said:
Quote:

@Steveymoo, for performance reasons if you have v-sync on you probably want triple buffering, meaning you are always at least 25ms behind. Assuming you want vsync to avoid tearing switching to gsync solves the tearing problem, stops the judder problem on unsynched frames, and reduces your input lag by 33% . . . What's not to like?

Its proprietary to both GPU and monitor...


But solves the problem. You might as well get used to proprietary stuff. If I pay to solve a problem everyone hates (insert company name) I won't be giving it to the enemy free...NEVER...LOL. Or at least until I can't figure out how to milk it alone...ROFL. There corrected myself.
Score
0
November 2, 2013 3:38:47 PM

awesomedude911 said:
it would be nice if you could add the hardware yourself.

For an example, I bought a VG248qe 144hz 24" monitor, so if I could open the back of the monitor and some how add the hardware that would be cool.


"Q: When will I be able to purchase this?

A: The NVIDIA G-SYNC Do-it-yourself kits for the ASUS VG248QE monitor will be available for purchase later this year. We will have more information to come on how and when to get G-SYNC enabled monitors in the future."

"Q: Can I install G-SYNC modules for my current monitor?

A: For gaming enthusiasts, NVIDIA has made available a do-it-yourself monitor modification kit for an ASUS VG248QE monitor. The mod takes about 20 minutes. More details of the kit will be posted. "

Just wait a bit, you can DIY soon.
I suppose you can watch here:
http://www.geforce.com/hardware/technology/g-sync/faq
Or their blog.
http://blogs.nvidia.com

I'm sure tech sites will cover this the second it gets released anyway but just in case... :)  I don't think you open the monitor but could be wrong. Sounds like a dongle type deal but maybe not. It can't be that tough or it would be a customer nightmare for anyone considered NEWB :) 

Score
0
a c 84 U Graphics card
a b C Monitor
November 2, 2013 3:44:04 PM

Quote:

But solves the problem.

That's yet to be seen.

You might as well get used to proprietary stuff. If I pay to solve a problem everyone hates (insert company name) I won't be giving it to the enemy free...NEVER...LOL. Or at least until I can't figure out how to milk it alone...ROFL. There corrected myself. said:

You might as well get used to proprietary stuff. If I pay to solve a problem everyone hates (insert company name) I won't be giving it to the enemy free...NEVER...LOL. Or at least until I can't figure out how to milk it alone...ROFL. There corrected myself.

There are many open standards developed by companies. Unfortunately both NVidia and AMD both resort to proprietary gimmicks.

Realistically when things are proprietary, the consumer gets screwed.
Score
0
November 2, 2013 5:16:15 PM

smeezekitty said:
Quote:

But solves the problem.

That's yet to be seen.

You might as well get used to proprietary stuff. If I pay to solve a problem everyone hates (insert company name) I won't be giving it to the enemy free...NEVER...LOL. Or at least until I can't figure out how to milk it alone...ROFL. There corrected myself.[/msgquoted said:

There are many open standards developed by companies. Unfortunately both NVidia and AMD both resort to proprietary gimmicks.

Realistically when things are proprietary, the consumer gets screwed.]
You might as well get used to proprietary stuff. If I pay to solve a problem everyone hates (insert company name) I won't be giving it to the enemy free...NEVER...LOL. Or at least until I can't figure out how to milk it alone...ROFL. There corrected myself.

There are many open standards developed by companies. Unfortunately both NVidia and AMD both resort to proprietary gimmicks.

Realistically when things are proprietary, the consumer gets screwed.


Yet to be seen by you. Everyone who has seen it says it's awesome, including Sweeney, Andersson, Carmack, Reign and every tech site out there. They all claim it's a total game changer. It is NOT a gimmick in this case. It works and doesn't require devs to go back and code everything again or even work to get it in their future games. It's a monitor driver thing, not a dev please write more code thing (like mantle). Many great things start proprietary but become licensed which is what I hope happens here so we get it everywhere (mobile, built in to tv's etc), so gaming is smooth everywhere.
Score
0
a b U Graphics card
November 2, 2013 6:40:19 PM

Nobody cares about Carmack's opinion anymore. He has completely gone off the deep end. This is the same guy that said Unreal engine 4 or whatever could only be fully utilized by a GTX 680 OR xbox 360 or ps3 consoles. Like give me an effing break. I can respect a guy like Gabe Newell sticking up for what he believes in and putting his checkbook to work to make it happen -- example, porting all popular Valve games to OpenGL. I can't respect Carmack for selling his name to the highest bidder.

The point is, as AMD continues to gain marketshare for providing better price/performance gaming GPUs, Gsync is not going to become a game changer because budget savvy users aren't going to spend more money upgrading their monitors and accept less performance from their graphics card or spend more money for the same performance. It's an ADD-ON like eyefinity or physx (and that's assuming it does make games appear smoother which I'm still skeptical about) -- not a defining feature.

As far as I'm concerned right now, it's a marketing ploy. These developers need to be concerned with pushing the envelope and forcing hardware vendors to advance monitor and GPU technologies, not squeezing every last MS of frametime out of 1080p gameplay. We're over it -- it's old technology and it's time to move onto something better.
Score
1
November 4, 2013 3:07:12 PM

This will just make Asus monitors overpriced.
Score
0
a c 84 U Graphics card
a b C Monitor
November 4, 2013 4:21:13 PM

Vigilence said:
This will just make Asus monitors overpriced.

Chance are not all ASUS monitors will include it.

Off topic: I am using an ASUS monitor now and it has a very sharp picture. Better then most others I have used.
Score
0
November 7, 2013 7:49:28 AM

This is just another Physx proprietary offering from Nvidia that will ultimately fail when monitor makers realize that people are not going to rush out and buy them like Hapkido has stated! Money is the bottom line for these companies and if it doesn't make enough then it will be scraped. Only the few hardcore Nvidia fans will waste their money on a closed tech... Don't get me wrong if they actually licensed it out then that could be a good thing but it's Nvidia they want to squeeze the competition out and rip us all off if they can. As for Mantle if anyone actually paid attention it is an open standard meaning that Nvidia if they so chose to could use it.
Score
0
November 7, 2013 2:51:07 PM

Haha, it's insane to me how many are just poo-pooing this. You clearly don't understand the technology or what it means.
Score
0
November 7, 2013 2:52:55 PM

Hmm. I have to wonder if the people who are poo-pooing this even understand the technology. If you don't think this is a major game changer, you simply don't understand the core concept of lag vs screen tearing, or how GPUs and monitors work.

(sorry for double post, first one wasn't showing for some reason)
Score
0
a c 84 U Graphics card
a b C Monitor
November 7, 2013 4:56:28 PM

Oh well I don't mind screen tearing anyway.
I still say proprietary = bad
Score
0
November 7, 2013 6:44:40 PM

My current Asus Z87 / Intel i7-4770 / GeForce GTX 770SC 4GB rig I just built was based around the future G-Sync. Within the next couple weeks I'll be buying three Asus VG248QE monitors to replace my existing three LCD monitors. Then once Nvidia releases the G-Sync adapters to VG248QE owners to install on their own, I'll get them too. The new Asus VG248QE G-Sync ready monitors won't be available for another 3-6 months yet. As for the rumor that Asus is the only company getting the new technology that is false. Asus only happens to be the first because it was the first VRR (variable refresh rate) monitor G-Sync was installed on and kits already exist. My understanding is within 10-months from now, at least 5 different companies will have G-Sync VRR ready monitors. But for now if you intend to be among the first adopters of this new technology, you must outfit your rig with the latest Nvidia video card(s), and for now the only monitor that can be used is the Asus VG248QE 144Hz VRR 24" LED/LCD monitor but you have to install the card yourself or take those monitors to a qualified person and they can install the G-Sync card. After Q1 next year, the card will come already installed in that monitor. Visit Nvidia and follow G-Sync developments and release dates for do-it-yourselfers. I know I will.
Score
0
November 8, 2013 1:40:51 AM

jab8283 said:
My current Asus Z87 / Intel i7-4770 / GeForce GTX 770SC 4GB rig I just built was based around the future G-Sync. Within the next couple weeks I'll be buying three Asus VG248QE monitors to replace my existing three LCD monitors. Then once Nvidia releases the G-Sync adapters to VG248QE owners to install on their own, I'll get them too. The new Asus VG248QE G-Sync ready monitors won't be available for another 3-6 months yet. As for the rumor that Asus is the only company getting the new technology that is false. Asus only happens to be the first because it was the first VRR (variable refresh rate) monitor G-Sync was installed on and kits already exist. My understanding is within 10-months from now, at least 5 different companies will have G-Sync VRR ready monitors. But for now if you intend to be among the first adopters of this new technology, you must outfit your rig with the latest Nvidia video card(s), and for now the only monitor that can be used is the Asus VG248QE 144Hz VRR 24" LED/LCD monitor but you have to install the card yourself or take those monitors to a qualified person and they can install the G-Sync card. After Q1 next year, the card will come already installed in that monitor. Visit Nvidia and follow G-Sync developments and release dates for do-it-yourselfers. I know I will.


You are correct. With the added statement that once it's inside the monitor out of the box, you will lower costs because you will no longer be replacing something with an external card. There are chips inside the monitor currently that the card replaces when installed. A monitor company won't need to buy the chips this card replaces so it should drop premiums to $50-75 then instead of $100 currently for the card. Of course this should get even better over time as R&D to get it to work with each monitor should get figured out faster as they now know what they are doing probably unlike when they had to come up with the stuff on day one with Asus from a blank slate. All they knew at that point was what the problems were that had to be solved. Now they know that, and how they fixed it. It will be easier for each model vs. Asus.
Score
0
November 8, 2013 1:51:32 AM

smeezekitty said:
Oh well I don't mind screen tearing anyway.
I still say proprietary = bad


Umm, something wrong with you if you like crap graphics :) 

Agreed on proprietary, but still recognize that it sometimes makes others up their game for a standard. Glide etc made MS put out far better versions of DirectX (still proprietary I guess...LOL, but got everyone on the same thing at least back then). If someone doesn't pave the way, we never get anything. RIMM had enterprise email all to themselves. Then apple came along (and a few others) with Exchange for mobile stuff and Rimm and it's wallet drenching pricing got screwed ;)  Rimm's great stuff cause the world to catch up (which killed rimm...LOL). See how that works? Qcom has been doing the same with modems, but we're now seeing the world catch up and their margins just Wed wen to crap from 60+ down to 54%. So 10% drop due to competition all getting cheap modems into asia. Watch as they drop further as this gets to everywhere with everyone supporting multimode (wel, all modes for all countries and types).

So it's not really that bad to me. Without the people forking over dough for R&D for new crap, we never get the world making it a commodity item eventually. Note NV licensed physx as it's used in all engines basically and also all consoles. Though that doesn't seem to have made it much more popular. It's only in ~60 games (even though it's included in unreal engine etc etc, being in there doesn't mean used, just like mantle). I predict the same or worse for mantle as it only works on a VERY small subset of the smaller company here in market share (amd only has ~35% discrete and only a portion of those will be R7/9 for years). AMD will pay every time it gets used and devs can't charge more for optimizing for this little niche.
Score
0
a c 84 U Graphics card
a b C Monitor
November 8, 2013 12:08:45 PM

somebodyspecial said:
smeezekitty said:
Oh well I don't mind screen tearing anyway.
I still say proprietary = bad


Umm, something wrong with you if you like crap graphics :) 

Oh well. There are alot of things wrong with me but that seems to be the case with alot of people on the internet.

Alias shimmering is much more annoying to me. Perhaps because I don't play FPS's has something to do with it.

Quote:

Agreed on proprietary, but still recognize that it sometimes makes others up their game for a standard. Glide etc made MS put out far better versions of DirectX (still proprietary I guess...LOL, but got everyone on the same thing at least back then). If someone doesn't pave the way, we never get anything. RIMM had enterprise email all to themselves. Then apple came along (and a few others) with Exchange for mobile stuff and Rimm and it's wallet drenching pricing got screwed ;)  Rimm's great stuff cause the world to catch up (which killed rimm...LOL). See how that works? Qcom has been doing the same with modems, but we're now seeing the world catch up and their margins just Wed wen to crap from 60+ down to 54%. So 10% drop due to competition all getting cheap modems into asia. Watch as they drop further as this gets to everywhere with everyone supporting multimode (wel, all modes for all countries and types).

Agreed to some extent. But it would be better if there was a standard created by monitor manufacturers. That would pretty much force it to work with both brands.

Perhaps even a new DVI standard.
Score
0
!