AMD Clears the Air Around Project FreeSync
Tags:
-
Graphics Cards
- vesa
-
Monitors
-
AMD
Last response: in News comments
It looks like Adaptive-Sync might just become standard fare on DisplayPort-enabled monitors a couple years down the road.
AMD Clears the Air Around Project FreeSync : Read more
AMD Clears the Air Around Project FreeSync : Read more
More about : amd clears air project freesync
Bondfc11
July 14, 2014 12:22:34 PM
Score
7
Related resources
- AMD Project FreeSync FAQ - Forum
NightLight
July 14, 2014 1:18:49 PM
Quote:
Although ASync only works with 2 AMD cards at the moment. Not sure how that makes it more open to the public. I am sure AMD will try and expand the card list to support Async in the future, but for now it is even more limited than Nvidia's solution.A simple driver update could address this as long as the gfx card is capable of the appropriate version of display port.
Score
7
hannibal
July 14, 2014 1:41:07 PM
falchard
July 14, 2014 3:35:45 PM
I am certain AMD wants this to work on AMD, NVidia, and Intel GPUs. Most proprietary solutions fail in the marketplace without the company shelling out cash for manufacturers/developers to adopt. After all no point limiting your market to 50% of users. More than likely we will see support for AMDs current architecture and NVidia since Kepler.
Score
7
Tem B
July 14, 2014 5:22:27 PM
Quote:
Although ASync only works with 2 AMD cards at the moment. Not sure how that makes it more open to the public. I am sure AMD will try and expand the card list to support Async in the future, but for now it is even more limited than Nvidia's solution.R9 290X, R9 290, R7 260X and R7 260 graphics cards support it. Their APUs also support. A lot of people are already able to benefit when it comes out.
Of course it is limited since its not out yet. When it is out though, its over for Gsync. If nvidia doesn't support this tech because of their stupid EGO or try to fight it, they will suffer - if reviews show it essential for a wide range of gaming levels (low to high to ultra high).
Score
7
Quote:
The best thing would be to have Intel to support this feature with their next CPUGPU upgrades. It should be possible to do it just by making new drivers, and those CPUGPU solutions really need this feature, because they can have really poor frame rates.Yea and Intel has so far been really good about trying to update features on their graphics chips. Granted they only have a small handful that are worth using for anything which makes it easier to update as opposed to the massive holdings AMD and Nvidia have so many cards to update. So I bet Intel would be happy about the technology and support it. It would be good to see them get into it.
Score
2
eklipz330
July 14, 2014 6:07:20 PM
Quote:
i love amd, but if it wasn't for nvidias money-mongering, this would have never come to fruitionNot necesarily. This issue has been present for pretty much the full history of computer gaming. That is why V-Sync was originally created. It seems more likely both companies were working on the same problem and have came up with different solutions. Nvidia created a solution that would make them more money. AMD created a solution which would improve their performance. It probably helps that having this, will prevent loss of sales from people wanting to use G-Sync, but I doubt that was their primary reason.
Score
5
H1itman_Actual
July 14, 2014 8:06:10 PM
The_Icon
July 14, 2014 10:10:29 PM
Truckinupga said:
Free is better, godspeed to AMD. Just please don't let this be another overly exaggerated minor performance improvement like Mantel. Mantel has been successful in broadening the market by giving weaker CPU's a sizable performance boost. Higher end CPU's saw little to no performance gains, but they didn't really need them, being as strong as they tend to be. However, AMD marketing did seriously over-promise on Mantles abilities. Someone really needs to reign them in as very few products live up to their hype...
Score
2
Quote:
The best thing would be to have Intel to support this feature with their next CPUGPU upgrades. It should be possible to do it just by making new drivers, and those CPUGPU solutions really need this feature, because they can have really poor frame rates.Score
2
Martell1977 said:
Truckinupga said:
Free is better, godspeed to AMD. Just please don't let this be another overly exaggerated minor performance improvement like Mantel. Mantel has been successful in broadening the market by giving weaker CPU's a sizable performance boost. Higher end CPU's saw little to no performance gains, but they didn't really need them, being as strong as they tend to be. However, AMD marketing did seriously over-promise on Mantles abilities. Someone really needs to reign them in as very few products live up to their hype...
I agree, Mantle did bring Improvements and the future is looking much brighter for graphics performance because of Mantle. That goes for AMD, Nvidia and Intel, but as you mentioned we are all getting a little fed up with overblown promises just to be let down. Even if they do deliver to a certain extent, it is not good to raise our expectations too high. That will only create disappointment even when there is a measurable Improvement as in the case of Mantle.
Score
1
N.Broekhuijsen said:
It looks like Adaptive-Sync might just become standard fare on DisplayPort-enabled monitors a couple years down the road.AMD Clears the Air Around Project FreeSync : Read more
"AMD is actively working with these scaler vendors to bring Adaptive-Sync support into their higher-end scalers, and the company expects the mainstream scalers to gain support for the features very soon as well. The process isn’t progressing as slowly as we thought, although the AMD spokesman did make it clear that it was still too early to discuss exact commitments."
Let me know when that is ACTUALLY TRUE. Until then I say BS, NV tried and couldn't get them to budge, so created a way to do it themselves. Too early to discuss commitments because there are ZERO to discuss. If scaler companies don't make the hardware, how far are you going to get? About as far as NV did I suspect. If it was progressing quickly why would you put a couple of YEARS? Also since there are moles everywhere, I'm sure NV will know when they decide to cooperate and at that point they can choose to drop gsync or lower the price to force more sales (it does sell a gpu also). It's up to them to figure out which way is better for their business and I'm sure their bean counters will be hard at work...LOL.
http://www.blurbusters.com/gsync/list-of-gsync-monitors...
The list is growing for gsync monitors announced. Not sure when it was last udpated but I'm sure more will be announced in the next 5 months. The only reason we don't see more already is the amount of time it takes to tune it for each panel (hence the cost). It would sure speed things up if they get a rev2 our or something that can just be applied to all monitors easily. Considering the difficulty in even doing it NV's way, I'm still wondering how GOOD AMD's solution will really be when someone finally is allowed to test it gaming across a dozen titles or so.
"couple years down the road"
This is a jump on Gsync that is ALREADY HERE? ROFL. "speed to market" means nothing if it takes years to actually GET to market. AMD said nothing here, no clearing the air, just more of what they HOPE will happen. Meanwhile monitors with gsync will be out in quite good numbers for xmas (meaning on a good number of monitors). Your comment is crazy. In order to have a jump on gsync you have to be FIRST don't you? Not in a COUPLE OF YEARS, right? Is it 6-12 months or a couple of years now?
"We can expect to see almost all mainstream and high-end monitors support Adaptive-Sync in the future."
Umm...There's that scaler problem that has to be worked out with the 4-5 vendors the AMD guy mentioned...Remember, NV said they tried and nobody would budge (that R&D costs money), so again, this is why they did it themselves. It is also why you DON'T give it away freely after doing that R&D. Make no mistake the scaler companies will charge the monitor people, and the monitor people will in turn charge YOU. The same thing happened with gsync. This is no different, it's just not AMD doing it to you, it's the scalers/monitor makers who will.
"If we wanted to do something over HDMI right now, it would have to be proprietary, and we would rather not do that."
ROFL...
So in other words "Nvidia had no choice but to do it proprietary because that is all they had available to work with, and since we don't want to be blamed for charging you, we'll wait for years maybe until scalers cooperate so it can be blamed on them or monitor makers"...LOL.
If scaler makers move at all it will be due to them losing sales because Gsync is included INSTEAD of their scaler. At that point (say xmas or so when all the monitors that we know are coming with gsync are out in great numbers), they may be willing to at least do the work and charge a minimal amount for it, but they won't go for FREE, just cheaper than NV probably to win back sales from Gsync monitors. You see, without vast numbers of gsync selling yet, they have no fears, but that ends at xmas. You could say, AMD's success at getting it into monitors is solely based on Nvidia's success at selling Gsync this xmas...ROFL. If NV succeeds you'll see scaler vendors ramp up some R&D to get new scalers out the door to stop gsync from taking all their sales. It's that simple. Then again, if NV can drive the cost down as sales ramp up they may lose anyway. That's how cuda got entrenched. By the time AMD actually did something they already had years in cuda and owned 90% of the market.
What air got cleared? No commitments discussed and no "it will be out on X day", so what got cleared up? The 4-5 scalers still haven't committed here either or it would be in the post. All I see is "it's taking long, so we thought we'd make more some more fluff noise and keep saying words like FREE when we know it isn't FREE for scalers or vendors".
Even the monitor makers have to pay some R&D to get their monitor to pass for the label. Why the heck would AMD not reveal a monitor that CAN be used today with adaptive sync unless it, well, CAN'T? Are you unable to purchase it with this different firmware that can use it because brand X wants to make you buy a new monitor?Fuzzy, fuzzy, fuzzy...Not clear at all.
Score
0
Truckinupga said:
Free is better, godspeed to AMD. Just please don't let this be another overly exaggerated minor performance improvement like Mantel. Who do you think will pay for the scaler R&D and monitors to pass Adaptive Sync validation?
I'll give you ONE guess.
The 4-5 scaler makers aren't jumping because to them it isn't FREE R&D. Monitor makers will have some cost incurred to get the label slapped on them as approved also (that takes some kind of R&D and testing).
Free is better, but this isn't FREE to anyone but AMD (applying for something to be spec is one thing, getting it supported is quite another). First, if it was so easily implemented NV wouldn't have come up with Gsync, and second, it would be ALL OVER the place right now if it cost NOTHING to implement the "FREE" one. Why would ANY company hold up a product that "supposedly" (we haven't seen it in gaming yet on a review site, unlike gsync) helped gamers so much, if it was FREE and EASY to implement?
Score
0
IInuyasha74 said:
Quote:
i love amd, but if it wasn't for nvidias money-mongering, this would have never come to fruitionNot necesarily. This issue has been present for pretty much the full history of computer gaming. That is why V-Sync was originally created. It seems more likely both companies were working on the same problem and have came up with different solutions. Nvidia created a solution that would make them more money. AMD created a solution which would improve their performance. It probably helps that having this, will prevent loss of sales from people wanting to use G-Sync, but I doubt that was their primary reason.
That's incorrect. They created what scaler makers would NOT create and that COST money so they have to charge for it or simply lose money. AMD created a solution that OTHERS must pay to implement. Those 4-5 scaler makers and monitor vendors have to update, and again that is NOT FREE. AMD's solution merely shifted the blame. Also, NV's solution VASTLY improves performance as all reviews have shown. It remains to be seen AFTER testing at review sites if AMD's solution is equal or LESS to NV's. It won't prevent my lost sale if it's 2nd rate. Nobody can say how that will turn out until it gets tested across a dozen games on some review site. Right now we don't have much more than AMD's word and a demo that isn't gameplay right?
I also have a problem with their comment on their display saying "virtually eliminates stutter". We learned in school words like Virtually are red flags...LOL. I really hope it's good, but they sure haven't put it in anyone's hands for testing yet. That's not good IMHO. Not to mention all 3 comments on the display say "helps" in front of them. So not ELIMINATES? Just helps? That doesn't scream confidence does it?

http://www.pcper.com/news/Graphics-Cards/AMD-Demonstrat...
"What is not good news though is that this feature isn't going to be supported on the full range of AMD Radeon graphics cards. Only the Radeon R9 290/290X and R7 260/260X (and the R9 295X2 of course) will actually be able to support the "FreeSync" technology. Compare that to NVIDIA's G-Sync: it is supported by NVIDIA's entire GTX 700 and GTX 600 series of cards."
Conveniently left out of this article post...ROFL. Is it free if I still need a new monitor and new AMD card? Sounds like the same price as NV correct? PCPER made ZERO mention of hoping for other cards, though maybe they just overlooked that HOPE.
Score
0
iknowhowtofixit said:
Quote:
Although ASync only works with 2 AMD cards at the moment. Not sure how that makes it more open to the public. I am sure AMD will try and expand the card list to support Async in the future, but for now it is even more limited than Nvidia's solution.A simple driver update could address this as long as the gfx card is capable of the appropriate version of display port.
Considering AMD still hasn't fixed DX9/DX10 for the last gen cards, not sure how much luck you'll have with that and PCPER doesn't make it sound like a possibility. It may actually require tech in the new cards, AND in the monitor as PCPer mentions (they said some can support it, well at least ONE model as shown).
"but AMD warns us: "...this does not guarantee that firmware alone can enable the feature, it does reveal that some scalar/LCD combinations are already sufficiently advanced that they can support some degree of DRR (dynamic refresh rate) and the full DPAS (DisplayPort Adaptive Sync) specification through software changes."
http://www.pcper.com/news/Graphics-Cards/AMD-Demonstrat...
Even if it's possible, how much is "some degree"? If AMD knows of other monitors why not name them too? Why not name the one they used or even name the tech that is working (the scaler model), or heck name anything instead of a total dodge leaving only questions?
Score
2
Quote:
Mantel has been successful in broadening the market by giving weaker CPU's a sizable performance boost. Higher end CPU's saw little to no performance gains, but they didn't really need them, being as strong as they tend to be. However, AMD marketing did seriously over-promise on Mantles abilities. Someone really needs to reign them in as very few products live up to their hype... From Tech Report

10% increase in BF4 using Mantle with an i7-4770K and R9 290X.
Your propaganda and talking points have failed. Please phone Jen-Hsun Huang for new instructions ...
Score
3
H1itman_Actual
July 15, 2014 8:39:47 AM
nikolajj
July 15, 2014 9:53:06 AM
Thanks for actually explaining how the Sync/VBlank/Sync works. I was under the impression that the VBlank had to include the amount of time to refresh which meant that AMDs system would turn out to be a 'best guess' style of system and not eliminate the need for a triple buffer. I'm pretty much ready with my money set aside to switch to whichever variable sync tech goes on a monitor that I would want when it comes to market and pretty much counted FreeSync out as a 'me too' gimmick but it seems like it could be just as effective at reducing the latency and controlling sync issues.
Score
0
Wisecracker said:
Quote:
Mantel has been successful in broadening the market by giving weaker CPU's a sizable performance boost. Higher end CPU's saw little to no performance gains, but they didn't really need them, being as strong as they tend to be. However, AMD marketing did seriously over-promise on Mantles abilities. Someone really needs to reign them in as very few products live up to their hype... From Tech Report

10% increase in BF4 using Mantle with an i7-4770K and R9 290X.
Your propaganda and talking points have failed. Please phone Jen-Hsun Huang for new instructions ...
10% is a joke. That can be done regularly with driver updates, check the notes for all the drivers of the past. Many games get MORE than this from both sides. Mantle was built to help their lacking cpus vs. Intel. All reviews have made the same comment regarding how much you get from high end stuff (NOT MUCH). Is a talking point when all the reviews say the same thing?
"The gains at the high end aren’t worth writing home about, but since we need the CPU to churn out a fairly high framerate regardless, there’s a much greater opportunity to benefit from Mantle on lower end Intel CPUs and AMD’s CPUs/APUs."
http://www.anandtech.com/show/7868/evaluating-amds-true...
They love AMD (portal and all, no NV portal etc), and they still said it. No different than anywhere else.
http://www.techspot.com/review/793-thief-battlefield-4-...
Less than 10% unless apu used.
"For the most part, when using the R9 290X we saw less than a 10% performance gain with Mantle, 5% being more accurate."
Again, a driver update can do this easily. Why write code for this type of perf when gamers are not buying the low end in most cases. It was to help the cpus do a better job of competing with Intel. Especially APU since they seem to be out of the cpu race. I wonder how long AMD can pay to get this supported. $8mil to dice/ea shows it isn't that cheap.
http://www.guru3d.com/news-story/nvidias-tba-dx11-will-...
It's pretty clear what you can do even with DX11 and a driver update. Star Swarm was the mantle showcase, massively spamming the screen with objects etc. But NV wins now?
Thief too? You can see they are showing 3 driver revs and each gets a pretty massive increase from the 331's at just over 40fps to well over 60fps of the beta's and in doing so topping mantle on 290x. Kind of pointless IMHO as AMD's way requires EVERY game to be coded for it, while if they would have just spent all that R&D on drivers that affected everything (like NV) running dx or opengl, they would have been better off. Score
3
H1itman_Actual
July 15, 2014 6:11:57 PM
somebodyspecial said:
Truckinupga said:
Free is better, godspeed to AMD. Just please don't let this be another overly exaggerated minor performance improvement like Mantel. Who do you think will pay for the scaler R&D and monitors to pass Adaptive Sync validation?
I'll give you ONE guess.
The 4-5 scaler makers aren't jumping because to them it isn't FREE R&D. Monitor makers will have some cost incurred to get the label slapped on them as approved also (that takes some kind of R&D and testing).
Free is better, but this isn't FREE to anyone but AMD (applying for something to be spec is one thing, getting it supported is quite another). First, if it was so easily implemented NV wouldn't have come up with Gsync, and second, it would be ALL OVER the place right now if it cost NOTHING to implement the "FREE" one. Why would ANY company hold up a product that "supposedly" (we haven't seen it in gaming yet on a review site, unlike gsync) helped gamers so much, if it was FREE and EASY to implement?
You make a very good point. I never really believed it would be free in the end anyways. But if it were NV's tech alone then it would be unaffordable to most, I love Nvidia GPUs and thats all I use but I'll admit that they really love to stick it to you on their prices.
Score
0
Truckinupga said:
somebodyspecial said:
Truckinupga said:
Free is better, godspeed to AMD. Just please don't let this be another overly exaggerated minor performance improvement like Mantel. Who do you think will pay for the scaler R&D and monitors to pass Adaptive Sync validation?
I'll give you ONE guess.
The 4-5 scaler makers aren't jumping because to them it isn't FREE R&D. Monitor makers will have some cost incurred to get the label slapped on them as approved also (that takes some kind of R&D and testing).
Free is better, but this isn't FREE to anyone but AMD (applying for something to be spec is one thing, getting it supported is quite another). First, if it was so easily implemented NV wouldn't have come up with Gsync, and second, it would be ALL OVER the place right now if it cost NOTHING to implement the "FREE" one. Why would ANY company hold up a product that "supposedly" (we haven't seen it in gaming yet on a review site, unlike gsync) helped gamers so much, if it was FREE and EASY to implement?
You make a very good point. I never really believed it would be free in the end anyways. But if it were NV's tech alone then it would be unaffordable to most, I love Nvidia GPUs and thats all I use but I'll admit that they really love to stick it to you on their prices.
I like cheap stuff too, and am currently using a radeon 5850, mostly due to the great temps of this particular card, and I got a smoking deal from amazon $260 when they were going for $310+ (amazon just took 7 months to give them to 450+ backorders and tons of complaints...LOL - priced to low at pre-order so they switched model# hoping we'd all give up). But if you look at their earnings NV isn't screwing us at all and hasn't been since 2007.
http://investing.money.msn.com/investments/financial-st...
10yrs summary of their earnings. Note the 800mil in jan 2008 (which is the 2007 year of earnings). Note now they only make ~ half that and haven't hit 800mil since 2007. Also note of that 450mil they make now, 300mil is Intel's money each year given due to the lawsuit. So really they're making $150 mil, which is a FAR cry from income of 2007 right? So if they're screwing us all, how come they aren't making MORE than 2007, especially with intel giving them 300mil every year?
http://www.nvidia.fr/object/nvidia-intel-licensing-fees...
Details of the 5yrs of payments totaling 1.5Billion.
I say NEITHER side is charging NEAR what they should be, which is why both are basically making nothing. $150mil is not 800mil correct? AMD has lost $6billion in 10yrs.
http://investing.money.msn.com/investments/financial-st...
If we want these two to survive they better either stretch out cycles MUCH longer (say new cards every 3yrs giving them more time to milk the expensive R&D) or they just need to charge higher prices period. The math doesn't lie, and neither does the financial reports. We are screwing them...LOL. Their price war is screwing each other. What you get today is phenomenal for the price at any range compared to 2007, R&D costs have went through the roof in those years since then, and they aren't getting anywhere near what they used to despite having a MUCH larger market of gpu users to sell too. Translation: Far more products sold (NV set records last year) but 1/5 the earnings (or worse for AMD) means you aren't charging enough. I can say I hate high prices, but I can't say they are charging too much. The profits (or lack of them) say the complete opposite. They need to stop revving yearly and maybe they'd do better without adding to our pricing. That sucks too but better slow tech than not being able to afford a card right?
Again, not an advocate for high pricing or anything, just saying neither makes what they used to so they can't be charging too much. Also note NV's R&D has gone up in the last 3yrs from ~850mil to 1.33B.
http://investing.money.msn.com/investments/stock-income...
AMD's has went down, but that's expected when they're doing so poorly financially, forcing layoffs (30% of the workforce in the last 3-4yrs) etc. It's also why their drivers are going into phase3, put out a bad 290x etc. They clearly need more for R&D as they now are behind NV in spending on it.
I hope they have a good quarter today, but doubtful with Intel having such a good Q on the PC side (mobile lost 1.1B...LOL, but the PC side was very strong). If intel had strong sales, I doubt AMD got any extra from this as it was mostly business related sales due to shutting down xp support and causing business to jump. Most of that market is INTEL only. Console sales have massively slowed (off more than 50% for xbox1/ps4 at last check, probably worse now 3 months later, we're about to find out either from AMD news or sony/ms sales numbers soon for this Q), and home users haven't exactly starting a buying craze of PCs. It's just business which AMD never steals anything from Intel. Mining craze is over, and some channel checks rumored they were really overstocked on gpus at retail (thus cutting production instead of price), so that won't be good. I would be surprised by a good AMD Q (meaning over 100mil NET) and actually expect a very bad Q though they'll surely say "but the pain is over now and we're on track to do good things" yet again.
It's like Intel Q when losing 1.1B on mobile (which could be profits if they just quit and fabbed everybody's stuff). They made 50mil in mobile revenue and are "on track" to sell 40mil soc chips for tablets. So they sold 1.15B worth of mobile crap, but managed to lose 1.1B on it. So you're PAYING people to just take the chips please and please use them for the love of god please use our crappy chips...LOL. That isn't a good sign and most think they won't have anything near competitive until 2016 (so another $8Bil in losses over 2yrs). If the PC side wasn't so strong (pc/server/dc) they'd be really hurting profits wise. They got a temp business pop, but I expect that to stop soon and we'll see ARM do to desktops exactly what they did to notebooks (stole 21% of ALL notebooks already). When 64bit and androidL hit by xmas some low-end desktops will surely go just like notebooks. Most just browse, get mail and run casual games. You don't need wintel for that at all, just MS just announce thousands of layoffs. Intel has an empty fab here in AZ. The writing is on the wall for anyone looking (and it isn't saying good stuff...ROFL). It's just a question of how much can android/steamos etc take from wintel. Android is screwing both, while steamos just hurts MS over time at least until it gets ported to ARM by NV or Valve (or both). At that point some gamers can leave too as steam games already run on linux and are being ported as fast as Valve can, so a port to arm isn't hard after that as they both run opengl.
Score
0
"Our transformation strategy is on track" Just as I predicted they say, even though they lost another 38mil and shares down .81 cents, 17.7% drop in stock price after hours. Conference call going now (must be bad, was 62c down before they talked), haven't figured out what was weak yet, but suspect consoles/cpus, while gpus are a question. If channel checks are right, they could be bad also.
Score
1
!