g92 and whatever ati will make..

:wahoo: dose anyone know when the g92 will come out? i need to upgrade my system realy bad. so how long will i have to wait. should i just buy a 8800 320 for now? and buy the g92 when i comes out? how much will it cost? :heink:
24 answers Last reply
More about make
  1. It's speculated that the G92 will come out in november and prices will be 4-500$ for the GTS and 5-600$ for the GTX. Wait for it if you can, it's the best thing you can do. Well, I couldn't wait and still grabbed an 8800GTS a while ago, mainly for Bioshock, but I hope to sell it, add some extra money and get an 9800.
  2. It's also speculated that the G92 will be some sort of refresh and will fall in between the 8600GTS and the 8800GTS. So... basically we know nothing of nothing.
  3. That was the original speculation, according to the new ones, G92 is new the high-end chipest. Regardless, what I was talking about here is the next high end Nvidia GPU that is supposed to come out in november, G90 or G92 or whatever.
  4. Umm what is your proof of this? (I have been out of the loop for about a week, so I'd like to see) and please don't link me to the inq or show me that retarded xbitlabs post with "specs" from some random guy.
  5. not true...the g92 is NOT a refresh. People really need to stop saying that. The g92 is said to be slightly above the 1 tflop barrier. Why would a midrange card be above 1 tflop when a 8800 GTX is between 500-550 gflops? It is NOT mid range...current dx10 cards cannot support DX10 games well at all. Nvidia will have to release the g92 in order to actually run dx10 games at decent frame rates. the 8800 is wayyyy to late into the generation to provide decent dx10 gameplay. the 8800 is just an extremely fast DX9 card with dx10 capabilities..but those dx10 capabilities will not get you a great framerate via dx10...Afterall..The card is almost a year old. Please also don't say it can play lost planet and all that jazz like most people would. These are all PATCH jobs...not a native DX10 game..Lost planet, call of juarez, and supreme commander...ALL dx9 with a dx10 feature here or there.

    Nvidia will follow the same exact course at they did the with 8800. The g92 will be released first (the high end). With a few months later releasing their mid range line during q1/q2. the R700 is slated for a q1/q2 2008 release. Also keep in mind that Nvidia is on a yearly release cycle for their graphics cards. the 8800 GTX/GTS was released on november 8th, 2006...expect the g92 in november or early december if they hit delays.
  6. Well, I was about to link you the xbitlabs specs from the random guy. Those are the only ones we have atm. They might not be true, but since they look so good many hope they are. I guess you're right, we know almost nothing for certain right now. But, it doesn't matter much if the GPUs are called G90 or G92, people usually now refer as G92 as the next high-end GPU from Nvidia, so there isn't too much confusion created.

    Edit: Kamrooz, why is everyone saying that current DX10 cards won't be able to handle DX10 games? An 8600GT is in the recommanged sys. reqs for Bioshock DX10, and it has been said that an 8800GTX can run Crysis on high. Both of them are different from the crappy-patched DX10 games like Lost Planet or CoH. I say we should wait for a true DX10 game so we can get a true image of the performance current cards get in DX10.
  7. So, you are telling me that these cards will be able to run dx 10 titles, Meanwhile these dx9 titles with just a few implementation of dx10 have a 2/3 drop in performance when they run the DX10 patch?...come on man. the 8800 Ultra can't even run Lost Planet at 1600x1200 with everything on max WHILE having the dx10 features enabled..You'd have to lower the AA/AF and even then you can't run the dx10 features on high..

    Current hardware can't play DX10...It's like this with every generation of new hardware when it comes to a new code path. DX10 is above current hardware...NEXT gen is the real dx10 capable graphics cards. This happens everytime a new DX format comes out. If memory serves me right the same thing happened with the DX 8 to 9 transition. Don't expect incredible performance for a new standard on first generation products....There are always flaws that need to be worked out...or in this case..a lack of power to fully utilize DX10...in this case..a DX10 PATCH.
  8. BTW Cpt_wuggles...sorry for the thread jacking. Kinda de-railed. If you are to buy any card buy a EVGA card. Also try to do it mid/late september so you are just in range even if they hit delays for the g92. The EVGA step up program should work well. This way you'll be able to atleast play everything till the time comes around. I'm building myself a new rig as well in the next few months. Can't stand this 3.2 prescott and x700xt any longer. going to dump around 3-4 grand on a new rig >_<...lol..
  9. I gotta side with vonwombat a bit here, we just DONT know how these cards will perform under DX10, until we get a REAL DX10 title. Although Im pretty sure that the 8600 won't be able to cut it either.

    I thought and was almost certain that what was said about the 1 teraflop GPU was in fact a reference to the nVidia Tesla and people somehow got the impression that it was the G92. Correct me if Im wrong.

    The truth is we speculate that there will be a new GPU G90/G92 on november (And probably will be), but we don't know much (if anything at all) about what it can do.
  10. Kamrooz:
    You want to provide a link stating that these G90 cards do not run DX10 games well?

    DX10.1 is not compatible with DX10 hardware which has been stated in various articles.. HOWEVER, they mention that the 10.1 update is insignificant and no big deal.
  11. Kamrooz. Those games with DX10 patches are just that. Crappy coded patches to implement some DX10 features on a game that built on DX9. Lost Planet is just a ported game anyways which is the major reason for its lack of performance. Hell... look at Rainbow 6 Vegas and many other DX9 games that don't even use DX10 patches and they can run like crap even on the highest end hardware.

    Until there is a game built from the ground up on DX10 and Vista has the proper patches from the service pack. You are only speculating on something you can't prove. I'm not saying you aren't way off base. It certainly is very plausible that the current dx10 hardware may not run the games well. But, until we have something to truly base that off of we can't just dismiss them.
  12. will agree with you that dx10 is in early infancy. But the point is don't expect a MASSIVE increase in performance just because it will be a native dx10 game. I don't expect crytek to have much code path problems considering how passionate they are in their games. It's going to take the companies a little time to fully workout dx10 in terms of creating impressive visuals with the least amount of performance penalty. They may have ended up taking routes that impact performance. BUT...One thing is certain. It's not only on the software side. Hardware is also a suspect. DX10 as well as DX10 hardware needs to mature just like any other. Both the hardware and software are in their infancy stages. We ran through this same situation with DX8 to DX9...It's expected to repeat just like we'll see this from DX10 to DX11. Hardware needs to evolve just like software.

    I will also state that the difference between a native dx10 game to a dx9 with a dx10 patch will be different. But how vast is the main question. We'll have to wait to find out. But personally...I find it false to say that there will be a huge performance increase. the current DX10 cards were introduced quite a while back..Not for ATI's situation though but that mostly has to do with the merging of ATI/AMD...But the 8800 is almost a year old...

    Time will tell though..but imo don't expect wonders from the g80. I don't doubt that it is a damn good card..I'd trade up my card for one any day. But do keep in mind this is first gen...and as always...first gen always has problems. Considering how there are many games out there that bring the current high end games to a crawl (Stalker, Oblivion, Medieval 2, etc...keep in mind I'm talking about HIGH settings)...don't expect DX10..which can be MUCH more visually appealing and in cases more gpu intensive...to have INCREASED performance when these dx10 titles are suppose to push hardware even further.

    comptia: Have you been reading my posts at all?..lol..No offense man but I'm talking about the g80 series..not the g92. The g92 will be the TRUE dx10 cards...I'm talking about how imo I consider g80 just as stinking fast DX9 card with DX10 implementation.
  13. Kamrooz said:
    comptia: Have you been reading my posts at all?..lol..No offense man but I'm talking about the g80 series..not the g92. The g92 will be the TRUE dx10 cards...I'm talking about how imo I consider g80 just as stinking fast DX9 card with DX10 implementation.

    Explain how the G9x series will be any more 'true DX10' than any other card. Sofar every thing about it says no new features, only more of the same. Unless they improve their geometry shader performance and some other aspects it won't be able to keep up in certain DX10 situations, not that that alone precludes it from DX10 or makes it bad, just doesn't make it as good as others. So what exactly make it more of a 'true DX10 card' to you versus the two models we have now, other than brute force?
  14. When I say true dx10 card I speak of high performance. What good is being dx10 capable if you can't push the frames out for it?..

    Overall the 8800 will be able to play crysis just fine..so will the 2900. But you won't be playing it out on max AA/AF/settings. So far there have been no official requirements posted. Before anyone says there has been...They are all bogus. crysis-online has some requirements up but they are not official..They are based upon watching the videos and judging frame rate and settings...As well as factoring in all the rumors and information they've gathered. Nice, but not official.

    In a interview regarding E3...Crytek president did state that they were running the game on a 8800 at E3...It was stated that they were running the game on geforce 8800 video card with 4 gigs of ram and a intel dual core processor. He also stated the game was running on high settings but not the highest...and the game was running smoothly "Most of the time". (Just so you know I added the quotations..to get the point across..lol) Also keep in mind..AA/AF wasn't mentioned (doesn't surprise me..most interviews wouldn't)..

    It's true the g80 is a great card...But don't expect 1920x1200, max AA/AF and ultra high settings....chances are you'll get below 30 fps..probably below 20..

    Overall we just have to wait and see. But do keep in mind "playable" to gaming companies is 30 frames per second. To enthusiast who dish out tons of cash on hardware...playable means 60+...This is why I believe g90 to be a true dx10 card. Remember though there are DX9 games that can bring a 8800 GTX and 2900XT to a mere 30 fps..But after all, we'll have to wait and see.
  15. Anyone know if G92/R700 will be PCI-E compliant?
  16. I take it you mean PCI-E 1.1?...They all should theoretically. But there could be bottlenecking if they hit the 2.5 gb barrier. PCI-E 1.1 has a 2.5 gb transfer rate while PCI-E 2.0 will have a 5 gb transfer rate. ATM there is no information on what the g92/r700 will be. But considering they have the x38's coming out with PCI-E 2...it probably means they have hit and gone over the barrier by a bit. Probably nowhere near 5 gb..but chances are above 2.5...

    But they should be compatible with PCI-E 1.1 if they are indeed PCI-E 2.0....with a small trade off.
  17. Syr 4 the hijack too... dunno WHEN the new NV cards will come out, but i want 1.

    But i do agree with Kamrooz

    Come on guys, on my 3.2Ghz C2D sys with oc 8800gts, even Test Drive Unlimited stutters sometimes. It only JUST handles the lastest games 100%.
    What is DX10 some MAGIC performance bullet?
    Only time will tell, but rly, ur kidding urself if u think ur 8800 will handle dx10 WELL, esp at 16x12 type rez.

  18. P35 boards has PCIE 2.0 feature.
  19. p35 boards unofficially support PCI-E 2.0..Atleast that is what has been said so far. I haven't heard anything else for a while regarding that. But we'll have to wait and see on that as well.

    Mrmez: It is indeed a magic bullet..lol. But as all new games come out they push hardware even farther and farther till it's time for the next gen. But it always gets more intense when making a new codepath change (DX8 > 9 > 10 > 11 and so on). A transition must always be made ^_^..I'm just glad I was able to wait it out. Getting my new rig within 3 months. probably nab a g92/x38/penryn (or maybe a q6600 for temp till the mid range penryn processors come out..). I have to admit though..I'm a bit worried on nabbing a g92 and having the r700 come out 2-6 months later...Really curious on how it will perform. I've had Nvidia cards and ATI cards in the past..But one thing I love about ATI are their drivers...which is the only thing nvidia lacks.
  20. P35 boards do not support PCI-E 2.0. I believe they were launched before the PCI-E 2.0 standards were complete.
    And no, we don't have any official Crysis reqs., but it ran well on current high-end cards until now, and from what I've heard, the CEO of Crytek said than an 8800 could run the game on high. Supposedly he said this in an interview for PC Gamer. Also, keep in mind that apart from Crysis, there are no games which will be this demanding on hardware, and most games currently in development won't make use of DX10 as much as Crysis does.
    Current DX10 patched games add better shadows at a cost of 30 FPS or so, but I believe that is because of poor coding rather than poor hardware. Bioshock will support DX10 features, and it will come out in 8 days, so that should give us some insight into the problem.
    Don't get me wrong, G92 will probably be awesome, I'm saving for one and hope to grab it as soon as it lands, but current hardware should hold out for a while as well.
  21. Rumours and propaganda.... The makings of a good FUD war. Let it lie, it will happen when it happens. Until then just patiently wait for 3rd party tests to show up.

    I should just make this my sig, then all I'd have to do is post a blank message over half of the time.
  22. Kamrooz said:
    not true...the g92 is NOT a refresh. People really need to stop saying that. The g92 is said to be slightly above the 1 tflop barrier.

    LMAO. WHY should people stop saying this? There's no official news from nvidia, only rumours posted on places like the Inquirer. In fact it was the inquirer that initially reported the G92 as being the top end part and the very same inquirer later reported that the G92 was a sub 8800 refresh, if you're going to believe one report from a completely un-backed up source, why discount a more recent report from the very same completely unreliable source?

    This is the very same inquirer that around december last year reported that an 8900GTX would be out by mid summer which would have clock speeds well above the 8800GTX... what actually came out was the 8800 Ultra which was no where near the performance the inquirer quoted.

    The inquirer prides itself on posting "insider information" but never quotes it's sources and is very very frequently proved to be wrong.

    The truth of the matter is, no one will know for sure until the official press statement from nvidia themselves, which will most likely not be until a few weeks before any product release. For example the definite specs for the 8800GTX/GTS weren't know in the world until mid october last year, with an early november release. The rumour mill had "specs" out in september, but a month later they were proved to be wrong.
  23. The reason the current DX10 cards are so wonderfully potent is mainly due to our using them for DX9 apps. DX10 apps (and the attendant overhead that DX's preferred working environment has) is going to bring those cards to their knees.
  24. croc said:
    Rumours and propaganda.... The makings of a good FUD war. Let it lie, it will happen when it happens. Until then just patiently wait for 3rd party tests to show up.

    I should just make this my sig, then all I'd have to do is post a blank message over half of the time.

    Crikey, you ain't kidding! Let's see here:

    The G92 is absolute vaporware for now. Nvidia isn't saying ANYTHING. Either they're playing their cards close to the deck, or they are planning to sit and milk the 8000 series some more. In lieu of more absolute high-end competiton from AMD, they have very little to gain from rolling out an entirely new series. If anything, and this is absolute, unadulterated speculation on my part, we will probably see a refresh before the end of the year.

    DX10 titles poor performance to date has very little to do with the hardware and very much to do with poor software implementation of DX10 features by developers. In theory, DX10 can get you visual quality above that of DX9 at roughly the same hardware cost. Developers are going to push this of course, they always have. That is what truly drives change in the industry after all.

    The whole 1Tflop thing should really just be dropped. It's a neat figure and a noteworth accomplishment (if it turns out to be credible), but it means very little when it comes to gaming. Unless you plan on using your video card for folding or some other math intensive app, it's just a marketing point. There's no direct correlation between Tflops and FPS.

    Finally, PCI-E 2.0 came about for reasons other than neccesity. PCI-E 1.0 certainly isn't swamped, at least as far as bandwidth is concerned. Neither was AGP when it was dropped. The chances of AMD and Nvidia's next cards having at least one model that supports the new standard is high, if for no other reason than all the people who will say "Well, I have a PCI-E 2 slot, might as well put something in it!" And honestly, though it shames me, I am that kind of person myself.
Ask a new question

Read More

Graphics Cards ATI Graphics