ATI Releases More "R600" Details

103 answers Last reply
More about releases r600 details
  1. HEHHEE ill post the whole thing :lol:

    320-stream processors, named ATI Radeon HD 2900

    AMD has named the rest of its upcoming ATI Radeon DirectX 10 product lineup. The new DirectX 10 product family received the ATI Radeon HD 2000-series moniker. For the new product generation, AMD has tagged HD to the product name to designate the entire lineup’s Avivo HD technology. AMD has also removed the X-prefix on its product models.

    At the top of the DirectX 10 chain, is the ATI Radeon HD 2900 XT. The AMD ATI Radeon HD 2900-series features 320 stream processors, over twice as many as NVIDIA’s GeForce 8800 GTX. AMD couples the 320 stream processors with a 512-bit memory interface with eight channels. CrossFire support is now natively supported by the AMD ATI Radeon HD 2900-series; the external CrossFire dongle is a thing of the past.

    The R600-based ATI Radeon HD 2900-series products also support 128-bit HDR rendering. AMD has also upped the ante on anti-aliasing support. The ATI Radeon HD 2900-series supports up to 24x anti-aliasing. NVIDIA’s GeForce 8800-series only supports up to 16x anti-aliasing. AMD’s ATI Radeon HD 2900-series also possesses physics processing.

    New to the ATI Radeon HD 2900-series are integrated HDMI output capabilities with 5.1 surround sound. However, early images of AMD’s OEM R600 reveal dual dual-link DVI outputs, rendering the audio functions useless.

    AMD’s RV630-based products will carry the ATI Radeon HD 2600 moniker with Pro and XT models. The value-targeted RV610-based products will carry the ATI Radeon HD 2400 name with Pro and XT models as well.

    The entire AMD ATI Radeon HD 2000-family features the latest Avivo HD technology. AMD’s upgraded Avivo with a new Universal Video Decoder, also known as UVD, and the new Advanced Video Processor, or AVP. UVD previously made its debut in the OEM-exclusive RV550 GPU core. UVD provides hardware acceleration of H.264 and VC-1 high definition video formats used by Blu-ray and HD DVD. The AVP allows the GPU to apply hardware acceleration and video processing functions while keeping power consumption low.

    Expect AMD to launch the ATI Radeon HD 2000-family in the upcoming weeks, if AMD doesn’t push back the launch dates further.
  2. all i can say is DAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAMIT!!!

    320! that's obscene! go figure a 6 month paper launch would get you 320 stream processors AND physics processing! my new rig is going to be a nuclear-reactor-slash-demi-god!!!

    anyone think the 8900 will pwn this? omg man if nvidia merely thought of doubling their own stream processors they are in SO MUCH TROUBLE! WOO!

    seriously though, all benchmark numbers to date on this card HAVE to be entirely false. Just the shaders alone should cripple any game this card takes on. Should make a mockery of oblivion on it's own.

    Havok FX here i come. Also, does anyone think the physics processing capabilities could replace ageia? As in, could we play ageia based games with this? I guess a representative of the company needs to confirm that?

    EDIT: WHAT IF THE SHADERS ARE STILL VEC4!?!?!!?!?
  3. Quote:


    AH $CREW THAT !!

    I want Laptop Info !!

    Oh BTW, nice find. :trophy: :mrgreen: :trophy:


    HEhe, I like emoticons...... :P
  4. Quote:
    all i can say is DAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAMIT!!!

    320! that's obscene! go figure a 6 month paper launch would get you 320 stream processors AND physics processing! my new rig is going to be a nuclear-reactor-slash-demi-god!!! :lol: :lol: :lol: :lol:

    anyone think the 8900 will pwn this? omg man if nvidia merely thought of doubling their own stream processors they are in SO MUCH TROUBLE! WOO!

    seriously though, all benchmark numbers to date on this card HAVE to be entirely false. Just the shaders alone should cripple any game this card takes on. Should make a mockery of oblivion on it's own.

    Havok FX here i come. Also, does anyone think the physics processing capabilities could replace ageia? As in, could we play ageia based games with this? I guess a representative of the company needs to confirm that?

    EDIT: WHAT IF THE SHADERS ARE STILL VEC4!?!?!!?!?
  5. Ati previously confirmed this is a 720 million transistor part. The Geforce 8800 has over 680 million transistors. Less than 40 million more transistors to work with.

    Either the current G80 cards have a lot of their functionality disabled, or the stream processors on the R600 are not exactly comparable to a stream processor on the Geforce 8800.

    Either way, I wish they'd just release this thing so we know how fast it goes. I could be interested in buying a more high end video card soon. :)
  6. features are nice but benchmarks are key. I am still holding my breath
  7. Quote:
    features are nice but benchmarks are key. I am still holding my breath


    same here :D
  8. Nice. 320 streaming processors! I can't wait to see how it performs against the G80.
  9. I knew it was gonna be a monster but this exceeds what I thought it was gonna have by a mile...
  10. its going to be an exciting 10 days to wait for the press release. things will get leaked here and there. Also there will be a s**t load of rubbish posted. but according to this post it would appear i will have a r600 in my next build :)
  11. OMG if that's true it's going to wallop the 8800's
  12. I've got my money just waiting for the first chance to buy this beast, with Barcelona and the new 790G. (Long overdue for an overhaul).

    Did I mention beasts!

    Kinda of reminds me when ATI was quiet, then bam! 9700 pro just blew everything away. Way to go AMD/ATI!

    They truly understand where the market is going. Just look at the new UMPC's, and you'll understand that AMD/ATI are in a great position.

    Just give them the little time they deserve, and I'm certain we will not be disappointed.

    It will have been well worth the wait!

    Mind you, that my Dothan at 300 fsb has been pretty good to me.
  13. I say... Release the damn thing for craps sake! :lol:

    so close, yet so far...

    I hope it does justice to its extra late release.
  14. From the info we just got in this article it should make up for it but no more damn delays...
  15. Nice find. :)

    Now let's see some non 3dmark benchies!
  16. You can't compare the stream processors the ATi cards to that of the 8800GTX's. Case in point...R600XT Scores 12K in 3DMark06. So how is that it only scored slightly better, but had over 200 more stream processors? Since everyone is new this design we don't know whether the stream processors are clocked lower or they have a smaller effect on performance than originally believed.
  17. Well according to FUDZilla (not even going to link it, its that credible :P )

    the 2900 XT HD (GDDR3 512mb, 800mhz core) running the newest 8.361 drivers, its 10%-17% faster than the 8800GTX


    take it with a dump truck load of salt, FUDZilla said it without proof
  18. Quote:
    You can't compare the stream processors the ATi cards to that of the 8800GTX's. Case in point...R600XT Scores 12K in 3DMark06. So how is that it only scored slightly better, but had over 200 more stream processors? Since everyone is new this design we don't know whether the stream processors are clocked lower or they have a smaller effect on performance than originally believed.


    That's because the ATI has 320 stream shaders, but the G80 has 128 unified physical shaders and a higher core clock.

    What the R600 really has are 64 unified shaders that run up to 5 operations per clock faster than Nvidia's unified shaders.

    The problem is, performance like that is to software/application dependent. What you will realistically see is some games running 100% faster on Nvidia and some games running 25% faster on ATI. Why? Because when an application doesn't utilize the faster shader architecture, Nvidia has the brute force unified shaders combined with core clock to destroy ATI's 64 shaders. But even when applications DO utilize the more operations per clock cycle, they can only yield about 17% more performance than the 8800GTX because of the 8800GTX's brute power.

    This is probably why ATI has so many non disclosure acts all over their R600's. Notice some people say, "We sometimes see a 17% performance gain on the 8800GTX" and they also sometimes say "But in other games this gain isn't seen".

    Flip a coin, heads = your game will be Nvidia optimized, tails = your game will be ATI optimized.
  19. I personally dont even look at 3d mark scores, they are bogus. But I do agree with you. I doubt the stream processors will be as powerful/efficient as nVidias. I personally think that we will all really have to wait to see what these cards perform like in DirectX10 games. That will be the deciding factor for me. The unified shader archetectures of both of these cards should really shine under DX10.
    Sorry for the grammar mistakes.... I went to bed at 1 and got up at 6..not fun.
  20. 8O WOW, all I can say is .... Curve ball!
  21. Quote:
    I personally dont even look at 3d mark scores, they are bogus. But I do agree with you. I doubt the stream processors will be as powerful/efficient as nVidias. I personally think that we will all really have to wait to see what these cards perform like in DirectX10 games. That will be the deciding factor for me. The unified shader archetectures of both of these cards should really shine under DX10.


    We're going to see Doom 3 engine/Oblivion all over again.

    Basically, because both companies are utilizing such a contrasting architecture in their cards, it will depend on the games on which card performs better or not.

    Think of it this way, if a game favors neither and isn't optimized one way or another, you'll see the 8800GTX really shine because that's the kind of card it is, a brute power house.

    If a game is optimized to utilize complicated shader architecture, you'll see Oblivion all over again, ATI favored across the board.

    So we're in a situation where we won't see which cards are better for the next year or so, and by then why would you buy an 8800GTX or 2900XTX when you can get a 9000GTX or 3000XTX?
  22. and this is coming from someone running a pentiumD :lol:
  23. Quote:
    and this is coming from someone running a pentiumD :lol:


    What does that have to do with anything?

    I buy what's needed for me, not what's on the latest cover of PC gamer magazine.

    Wanna know what game I play? Counter Strike: Source.

    Oblivion is boring, Flight Simulators are for nerds, Neverwinter Nights sucks, Need for Speed is about as realistic as Mario Kart and about as fun as sparring with Bernard Hopkins, Crysis is going to be a flop.

    Why do I need a Core 2 Duo and an 8800GTS in order to discuss hardware?

    Fine, if it suits you better, here are the specs for me and my friends' tester rig. We each pitch in $150-200 each to build a rig for benchmarking and having fun with-

    E6600 at 3.6
    Biostar TForce 965PT (We have a Commando sitting around but we really have no reason to use it, my friend bought it with his own money so we'll probably use it for his build)
    2 GB of G.Skill HK ram (We were gonna use HZ for Micron D9 but why when binned HK over-clock to 1100 all the same)
    2x WD 160 caviar in Raid 0
    1x Seagate Barracuda 7200.10 250
    EVGA 8800GTS 320 (We sold our 640 to get the 320 since our monitor is a 19 wide)
    Thermaltake Armor full tower (We had an Antec 900 but it was a complete pain to install with)

    All on Windows XP Professional SP2. We have a Vista on the Seagate partition but we don't ever use it. We've had nothing but problems with our 8800GTS and Vista.

    If you're interested, we already broke 10k on 3d06.
  24. Its quite ironic that you even mention this because this is exactly what DX10 was supposed to get rid. It was supposed to "unify" the shaders and what not so that games could be more easily optimized to run on these graphic cards. Allowing for programmers to take more advantage of the power they have. Unfortunately if your comment holds true this is just a continuation of the the game to game specific performance of certain graphic cards.
  25. Quote:
    Its quite ironic that you even mention this because this is exactly what DX10 was supposed to get rid. It was supposed to "unify" the shaders and what not so that games could be more easily optimized to run on these graphic cards. Allowing for programmers to take more advantage of the power they have. Unfortunately if your comment holds true this is just a continuation of the the game to game specific performance of certain graphic cards.



    which this all ironically brings about the much debated question of, "Are nVidias "unified shaders" truly 'unified'?" some say no, others say yes... but no one questions ATi's. So the reason its still game to game is because neither company can agree on what makes a "unified shader", 'unified; and thus the struggle lives on.
  26. Quote:
    Its quite ironic that you even mention this because this is exactly what DX10 was supposed to get rid. It was supposed to "unify" the shaders and what not so that games could be more easily optimized to run on these graphic cards. Allowing for programmers to take more advantage of the power they have. Unfortunately if your comment holds true this is just a continuation of the the game to game specific performance of certain graphic cards.


    It will... 3D mark is DX9.. under DX10 expect nearly 100% performance advantage for X2900XTX over the 8800GTX.

    ATi is trying to pull another R300, this time with DX10. X2900XTX will be the fastest DX9 card but not by much.. just like Orton claimed. But under DX10.. it's built for DX10 and VISTA. nVIDIA will need a new architecture to compete.
  27. Quote:
    Its quite ironic that you even mention this because this is exactly what DX10 was supposed to get rid. It was supposed to "unify" the shaders and what not so that games could be more easily optimized to run on these graphic cards. Allowing for programmers to take more advantage of the power they have. Unfortunately if your comment holds true this is just a continuation of the the game to game specific performance of certain graphic cards.


    It will... 3D mark is DX9.. under DX10 expect nearly 100% performance advantage for X2900XTX over the 8800GTX.

    ATi is trying to pull another R300, this time with DX10. X2900XTX will be the fastest DX9 card but not by much.. just like Orton claimed. But under DX10.. it's built for DX10 and VISTA. nVIDIA will need a new architecture to compete.
    That's what I was thinking... I doubt it will be too much faster in DX9. But in DX10, where the card was built for, it will be a bit faster... Hopefully better drivers in Vista too.

    Now I wonder how the onboard HD audio is going to work... it sounds interesting... I guess they'll have an adapter so I can plug my headphones in it?
  28. Quote:

    That's what I was thinking... I doubt it will be too much faster in DX9. But in DX10, where the card was built for, it will be a bit faster... Hopefully better drivers in Vista too.

    Now I wonder how the onboard HD audio is going to work... it sounds interesting... I guess they'll have an adapter so I can plug my headphones in it?



    its being said that the onboard audio is strictly for HDMI use, meaning sound cards will still be needed if you aren't using HDMI.

    I don't see why an HDMI-to-DVI connector with digital audio plug wouldn't work to use it on a external set of speakers, via digital input from the digital out adapter on the adpater piece.
  29. Well nvidia probably was a little smarter than we all think with there "not so unified" shaders.,

    Knowing full well that DX10 along with Vista was on everyone's minds they released there "DX10" graphics cards well before ATi, but in reality had DX9 performance as the focus, knowing full well that they would still have enough time before a solid group of Dx10 games were even out, allowing them to ride the DX10 "wave" while still developing there much more "unified" shader architecture for the next generation that would be more, shall we say, direct competition to ATi's unified architecture.
  30. Quote:
    You can't compare the stream processors the ATi cards to that of the 8800GTX's. Case in point...R600XT Scores 12K in 3DMark06. So how is that it only scored slightly better, but had over 200 more stream processors? Since everyone is new this design we don't know whether the stream processors are clocked lower or they have a smaller effect on performance than originally believed.


    That's because the ATI has 320 stream shaders, but the G80 has 128 unified physical shaders and a higher core clock.

    What the R600 really has are 64 unified shaders that run up to 5 operations per clock faster than Nvidia's unified shaders.

    The problem is, performance like that is to software/application dependent. What you will realistically see is some games running 100% faster on Nvidia and some games running 25% faster on ATI. Why? Because when an application doesn't utilize the faster shader architecture, Nvidia has the brute force unified shaders combined with core clock to destroy ATI's 64 shaders. But even when applications DO utilize the more operations per clock cycle, they can only yield about 17% more performance than the 8800GTX because of the 8800GTX's brute power.

    This is probably why ATI has so many non disclosure acts all over their R600's. Notice some people say, "We sometimes see a 17% performance gain on the 8800GTX" and they also sometimes say "But in other games this gain isn't seen".

    Flip a coin, heads = your game will be Nvidia optimized, tails = your game will be ATI optimized.

    ATI's 'stream' shaders are unified shaders... they are second generation shaders in comparison to Nvidia's unifed shaders, which are technically their first generation of unified shaders. ATI's complex shader approach can yield just as much performance boost, if not more, than the brute force shaders which are clocked to high frequencies. So immediately, you can't pass off ATI's r600 as weak in comparison to Nvidia's because if you try to compare them directly and not through actual benchmarks and game performance, then its impossible to see how ATI's method could beat Nvidia's, but considering ATI's 4-5 issue shaders can do a lot more than Nvidia's shaders, and knowing ATI's experience with DX10 capable chipsets, then its best not to underestimate them. I'm sure the R600 won't knock the socks off G80 in DX9, but once DX10 comes into full swing, I think we'll the true power of both chips come to light, and then we'll be able to truly judge which one's the more powerful one. For one, I would like to see how geometry shaders would implemented in games and how it would improve performance, considering what Crysis will probably demand. So considering what you've said, I just can't agree with you completely. Sure, the whole application dependency due to driver optimisation will still be an issue, though ATI have already shown how threaded GPU architectures do bring about performance benefits over the old pipeline architectures, so I feel confident their complex shader approach will yield similar results. Though, the only really way to find out is to wait for the card to be released and then to be tested and benchmarked.
  31. you know, my question will be.. are these unified shaders 100]%programable on the fly?
    I mean, imagine using 50 out of 64 unified shaders for video, and the rest for physics?

    also if it has 320 streams.. their use should be more for videoediting or stream processing such as the petaflop thing, right?
  32. Quote:
    you know, my question will be.. are these unified shaders 100]%programable on the fly?
    I mean, imagine using 50 out of 64 unified shaders for video, and the rest for physics?

    also if it has 320 streams.. their use should be more for videoediting or stream processing such as the petaflop thing, right?



    entirely software (maybe driver) dependent
  33. While everyone goes "OMG 320 STREAM PROCESSORS, THE 8800GTX ONLY HAS 128!!1" remember that the shaders on the 8800GTX are at a much higher clockspeed, and it's possible that one of the R600 shaders at the same clockspeed does not equal an 8800 shader. If the X2900XT and XTX end up kicking the shit out of the 8800GTX, I'll just jump ship and go with the new kid on the block, as Oblivion can always use more graphics power. :wink:
  34. They never said how or what it is about the card that made it have 320 stream processors, which is yay we can F@H much faster now, streams don't mean much for gaming unless they're doing physics. Then the question arises at what cost does utilizing stream processors along side unified shaders, bring about? Do you lose some shaders to gain a few streams or is nothing lost?
  35. Why exactly is crysis going to be a flop?!? JC!

    Best,

    3Ball
  36. Quote:

    That's what I was thinking... I doubt it will be too much faster in DX9. But in DX10, where the card was built for, it will be a bit faster... Hopefully better drivers in Vista too.

    Now I wonder how the onboard HD audio is going to work... it sounds interesting... I guess they'll have an adapter so I can plug my headphones in it?



    its being said that the onboard audio is strictly for HDMI use, meaning sound cards will still be needed if you aren't using HDMI.

    I don't see why an HDMI-to-DVI connector with digital audio plug wouldn't work to use it on a external set of speakers, via digital input from the digital out adapter on the adpater piece.

    Here is something that may fit the bill for you. A bit pricy but Gefen makes good hardware.

    http://www.gefen.com/kvm/product.jsp?prod_id=3939

    We'll see how long this type of device lasts because of the whole HDCP over HDMI nonsense. I could see this going away because Toslink is not protected.

    Ryan
  37. I have 2 Gefen HDMI splitters that i use to split a HD single from 2 recievers to four high defintion TV's. Unfortunately Gefen's tech support is not the greatest and their product didn't work very well either. Regardless, through numerous trials i was able to get it work. Also it seems as though Gefen is really the only quality maker of units like this dealing with DVI and HDMI splitters and what not.

    If ATi were smart they would make their own unit that could split the HDMI single into audio and video, since obviously no computer monitors have HDMI connectors or would never have quality speakers. Even more smart would be if they package it with the the 2900XT
  38. The majority of the problems I see with HDMI are usually related to devices not having good firmware. Many HDMI sources do not sync well with devices classified as repeaters, such as switchers and splitters. There is not much companies like Gefen or any AV receiver manufacturer with HDMI switching can do about it. For example, finding a cable box that will pass through a switcher is almost impossible, at least with the cable company in my area. (Crimecast)

    This is because when HDMI was developed (mainly by Silicon Image), 100% compliance with the standard was not required to get a license. It fact compliance is still optional. Each company has the option to put a component through the SimPlay certification process, but is not required to do so. Not that I care much for Best Buy, but fortunately with as much influence as they have in the industry, they are using it to help this issue. At some point in the very near future they will begin requiring all HDMI products they carry to pass the SimPlay certification. This hopefully will light a fire under a lot of companies to get their act together.

    I like your idea of AMD/ATI offering a solution of their own. If it was reasonably priced, I would buy it. It would be kind of like the DVI to component video adapter they sell that works on most Radeon cards.

    Of course none of this would be necessary if it were not for stupid DRM solutions like HDCP.

    Ryan
  39. Quote:
    and this is coming from someone running a pentiumD :lol:


    Wanna know what game I play? Counter Strike: Source.

    Flight Simulators are for nerds, Crysis is going to be a flop.


    Unless you were mucking about, that's just a tad daft.
  40. Seriously,you don't think Nvidia has got an answer to this?They've been kickin back makin a run with the 8800's(and a good 1 at that).Once the r600 displays they're going to counter with the 8900's,an I'll sit back and watch the lemmings drop where they may.After the smoke clears ,I'll pick up the best card for the buck,and you'll have whatever their marketing scheme sold you on.Thinkaboutit :D gl.
  41. Quote:
    Seriously,you don't think Nvidia has got an answer to this?They've been kickin back makin a run with the 8800's(and a good 1 at that). Once the r600 displays they're going to counter with the 8900's,


    What was their reply to the X1800? The lame GTX-512.

    Don't bet on any unreleased products, not the R600 not the GF8900. And with the potential for a 65nm R600 refresh there's another potential reply, so what will and will not be a worthy reply has yet to be seen.

    Quote:
    After the smoke clears ,I'll pick up the best card for the buck,and you'll have whatever their marketing scheme sold you on.Thinkaboutit


    Sounds like you're buying into the marketing. The first part makes sense, but to think that one or the other company isn't using a marketing scheme means you're just another lemming of a different colour.

    The only smart thing you said is to 'pick up the best card for the buck' PERIOD.
  42. 64 shaders that can do 5 operations each has the same computing power as 320 simple shaders.
  43. The GTX 512 was a true powerhouse at its time of release, it swept the board with both the vanilla 7800GTX and X1800XT. Unfortunately for Nvidia as the X1800 was delayed ATI was already done with the X1900XTX. Nvidia also had the 7900GTX done by then, but for all of its muscle it still fell short of the X1900 IMHO.

    I also think that last generation is being mirrored in this current generation, as Nvidia got the first cards in a generation out first, and ATIs were delayed again and again. If the pattern is followed again, the HD2900 will compete with the 8800GTX, Nvidia will release a new card (Ultra or 8900?) then ATI releases its real powerhouse of the generation (XTX or 2950?) which will hold its own until Nvidias 9800 release.
  44. The true Qualified card will be known in a couple of months as far as someone wanting to wait that long,don't try to act intellegent it insults what little you have.
  45. Quote:
    I agree, I'd like to see if nvidia is actually doing anything to the g80 core for the 8900gtx besides clock it up higher, since that's all they did pretty much with the 7900gtx


    Actually I was talking about the mythic GTX-512, you may be confusing numbers.

    Quote:
    WTF are you talking about? both the 512 7800GTX and 7900GTX were damn fine cards and traded blows with both the X1800XT and X1900XT.


    I think you need to look at them again, the GF7800GTX-512 was overpriced and underpowered in anything but the older games, and as time went on it was worse. The GF7900GTX was a great refresh and the one truely needed to deal with the X1800, but the X1900 did beat it too. So it's not unreasonable to think the same thing could happen again.

    Don't get me wrong the GF7800 when it launched was awesome, but the GTX-512 overclock was still a weak card compared to the X1800XT it was trying to best and at best matched that card, and then when the X1900 came along shortly after nV never regained the throne until they launch the twin card GF7950GX2. All I'm saying is that there is no guarantee as to whether a response will be worthwhile or not.
    Looking at another 2 examples, look at the X1600 it came out well before the GF7600GT, then the GT comes out and destroys the X1600, causing them to cripple the X1800GTO to try and help, and then ATi brings out their X1650Pro refresh and it still gets chumped by the GT, so they finally bring out a completely different card in the X1650XT which finally takes the crown back for the 7600/x16xx duel, even though by then the X1900GT was easily dominating that price range with it's low $115 price. Also look far enough back to the FX5600 rev1 and rev2, and even the FX5700, none of them were able to best the competition they were up against. Or even look at the GF7300 failure to reply to the X1300, leading nV to use crippled GF7600 cores to make the GF7300GT to compete. The main thing is with the GF8800 and R600, there is no higher card for the companies to cripple to help, so I don't trust the unsupported belief that whatever ATi launches nV will be ready. If the R600 were twice as fast would they be ready? It won't be, but that would test that blind faith.

    Here's the thing, I don't expect the R600XT to be overall better than the GF8800GTX in CURRENT games, and that the potential benefits are in the future, I suspect it launches below the GF8800GTX in performance in games like FEAR, BF2, COD2, COH, etc. The only game I see potential noticeale gain will be Oblivion (especially with mods), but the overall picture of the performance should be 60/40 or worse in favour of the GTX, and while I understand people expect the GF8900 to reply to the R600, unless nV switches processes, it is unlikely anything that amounts to no more than an OC will compete with the 65nm refresh of the R600, and that should be cheaper to produce than even an 80nm G8x refresh, so unless nV has their 65nm part ready to launch I think ATi's next refresh which is expected by summer gives them the speed, and cost advantage.

    But that's speculation on my part, and I don't expect anyone to see it that way or wish to push it on anyone. However I do want to point out that bein given enough time to prepare your reply doesn't mean it will be a good one, just like the GF7800GTX-512 wasn't up to the job, either for performance or price. And both companies have that history, like I said look at the X1600 for ATi's practice of that.

    That's just my 2 frames' worth as always.
  46. why...why do you have to post such long replies all the time. Im too lazy to read it all :P
  47. Quote:
    don't try to act intellegent it insults what little you have.


    I'd almost be tempted to reply, but I'll simply point out that you obviously lack intelligence when you misspell the word while trying to insult someone about theirs.

  48. Quote:
    why...why do you have to post such long replies all the time. Im too lazy to read it all :P


    Because I pay my ghost writers by the word and they have families to feed.

    So to boil down my above L o n g post; not all replies are successful, like the GF7800GTX-512, the GF7300, and the X1650, and come to think of it the X700, many fall short.
Ask a new question

Read More

Graphics Cards ATI Graphics