RV670 vs everything else

I just saw some numbers at http://forums.vr-zone.com/showthread.php?t=201700 , and it led to small analysis. Lets compare die area of recent nVidia and ATi products:

G92- 315mm²
and
RV670- ~180mm²

The numbers say two things:
1. Teoreticaly ATi can get 7 dies where nVidia is getting 4!
2. RV670 die area doesn't even belong to compare with G92, but with GeForce8600 series- 169mm² and HD 2600- 153mm²!

If only the yealds for RV670 are good, AMD must be very happy: competing product- g92 cannot cost cheap because of its large die size. So AMD has a chance not only make good profit in performance segment, but also to use lower quality chips to own mainstream:) What do you think?
53 answers Last reply
More about rv670 else
  1. I think your right! anyone else?
  2. Why are you comparing the 8800gt to a chipset that will probably cost 2x more and is a high end piece if your comparing that to the gt then youve already lost the point the 8800gt isnt nvidias big release and if it even compares to the 3800 then amd/ati has already screwed up, all they have put out lately are loud slow and expensive gpu's........ when nvidia pumps out the next gtx and ultra we will compare the 3800. Dont beleive the hype.
  3. That's interesting. I guess using a smaller fabrication process and having fewer transistors makes the RV670 a good bit smaller than g92. I suspect that these 38XX cards will be a great value.
  4. What good would die size do for the consumer if the card is not competitive? Course, we won't know for sure until the benchmarks come in. What's the purpose of this post anyways?
  5. vaker5 the R680 (not the RV670) is the competitor to Nvidia next big card the RV670 should be the competitor to the cards derived from the G92 (8800GT and whatever else comes out).
  6. vaker5 said:
    Why are you comparing the 8800gt to a chipset that will probably cost 2x more and is a high end piece if your comparing that to the gt then youve already lost the point the 8800gt isnt nvidias big release and if it even compares to the 3800 then amd/ati has already screwed up, all they have put out lately are loud slow and expensive gpu's........ when nvidia pumps out the next gtx and ultra we will compare the 3800. Dont beleive the hype.


    Sorry, no offense Vaker, but who taught you english/grammar? Because that was probably the most confusing and poorly phrased post I have even seen in my life. And the 8800gt is not a precurser to a new highend GPU, hate to burst you bubble. The new highends nVidia makes will be a whole new line completely. Or at least, that is what we have been told. ATi made no mistake. With a die size like that, if the HD 3870 gets even close to the 8800gt in performance, it will massacre it in price/performance...
  7. Such small die definately will help AMD for profit margins, but if rumours are true about it being ~20% slower than 8800GT, IMO they could have earned even more if they pumped lets say 40% more transistors to 250mm². Faster than GT and still cheaper to make, could charge for it more than GT.
  8. Die size helps in manufacturing costs..but not that much in raw performance gain. I don't think a die shrink is really going to help them that much.

    On a side note...it's interesting to note how AMD's GPU die size is smaller than their CPU die size. Sorry but I just couldn't help but laugh at that.
  9. Harrisson said:
    IMO they could have earned even more if they pumped lets say 40% more transistors to 250mm².


    Yeah I agree, I think they would've done better to improve the back end by adding ROPs and TMUs, but that increases development costs, while the basic R600 design means less time effort and expense on a redesign.

    At some point they probably had a sit-down and decided the design direction, the question is whether that was before or after they knew the performance of the early R600s. If this design was early '07 it's not really surprising they didn't add much, otherwise they had to pick between the two paths improve R600 performance beyond clockspeed boosts or reduce costs.

    Also if you think of the spinoffs of this chip, it's probably more mobile friendly than any iteration with more transistors; and I wouldn't be surprised if this ends up in the high-end of their mobile line, while to me the GT still looks questionable for a direct mobile port.
  10. TheGreatGrapeApe said:
    Yeah I agree, I think they would've done better to improve the back end by adding ROPs and TMUs, but that increases development costs, while the basic R600 design means less time effort and expense on a redesign.

    At some point they probably had a sit-down and decided the design direction, the question is whether that was before or after they knew the performance of the early R600s. If this design was early '07 it's not really surprising they didn't add much, otherwise they had to pick between the two paths improve R600 performance beyond clockspeed boosts or reduce costs.

    Also if you think of the spinoffs of this chip, it's probably more mobile friendly than any iteration with more transistors; and I wouldn't be surprised if this ends up in the high-end of their mobile line, while to me the GT still looks questionable for a direct mobile port.


    I agree with you, maybe a couple more ROPs would help. Is there any possibility that AMD could do something similar to Nvidia, by clocking the shaders higher than the rest of the core? Could that play a future role to evening the performance a bit between AMD and Nvidia?
  11. you guys only see ROPs and clocks - not the fact that this processor is a whole new design - most of it ( 90% + infact )

    wait and see - if its good you can buy it and **** on those below you or vice versa - as this seems to be culture of intel, amd, ati, nvidia supporters - well the majority at least
  12. YOU THINK YOU THINK BLA BLA BLUR BLUR BACK END - HA HA - aaaa HAHAHA - who are you my clever man - to think - to think with the master minds behind all of this -

    U think waayyy to simple me friend
  13. wernerdb2 said:
    YOU THINK YOU THINK BLA BLA BLUR BLUR BACK END - HA HA - aaaa HAHAHA - who are you my clever man - to think - to think with the master minds behind all of this -

    U think waayyy to simple me friend

    :heink:
  14. What a shame - is that how you looked when you flonked maths
  15. ...what randomizer said...
  16. is it just me or did a good discussion come to a very bizarre end here? -_-
  17. LOL
  18. wernerdb2,

    Please just go away.

    Thanks
    Mike
  19. NA - What i would like to say by being decent for a change is that - have a bit of faith - everythings not about clocks and ROPs to get better performance - I have 8800ultras in sli and i just know that ati has something huge coming up - you can feel it in the air - with the gtx been nicked by an 8800gt - my ultra sli rig barely wins two 8800gts in sli - with bloody game performence - not testing - and this is pure clocks we have here
  20. na sorry guys - mind my manners- like i said - i'm just 23 - sorry mike
  21. wernerdb2 said:
    you can feel it in the air


    That little bit there reminded me of lord of the rings -_- :lol:


    and now i should probably stop spamming.
  22. i'm angry with both ati and nvidia - its like these companies dont have respect for their users when it comes to cards - i was a fool who bought 2 7950gx2 cards for my machine - and it got beaten by a bloody 8800gts320meg with game performance - ati on the other hand has the driver issues in many new titles - with i think some angry or dissapointed fans

    and now again with nvidia - 8800gt in sli compares with 8800ultras - the price tag - it bites in the end - and each time we choose another team or get treated ok for a while or so -

    mike - youre 40 - is this how life will go on forever
  23. 8800 GT will beat HD 3800's on performance but the 3800's will try to compete on price (and the silly 10.1 thing). The G92 GTS with 128 shaders enabled will blow anything in the 3800 line away starting around 300 USD for the 512 and 400 USD for the 1 gig versions. The G92 GTS will go on to be THE big enthusiast card for Christmas.
  24. I mark you on this one mithcellvii -- you picked your side - with your resources on the net - the final product will blow up in you face with 10.1 all over you -

    How can you be so sure - what - because of clocks and shaders - have you seen the
    HD3870 core in action ? theyre so secretive and this makes it a haven for fake reviews - ITs simple

    Time shifted to ati this time with their new supperior tech - its how it goes - and now ati has their turn coming - remember this day if youre serious about your quote - '

    Its like amd now intel later or vice versa - we think we can by ahead - but we in reality were always behind- (including in leaked specs)

    When you hear gpu - you think ohh ok ok mhz = 1000
    when you dont know anything about the pros and cons of their designs
  25. 8800gt beating the HD3870 - well i assume the HD3800's you quote means all of them - no way in hell
  26. I guess we should find out in a couple of weeks. I will point out, though, that based on the recent past, AMD's/ATI's performance seems to inversely proportional to the marketing hype preceeding it.
  27. If ATI/AMD is releasing on 11/15.... That's next Thursday! I'm holding off on a bunch of purchases till this situation is out on the table. Personally, I'm hoping for anything that will bring two things. Lower prices across the board, and renewed competition. You know, like the olden days.
  28. Yes competition is a good thing, but I can do without any of the floptimization scandals from that era that almost reared it's head again.

    If both compete in different ways it should be good for the consumer. Right now it's the low & mid-range, and maybe eventually there'll be some high-end competition again.
  29. TheGreatGrapeApe said:
    Yeah I agree, I think they would've done better to improve the back end by adding ROPs and TMUs, but that increases development costs, while the basic R600 design means less time effort and expense on a redesign.

    At some point they probably had a sit-down and decided the design direction, the question is whether that was before or after they knew the performance of the early R600s. If this design was early '07 it's not really surprising they didn't add much, otherwise they had to pick between the two paths improve R600 performance beyond clockspeed boosts or reduce costs.

    Definately its easier to just shrink die plus add UVD, than major redesign of the core. Lets run imagination a bit: by having plenty of transistors space it shouldnt be that hard to make new refresh RV680 to tackle all original design shortcomings. If ATI design is anything like current AMD "lego design" style, to add 2 x current ROP's and TMU's though not trivial but really doable, and they probably wouldnt even reach 8800GT die size, while such video card (3900+ series) would blow 8800 Ultra out of the water. Self cost would be 2x higher than 3870, plus redesign cost, but AMD could charge as much as 4-5x more, plus sweet publicity as performance king.

    Why AMD isnt doing this? IMO its strictly resources shortage, engineers already spread thin over loads of projects and AMD by swalowing pride being permanent 2nd lately, 100% of attention devoting to bread and butter segments, and ignoring 5% high-end market completely. Although for enthusiasts its not so good, but IMO it makes sense at this moment for AMD to strictly focus on survival instead of chasing crown of the fastest badass ;)
  30. Ever since amd took over it seems like things have slowed down for there whole company I mean the lack of Crossfire mobo selections, bad drivers, and super-hyped dissapointments if they do it again with the 3800, a bigger hit is likely to come. Nvidia has there head on straight!!!
  31. I think AMD is keeping die size small because they are trying to catch up lost ground on R680 or R700. I think they are just trying to shoot out a great card that is cheap so they can make good money while working on their next major lineup. That's just my 2 cents though...
  32. By the way, the super-hyped disappointments were in NO way ATi/AMDs fault. Who created the hype? No one but ATi fanboy's; not ATi. And if I remember...there has only been ONE super-hyped disappointment(from ATi); and now everyone is taking the let-downs as a constant track record for ATi when in reality we know this not to be the case. So why don't we lay off them for a while, huh? (Bet you don't know where my loyalties lie ;))
  33. Or they are keeping it small so they can fit 2 on same pcb while still keeping thermals at reasonable levels... heres a rumour about the R680
    http://www.nordichardware.com/news,7026.html
  34. monsterrocks said:
    So why don't we lay off them for a while, huh? (Bet you don't know where my loyalties lie ;))


    Yahoo, another fellow MATROX fan !! :sol:
  35. Kari said:
    Or they are keeping it small so they can fit 2 on same pcb while still keeping thermals at reasonable levels... heres a rumour about the R680


    We know about that, but there are limitations to a 2 chip solution that are not present in a more robust single chip solution. Adding 25% more transistors for more ROPs and more TMUs or components thereof, would not increase the thermals much and would greatly improve the performance per wafer (ie one chip to do the job of two 75+% as large chips).

    The multi-chip solution works OK, but if the buswidth is limited to 256(mem)/512(ring)Mbit then you are stuck on that end, and need much more expensive/exotic memory to make up for the shortfall at the higher settings/resolution of these top-end cards.

    I appreciate the more modular plan, but like the GX2, it seems less than ideal, and shifts the complexity and cost from the single chip to two 2 chips, faster memory and likely an equally complex PCB, and much more complex driver support.

    Anywhoo, I think everyone was just anticipating some fixes to the existing design, while early rumours see no such changes.
    I also think if AMD have made a fast and cheap 240SPU variant regardless of performance it would be easier to talk about the efficiencies and other possibilities, but since it has very similar characteristics to the R600, so it easily attracts such a comparisson.
  36. http://www.fudzilla.com/index.php?option=com_content&task=view&id=4086&Itemid=1

    Maybe already posted somewhere, but if the HD3870 is going to list for $249, maybe it's better than we expect. Either that, or it's not and that's why they will focus attention toward putting sub $200 3850's on the shelves instead.
  37. TheGreatGrapeApe said:
    Yahoo, another fellow MATROX fan !! :sol:


    Now that's not true; I am not a MATROX fan. I am more of a fan of people like you who post great information on these forums...I would not be a fan of a fanboy like myself. :)

    Edit: Wow, that sounded kinda weird, didn't it? I like people who post relevant and helpful information on forums, not people who bash everyone just because their CPU is 25Mhz faster than everyone else's. :kaola:
  38. HD3870 is 10% faster than HD2900XT

    HD3870: 775/2250MHz,
    Power set at 105w
    TSMC advanced 55 nm process.

    Estimated 3DMark06 score 12789.7

    link http://translate.google.com/translate?u=http%3A%2F%2Fchiphell.com%2Fviewthread.php%3Ftid%3D11822%26extra%3Dpage%253D1&langpair=zh%7Cen&hl=de&ie=UTF-8&oe=UTF-8&prev=%2Flanguage_tools

    SRP US$200 to US$230
    Look at price, power & performance, it's value for your pocket
    Tick tock tick tock, 1 more week to go to test the performance of HD3870
    Just wait till Nov 15
    Enjoy new toy...
  39. I have to correct you - you mean 45% at minimum -
    thats faster than all cards
  40. Preliminary test from vrzone

    3DMark06 score 11,669
    Idle temp 47
    Can mix with 3850 for xfire
    3850 idle temp 30 deg, it's pretty low

    http://forums.vr-zone.com/showthread.php?t=202510
  41. Summary.... low temps, low power, low noise, low price, but allegedly 7% lower performance than the 8800GT ("In comparison, 3870 scored 11669 while 8800GT scored 12495"). Lots of potential for Xfire.
  42. Overclocks to get 13K 3dmark @61C on stock. This could be better bang:buck than 8800gt whilst being lower power and quieter
  43. Should be an interesting card. But it doesn't look revolutionary based on this presentation/specs;

    (Presentation);
    http://www.madboxpc.com/contenido.php?id=5430

    Imageshack link to Specs since that other site is kinda slow (click if trouble loading above);
    http://img225.imageshack.us/img225/1782/5uitxoe3.jpg

    At stock speed I don't see dramatic difference with the HD2900 to even consider a possability of a GT challenger in the areas it's already losing noticeably, but higher clock and Xfire may make the card interesting. And if they can get any kind of volume and truly hit at leats their MSRP if not lower, then it will also have a price/performance consideration for the GT due to the GT's low availability and high price-gouged parts.
    However if the HD38xx and 8800GT have similar prices it will be tough for AMD, but if they price them right, then they could get some good sales (not going to guess which +/- more at this point).
  44. Quadfire in action :)
    http://www.youtube.com/watch?v=A3jpG3rv4zI

    BTW, it was 4 x 3850, for 3870 there is only one Quadfire MSI available atm for dual slot cards.
  45. AMD's own HD38x0 internal benchmarks. Doesn't look like it will be a match for the 8800 GT, but it should end up cheaper.

    For those suggesting that eight GPUs (Four R680s) could be used together, this seems unlikely. It would be incredibly difficult to write drivers to take proper advantage of eight GPUs. I doubt that either Nvidia or ATI will go beyond four GPU support.
  46. @Mandrake, there is no question its very hard to make Octoquad drivers, both nVidia and ATI have issues with dual gpu's, what to speak of eight. But IMO if they make scalable drivers (they have to while making Quadfire), they arent that far off eight as well.

    No idea if they'll make it someday, but since R680 is dual, and they are launching quad slot MBs as we speak, it makes little sense to use only half capacity.
  47. Latest result with "Crysis" benching

    HD3870, HD3870 crossfire & 8800gT

    HD3870 crossfire 3dmark06 scores 20,383 - huge, more than 60%

    From 3Dmark06 and Crysis, it looks like nVidia 8800GT is better than HD3870. However, there is one thing we need to tell you

    We found ATi HD3870 is actually has better picture process while Crysis benching.

    Specially the light refraction from water, HD3870 is very smooth but 8800GT is not. It is around frame 500 to 1000.

    If you have chance, compare it. You will know what are we talking about.

    The price for ATi HD3870 we are not sure yet, but it should be less than 8800GT.

    One more thingy to wait
    The released of ATI Catalyst 7.11, will get better result I suppose

    Observation
    ATI 3870vs Nvidia 8800GT
    In favor of on the right based on recent reviews

    1. 3DMark06 - 8800GT
    2. Low temp - 3870
    3. More energy efficient power consumption - 3870
    4. Crossfire - 3870
    5. Image quality based on Crysis test result - 3870
    6. Smooth running of Crysis - 3870
    7. Based on recent announcement selling price $200 to $230 - 3870
    There are other features that are in favor of 3870

    3870 is the clear winner & value for money at this moment
    So, I would consider 8800GT is over priced now

    http://www.iax-tech.com/video/3870/38703.htm
  48. Well let's face it ... Nvidia flooded the market with dx10 only cards ... all well knowing dx10.1 was only months away . Hundreds of thousands of consumers not researching bought them way before they were really even needed or useable with all their benefits on Win xp ... directx 9 machines . ATI released way below standard cards that admit it were not suitable for comparison .
    In a sense they both screwed us ... now both companies are going to release what they should have in the first place !!! I'd like to stick my finger in both ATI and Nvidia's CEO's eye . If Crysis is what all the new games are going to require .... and newer ones will require more GPU power .... let's hope one of the manufacturers will give us a descent card for the money .... but don't count on it .
    Admit it they both shoved the shaft to the consumer . And don't go by what hype or projected benchmarks are out there now by any site . The old saying " THE TRUTH IS IN THE PUDDING " will only be the way we will know . I know it's been a wait ... no one is PI** ED off any more than I am .... but let the new cards come out ... be tested on real games and stop arguing about what one site or so called expert is telling you . Believe me .... running my current games slower and lower than I can stand is burning my knickers too ... but I won't give either company hundreds again for a lame card . Was hoping Intel vamping up their video staff would release near the same time something .... but alas have heard nothing more . I really don't trust or like either firm ...
  49. Steady on old chap! dont forget the dx10 cards are oodles more powerful than the previous generation top-of-the-line models. Even the lowely 320mb gts knocks the socks off a 7900gtx or x1950xtx. Current dx10 failings are more down to the software side of things I would tend to say. Hence collpase in frame rates and small visual quality improvements when enabling dx10 over dx9. Things like lost planet and call of juarez are in this respect, a very poor joke.

    ...you mean "the PROOF is in the pudding", good day to you sir!
Ask a new question

Read More

Graphics Cards LED Monitor Nvidia Graphics Product