Sign in with
Sign up | Sign in
Your question

G80 going to be outdated soon enough?

Tags:
  • Nvidia
  • Games
  • Graphics
Last response: in Graphics & Displays
Share
December 5, 2006 8:11:34 PM

I know this article is from the Inq, but it still proves a valid point.

Article

Does this mean that people who bought the G80 now got pretty ripped off?

So far there are no Dx10 games out. However, when the G84 is out there will be.

Just creating some more controversy.

More about : g80 outdated

December 5, 2006 8:31:47 PM

Not really as the G80 is the high end chip while the G84 is the more mainstream, budget oriented chip. It's like comparing the 7900 to the 7600, the 7900 will beat the 7600 and perform better, but that performance comes at a price (being more costly). So no, the G80 will outperform the G84, but once supplies stabilize you'll find that the situation is more akin to the 7900 vs. the 7600.
December 5, 2006 8:32:33 PM

The article says nothing about the G80 getting "outdated" the Nvidia naming scheme just moves like that. The G80 is high-end and the G84 is mainstream.
Related resources
December 5, 2006 8:33:31 PM

Oh. I thought the G84 was a refresh of the G80, just with mainstream cards being made also, not just high end.
December 5, 2006 8:39:46 PM

Yeah, the Inq says the G84 is a mainstream part.
December 5, 2006 8:41:52 PM

Boo.
December 5, 2006 8:42:22 PM

So what happens when Nvidia reach the 10k naming mark.


New Nvidia 16000 and 16500 chips arive monday

ati: "psht, we were then 4 years ago"
December 5, 2006 8:43:37 PM

Ha. I think people should wait for the R600. Just my opinion.
December 5, 2006 9:07:28 PM

Is there anything that makes you believe that the R600 will be better? I mean besides hearing that ATI and Nvidia are trying to do this two different ways I haven't heard anything about performance or benchmarks. I would love to hear your input as I honestly haven't heard anything :p 
December 5, 2006 9:23:28 PM

He didn't say R600 would be better. He said people should wait for it.

It's a good idea, too. If it isn't better, you can still buy an 8800. If it is, then you can get R600 instead. It's a little hasty to be buying such an expensive product when the competition's releasing their equivalent so soon.

Right now, the reasons to get an 8800 are:
1) ZOMG I must have the very best RIGHT NOW!
2) If you think it's better than the R600 anyway (no real evidence either way)

The reasons to wait fro R600 are:
1) I can't take full advantage of the hardware anyway (No DX10 games)
2) The R600 might be better, or worse, it's good to be able to consider all the options, instead of having only one option to choose from
3) R600 will be out really soon. Waiting another month or two won't kill me
December 5, 2006 9:26:29 PM

unless, say, your computer explodes and shrapnel that otherwise would have been blocked by the monstrosity that is the 8800, hit you in the head, killing you instantly.


Some times, its best to buy now, to save your life.
December 5, 2006 9:34:58 PM

So far no one really knows anything about the R600. However, because it is coming out after the G80, I think it can improve on its short comings.

The only thing that I know about the R600 that is better than the G80 is that the R600 uses Gddr4 ram, not Gddr3.

Here is some info on the R600. You will have to find your own specs on the g80 to do the comparison.
December 5, 2006 9:48:40 PM

"the R600 combines the X2 technology we employ in our AMD processor line chips and effectively creates two x1950xtx in one chip. We plan on having dedicated ram busses supporting DDR4 at a maximum amount of 2GB for the enthusiast market, and 1GB standard. The watt consumption is 615watts per card and to prevent any necessary upgrades, they have an external power brick. More news to follow."
December 5, 2006 10:14:06 PM

Quote:
So what happens when Nvidia reach the 10k naming mark.


New Nvidia 16000 and 16500 chips arive monday

ati: "psht, we were then 4 years ago"


Well at 10 k, the counter is reset to 0
December 5, 2006 10:20:45 PM

B.S.
December 5, 2006 10:41:39 PM

Quote:
So what happens when Nvidia reach the 10k naming mark.


New Nvidia 16000 and 16500 chips arive monday

ati: "psht, we were then 4 years ago"


Well at 10 k, the counter is reset to 0

Nah, They'll follow ATI and throw a letter in there. X is taken, so I suppose we'll get Y from Nvidia.
December 5, 2006 11:04:00 PM

Quote:
So far no one really knows anything about the R600. However, because it is coming out after the G80, I think it can improve on its short comings.

The only thing that I know about the R600 that is better than the G80 is that the R600 uses Gddr4 ram, not Gddr3.

Here is some info on the R600. You will have to find your own specs on the g80 to do the comparison.
DDR3 generation doesn't mean it's going to be faster; the X1950XTX uses DDR4 and the 8800GTX is usually around 2x as fast as it.
December 5, 2006 11:17:07 PM

Not sure I'd say this is a bad time to buy a high-end card in general. It depends on the person and their upgrade needs, maybe building a new system and want the current best.

There is always, especially in the GPU market, someone coming out with a 1-up card or technology. To say theres no reason to get the G80 until R600 arrives then thats half a generation wait almost in GPU time assuming R600 doesn't come out until Feb-March which most sites estimate is the expected arrival time.

I for one am not made of $ by any stretch but I may very well buy a 8800GTX before R600. I won't feel ripped off if I make that choice because I know how the market goes and waiting for the next thing 2-3 months down the road will trap you in an endless cycle of waiting.

Anyway, there are certainly good reasons to wait and see for those not in a hurry right now. For those that are then knock yourself out and enjoy that new card because the best reason to get one is that it does blow everything else out of the water right now DX10 games or not.
December 6, 2006 1:26:45 AM

My bad, I should actually read these things sometimes hehe
Yeah, I totally agree for waiting for it. In the very least it should get cheaper once the R600 comes out. Ahhh... my new computer will be built in Feb. Should be an exciting time for me!

Quote:
He didn't say R600 would be better. He said people should wait for it.

It's a good idea, too. If it isn't better, you can still buy an 8800. If it is, then you can get R600 instead. It's a little hasty to be buying such an expensive product when the competition's releasing their equivalent so soon.

Right now, the reasons to get an 8800 are:
1) ZOMG I must have the very best RIGHT NOW!
2) If you think it's better than the R600 anyway (no real evidence either way)

The reasons to wait fro R600 are:
1) I can't take full advantage of the hardware anyway (No DX10 games)
2) The R600 might be better, or worse, it's good to be able to consider all the options, instead of having only one option to choose from
3) R600 will be out really soon. Waiting another month or two won't kill me
December 9, 2006 3:25:56 PM

I agree with Talon, there's always gonna be something better just a few months away. Who knows, maybe when the R600 comes out in Feb, there will be a GeForce 8900 coming in June. At some point you probably don't want to keep waiting for the next best thing, because there will aways be a next best thing. I guess each person needs to make up his own mind.
December 9, 2006 3:38:23 PM

actualy both companies said they will stop building video cards in may2007. So really, there will not be anything beter.
December 9, 2006 4:06:59 PM

Quote:
DDR3 generation doesn't mean it's going to be faster; the X1950XTX uses DDR4 and the 8800GTX is usually around 2x as fast as it.


DDR4 does clock faster and is more energy effecient (always good in this day and age, a 1950 equiped with DDR4 performs some 30% faster than one with DDR3, or so I've read), I think what you're pertaining too is the GPU core and the 8800 will be faster even with DDR3 simply because it is a brand new core architecture and as such does things a little different while the 1950 core is a revised card of an architecture that has been out for a while now.
December 9, 2006 4:17:36 PM

Anyone who got the pre-Christmas nVidia GeForce 8800 cards will be quite annoyed by March - April - May 2007

Imagine another +40 to +60% jump in performance (depends on settings).

Also imagine the power of 2 x X1900 XT cards on one GPU, with GDDR4 at 2.4 GHz to boot - because that is basically the level of performance they'll be competing against.

This is why I am holding off.
December 9, 2006 4:45:11 PM

aren't you guys the least bit crazy over what's just been said? If it has the power of two 1950xtx has anyone thought it might be a dual-core 1950xtx with 64 4-way shaders EACH with GDDR4? You can forget the arms race if ATI has even coughed at the revision of that core with speculatory stats like these. Who knows, it could be what i just said and ATI is holding back a huge suprise just like nvidia did to us.
December 9, 2006 5:54:58 PM

I dont think it'll have 2 x 256 bit GDDR4 at 2.4 GHz for just 'one much more advanced core on a die shrink'.

A switched 512 bit interface, maybe.
256 bit x 2.4 GHz GDDR4 = Far more likely, but that is still 76.8 GB/sec connected to a GPU with a 512 bit ring bus memory controller.

(ATI get better utility out of what VRAM throughput they have than nVidia IMHO, this does not mean their cards are always faster however).

Considering it will not be long until nVidia have 105.6+ GB/sec (VRAM peak) cards on the market if I was ATI I'd be planning something big for March / April '07.

Either way ATI will beat the GeForce 8800 GTX by a good margin just 3 months after Christmas, or nVidia will with a refresh part.
December 9, 2006 8:07:27 PM

What scares me the most is the speculated power supply for the thing (and only the gods know at what cost).
December 9, 2006 8:15:12 PM

AU $1,250 (apx) initially, but it'll drop by 12% in quick order.

The PSU I'd be recommending for overclockers would be an 880 watt or greater (see Power Supplies Thread).

Performance will likely be about +24% higher than the GeForce 8800 GTX (the pre-Christmas one, as it'll be known late Q1 2007).
December 9, 2006 8:53:35 PM

Quote:
DDR3 generation doesn't mean it's going to be faster; the X1950XTX uses DDR4 and the 8800GTX is usually around 2x as fast as it.


DDR4 does clock faster and is more energy effecient (always good in this day and age, a 1950 equiped with DDR4 performs some 30% faster than one with DDR3, or so I've read), I think what you're pertaining too is the GPU core and the 8800 will be faster even with DDR3 simply because it is a brand new core architecture and as such does things a little different while the 1950 core is a revised card of an architecture that has been out for a while now.I understand perfectly the benefits of DDR4 and the performance improvements over DDR3, however the previous poster's reasoning for the R600 being faster was "DDR4 support" when faster memory will not likely be the cause of the R600's superior performance.
December 9, 2006 10:38:41 PM

It'll be a combination of faster VRAM (be it GDDR3 at up to 1.8 GHz or far more likely GDDR at 2.2 to 2.4 GHz), and the new GPU.

The new GPU will have a much higher transistor count (performance) than the existing X1950 XT.

Typically the aim to release a flagship product with balanced GPU core performance, and enough VRAM throughput to really feed it.

Just one or the other is rather pointless.

As we've already seen the X1950 XT scaled quite well when going from GDDR3 to GDDR4 at just 2.0 GHz - Did it not ? - Something about ATI owning many of the patents that went into GDDR4, I don't even know for sure if nVidia can cross-licence it. (Or if they can it'll be through a GDDR4 manufacturer that needs to pass costs onto nVidia, which will put ATI in a better state price/performance wise anyway, and they need to regain heaps of market share after AMD bought them out).
In fact, effectively, AMD now own all the GDDR4 patents, that is not too shabby.
I'd be expecting about +76% (if chipset/CPU doesn't bottleneck) performance in high resolution with 4x FSAA more performance with R600 vs Radeon X1900 XT.

Of that (compunding) performance gain +33% will come from the GDDR4, the majority of the performance boost will be the new GPU architecture. This has been the case historically once memory got fast enough (eg: GeForce 2 MX - 400 series, vs Riva TNT2 Ultra), and has been ever since.

Frankly I look forward to the card, and hope it has a low power consumption at the desktop / while mostly idle, and is quite (fan noise wise) when under high load.

Any 'extra' VRAM throughput they get on it will be due to a better memory controller on the GPU itself and/or wider than 256 bit interconnect to the GDDR4. It'll have at least 76.8 GB/sec peak for VRAM anyway, but hopefully double that (153.6 GB/sec peak) using 512 bit interface. (Which is a pipedream, that only reinforces the fact that most the gain will come from within the GPU itself, not without - unless you count the power source that is - :lol:  ).
December 9, 2006 10:41:27 PM

Quote:
"the R600 combines the X2 technology we employ in our AMD processor line chips and effectively creates two x1950xtx in one chip. We plan on having dedicated ram busses supporting DDR4 at a maximum amount of 2GB for the enthusiast market, and 1GB standard. The watt consumption is 615watts per card and to prevent any necessary upgrades, they have an external power brick. More news to follow."


(If this is true) Maybe ATi merging with AMD isn't such a bad thing after all. Trading resources and technology may just give us a hell of a DX10 graphics card. A GPU prepared power source will be a real PSU saver.
December 11, 2006 1:24:37 PM

I want the Geforce 2 Mx400 to come out in an aniversery edition. Would be great if both companies released older cards in special silver boxes for only $100.

As to the title of this thread and the OP.... YES ALL CARDS GET OUT DATED SOON ENOUGH. Spending 600 bucks you would think it woud last, but it doesnt. Get over it... nothing new.

I payed $850 for a Pent3 733 mhz processer. Is it out dated? yes. Worth $850? No. You payed $600 for the 8800. Is it out dated? Not yet. Worth $600? No.
December 11, 2006 5:54:16 PM

I'm not stupid enough to get a G80.
December 11, 2006 7:02:12 PM

cause its not good enough when it comes to dx10.
December 11, 2006 7:05:12 PM

Hmm.. way to pay 600-700 bucks to play games at 1280x1024.

Here's a task for you: list all Dx10 games that your "God-like" 8800gtx can play.

Please, in the future don't waste my time by posting screen shots of nothing.
December 11, 2006 7:28:05 PM

Did you just reply 5 times? Wow congrats...


YOU WIN THE: TOTALY USLESS POSTER AWARD :trophy: :trophy:


Btw, I hope it doesnt take 5 replies to get back to me.

Actualy, my Geforce 6150 can play Oblivion at those settings, and its an IGchip. You just payed $600 bucks for something that do exactly what mine can.... albeit at 3fps, but who cares right? its not DX10.
December 11, 2006 7:29:12 PM

Speaking of a moron. You just posted 5 seperate times (repeating a post), when you could have just posted once.

How much did you pay for your 8800gtx. How much did I pay for my X1900XT. What can your card do that mine cannot?

Exactly.

You an innate fool, who blatently thinks you are far superior when in fact you are worthless wazz.

LMAO. How about you you words buddy.

Back on topic though, 1280x1024 is pathetic, especially when you buy a $700 video card. You should have used that money to buy a better monitor you mongoloid.

Its not that I cannot see you pictures you idiot, its that they worthless, innane pictures of nothing.
December 11, 2006 7:32:07 PM

the fact that the pictures are: worthless, innane pictures of nothing, is a problem with your computer you stupid moron.
December 11, 2006 7:34:37 PM

No. They are pictures of rocks and grass.

Nothing redeeming about them, especially when they are in 800x640.

Enough said.
December 11, 2006 7:38:35 PM

Quote:
I know this article is from the Inq, but it still proves a valid point.

Article

Does this mean that people who bought the G80 now got pretty ripped off?

So far there are no Dx10 games out. However, when the G84 is out there will be.

Just creating some more controversy.


and btw dont trust too much inq says :wink:
December 11, 2006 7:38:45 PM

:lol:  Like I said, and he said... the content is based on your computer, not what he actualy posted.
December 11, 2006 7:42:10 PM

Here are my thoughts, not that anyone will really care. I bought the 8800 GTS knowing that Direct 10 games were not available yet. The reason I did is because I sold my X1900XT for a decent amount and I only had to pay around $150 to upgrade to the GTS. I hope the R600 is going to be a good card because if isn't then there is no competition.
December 11, 2006 7:42:18 PM

Im glad we have gotten off topic.

He does have a better graphics card then me, though he is not using it to its full potential.

Also, remind me how many of the original Dx9 cards are still being used today...
December 11, 2006 8:03:33 PM

Quote:
What can your card do that mine cannot?


1) Run games at maximum settings at over twice the framerate of the 1900XT

2) HDR + AA on every title that supports it

3) Quantum physics on-card gpu physics

4) 16xAA

5) Coverage AA

6) DX10 geometry shader capability


How's that for starters, chump?
December 11, 2006 8:05:12 PM

What was the first card to have dx9.0? Fx 5700? SHUDDER
December 11, 2006 8:08:21 PM

Quote:
What can your card do that mine cannot?


1) Run games at maximum settings at over twice the framerate of the 1900XT

2) HDR + AA on every title that supports it

3) Quantum physics on-card gpu physics

4) 16xAA

5) Coverage AA

6) DX10 geometry shader capability


How's that for starters, chump?

Actualy,
All games are capped at 30fps, and most if not all modern cards can achieve that.

2.) only one card can not support AA and HDR, and thats the 9800pro.
3.) So? We dont need phyiscs
4.) 16xAA has been around for 3 years....
5.) So? Nothing new, just a new name.
6.) DX10 geoshader is the same as dx9 geoshader.

Face it... you got had.
December 11, 2006 8:35:13 PM

so much anger in this thread

December 11, 2006 9:00:06 PM

Quote:

Actualy,
All games are capped at 30fps, and most if not all modern cards can achieve that.


Oh, you precious little thing. No they're not; your graphics card is probably just not strong enough to support more than 30FPS. Take a look at RobsX2 screenshots above sporting one of the most graphically demanding games to date, Oblivion, and you'll see that he's running anywhere from 50-70FPS in some intense places.

Quote:
2.) only one card can not support AA and HDR, and thats the 9800pro.


Wrong again. HDR didn't even come along until the GeForce 6800 series of cards. The Radeon 850XT's, for example, couldn't support HDR and they came out over a year after the 9800 PRO did.

On top of that, the GeForce 6800 couldn't support both at one time, neither could the 7900GTX, and neither could the 1900XT w/o a hacked patch/driver--and even then it didn't work on every game, just some.

Quote:
3.) So? We dont need phyiscs


*snicker*

Quote:
4.) 16xAA has been around for 3 years....


Not on single cards, it hasn't.

Quote:
5.) So? Nothing new, just a new name.


Do some more research and educate yourself; you just sound simple-minded.

Quote:
6.) DX10 geoshader is the same as dx9 geoshader.


DX9 doesn't have a geometry shader. Again, do your research.

Quote:
Face it... you got had.


Face it. You don't know what the @#^$ you're talking about.
December 11, 2006 9:08:28 PM

I apreciate the time you took in order to reply to me. Unfortunantly, you dont seem to notice the obvious banter.... or you think your knowledge is well earned. ... well its not... very common knowledge, especialy for most people in the forums discusing the G80.


So let me repete myself.

ALL Games, and I mean every last one, are capped in FPS *usualy 30.
Only the 9800pro didnt support AA and HDR. ALL OTHER CARDS DO
Physics are not needed in games today... graphics do all the work
16xAA was invented 3 years ago
Coverage AA is exactly the same, except a new name.
and the DX10 GeoShader has been around for a few years starting with DX9.0

If that wasnt clear enough for you... i suggest you take a chill pill and whack your elbow a few times.
December 11, 2006 9:43:23 PM

After reading most of the last two pages, I am tempted to say, "All right children. Calm down. Arguing facts and opinions is - or can be - productive. Personal attacks are not."

Oh, wait. I did just say that.

I have on order from newegg an eVGA 680i mb, an e6600, 2 GB of Crucial DDRAM (uses the Micron ram chips), and an eVGA 8800GTS video card. I found an Antec Trio 650 w ps on sale.

I thought a lot about the video card. Do I buy an 8800 series card (GTS - $450) knowing that in six months the 8800 series cards will be superceded by something from ATI and later releases from nVidia? Do I buy a mid to upper level DX9 card ($250 - $300) and upgrade in a year or so? Do I buy a low to mid-level DX9 card ($100 - $150) and upgrade in six months?

My problem is that I am home on vacation from working in Saudi Arabia. Upper tier parts are both scarce and expensive. I opted for an 8800GTS. I will put the parts into a case, test them, then pack them in my suitcase when I leave.

I am certain that most of the people here buying 8800's went through similar thought processes.

john
1st computer - 1978, TRS80, 1.77 MHz Z80, 16 k ram, 12 k ROM
December 11, 2006 9:46:58 PM

Considering the Saudi's are one of the richest oil-states, why would it be hard to get good computer parts???

Joking of course....
what do you do therE?
      • 1 / 3
      • 2
      • 3
      • Newest
!