Sign in with
Sign up | Sign in
Your question

Is Charlie right

Last response: in Graphics & Displays
Share
December 3, 2008 5:13:31 PM

Quote:
NVIDIA IS SET to trickle out the latest batch of 55nm parts. Expreview has some pictures and tidbits about the latest 55nm GT200/GT200b here, and some GX2 info here.

It looks like the on-again, off again GT200GX2 is on again, and it is called the GTX295. Yay. The 55nm parts, internally code named GT206, are finally trickling out like we said they would, with no speed increases, and no power gains. What should have been a simple optical shrink is turning into a totally botched job, with the 'real' 55nm parts unlikely to come out until late January at the earliest following yet another spin.

Given the lack of gains with the B2 stepping, the GX2/GTX295 still seem unmakable in volume, but such trifling concerns have never stopped Nvidia in the past. We hear they are going to launch it even though they can't make it, along with the requisite 19 parts to Newegg so they can claim it is on sale. Real volume won't happen until (if?) they can fix the power problems.

We hear that the 'launch' is likely going to happen at a shindig on the 12th of December so they can claim the win they promised before the end of the year. One has to wonder if cherry picking parts in an attempt to use tame press to snow the public is the definition of 'Whoop-ass'? I am sure they will claim a stunning victory in any case.

One way you can tell how screwed up the chip is is the use of a heat spreader and a stiffener (the metal ring around the chip). If you have a big die, you need mechanical support for it, or it can crack or break bumps. A stiffening ring is usually the cheapest and most efficient way to go, but in many cases, a heat spreader will do the same job.

The problem with a heat spreader is that it introduces two additional thermal barriers, the paste under the lid and the lid itself, to the cooling of the silicon. Each one makes cooling incrementally less efficient, not to mention material and assembly costs. You don't do this unless you have to.

If you are wondering why every modern CPU out there has one, the answer is simple, so ham-handed monkeys like most DIY people don't crack the die when they clamp the heatsink on. Think AMD K8 here. CPU makers think the cost of a spreader, and the reduction in performance it brings, is worth the protection it gives.

GPUs however come assembled. Factory robots don't break chips, so the mechanical protection is not an issue, but the costs remains. So, why did Nvidia do it on the GT200? They can't control hot spots. The lid is a heat spreader, and it helps keep chips with poor hot spot control alive and working.

When you see a heat spreader on a part that comes assembled, it is a pretty sure sign something is wrong thermally, it simply is not worth the cost and performance drop otherwise. Make no mistake, the spreader and stiffener combo on the GT200b is a bad bad sign.

Why is the GT200b such a clustered filesystem check? We heard the reason, and it took us a long time to actually believe it, they used the wrong DFM (Design For Manufacturing) tools for making the chip. DFM tools are basically a set of rules from a fab that tell you how to make things on a given process.

These rules can be specific to a single process node, say TSMC 55nm, or they can cover a bunch of them. In this case, the rules basically said what you can or can not do at 65nm in order to have a clean optical shrink to 55nm, and given the upcoming GT216, likely 40nm as well. If you follow them, going from 65nm to 55nm is as simple as flipping a switch.

Nvidia is going to be about 6 months late with flipping a switch, after three jiggles (GT200-B0, -B1 and -B2), it still isn't turning on the requested light, but given the impending 55nm 'launch', it is now at least making sparking sounds.

The real question is, with all the constraints and checks in place, how the heck did Nvidia do such a boneheaded thing? Sources told us that the answer is quite simple, arrogance. Nvidia 'knew better', and no one is going to tell them differently. It seems incredulous unless you know Nvidia, then it makes a lot of sense.

If it is indeed true, they will be chasing GT200 shrink bugs long after the supposed release of the 40nm/GT216. In fact, I doubt they will get it right without a full relayout, something that will not likely happen without severely impacting future product schedules. If you are thinking that this is a mess, you have the right idea.

The funniest part is what is happening to the derivative parts. Normally you get a high end device, and shortly after, a mid-range variant comes out that is half of the previous part, and then a low end SKU that is 1/4 of the big boy. Anyone notice that there are all of zero GT200 spinoffs on the roadmap? The mess has now officially bled over into the humor column.



looks like nvidia are having problems again

More about : charlie

December 3, 2008 5:20:21 PM

yes i did, get a life
Related resources
Can't find your answer ? Ask !
a b U Graphics card
December 3, 2008 5:28:31 PM

The very fact that nVidia hasnt released its 55nm G200s shows hes at least partially right.. I remember rumors of both 65 and 55nm being taped out before G200 release. Im thinking that because of G200s efficiency, or lack of it, transistor to die size, has caused some of this. It took ATI 3 shrinks to get theirs right, going from the 2xxx series to the 4xxx. The 2xxx series leaked bad, the 3xxx was better, but no performance boosts, it took the 4xxx series to actually open it up.

Reading Annands tribute to the engineers on the 4xxx team, and the 4870 et al, in his article, he says not until 2010 will we see midrange and lower cards using the G200 design, so maybe theres alot more to it than whats being accepted. Shooting for the top spot, having such a large die prohibits going to the lower end in a cost vs performance scenario, when faced with a competitor thats already gone small, aimed for not the halo end, but the performance end as their top chip, and it leaves a huge gap, which nVidia was hoping the G9x series would fill, as the G200 just couldnt be competitive at these smaller, more cutthroat pricings we have at the lower end of cards
December 3, 2008 6:47:55 PM

Quote:
You don't see a problem with what you did?

its not like i was calling it my own, the title does say charlie, but yes i forgot, being human we do sometimes forget
December 3, 2008 7:33:30 PM

There is no reason ATm for Nvidia to rush everything an make a mistake. They have the top single GPU solution out, and seeing as their approach to dual GPus is always as easy as a sandwhich, we should see something from them soon.
As it stands for me, I'm never going quad sli anymore, just too many problems with it. Even with tri sli, i'm a little jumpy. I'd say regular SLi, or single GPu or nothign my case from now on....but we'll see what the drivers will have to say.
a c 130 U Graphics card
December 3, 2008 7:51:02 PM

@ rangers,
How dare you just copy and paste like that, without a link we are all denied a visit to the inq and are robbed of a chance to bask in its myriad advertising :lol: 
No seriously you used a hell of a lot of space for something a simple link would have done. I know forums where they scream blue murder if you use a whole link and not a TinyURL.

Mactronix :) 
a c 176 U Graphics card
December 3, 2008 8:00:48 PM

In many ways I do feel he's right. As mentioned, there have been ZERO lower end products based off the new design. They haven't managed to trickle down to lower price points at all. He and I can guess all day long as to why that is, it might be a problem or not. AMD has done a fantastic job moving the 4 series out to EVERY price point. They had to use CF in order to hit the very high end, but its there and it can compete.

I'm very interested in the GX2 part. Because the GPU die is so large, its been speculated that they will have to do another dual PCB card unlike the x2. This means increased costs due to the two PCBs. You also have to wonder a bit about power. The 4870x2 only needs a single 8pin and a single 6pin to work. Will the same be true of the GTX295? Or will each board need both, requiring a monster of a PSU? Last, the chunk of copper/other metal must cost more then a shiny penny as well.

In many ways I see little wrong with what he wrote. Nvidia mis-stepped with these cards. The question now is, what do they do?
a b U Graphics card
December 3, 2008 8:03:59 PM

^ True.
December 3, 2008 9:28:07 PM

Looks like amd is getting lucky this year with the introduction to the 4xxx's and nvidia's poor performing 9xxx’s cards, plus the gtx200's. nvidia does have the best singe gpu solution but with a few weaknesses(price/performance, power, and heat) which makes amd's 4870 look the best overall

i have little faith in nvidia and it's gtx295 nor any next gen gtx300's cards. nvidia had it's 15 minutes of fame. but its now ati's turn for the spotlight. Hopefully nvidia doesn’t fall to behind and can still compete with ati.


Quote:
yes i did, get a life
Quote:


this is really childish and you should of put a link but you could of just "sorry here is the link" instead of arguing with strangestranger. you both should get a life
December 3, 2008 9:55:58 PM

cant see nvidia standing still, they will pull every underhanded stroke in the book to regain top spot, ether way were in for a pretty exiting six months
December 3, 2008 10:30:49 PM

The way u speak of it, its like AMD is an angel and Nvidia is a devil. I mean both companies pull their own schemes, and both made their own mistakes :) 

Either way, I think that as consumers we have only to gain.
December 3, 2008 10:46:12 PM

i feel sorry for anyone that holds Charlie with any relevance whatsoever. the guy has a deep hate for Nvidia and loves to spread the FUD. he is like a fanboi evolved to tabloid writer.
December 3, 2008 10:50:47 PM

These are all previews even if they are real, I mean the 4870 was supposed to double the 9800 GX2 from the previews, and the 9800 GX2 was supposed to more than double the 8800 GTX constantly, yet we saw the GX2 get crippled when resolutions were increased, while the 8800 GTX managed to survive because of the extra Ram.

We'll see where this goes, be happy either way, all of the other cards will go down in price:D 
December 3, 2008 10:54:42 PM

oh definitely. i do not desire any dual GPU card at this point. i just find posts that circle back to Charlie about the same as posting an article in the Weekly World News or The National Enquirer. maybe reference to something real then smeared with crap.
December 4, 2008 1:40:16 AM

well charlie can be a bit over zealous with hes hatred of any thing nvidia, but he gets it right 9 times out of 10, and i find him quite insightful
December 4, 2008 1:42:48 AM

This is the first I'ver heard of him so I can't comment, ppl that speak from the soul usually have tainted information. So i Dunno.
December 4, 2008 1:49:12 AM

he does not base things on facts, mostly conjecture, but he does seem to hit the nail on the head, most of the time anyway
a b U Graphics card
December 4, 2008 1:56:21 AM

Yeah, he seemed a bit harsh. I mean, I agree that the card is a bit absurd, but if it is a total flop, why would NVidia release it?

Also, I think he is a too cruel to NVidia's design team. I mean, these are the same people who created the G80's. They aren't idiots. And that Anandtech article about AMD gives some insight into this very difficult process.
December 4, 2008 2:49:59 AM

I'd say the biggest problem is Nvidia thought they could go to a bigger die like the GTX200's, and they hit a wall a size/generation before they thought they would. In turn they have incredible power requirements, high heat, and low yields. Just seems like they really need to get something new out the door try to cut GTX200 out of their history (maybe the 9k series, too).

I just hope their next design (not shrink) is radically different, even if it ends up slower and in need of tweaking (Thinking HD 2k here).
December 4, 2008 2:56:54 AM

u kno what i still Don't understand, its the strongest single GPU, it almost doubles the 8800 GTX in some cases when it beats out the GX2. I'm still at a lose when it comes to these heat levels I keep hearing. I leave 33% fan speed, the cards idle at 55 and go to 65 on heavy load.

So all this heat talk, not that I'm denying it, I just don't see it. Every1 is calling the G200 series a flop, yet we have a card that can keep up to the 4870 X2 with a single GPU...

I don't understand what ppl's expectations are. I mean the 4870 X2 is just 2 4870s in Crossfire...yet it results in up to 30% gain compared to 1 280 GTX.


can Some1 explain?

I'm not defending either company, I just don't like it when words are just thrown out there. I mean not even a couple of weeks ago (when I was consistently on THG forums) every1 was bitching and complaining about the heat levels in the 4000 series. Whats going on now, nvidia launches the GX2, and all these heat problems sprung up out of nowhere:S

Some1 enlighten me

280 GTX OCED thermals: 50-55

http://www.guru3d.com/article/bfg-geforce-gtx-280-ocx-r...

4870 1 gig (after the fan fix): 72 -85

http://www.guru3d.com/article/amd-ati-radeon-hd-4870-10...

This whole thermal points are still affect the over all temps of the card.
a b U Graphics card
December 4, 2008 3:17:49 AM

Yeah, people who whine about heat need to take a step back and look at what they are saying. Why? Because the truth is that, to date, the worst card when it comes to heat production has been the 4870X2. Now I own a 4870X2 so I can talk from experience. These cards dump out HUGE amounts of heat, and require constant fan variations to keep the heat on the actual core down.

Now I love my X2 it kicks butt in games and I am a big fan for the underdogs (ATI AMD). But the truth remains that this card heats up my room when it is IDLING.

The 9800GX2 on the other hand, which was my previous card, was toted as a "heat monster" and a "dual PCB failure" by the INQ, was a great card and actually ran VERY cool, overclocked well, and still holds the title in some games (Think Crysis).

Simply put, ATI and Nvidia are not going to release dual chip cards unless there is a place for them in the market. No matter what the INQ or anyone else says I fully believe that the GX2 260 or whatever will be an attractive offering.

Heat is just a scapegoat for alternative motives.
December 4, 2008 4:02:09 AM

L1qu1d said:
u kno what i still Don't understand, its the strongest single GPU, it almost doubles the 8800 GTX in some cases when it beats out the GX2. I'm still at a lose when it comes to these heat levels I keep hearing. I leave 33% fan speed, the cards idle at 55 and go to 65 on heavy load.

So all this heat talk, not that I'm denying it, I just don't see it. Every1 is calling the G200 series a flop, yet we have a card that can keep up to the 4870 X2 with a single GPU...

I don't understand what ppl's expectations are. I mean the 4870 X2 is just 2 4870s in Crossfire...yet it results in up to 30% gain compared to 1 280 GTX.


can Some1 explain?

I'm not defending either company, I just don't like it when words are just thrown out there. I mean not even a couple of weeks ago (when I was consistently on THG forums) every1 was bitching and complaining about the heat levels in the 4000 series. Whats going on now, nvidia launches the GX2, and all these heat problems sprung up out of nowhere:S

Some1 enlighten me

280 GTX OCED thermals: 50-55

http://www.guru3d.com/article/bfg-geforce-gtx-280-ocx-r...

4870 1 gig (after the fan fix): 72 -85

http://www.guru3d.com/article/amd-ati-radeon-hd-4870-10...

This whole thermal points are still affect the over all temps of the card.


I believe that what people are referring to is that the GT200 generates more heat, regardless of what temperature it runs at. It is perfectly possible after all that nvidia is installing a higher quality cooler on their cards compared to the ones installed by ATI on the HD 4800 series. Remember that those power-hungry beasts need to dissipate one way or the other all that thermal energy and even if the current cooler is enough for a single GT200, I'd like to see what it will take for two of those sharing the same cooler.

To put it into perspective, it is my firm believe that with the current GT200 chip that we all know and love (sarcasm :p ) we would need something along the lines of a triple slot cooler to keep the temps down low enough for those alarmists scared of burning out their GPUs at 65 C idle.

Basically what I'm trying to say is that, they might be installing decent enough heatsinks on their single GPU cards, there's a physical limitation as to how much they'll be able to cool with a dual slot solution. That's just how it works.


EDIT@annisman: What people are referring to as heat-monster is not how hot the card runs, but how much heat it generates. If that doesn't make sense, then I'll try to explain a little better :D 
December 4, 2008 4:03:24 AM

Thought competition brings evolutionary products. Anyone remember how successful the 9800gx2 was how about the 7950gx2 cough-sarcasm-cough
December 4, 2008 4:15:40 AM

9800 GX2 was the most demanded card, and still is. Companies ahve that card back ordered. Its a late bloomer. The 7950 GX2 was the first dual of Nvidia, SLi was still something that was new to public. Now, if games don't support sli/Crossfire, its a surprise.

The 7950 GX2 shouldn't be compared. Thats the beginning of Sli and how the market shifted because of it (ofc its NOT THE literal first, but it was mainstream like it is today, where every game supports it).

For my 280 GTX, BFG told me that 100 degrees should be the max the card could take, and if it stays around 90 degrees all together its fine, but not recommended. So I mean this whole thing about the heat I personally think its BS. No triple fan, no special cooling. Stock cooling.

So please no more of this heat nonsense, its just not a factor that this card falls under. My 280 GTXs make a 3 way sandwich, barely any air circulates in and out of my case because of water cooling.

December 4, 2008 4:33:21 AM

You either have very selective reading or you're just not getting it. These GPUs are HUGE power-hogs, and all that excess electric energy is transformed to thermal energy which in turn needs to be effectively dissipated for the card not to overheat. Nvidia is probably installing some very good reference cooling to dissipate all the thermal energy and keep all the heat in check, but there are physical limitations as to how much heat (thermal energy) you can dissipate with a dual slot cooler.

It was possible to cool G92 even with a single slot heatsink (albeit not effectively), so it was perfectly feasible to try for a dual slot-dual PCB card with a similar thermal envelope of two G92 cards. However, GT200s require dual slot cooling to work well within specifications because their thermal envelope is just so much higher than any G92 chip, so a Triple Slot cooler would be the logical step for a dual PCB card to work without extreme heat issues.

You're basically thinking of the end result (GPU temperature) and you're not thinking about what it takes to keep it at those levels compared to previous solutions, all that power drawn from your PSU has to go somewhere.

On another note, I'd like to see how many power plugs this thing would need, I don't think it would be too farfetched at least two 8-pin PCI-E and one 6-pin PCI-E even with a 55nm shrink.
December 4, 2008 4:48:25 AM

Lets see around 750 wattage for a whole Systemwith triple SLI 280 GTX. Don't think those power levels are too out there:) 

http://www.guru3d.com/article/geforce-gtx-280-sli-tripl...

And thats peak:) 

So really not that much wattage being pumped into the 280 GTX


PC 100% usage (wattage gaming Peak) = 373 Watt load 280 GTX

PC 100% usage (wattage gaming Peak) = 317 Watt load for the 1 gig 4870

thats a 60 watt difference of electricity being pumped into the card. So really :p 
a b U Graphics card
December 4, 2008 5:14:04 AM

Yea I think 3X 280GTX are actually less demanding than 2X 4870X2, I might be wrong but I'm pretty sure that require at least 900 Watts.
a b U Graphics card
December 4, 2008 5:37:38 AM

1 thing not one person has mentioned that owns, or has owned an nVidia card is, when comparing a nVidia card to a ATI card, the fact that theres more heat coming out the back of your rig is a good thing.

Dont forget, nVidias solution is to dump some of the heat back into the case, unlike ATI. Add that heat out the back, and its similar, as the power requirements are similar, as seen by the 4870 vs the G260 power usage.

So, in the end, Id prefer the heat on the outside of my case, whether its in idle or not, load or not, instead of some being left in my case
December 4, 2008 9:46:03 AM

Can't imagine a gtx 260x2 pumping your case full of heated air. If thats the case then aftermarket liquid cooling is the only option.
December 4, 2008 9:46:09 AM

Can't imagine a gtx 260x2 pumping your case full of heated air. If thats the case then aftermarket liquid cooling is the only option.
December 4, 2008 1:01:19 PM

JAYDEEJOHN said:
1 thing not one person has mentioned that owns, or has owned an nVidia card is, when comparing a nVidia card to a ATI card, the fact that theres more heat coming out the back of your rig is a good thing.

Dont forget, nVidias solution is to dump some of the heat back into the case, unlike ATI. Add that heat out the back, and its similar, as the power requirements are similar, as seen by the 4870 vs the G260 power usage.

So, in the end, Id prefer the heat on the outside of my case, whether its in idle or not, load or not, instead of some being left in my case


Method of cooling isn't an excuse, considering it can be changed, I mean we have dual fan 4870 X2s and regular turbine fan 4870 X2s.

Just because of the cooling, thats a stupid reason to say Nvidia just pumps it into the case, I mean thats not that hard to change. I can tell you this though, when I put my hand on the 280 GTX it pumps the heat outside the case, and very minimal inside the case. The 9800 GX2 I had pumped alot more inside the case.

If you look at the X2 and the 280 GTX, they both run a turbine with air being pumped out the back. So I don't kno about this heat in the case. Most of the argument I saw is the 9800 GX2 which pumped out the top, and not reall the sides.
a b U Graphics card
December 5, 2008 5:19:01 AM

L1qu1d said:
9800 GX2 was the most demanded card, and still is. Companies ahve that card back ordered. Its a late bloomer.


But it's as expensive to make as the GTX series, so it's pointless to keep making them at a loss. It's like saying that in order to help your losses at selling Ferraris @ $30,000 a piece you decided to sell Lambos for $20,000 a piece while the competition has excess demand for their VWs and is selling them for $5,000 over value.

Quote:
The 7950 GX2 was the first dual of Nvidia,


No, the Gigabyte GV 3D1 66GT, Asus EN6800GT Dual and GV 68GT were the firsts, and then the humongous GF7900GX2


Quote:
For my 280 GTX, BFG told me that 100 degrees should be the max the card could take, and if it stays around 90 degrees all together its fine, but not recommended. So I mean this whole thing about the heat I personally think its BS. No triple fan, no special cooling. Stock cooling.

So please no more of this heat nonsense, its just not a factor that this card falls under. My 280 GTXs make a 3 way sandwich, barely any air circulates in and out of my case because of water cooling.


GTX280 consumes more power , in ICs power is turned into heat, GTX280 produces more heat than any other chip. That it runs cooled on its card is a result of the HSF assembly, and that's mainly due to alot more copper than the others.

However put that on a single card or a dual card and now you have to keep the cooling effective enough, that's the problem. Put a huge HSF on and R600 and you can make it's temp lower than a passively cooled HD4650, but that doesn't mean the R600 doesn't have a heat problem that needs to be considered when designing solutions to put the chips in.

Quote:

PC 100% usage (wattage gaming Peak) = 373 Watt load 280 GTX

PC 100% usage (wattage gaming Peak) = 317 Watt load for the 1 gig 4870

thats a 60 watt difference of electricity being pumped into the card. So really :p 


The thing to remember is that that's 60Watts that's primarily the GPU out of something where the entire system is running 373 W that's a large amount of power, so say that of the 320 watts in the 4870 half of it is the graphics card, then that's almost 40% more heat for the GTX280, so yeah that's a big difference.
a b U Graphics card
December 5, 2008 5:35:02 AM

BTW, charlie definitely has the knives out for nVidia, but the thing is that even if he's guessing maliciously, he's been getting it right more often than not sofar. Including the extent of the GPU bonding issue and the delay of the 55nm which was expected in August, Sept, Oct, Nov, and Dec by most everyone else (Fuad was saying a fall release which made sense to those of us who delt nV needed something ASAFP).

I wouldn't invest or bet money based on what Charlie says, but I also wouldn't bet against him either right now, especially on something like this where even a successful 55nm transition meant still far from the same production efficiency as the HD4K series (resulting in about 65% as many chips if it were a good transition). The delay suggests it's far from an easy transition, so I would suspect we're still seeing a 2:1 ratio even in 55nm, especially as the HD4K process matures.

PS, in the future Ranger, just link to the articles, it's not only a good policy it's actually also part of the forum rules.
a c 176 U Graphics card
December 5, 2008 1:33:33 PM

Speaking of knives out for Nvidia, I'm about to bring a gun. Anyone see the article on [H]ard about Nvidia renaming the 9 series again? Seeing as they haven't brought any lowend parts out based on the new arch, they will instead rename the current 9 series cards with GTX names. They seriously need to stop trying to hoodwink us and just work on better cards.

I mention this because one, I'm pissed that Nvidia is renaming their cards yet again. Second, I wonder if this is a sign of serious problems. Anands article said that it takes about 9 months to spin a new arch down to the lower price points. While the GTX hasn't been out this long, I take it that they are having issues with the lower end parts. I assume this because they plan to rename current cards to give the appearance that they have a complete portfolio?
December 5, 2008 1:41:09 PM

How come ur mad? You kno about the scheme ahead of time. I don't get how this can anger any1. Do the research and see whats going on. If idiots blindly choose a card, thats good for the company. Life is built on ignorance and I don't think Nvidia is to blame. I think its society to be honest. I mean no1 is mad that 2 months after the 4870 (or 3) the 4870 1 gig came out, pretty much kicking the 4870 512 out.

Honestly if my company had a high selling point by renaming the model...why not? Bring profit in for nothing. Its mostly like the car companies...but no1 says anything about that. Same car, maybe slight redesign. Same features almost.....Yet no1 complains:) 

This is quickly turning into an Nvidia Hate forum, rather than a discussion, I mean we're supposed to be finding the problems in both companies, yet we have all fingers pointed at 1. Just link when the 8 series came out, we had our fingers pointed at ATI.

Move on, and look for what you need, don't get mad cuz they'll trick a couple of people.
December 5, 2008 1:48:29 PM

That's old news, we knew for about 2 months that nVidia would be renaming their 9 series to GT 100 to sell to nvidiots around the world. Bet the grass doesn't look so green for the green team right about now...

One thing I hadn't considered that he mentioned is that nvidia has yet to release or even announce on the roadmap any midrange GT200 based products... Bet that whole "We do Huge Single die as opposed to scalable small GPUs" strategy isn't looking so juicy now either....

EDIT: That's simple, back then Nvidia had the lead and wasn't so deceitful, while AMD was letting us down left and right. Now AMD has come through with their claims, while nvidia has been screwing with us, some people are just not happy with their business practices from the past 18 months.
December 5, 2008 2:01:17 PM

past 18 months was the era where Nvidia had the 8800 series out...ATI had nothing, don't forget that ATI was ATI before it became AMD. Now its not ATI, its AMD. Just cuz it has ATI's name, doesn't make it the same company.

Look at the 8800 GTX, its now 2 years running, and that card can sitll do current gen games MAXED OUT, even better than the 2000s 3000s, and the 4850 here and there. I mean really Thats impressive. I haven't seen a card put that much of a fight since the 1950 Pro, which did last, but didn't max out alot of games.

In stead of seeing both ATI and Nvidia making these stupid X2 and GX2 cards, I think both companies could benefit from a single GPU solution. I'll give you this much, atleast ATI put the effort of creating a unique card for the dual GPU instead of sandwiching the cards together like Nvidia.

But really, if it were my choice, 1 card , 1 GPU none of this does it scale crap.
Both companies should concentrate less on creating cards, and more on mastering the drivers that are out now, because honestly the 4000 series drivers are just pathetic at this point, and the tri sli drivers from Nvidia are the same.

Waste of money.
December 5, 2008 2:19:40 PM

Past 18 months was the release of the 8600, when the disappointment began, because let's face it, between the HD 2600 and the 8600 it was a crapfest on the midrange. (You also failed to realize I didn't refer to ATI at any point in my previous post, I referred to AMD)

And even though the single GPU route is optimal to me as well, I rather have smaller, powerful enough GPUs any day over insane power-hungry, heat generating GPUs.


December 5, 2008 2:24:27 PM

Thats where you and me differ, trust me the 8600s were nothing compared to the 500$ power hungry, bulky, slow and boiling hot 2900 XT. I mean both companies had their faults. Yet 1 comes out ontop I don't get it:) 

And no 18 months ago, is when prices started dropping, and it was a month before the 8800 GT release....8600s were already pretty cheap then.

Although they didn't do justice in our desktops, they did in our laptops:) 
December 5, 2008 2:30:41 PM

I dont sway to either company as you can see from most of my posts. but in this thread some1 needs to play down the middle.

I don't care if you critize the living hell out of Nvidia. Just don't throw useless comments such as marketing schemes. I mean both cards did it. Remember the ATI GTO, XL, XT pro GTS AGP PCI.

Its just a shame to see anger being thrown in 1 direction and not the other.
December 5, 2008 2:39:47 PM

Well I'm not set out to prove either company. Just what some1 needs. Do I prefer Nvidia over ATI? At the moment, yes because as an Enthusiast I take performance over Price/performance.

Seeing as 280 GTX tri is the strongest setup to date.

Do I put favor in Nvidia when giving out opinions. No, because to each his own, I'm not going to suggest a 4870 to tri sli board owned, just like I won't suggest a 260 GTX to a Crossfire board owner.

So please don't give reasons as to why I say what I say, i mean most of the ppl bad mouthing Nvidia are ATI owners, just like the opposite works for Nvidia owners.

So really threads like these will always spark up. I personally don't hold a grudge on either company. I switched to Nvidia cuz of the nonsense 2900 XT, because if that held its own in the market, I would prob still be an ATI holder atm.
December 5, 2008 2:41:13 PM

18 months ago was June 2007 (The 8600 were released on Q2 2007, can't remember exactly when) not October-November 2007, gotta get that clock of yours checked out. And the HD 2900 XT was $380-420, and I don't know if you remember, but I was one of the biggest HD 2900 bashers back then (and it was justified).

I really hope you've just playing dumb with me since the beginning of this thread, because I first explained really well why the GT200 is an insane heat-generator and you didn't seem to get it, but now it's just too much.

The 8800 were good, but that's where it ended, they released the 8600, then paper-launched the 8800GT to hurt AMD sales of the HD 3800s (sneaky, but valid strategy though), then the 8800GTS G92 was released to start the confusion, shortly after the renaming began with the 9 series, and last but not least the release of the GT200 products at ludicrous prices... Reason? The "Just because I can" strategy that doesn't sit too well with me or many of it's customers. And don't forget the crapfest of horrible chipsets released by nvidia since the 680i, which haven't improved much if anything.

Even though it's valid economy to sell products overpriced if there's no competition, If you treat your customers like that, they'll buy once from you, but they might not come back. AMD/ATI could've released the HD 4000 at $300 and $450, but they decided for their own reasons (be it honesty, screwing nvidia or whatever you can think of) to release their cards at great price and that's where the started seducing me to their side.

EDIT: I lean towards ATI now (And I accept it), but it's because of the reasons above, not because I'm just a brainless tool that likes red over green. There's really no price/performance point that nvidia can compete at the moment, so my justified bias is a non-issue for the time being anyway.
December 5, 2008 2:52:32 PM

8600 series april 17th 2007. Moving on

No I haven't been playing dumb, its just every1 is complaining about the scheme, yet you all know about it. None of you did it. Whats there to complain?

will you buy the next rename series? No

then? Long live idiocracy right?

I haven't been playing dumb, I've been playing the devil's advocate. Theres a difference. I mean these are very minimal reasons not to switch a company.

Price/Performance? 260 GTX was first to reach 200$. 9800 GX2 sub 300$ even a couple of months ago, perfomed better than any card out.

Really price/performance was started by ATI but is leaning towards Nvidia atm.

So really? There is no confusion. You don't seem confused do? Whats the problem?
a c 176 U Graphics card
December 5, 2008 3:20:40 PM

The issue liquid is instead of coming up with new cards, they CONTINUE to rename old ones. Aside from the GTX, which consists of two cards, they are still using the same g90/g92 that they've been using for how long now? By the time real new cards show up, that figure will probably be at least 24months.

Anger/hate is the wrong word for me, but it conveys the feeling. I want innovation, not new names to F over the dumb. BTW, before anyone brands me one way or another, the new card in my computer is an 8800GS/9600GSO.
December 5, 2008 4:03:36 PM

Well if the cards still hold their own, might as well port em further, get new public interest. its only been like a year that they've used it:) 

They are still good cards right? Renaming them is just another way of saving money. Something ATI needs to learn to do. They already have share holder that owns 20% of AMD all together.

Only reason I'm not that hot over ATI anymore is because its not the same company. AT ALL. Its just AMD. I use to have AMD Xp + ATI nothing else. They fell behind too much. I needed to move forward. Good for them. I'm not jumping to ATI as my main card until I see consistency in performance, not GPU making.

I mean these ones are good, what about the 5000s? what gaurantees are there they will b awesome? None. Same goes for the Nvidia:) 

We'll have to wait and see.

I could say the same thing and use my 4870 X2 in my secondary as an excuse for you not to judge me:) .

Only reason i bought it because I was on the fence, which was better 4870 X2 quad or 280 GTX tri. I got my answer. So I went tri:p 

every1 that posted here knows its a rename. Which is amazing for us. All the old cards will be marked DOWn sooooooo low. I mean ever since the 9800 GT I've been seeing 8800 GTs going for like 90$ in stores. Honestly you guys should be happy.

Out with the old, in with newer:) 
December 5, 2008 6:21:46 PM

L1qu1d said:
How come ur mad? You kno about the scheme ahead of time. I don't get how this can anger any1. Do the research and see whats going on. If idiots blindly choose a card, thats good for the company. Life is built on ignorance and I don't think Nvidia is to blame. I think its society to be honest. I mean no1 is mad that 2 months after the 4870 (or 3) the 4870 1 gig came out, pretty much kicking the 4870 512 out.

Honestly if my company had a high selling point by renaming the model...why not? Bring profit in for nothing. Its mostly like the car companies...but no1 says anything about that. Same car, maybe slight redesign. Same features almost.....Yet no1 complains:) 

This is quickly turning into an Nvidia Hate forum, rather than a discussion, I mean we're supposed to be finding the problems in both companies, yet we have all fingers pointed at 1. Just link when the 8 series came out, we had our fingers pointed at ATI.

Move on, and look for what you need, don't get mad cuz they'll trick a couple of people.



ive got to disagree with you there, I for one dont like to see anyone getting taking advantage of, whether there nvidia idiots or not, a company should not have to sink that low, and what does it say about said company and the way they treat there buying public
a b U Graphics card
December 5, 2008 7:23:12 PM

In June, when I was claiming the 8800GTX was only a midrange card, I came against alot of opposition to this thought. So, since June, nVidia has made nothing but midrange "new" cards by only renaming them, some for the third time. Foll me once.... but again and again?

Tell me, when people come to forums, asking "I just bought a G150 (or whatevers equivilent to a 98GTX), and had previously owned a 88GTS512, and the performance difference is nil,why", hows this good for nVidia? And why would anyone defend that? If I defended this, Id hope to NOT run into anyone whos been taken in by this deception, as they may not understand why Im defending nVidias naming scheme. This is just a guess, tho maybe they would understand. Accepting this behavior from nVidia isnt acceptable, even if its your preferred company. Theres no "them", or those that buy lessor cards, and Im not affected, as long as Ive got the new top card, and its truly new. This paints nVidia in a bad light, it slows progress overall, just like not implementing DX10.1, which could have been done on nVidias new cards. Just understand, things like this is cutting to the quick, leaving only the most die hards standing fully behind nVidia, and thats not a good thing
a b U Graphics card
December 5, 2008 7:44:24 PM

cal8949 said:
i have little faith in nvidia and it's gtx295 nor any next gen gtx300's cards. nvidia had it's 15 minutes of fame. but its now ati's turn for the spotlight. Hopefully nvidia doesn’t fall to behind and can still compete with ati.


Fifteen minutes of fame? :na:  Try 18 months of pretty much uncontested dominance (Nov 2007 to May 2008). I don't think you have to worry about Nvidia falling behind or being unable to compete with ATI. They are still very competitive in many categories and price points as it is. This isn't close to the situation ATI was in about the time the HD 2000 series was released during the 8000 series dominance (and ATI had to "merge" to survive). ATI has pretty much spent this release and it will most likely be another year or two before they have any sort of meaningful refresh or new release. Nvidia will counter before then. Back and forth between the two - nothing new for the last 8 or 9 years.
December 5, 2008 8:04:30 PM

JAYDEEJOHN said:
In June, when I was claiming the 8800GTX was only a midrange card, I came against alot of opposition to this thought. So, since June, nVidia has made nothing but midrange "new" cards by only renaming them, some for the third time. Foll me once.... but again and again?

Tell me, when people come to forums, asking "I just bought a G150 (or whatevers equivilent to a 98GTX), and had previously owned a 88GTS512, and the performance difference is nil,why", hows this good for nVidia? And why would anyone defend that? If I defended this, Id hope to NOT run into anyone whos been taken in by this deception, as they may not understand why Im defending nVidias naming scheme. This is just a guess, tho maybe they would understand. Accepting this behavior from nVidia isnt acceptable, even if its your preferred company. Theres no "them", or those that buy lessor cards, and Im not affected, as long as Ive got the new top card, and its truly new. This paints nVidia in a bad light, it slows progress overall, just like not implementing DX10.1, which could have been done on nVidias new cards. Just understand, things like this is cutting to the quick, leaving only the most die hards standing fully behind nVidia, and thats not a good thing



With today's knowledge would u personally buy a 9800 GTX??? no Why? Cuz u researched. People need to learn to that. That closes everything because well its true:) 

Don't forget its not really "Nothing" I mean the 9800 GTX can do tri, while the 8800 GTS can't. So really.....Its up to the consumer to notice these things, ask questions and then find out whats right.

Not rush head first into something that looks like a curtain and then ends up being a wall:) 

If you argue this, then I'm sorry we're never going to be on the same page.
!