Sign in with
Sign up | Sign in
Your question

GTX 295

Last response: in Graphics & Displays
Share
a b U Graphics card
December 3, 2008 6:23:08 PM

http://en.expreview.com/2008/12/03/dual-gpu-gtx260-gx2-...

Quote:
It’s an accomplished fact that GT200 graphics core is updated to be 55nm. We exposed the image of GT200 yesterday, though many people are doubtful about its truth. Anyway, everything will be clear sooner or later, and we believe our readers will have their own correct judgments.

Dual-GPU designed GeForce GTX260 GX2 graphic card will be officially named as GeForce GTX295. Apparently, GeForce GTX295 is coming to regain its dominance of performance which has been grabbed by AMD Radeon HD4870X2. It will utilize two GT200 of 55nm with 216 stream processors, and probably carry on dual PCB design. As of now, the frequency and memory are not known yet.


Our source indicates that the design of GeForce GTX295 graphic cards has been finished, and its trial-production and test process will follow later. With the big day coming closer, it’s time for GTX295 card to show itself.


Seems to be on target with my origional expectations. Should be nice to see how it stacks against the 4870X2...

More about : gtx 295

December 3, 2008 6:31:27 PM

is it going to be a triple slot cooler like that one PALIT card? Cause i think that thing is going to produce some serious heat, and hence cause some serious issues.
a b U Graphics card
December 3, 2008 6:39:37 PM

Can't be much worse than the GX2 was...
Related resources
Anonymous
a b U Graphics card
December 3, 2008 6:42:20 PM

as long as they properly support it with drivers it will be awesome

for those with lots of cash...

AND as long as it costs around 450 - 500 $... if it costs any more it won't be bought... just like the 4850 isn't being bought... because its 400$ and the 2 individual 4850s are 300$...
a b U Graphics card
December 3, 2008 6:43:35 PM

People, let's not hate on the GX2, it was a pretty good card when it came out. Also, the heat was nonexistent if you turned up the fan speeds a tad.
December 3, 2008 6:45:19 PM

O_O i wounder how much it will cost?
a b U Graphics card
December 3, 2008 6:48:09 PM

well GX2 was 600 when it started.
December 3, 2008 6:49:34 PM

still using the 2 PCB design, I wonder how much that is going to limit their clocking/overclocking.
December 3, 2008 6:52:21 PM

Very nice. Competition at the upper tier of video cards is good for both ATI and NVidia, as well as the consumer. Now if they have resolved the price fixing issue, things should finally be priced at fair and competitive market value.
December 3, 2008 6:54:07 PM

Maybe heat will be down because 55nm...let's wait and see. If it's good i'll buy for sure.
December 3, 2008 7:06:58 PM

Why do they insist on 2xPCBs? Isn't it more difficult to produce, harder to cool, and requires lower clockspeeds? Do they just want to keep on going because they don't want to copy ATI?
December 3, 2008 7:08:48 PM

frozenlead said:
Why do they insist on 2xPCBs? Isn't it more difficult to produce, harder to cool, and requires lower clockspeeds? Do they just want to keep on going because they don't want to copy ATI?


I would suspect it is due to mainly its size and can't be placed on 1 PCB.
December 3, 2008 7:36:49 PM

dual GPU cards should be single slot solutions and thats it in my opinion, reaching quad sli... too many problems, I wouldn't grab another GX2 or X2 card, Unless I have a single slot mobo, or I see drivers have been fully matured.
a c 176 U Graphics card
December 3, 2008 8:05:34 PM

Nice to see I was right. The GTX280 is simply to big/hot/power demanding to sandwich together. Using the 260 is a slightly better option. As for which is faster, at the moment I'm thinking the x2. The 4870 is "faster" then the GTX260+. (more like equals if anything.) If Nvidia has to lower the clock speeds at all to get two working together, then the x2 will be champ. (Unless you SLI the GTX280 that is...) The question now is what will be the clock speeds, and will it come out anywhere near $500? (I just checked newegg, I saw some x2's for just over that amount.)
December 3, 2008 8:15:28 PM

niiiice. now slap 4 of these babies on MSI's quad sli x58 board, for (octa?)8-core gaming. lol. if only games could take advantage of that much power.... well actually I guess I'd have to rewire a new 20 amp circuit to my room to power a rig that would need like 2kwatts lol. thats more than my air conditioner. though i think I could make up the cost of that by removing the heater in my house as I'm sure I wouldn't need one with this hahah.

and then eventually when intel releases their 6 core nehalem that supports hyperthreading, thats 12 more cores. stick two of those babies on a dual processor motherboard (lol like skulltrail when they release one for this) and you'd have 24 cpu cores + 8 gpu cores. 32-core desktop gaming!!!! can't wait!!! hahahaha theres something wrong with me.
December 3, 2008 8:28:07 PM

There is no excuse for 2 PCB's. ATI has sucessfully made several dual GPU cards on a single PCB, and the 4850X2 is over an inch longer than the 4870X2.

Also, I predict $700 for this, and not worth it just like SLi'ing GTX 280's or tri-sli'ing anything isnt worth it.
December 3, 2008 8:45:38 PM

Let's also keep in mind that they said "probably" on 2 PCBs...not definately. Until more information is released, especially from a substantiated source like Nvidia, this is just speculation. None of it could be true, all of it could be true. Time will tell.
December 3, 2008 8:48:30 PM

I just can't shake off the feeling that this will be a triple slot behemoth. Good for nvidia I guess, but it's not practical for anyone.
December 3, 2008 9:06:50 PM

Wow 2 underclocked GTX 260s together to go up against a 4870 X2?! The only reason the 4870 X2 is good is because it gets around 15% more performance than 2 4870s crossfired. I really hope that nVidia can pull this one out so that we can move on to some new architecture, but the 2 PCB thing is kind of looking like a 9800 GX2 flop (couldn't beat 2 8800 GTS 512mbs in SLI at stock). Oh well...
December 3, 2008 9:22:00 PM

it would be cheaper and quicker to have 2 PCBs than 1 Dual GPU PCB for the company. I don't kno if it would be cheaper for the consumer, but it would save the company money.

Remember that these are 2 260 GTX revisions, which will prob be able to clock higher, and pack a little more punch.So we'll see how they performa, don't jump the gun 2 soon.
a c 176 U Graphics card
December 3, 2008 9:41:04 PM

Why would they be able to clock higher? You now have two GPUs pumping heat into a single heatsink. Unless they do something "exotic" on the cooling, they might have to drop the clocks a bit.
December 3, 2008 9:53:29 PM

Well, if they revise the core and lower the critical temp, they'll be able to clock it higher to reach the same temp. There's no baseline to compare it to, though, considering there isn't a 260X2.

That is, of course, barring the thought that the 295 will magically be faster clock-for-clock. ;) 
a b U Graphics card
December 3, 2008 9:58:06 PM

Running all those tracings for a 512 bus on 1 pcb would be a feat in itself, and possibly more expensive for redesign and adding layers.

Im hoping nVidia is wise enough to be competitive with their pricing on this, as I dont see it killing the x2. It should win, or else why build it, but not a huge win
a c 176 U Graphics card
December 3, 2008 10:06:24 PM

Why would they revise the core??? The GTX295 is born in the same way as the GX2. They are behind (or perceived to be behind) and are reacting the same way. Slap two PCBs together and put SLI on a stick. They aren't re-working anything. They can adjust clock speeds, but thats about it. Because you now have two GPUs dumping heat into the same heatsink, they'll either need to upgrade the sink, or downclock the speeds.

I'll even bet this won't use the new 55nm parts, assuming it comes out "soon" (3-6 months.) They'll use the old parts in the GTX295, while selling the new 55nm chips as single cards. This works out better for them profit wise. If the GTX295 gets hung up for some reason, then this might change.
December 3, 2008 10:27:58 PM

I don't think it will be feasible without the 55nm parts... There's a reason why they haven't released this thing yet, heat issues more than anything.


a b U Graphics card
December 3, 2008 10:51:37 PM

Heat and power draw. Having 2 pcbs, 2 512 bus' 2 gigs of ram , or close, youll be drawing more power than the 4870x2 compatively.
December 3, 2008 10:53:02 PM

The 4000s single GPu has been known to have 80+ degrees, while the 260 and 280s manage a 60 degree temperature, yet we're talking about heat issues????

I think power drain would be a reason not to do, but heat, I doubt it. I mean the 4870 X2 does great with heat, and again look at it's single GPU little brothers:) 
a b U Graphics card
December 3, 2008 11:24:26 PM

Cmon L1qu1d, you should know better. What dont you understand about what I just said?
a b U Graphics card
December 3, 2008 11:25:40 PM

Try running those G200s at 80+ and see what happens
December 3, 2008 11:31:18 PM

so ur saying that the 9800 GX2 could do 90s and the G200 cannot do 80s or 90s?

I'll take you up on ur offer, I'll turn the fan speed from 33% to 10% and see what happens:)  I'll down clock them to stock 280 GTXs.

Now I can do 3 280 GTXs on a 1000 watt PSU, and like I said I don't think heat will be an issue so much as power drain, which will prob need a 1200 watt PSU. These are just estimates:) 

I mean 3 280 GTXs, which pretty much blow into each other, run at 65-70 degrees on load.

I don't think this is too far fetched.

The price might be outrageous, maybe hit high 600s-700s, with a die shrink, I think it can be do able.
a b U Graphics card
December 3, 2008 11:36:44 PM

The point being made here is, its not the temps coming out, but the temps on the core. nVidia has their cards/cores at lower temps for reasons, not convenience, as weve all seem higher temped nVidia cards out, as per your example.

Each card/solution has an ideal TDP operating temp and spec
December 3, 2008 11:45:56 PM

so ur saying even with a die shrink this is impossible? 9800 GX2 is just the G92 chips Pasted together.

I understand what ur going for but, really its not impossible. So ur saying that the core temperatures, of the 4000 series, is lower than the Nvidia series. I said core temperature, not the temperature coming out:) .

I really want to know, I mean I might have gotten this all wrong. No offense or anything, but ur posts are becoming very narrow minded compared to what they were a couple of months ago, thats what I'll leave it at.

I just want to point out that the operating temp of the GX2 is set to be higher than that of the other G92 cards, yet they are all the same GPU, no G92A or G92B
December 3, 2008 11:49:40 PM

Sorry L1qu1d, the 4870 produces less heat than the GTX 260 by a decent margin, the fan speed is why they seem so high. Run those GTX 280s at 2% fan speed then tell me I'm wrong. The 9800 GX2 is a bad example again, the problem is that to beat the 4870 X2 they need to not only keep STOCK settings on those cards, but also OVERCLOCK them by a pretty large margin to compete with the 4870 X2's extra performance over 2 4870s.

That all said, I hope nVidia can manage to get this thing to compete with the 4870 X2 so that prices drop and both companies can be spurred into action on the next architecture. Besides, "GTX 295" sounds like a cool name!
a b U Graphics card
December 3, 2008 11:56:41 PM

What Im saying is, the G200 isnt a G92. Cant be treated the same as fatr as thermals etc. I never said it cant be done, as if it can, it will be.

What I am saying is, between power draw and their design (sandwitch style), it may make it even harder, plus adding the extras the 4870x2s dont have to have like the 295 will have make it draw more power, POSSIBLY make for harder to cool solution due to the sandwitch design, even at 55nm.

Its obvious nVidia had trouble going to 55nm. So far the assumptions have been heat/leakage. If this is true, making these cards will include those problems. Im looking at this from a technical aspect, no fanboyism here. I think nVidia needs this card, and they need to price it right. Im hoping they do
December 3, 2008 11:58:22 PM

I dunno....the 4870 does run extremely hot...the fan speed isnt everthing, there is still the back of the PCB where the chip is and what not. Just removing my 4870 and getting this 4850 IceQ4, my overall system temps droped by a good 1-3c, thats NB and CPU. At 45% fan speed, i couldnt keep my finger in the back of the PCB of my 4870 for longer than 3 seconds without being stupid, and that was while CCC was reading GPU temp at 55c-60c, idle.

I wont say that this card is impossible, but it might be stretching standards a bit. By that I mean its probably not going to be possible to be used without GPU support brackets, aka special cases or custom made support. These single cards like 4870 and even my 8800GT bend enough, this new card will probably break the damn PCI interface.
December 3, 2008 11:58:24 PM

L1qu1d said:
so ur saying even with a die shrink this is impossible? 9800 GX2 is just the G92 chips Pasted together.

I understand what ur going for but, really its not impossible. So ur saying that the core temperatures, of the 4000 series, is lower than the Nvidia series. I said core temperature, not the temperature coming out:) .

I really want to know, I mean I might have gotten this all wrong. No offense or anything, but ur posts are becoming very narrow minded compared to what they were a couple of months ago, thats what I'll leave it at.

I just want to point out that the operating temp of the GX2 is set to be higher than that of the other G92 cards, yet they are all the same GPU, no G92A or G92B


You bring up a good point. The amount of heat a card is able to sustain depends on a lot of things. The GTX 295 is very possible, but it will most likely not outperform the 4870 X2 in any real way, kind of like the GTX 260 vs 4870. At this point I just can't find a good reason to why it would be possible, but hey who knows. nVidia might surprise us all. You also have to understand this GTX 295 will almost have to cost more to produce than the 4870 X2, even with the die shrink. This raises the question, what does nVidia have to gain from this release? They already own the top performing setup, albeit not by too much but at that price point the consumer paying that much isn't even conscious of any price/performance ratio, so why do they need to compete in EVERY price point? It just doesn't seem like a good idea on nVidia's part, but we will find out.
December 4, 2008 12:02:08 AM

What noticeable extra performance, only thing I see is the gain of going from 512 megs to 1 gig. I mean the 280 GTX is only 30% slower tops than the 4870 X2 (ofc ATI games would do the ati card wonders, just like Nvidia games do the Nvidia card wonders).

Your saying that another 280 GTX stock, couldn't easily close the gap of 30%? Come on I think that can be done. The 280s run at a fan speed of 33% stock I believe, your sayign that the 4870 ran at 2% or what I don't get ur point. I mean 2% isn't that basically barely rotating? I really doubt any card could last a 2% rotation percent.

I'm not defending any company at the moment, and I'm not even saying that the 280 GX2 would be better, all I'm saying is its DO ABLE, I don't under stand where these comparisons come from blood_raven. I mean how is the 9800 GX2 a bad example, thats a dual PCB card, it runs on 2 every day G92 chips, yet it can manage a higher temperature in the manual than the rest.

I'm not arguing who can beat who, frankly I don't care because I don't see myself ever buying a dual GPU card again, only 1 I bought was teh 4870 X2 cuz i got suckered in by the whole, it performs like a single GPU rumour, and also in all fairness I'm a star Craft fan, and I'd like to see it in Dx 10.1:) 
a b U Graphics card
December 4, 2008 12:05:17 AM

Remember, nVidia has to have the halo card tho. Its in their blood
December 4, 2008 12:05:57 AM

2% speed is about 40RPM, the card would be dead in less than a minute. Add to the fact that heatsinks with fans arent designed to be ran passively.
December 4, 2008 12:09:42 AM

Its always been like this, Nvidia has the best performance, but at the highest cost, and ATI could almost match at a lower cost. Thats why I was with ATI up until the 8 series, and ATI's horrible 2900 XT (which sold almost as much as the 8800 GTX and the 2900 XT performed at 8800 320 levels).

frankly all I'm doing this Christmas, or possibly my birthday is grabing and middle i7 CPU + motherboard, and seeing what will come this march. I mean with the amount ofpower 3 280 GTXs have and scaling it has, the quad sli setup of the 280 GTX isn't that appealing to me, I mean its an extra card...IF it happens.

I think a single GPU card solution from both companies would be more appealing to me.
a b U Graphics card
December 4, 2008 12:12:49 AM

The G280 may be done, but all the speculation is pointing towards a 260, not a 280, and this changes things alot, as for performance goes, as even the 216 isnt a killer for the 4870 1 gig. Like I said, if its done, itll win, or why do it? But it wont win by huge margins. The 280 sli solutions dont always win against the x2, not in all games. My guess is 10% faster, which is pretty good actually, but again, nVidia needs to price this thing right

As to the relevence, I point out the 7950x2, it was a horrible card, early eol, total lack of driver support, and best left forgotten. But nVidia didnt have a response for ATIs 1900 series, so they slapped it together. Why? Because a halo product IS that important to them, and is why we probably will see something like a 295
December 4, 2008 12:12:56 AM

L1qu1d said:

I think a single GPU card solution from both companies would be more appealing to me.


I agree. There are just too many hurdles in multi-card setups, on a hardware and software end. You almost never get your monies worth, something is always there to screw things it, be it scailing, bus bandwidth, power requirements....... Sometimes I feel like SLi/Crossfire are only around as a lazy technology. Why make a stronger single GPU when you can put 2 "weak" ones together and end up spending more money? I know thats not exactly how things are, but...


a b U Graphics card
December 4, 2008 12:25:39 AM

Heres another thing to look at, which I pointed out, and may have been mistaken about. The die size for the 4870 is256mm vs the G200s 576mm. Reducing the G200 30 some odd percent isnt going to get it anywheres near 256, which means theres a disparity, a large one, even at same nodes.

What does this mean? Costs, possible trouble going from 1 node to another (thus my references to possible heat issues) and lastly delays. Remember, when ATI went from the 2900, the 3xxx series offered NO improvements in perf, just like here. It took ATI 3 tries to get it right, as it may nVidia, just specualtion, but seeing the delays, ya never know
December 4, 2008 12:41:20 AM

L1qu1d you missed the point entirely... the GTX 295 should be able to match the 4870 X2, but it wont outperform it in any substantial way, atleast I don't see how. No one is talking about a GTX 280 GX2, we are talking about a GTX 260 GX2. Can you honestly say 1 GTX 260 and 1 4870 are not almost exactly the same? I said the 9800 GX2 is a bad example since it couldn't match 2 8800 GTS 512s in SLI. If the GTX 295 was like a 9800 GX2 then it could not compete with the 4870 X2, meaning nVidia would not be likely to release it. You need to stop getting so personal about this, no one is saying it is impossible, no one is saying it wont work, and no one is saying it wont atleast match the 4870 X2. I am just wondering what nVidia thinks they can gain from this as I can't see anything.

For the record I meant 12%, which I think is the same RPMs as the 4870 @ the stock 22-23% fan speed.

Regardless it will be interesting to see what happens, maybe nVidia will be able to make a decent dual GPU card, the 9800 GX2 was not.

Off topic, is it just me or is the 4850 X2 inferior to 2 4850s in crossfire? It seems like it only slightly beats the GTX 280, but should do better and beat the 9800 GX2 which it only does at times.
a b U Graphics card
December 4, 2008 1:33:18 AM

Well, a very large part of the 4000 series heat issues was the fan, at least for the 4850 (the 4870 does draw a bit more power). At the stupid stock settings, my 4850 (with the initial single slot cooler) idled well up into the 70's (and topped out who knows where, I never let it try). With the fan at 50%, it now idles in the 40s and tops out in the 60's. It also is worth noting that the original coolers are horrible for the 4850. I'm not really sure they are even copper (look almost like painted aluminum). Oh well, it will be interesting to see what clocks NVidia can pull of with the 295.
December 4, 2008 1:37:48 AM

EXT64 I haven't seen u on in ages lol.
a b U Graphics card
December 4, 2008 1:49:11 AM

Yeah, I come and go. I post a bunch when I am getting ready for a new build, as I am now. My attention sort of floats between comps, other hobbies and stuff, and college. I see your comp's been upgraded a bit since I last saw it.
December 4, 2008 1:54:55 AM

it will be a sad day if they take back the crown from ati
December 4, 2008 1:57:38 AM

Why? It forces competition, which forces price drops.

Nvidia already has the crown for strongest Setup. I mean ATI has the single card crown and price/ Performance, and again Nvidia has the crown for Single GPU.

Value I give it to ATI, and Performance I give it to Nvidia (if u have the cash)
!