Sign in with
Sign up | Sign in
Your question

*55nm* EVGA GTX 260 Core 216 Now In Stock at EVGA.com

Last response: in Graphics & Displays
Share
December 22, 2008 11:27:21 PM

EVGA has just released their 55nm GTX 260 212SP stock speed and SuperClocked variants. You can pick one up directly at EVGA.com. Widespread availability is expected to hit tomorrow morning.


http://www.evga.com/products/moreInfo.asp?pn=896-P3-1255-AR&family=GeForce%20GTX%20200%20Series%20Family




http://www.evga.com/products/moreInfo.asp?pn=896-P3-1257-AR&family=GeForce%20GTX%20200%20Series%20Family




In addition, Step-Up from a 65nm GTX 260 to the 55nm vanilla model is confirmed:





To anyone looking to purchase or Step-Up to one of these, here are my two cents. You might wanna wait a few weeks so NVIDIA can fine-tune its production of 55nm G200-103-B2 (GTX 260) chips and possibly leave you with a better binned card that has better overclocking potential? I don't know if this is true, but it's worth a shot.

Besides, NVIDIA is still trying to capitalize on the 65nm fabrication on all those consumers who don't care or know the difference. That can only mean that 55nm production will get better over time. ;) 

P.S. The B3 revision is the final revision for the GT206 chips.
December 22, 2008 11:47:26 PM

What do you mean "vanilla model"? How do you know which one is "vanilla model"? Most of there models end with TR or A1 or AR. I have never seen anything with B.
December 23, 2008 12:00:37 AM

XSmax said:
What do you mean "vanilla model"? How do you know which one is "vanilla model"? Most of there models end with TR or A1 or AR. I have never seen anything with B.


The term "vanilla" always refers to the stock model. In this case, it's part number 896-P3-1255-AR (first link).

Look at the four numbers after 896-P3. They are XXXX-AR. Those are the four numbers that change between models of the same type of card.
Related resources
a b Î Nvidia
December 23, 2008 12:31:06 AM

Interesting, but I'm not thrilled about this. Is it me, or is Nvidia bumping the prices up again? GTX260 was slower then the 4870, so they came out with the GTX260"+"/216. This put it above the 4870. Now we have the 55nm chips, which have higher clocks, and its now a ~$300 card? The 4870 is starting to hit the sub $200 price point, and Nvidia is charging even more now? If Nvidia knocked $75 off those prices, I'd be impressed/happy. Looks like they are going back to charging as much as they can again...
December 23, 2008 12:34:05 AM

Did you really expect different from nvidia?
December 23, 2008 12:44:12 AM

To be fair, this isnt newegg, and it is intro pricing. If in a month they arent selling, watch for the price drop
a b Î Nvidia
December 23, 2008 12:47:10 AM

No, but one could hope. I also wonder if AMD has anything to strike back with. If they have some super clocked 4870s, the speed race might be back on. Even if they don't, we are getting to the point where it doesn't matter for most of us. I would bet most of us here game at 19x10/19x12 or less, which either card can handle. Seeing as the 4870 512MB is under $200, and the 1GB can be found for ~$225, I'm not sure why I should spend $280 and get the GTX260.
December 23, 2008 12:52:13 AM

4745454b said:
No, but one could hope. I also wonder if AMD has anything to strike back with. If they have some super clocked 4870s, the speed race might be back on. Even if they don't, we are getting to the point where it doesn't matter for most of us. I would bet most of us here game at 19x10/19x12 or less, which either card can handle. Seeing as the 4870 512MB is under $200, and the 1GB can be found for ~$225, I'm not sure why I should spend $280 and get the GTX260.


I game at 2048x1152 (16:9) on a 23" Samsung 2343BWX. The resolution is very similar to 1920x1200, only off by 55,000 pixels. I use a single GTX 280 @ 670x2430 but to be honest, Crysis still suffers. If anyone's looking to play very GPU intensive games like Crysis, Fallout 3, and NFS Undercover, I would suggest using at minimum GTX 260 Core 216 SLI or a GTX 295 at these 1920x1080 or higher.



http://firingsquad.com/hardware/nvidia_geforce_gtx_295_...
December 23, 2008 12:55:56 AM

Yea, it seems nVidia is still playing the high cost thing, when theres a much cheaper and very competitive alternative out. Im wondering if they think everyones sold on their BB2 drivers, and their quad optimizations? And the badaboom etc etc physx etal. Sounds like theyre selling a package, and forgot about performance, and once again, competition
a b Î Nvidia
December 23, 2008 1:38:36 AM

Freak, how could you quote and not read what I wrote. As I said, MOST of us use that res or smaller. Crysis high settings at that res with 0AA or 2AA would probably be around 30FPS. Again, MOST of us don't need the massive cards that are being put out. We've reached the limit of current displays abilities, only more AA or newer games drive GPUs now.

What I'm trying to get at is back in the day, it was a big thing if your computer could hit 1024x768, then it was 1600x1200. We are now at the point where many games can be played at 25x16, the question is how many layers of AA can you also do. I'm not sure how many uber clocked GTX+++ cards we really need. Why spend $300 on a card that doesn't really allow new settings (because you've already maxed out your monitors resolution) if a $200 card will still play that res? Nvidia is going down the wrong road.
December 23, 2008 2:37:44 AM

4745454b said:
Freak, how could you quote and not read what I wrote. As I said, MOST of us use that res or smaller. Crysis high settings at that res with 0AA or 2AA would probably be around 30FPS. Again, MOST of us don't need the massive cards that are being put out. We've reached the limit of current displays abilities, only more AA or newer games drive GPUs now.

What I'm trying to get at is back in the day, it was a big thing if your computer could hit 1024x768, then it was 1600x1200. We are now at the point where many games can be played at 25x16, the question is how many layers of AA can you also do. I'm not sure how many uber clocked GTX+++ cards we really need. Why spend $300 on a card that doesn't really allow new settings (because you've already maxed out your monitors resolution) if a $200 card will still play that res? Nvidia is going down the wrong road.


On the contrary, can you play something maxed out at 2560x1600 with all shaders / shadows / and post processing settings on max? That's where shaders come in. The GPU race isn't necessarily about adding AA layers. It's becoming more and more about fill rate. With the move towards the unified shader architecture in DX10 and beyond, the idea is to reach a point where games can display shadows, post-processing effects, and physx effects in unison on such high resolutions without being hammered by slow draw time.
a b Î Nvidia
December 23, 2008 2:51:22 AM

Huh? I can't because my system isn't the best, but others can.

http://www.hardocp.com/article.html?art=MTU2MiwzLCxoZW5...

Here is [H]ard testing the 1GB version of the 4870. Yes, not all the games are at 25x16, but 19x10 is the lowest res used. Crysis, and Stalker are not maxed out in res or details, but GRID and CoD4 are. Age of Conan is maxed out in details, but has to use the lower 19x10 res. If most (90%? 95%? of us use 19x10/19x12, we are reaching the point where only the newest games will need more GPU power. We are starting to reach the top edge of what current display technology can do. Thats the point I was trying to make. If a $200 card can max out your 22" LCD, why spend $300 on a card that can do that also?
December 23, 2008 2:58:49 AM

4745454b said:
Huh? I can't because my system isn't the best, but others can.

http://www.hardocp.com/article.html?art=MTU2MiwzLCxoZW5...

Here is [H]ard testing the 1GB version of the 4870. Yes, not all the games are at 25x16, but 19x10 is the lowest res used. Crysis, and Stalker are not maxed out in res or details, but GRID and CoD4 are. Age of Conan is maxed out in details, but has to use the lower 19x10 res. If most (90%? 95%? of us use 19x10/19x12, we are reaching the point where only the newest games will need more GPU power. We are starting to reach the top edge of what current display technology can do. Thats the point I was trying to make. If a $200 card can max out your 22" LCD, why spend $300 on a card that can do that also?


Because with a $300 card you are paying for more image quality and eye candy over the $200 card. Of course both cards will be able to play at that res, but one of them is going to give you a few more options you can tick "Very High" settings with.
a b Î Nvidia
December 23, 2008 3:12:17 AM

At first I couldn't understand why you can't seem to get this. Now I see your the OP, so all is clear to me now. Did you click and read ALL the pages on my hard link? Yes, some games are not maxed, but others were. ALL detail settings were on high, and were being played at 25x16, the only difference is one had 8AA, and the other was "only" 4AA. Yes, there is some more work to go, but again, we are reaching the end of what displays can handle. I'm sorry if you can't seem to grasp this, I'll stop talking about it.

December 23, 2008 4:54:30 AM

I agree. And Ive been talking about this for awhile, mainly in the cpu section. I know physx etc will try to be run on nVidia cards, but unless its a multi card setup, its doomed to failure, as at least so far, cpus scale much better than gpus, and the extra cores on a cpu can and will do physx.

Another thing thats need to be talked about here is, DX11. Tessalation comes with it, and thats where gpus may hit a challenge again, depending on the game, and the devs use of tessalation . Having programable shaders included will be interesting as to which and how each game and design is going to favor each shader, and its design/capabilities. Basically, gpus have caught up to cpus, have exceded current resolutions for output, at least in the near future, wwe will be able to enjoy newer and better eyecandy, at the cost of ever increasing gpu functionality
December 23, 2008 7:13:47 AM

JD thats a moot point with the release of OpenCL... soon "physx", or to be specific, CUDA programming will be available on both nvidia and ati chips. GPU Parallel programming is here!

...just as soon as nvidia and ati get their act together
December 23, 2008 10:30:02 AM

Theres definately a difference between using physx and using Cuda. Im making that distiction here. I agree, rendering etc will be good using Cuda, but my refernce is strictly for physx, or in game physx. Since both AMD and Intel are going Havok, Im thinking nVidia will get squeezed out. And Im saying from a cpu vs a gpu standpoint. The costs of physx from gpus is too high vs having unused cores on your cpu doing it. As more cpu cores become the norm, physx using gpus will end
December 24, 2008 2:40:03 AM

JAYDEEJOHN said:
Theres definately a difference between using physx and using Cuda. Im making that distiction here. I agree, rendering etc will be good using Cuda, but my refernce is strictly for physx, or in game physx. Since both AMD and Intel are going Havok, Im thinking nVidia will get squeezed out. And Im saying from a cpu vs a gpu standpoint. The costs of physx from gpus is too high vs having unused cores on your cpu doing it. As more cpu cores become the norm, physx using gpus will end

this is absolutly an advantage ATI has atm; remembering blizzard went havok with diablo and probably sc2.
December 24, 2008 2:57:05 AM

but havok is software based physics isnt it? like used in oblivion?

and no matter how many cores are added to a cpu, they will never be as fast as 800 cores running at 750MHz...
December 24, 2008 10:34:50 AM

I agree, but whats the cost? If its perfected, and if nVidia ever decides to go multi core instead of monlithic, and theres some improvements in scaling in multi core gpus, then of course, its THE way to go, but, currently, you need another gpr to really make it shine, and hasnt been perfected. Lets hope it eventually does, as I agree, its truly a faster approach
a b Î Nvidia
January 5, 2009 1:46:01 PM

WOW... I just saw Newegg jacked the price of the GTX 260 cards up. What is with that? The 55nm GTX 260 from EVGA is now $284.99. I bought mine last week for $254.99. Good thing I jumped on it when I did, it's waiting for me to pickup at UPS right now.

I was looking to upgrade my card regardless as I sold my 8800GTS/512 on ebay and needed a replacement. Funny how the price jumps. Heck if I would have waited I would have considered the 4870 1GB for the nice price difference.
a b Î Nvidia
January 5, 2009 1:46:54 PM

Well heck did they jack up the price of the 4870 1GB to?
January 5, 2009 2:01:23 PM

Im thinking the release of the 285s and 295s have a direct effect on the 260 pricing, not to have as large a gap, from 1 to the other.
a b Î Nvidia
January 5, 2009 2:17:03 PM

EVGA's website is cheaper than Newegg. The price did not increase on their site. I do know that I purchased the 55nm GTX 260 and later that day they were sold out. So we shall see. I saw a few reviews that said the cooler on the 55nm not as hearty as the 65nm version. But I have no basis for comparison. The stock coolers have been pretty heavy and hearty in my opinion. No need for aftermarket coolers these days.
January 5, 2009 2:24:03 PM

Ive heard the same thing, tho Im sure its adequate too, even if its "downsized". Maybe a mix, less power, lil higher clocks plus shrink allowed for this, thus reducing costs
a b Î Nvidia
January 5, 2009 2:28:44 PM

But not reduced price. Which means Big Green is making.... Big Green. Oh well such is the path we chose for playing PC Games. I get crap on a daily basis from Co-Workers to go console.
January 5, 2009 2:37:52 PM

They had to do this. Look at die size alone, plus the pcb and wire trascing etc etc, margins are guaranteed to be slim with the former solution, now it at least gives them some room, but if ATI jumps first to 40nm, or has a surprise coming soon, theyll still be in for low margins, as nVidia simply cant lower their prices too much on the 200 series
a b Î Nvidia
January 5, 2009 4:21:02 PM

Well I was just home over my lunch hour, nice to live 2 miles from work. I stopped by UPS to pick up my new card. I had time to install this GIANT monster, install the drivers, and run 3DMARK06. It got 13271. Hmmmm Which for some wierd reason is right where my OCed 8800GTS/512 was hitting. What is the deal with that? Granted I don't really go by 3DMark. I will see what happens when I run the FarCry2 Bench. That is a drag.

I formally had an 8800GTS/512 and this thing is about 1.5" longer and 2 slots thick the whole way back. It is surprisingly light for it's size though. I plugged her in and sat for several minutes. It idled about 46C. Which is nice, my 8800GTS idled about 50-52C in my room this time of year. I like a toasty warm appartment.

And guess what? Damn me if they didn't screw up my card and send me the OCed version. I triples checked myself and checked in Rivatuner (which is just sitting idle for my OSD right now). It showed 626MHZ core and 1058 Memory. OH FREAKIN DARN!

January 5, 2009 5:36:14 PM

SPAM!
a b Î Nvidia
January 6, 2009 11:06:22 AM

May I add that the cooling on the 55nm is not very good compared to the 65nm. And it gets HOT! at idle its nice, but man it will hit 80C in Crysis fast. I was wrestling with this card last night and the fan profile is HORRIBLE. it sits at 40% duty cycle on idle and 80C, I was creating my own in Rivatuner just to cool the thing.

My suggestion, stick with the 65nm! I should have stuck with my gut and got the XFX card.
January 6, 2009 11:41:15 AM

Hmmm, I hadnt heard about this.. I do know that 40nm isnt expecting that much greater of clocks, but lower temps and power, sorta like the R600-670 move, maybe nVidia, ramping clocks and with a die shrink is experiencing what ATI has somewhat at 55nm?
a b Î Nvidia
January 6, 2009 12:12:43 PM

I am crossing my fingers that a new card comes out and I can use their step-up program. I am starting to consider RMAing my card because it was showing artifacts in Crysis once it got to high temps. I am taking it over to my buddies house to regoop the GPU. I've seen cards that have not had proper paste applied from the factory and that caused issues. It would save me the shipping at least. If this doesn't work I will RMA the thing. Plus it has a lifetime warranty, does it not?
January 6, 2009 12:23:29 PM

Yea, EVGA is really good for this. Your options may be limited to the 285, the downsized 280 for the step up program tho. I know alot of the ATI cards needed more goop on them, so its possible. After ATI released their 4xxx series at the low prices, its put a new perspective on manufacturing costs, especially with slim margins and low sales
a b Î Nvidia
January 6, 2009 12:36:53 PM

Yeah my x1900xtx benefited from it several years ago. My buddy has some arctic silver ceramic paste that we are going to slap on and see what happens.

But what's the worst case? I have to RMA it, I have 30 days to do so with newegg. Ya know? The 2nd worst case is I can't overclock the thing, frustrating but oh well.
January 6, 2009 12:47:34 PM

Maybe the cream will be your solution, hopefully. Ive heard of hot 280s, but not many or any hot 260s,. It may be whats wrong, and itll be fine, good luck
a b Î Nvidia
January 6, 2009 12:58:59 PM

I don't seem to be the only one. People are starting to post review on newegg and they all state good idle temps but 80C load temps which is what I'm getting. But when you go to the SC version everyone can OC the crap out of their cards. Not sure what the deal with that is. This is my first experience with eVGA and have gone XFX before. I've always read that XFX has very good fan profiles, however eVGA leans on the quiet over performance side. I can make my own in Riva tuner, which is no biggie. But it bothers me at stock that the card can't hold it's own.

On a note my buddy just upgraded his MB, CPU, and MEM to an i7. He has SLIed 8800GTX's and this this ROCKS now. 20K in 3Dmark06. He said it opens them up like made.
January 6, 2009 1:06:42 PM

Well, Mike Rowe would own one too, so, whenre you getting yours? heheh

Anyways, this is news to me, I had no idea the new 55nm ran so hot, Im wondering how hot the 285s are going to get? Or how loud?
a b Î Nvidia
January 6, 2009 1:44:37 PM

Yeah I dunno. My thoughts might be, weren't the 260 core 216's just crippled gtx 280's? Maybe the new 55nm's are stamped and binned as 260's period. Therefore they aren't designed to run at the higher clocks. I put mine to 700Mhz and it artifacts to crap. at 675 not to bad, but the fan profiles and new goop might fix that. EVGA has the SSC version out at 675MHZ. If I can achieve that speed I'll be happy. My 8800GTS was 650 stock and oced to 750. I was happy with that, and it could do 800 if I ran the fans at 80%. Oh well such is the OC world, you win some you loss some.
a b Î Nvidia
January 6, 2009 10:44:05 PM

Profit had to come from somewhere. You didn't think it was going to come from the shareholders pocket did you?
a b Î Nvidia
January 6, 2009 11:05:12 PM

AuDioFreaK39 said:
Because with a $300 card you are paying for more image quality and eye candy over the $200 card. Of course both cards will be able to play at that res, but one of them is going to give you a few more options you can tick "Very High" settings with.


No, you're not.

Your original post didn't have the single HD4870 1GB to illustrate 47's point, but look at Xbit's results and you'll see, we're not talking about a BIG difference here;
http://www.xbitlabs.com/articles/video/display/radeon-h...

You're not getting 50% more from the GTX260-216 or even GTX280 but you're paying that much more, and it's not like magically you get a significantly higher shader model access or even features access. Be it the old GTX260 or the new one, if the boost is minimal but there's a return to the original prices, what's the major benefit?

I'd say like everything it suffers from launch pricing, just like they all do. It's nice to see the new GTX260, and it's a good choice for many people looking for that class of card with some improved temps and power stats, but it's still got alot of pressure from the existing market participants even from nV herself.

You mention the GTX295, but that's an entirely different class, and no the GTX260-216 cannot hang with that card, it truly does offer more, but it also costs significantly more, just like the X2, and as mentioned, for most people who tend to game at or below 19x12 it's a question of diminishing returns. Best to save the premium towards the next generation cards coming later in the year.

Quote:
That's where shaders come in. The GPU race isn't necessarily about adding AA layers. It's becoming more and more about fill rate. With the move towards the unified shader architecture in DX10 and beyond, the idea is to reach a point where games can display shadows, post-processing effects, and physx effects in unison on such high resolutions without being hammered by slow draw time.


If that were the case, then you'd want to get the card that's better at complex pixel shader processing, which would be the HD4K, however we're stillnot there yet, very VERY few games involve overly complex shaders that push the envelope. GRiD is one of the few that has long complex shaders and there the HD4850 can outpace a GTX280, so the GTX260 would be a wash too; however, it's not the complex unified DX10 shaders that are mattering, it's the short and quick shaders that both can crunch like old school DX9 even if they have DX10 instructions in them. Really it's still a brute for world of gaming out there, and the GTX260-216 isn't significantly better at it to justify a price increase to the same level. Heck under 19x12 probably two cheap HD4850 or even GF9800GTX+ would destroy anything in the $300 range.

As for PhysX, c'mon that's like someone mentioning DX10.1 as being worth extra money. For PhysX you'd be better off with a GTX260 65nm and a $85 dedicate Ageia PhysX PPU.
a b Î Nvidia
January 6, 2009 11:21:40 PM

V3NOM said:
JD thats a moot point with the release of OpenCL... soon "physx", or to be specific, CUDA programming will be available on both nvidia and ati chips. GPU Parallel programming is here!

...just as soon as nvidia and ati get their act together


You might want to actually see what both of them have to say about that as well, because you're wrong about CUDA being on both, CL will be on both but ATi and nV want it to remain: CAL, CUDA, Brook+ and OpenCL with OpenCL being the platform for those wanting to be on all platforms. nV and ATi might port their apps to OpenCL, but unless THEY specifically do it, it won't happen easily and will essentially need an inefficient wrapper to make it run on something else, even OpenCL.
a b Î Nvidia
January 6, 2009 11:43:57 PM

JAYDEEJOHN said:

Anyways, this is news to me, I had no idea the new 55nm ran so hot, Im wondering how hot the 285s are going to get? Or how loud?


A few things to add to that, first some observations, A) the Copper surface contact is also smaller, the original copper contact was only partially used as you can see by the thermal grease left on it, but the extra copper would still aide in the quick dissipation of the heat away from the core. B) the metal cap for the die is the same size as the 65nm one, how much of the chip is close to the surface around the edges? This gives it the same contact point with the HSF copper ingot, but it meant that the surface area question/issue is happening inside the cap. C) Notice both the new and old HSF had contact points for all the memory modules, but the 65nm version only used half of them since it was double sided, and relied on the air to cool the backside chips, in this single-sided scenario for the 55nm all the memory is added to the cooling burden of the noticeably diminished/reduced HSF, this would impact the cooling negatively on it's own, even if there were no other changes. The question in the development stage was likely do the benefits of the 55nm step make-up for all of these?

The other thing that I was interested in seeing (and obviously won't be an issue in the same way) would've been the contact surface disipation point and how much that would've change (it's no longer fully visible), one of the problem s of smaller chips of course is trying to dissipate all of your heat from a much smaller point on the HSF concentrating everything more.

It's a strange choice to put all of these things together from a design perspective, but of course from a cost perspective it makes sense at every turn, reducing HSF size, reduces cost, one sided PCB memory mounting reduces costs, keeping the old cap reduces costs (minimally I would think). I doubt it's going to be a major issue, but just like the HD4K series we're going to see alot of people quoting temperatures with great concern in their posts, regardless of whether things are working fine or not.

It looks like nV is following ATi's lead in this and I would think they are leveraging every production cost penny so that they can fight on price too.
Die size will not be the only factor in their margins.
January 7, 2009 1:30:20 AM

This stunned me. The metal cap is the same size? Ummm, isnt that rather inefficient? Maybe they dont want to spend on tool and die for a new cap? Thats just nuts.
Having thre memory covered will more than likely keep the modules cooler, but yea, like you said, it adds to the HSFs cooling responsibilities. Thanks TGGA for pointing these out, and yea, going smaller is going hotter per mm, and all those that cried foul about the 4xxx series will find this out on the green side as well
January 7, 2009 2:07:52 AM

I remember a thread over at B3D awhile back, and some of the guys were giving wavy Davy a hard time about the 770 heat issues, and high idle clocks and heat, and he mentioned he wouldnt be surprised if nVidia went this direction as well, and its taken all this time to figure out exactly what he was refering to heheh
a b Î Nvidia
January 7, 2009 3:19:40 AM

Yeah, we were talking about this well over a year ago when talking about reducing size and having three major issues, gate & wire density/proximity with increase leakage and cross-talk, memory interface size and reliability, and cooling surface area. I think it's realy no surprise to many, but hard to explain to people who think you're just giving excuses at the time (edit: not when we were discussing, but for Wavey), because they can't fathom how the smaller die with less transistor count on the smaller fab process could possibly be hotter. And you just see it in action here.

I think the people who will reap the greatest reward from the 55nm G206 would be those using their own cooling solutions, either 3rd party or else more exotic, or if various IHVs at least maintain or improve their cooling.

Another possible line of thinking with the GTX260 refresh would be to somewhat limit it's cooling to also minimize the potential for overclocking to intrude too much on the territory of the GTX280 65nm and to maintain it's higher value which is already dropping to nearer $300 to help get rid of old stock. I would be surprised if the GTX285 suffered the same limitations since the GTX 295 should have a healthy difference to not be worrisome.

It's strategically smart for this round, but a little tricky, and it'll be interesting to see the reaction, whether jay2tall is the majority or if the 'hey it's new it's better, period' crow are the majority, or maybe not the majority, but who is the most influential for this round and most importantly leading into the next one in the summer, which could have some interesting feature differences again.
January 7, 2009 3:35:04 AM

It reminds me of the 1900xt-xtx thing, very close in perf.
Ape, what do you think the ocing abilities for the 295 will be, 289 watts.... cutting it close there
a b Î Nvidia
January 7, 2009 3:44:23 AM

I think the GTX295 will be a terrible overclocker out of the box, and likely be voltage limited to boot.

But I think the GTX '265' and '285' should be great overclockers when properly cooled.

The main thing we still don't really know is the stability. Some could be god others not, it's not easy to guess at until alot of people start testing them to the max.
January 7, 2009 3:49:59 AM

Thats my thoughts as well on the 295, and people using nVidia usually like to oc, or, alot do. The 295 wont really offer alternative cooling, unless its some weird and monstrous water cooled kit/prebuild.
I agree, the 260/216 is so close to the 280, itll prevent nVidia from any premium pricing, which I have a feeling they want to get back with the 285, as the 280 prices have tanked
!