Nvidia GT218 Card & Specs Possibly Surfacing

Status
Not open for further replies.

Mr_Man

Distinguished
Feb 17, 2008
202
0
18,680
I thought the GTX 2xx naming scheme was supposed to make things less confusing... where do they get the idea for GT218?
 

Tindytim

Distinguished
Sep 16, 2008
1,179
0
19,280
[citation][nom]Mr_Man[/nom]I thought the GTX 2xx naming scheme was supposed to make things less confusing... where do they get the idea for GT218?[/citation]

The GTX 280, 260, 285, and 295 are all based on the GT200 core. This is a new core, the GT218.
 

eklipz330

Distinguished
Jul 7, 2008
3,034
19
20,795
this just in:

nVidia's NOT renaming this time...

with all due respect, i understand taking the the older generation, renaming, and possibly lowering its hierarchy and price to fit the naming scheme making room for newer cards... but renaming, and increasing the price just isn't nice
 

ubergeetar

Distinguished
Aug 13, 2008
18
0
18,510
I dont get it... It's a new core, with ZERO improvements on memory, core speed, etc. Even if it has more processors, wouldnt the 64-bit interface would really slow things down? Why take such a huge step back from DDR3 512bit?
 

stridervm

Distinguished
Jan 28, 2008
645
0
19,010
I think the idea is for nVidia to make careful, but baby steps regarding it's new video card.... Process.

If you remember they only just recently relesed their 55nm video cards, and then jumping into another lower process, that takes guts as it's a real gamble. I think they're just playing safe.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
Uber, this is a lower-end card. While very wide memory interfaces are nice, there are some extra costs they bring that cannot simply be taken away through revisions to smaller fabrication processes.

Basically, the wider the memory interface, the more pins your package needs, and the more interconnects inside the GPU package has to have, all of which require so much edge space. Basically, looking at what GPUs I've been able to gather data on, the lower-end of die size necessary for a given memory interface width is in the neighborhood of the following:
■128-bit: 100 mm²
■256-bit: 196 mm²
■512-bit: 420 mm²
Bigger die sizes equals a greater liklihood of having a chip be bad, and fewer chips cut from a wafer to begin with, resulting in greatly increased prices. This is why, in spite of the advantages in performance it'd bring, no one moves their entire lineup to use wider memory interfaces: it'd require bigger chips.

Furthermore, the wider the interface, the more RAM chips you need to actually use it. I believe a minimum of 1 DRAM chip per 32 bits of interface width is standard for video cards; hence, a 512-bit interface requires a whopping 16 DRAM chips; not good for prices.

Basically, I'm guessing this will probably pack around 8 ROPs, 16 or 32 TMUs, and 32 or 64 stream processors; it will be a low-end part, probably designed, yes, to compete with the Radeon 4550. Looking at it, I'd say that the reason behind these decisions is that their recent beatings have forced nVidia to a more conservative ground, where they're making their first test with a product that will cost them very, very little to make, and will have a volume market that their traditional flagships do not, and could hopefuly restore them to profitability.
 

nottheking

Distinguished
Jan 5, 2006
1,456
0
19,310
Bleh, I failed up there, assuming these comments followed the same code as the forums... And no "preview comment" option. Just ignore the ugly tags; it was supposed to be bullet-points, obviously.
 

hannibal

Distinguished
Actually it is really a time to Nvidia to make some new low end cards allso! Their old very low end cards are based an really old design. Nice to see some new competition allso in the low end!
Just hope that this will be at least as good HTPC card as ATI have! NVidias new 285 has proven to be reasonable energy effecient in high end sector. Maybe this time allso in low end.
 

ubergeetar

Distinguished
Aug 13, 2008
18
0
18,510
Okay, well if it's a lower-end card, then more power to NVidia for taking slow steps. This is probably a better approach, instead of just slamming out a super-powered card at the 40nm where they could mess up.
 
G

Guest

Guest
[citation][nom]megamanx00[/nom]Guess this will have lower memory bandwidth than the 4550 so I suppose it may be a 4350 competitor. Either way it won't exactly be a gaming card ^_^.[/citation]
I'd have to say that a Radeon 4830 seems like a pretty good gamingcard to me!
It can basically play every game out there except the latest most advanced.
I remember when my Riva TNT32 card was in for renewal, how much different a Radeon 9500 was back then!
I can only imagine the 4300 to be more powerful; and perfectly suited for most games!
This newer Nvidia card with 40nm design might be a good initiative to have a low power consumption card available in the Nvidia lineup.
Though so far I'll stick with 50nm ATI cards!
For me the casual gamer, a 4830 or 4670 seem like good enough, cost effective and powersaving cards!
 
Status
Not open for further replies.