Nvidia Announces Three Versions of GeForce GT 730
Nvidia has quietly announced its new GeForce GT 730 graphics card. Well, we'd be wrong when we say a graphics card, as there will actually be three different versions of the card, all of which are wildly different from one another.
The first model of the card is one that comes with 96 CUDA cores and a 128-bit memory interface driving 1 GB of DDR3 memory. The GPU is clocked in at 700 MHz, and the memory at an effective frequency of 1.8 GHz.
The second and third models will both carry 384 CUDA cores clocked in at 902 MHz. One of these will carry 2 GB of DDR3 memory running over a 64-bit memory interface at a frequency of 1.8 GHz, while the other will address just 1 GB of memory, though this is DDR5 memory. It will still run over a narrow 64-bit bus, though.
We especially do not understand why the first model with just 96 CUDA cores is called a GT 730. This will only cause confusion among customers. All of these cards run over a PCI-Express 2.0 interface though, and are probably geared towards customers who need a discrete graphics card for the most basic desktop tasks on older systems, as a lot of on-die graphics solutions are likely to outperform these cards anyway.
Expect AIBs and AICs to come out with multiple versions of the card, in all shapes and sizes, including single slot and low-profile cards. No word on exact pricing yet, though we'd be surprised if these cost more than a small number of ten dollar bills, depending on the model, of course.
Follow Niels Broekhuijsen @NBroekhuijsen. Follow us @tomshardware, on Facebook and on Google+.

I know this is not what the cards are made for, but Nvidia is hoping grandma will pay an extra $30 for a "prettier" display.
And while underhanded does make sense economically.
What does not make sense is why give 1 of those cards 2 gigabytes of ram when it has no reason to have 2 gigabytes of ram. Any game you try to play besides DOS games will run at super low fps without anti-aliasing. Nvidia could shave $20 bucks or so off the price or even just pocket it rather than give the card 2 gigabytes of ram.
It looks exactly like a GeForce GT 620. According to Wikipedia the clocks and the cores are the same.
96 cuda cores! That is as many cores as in my old gt540M that I still use.
That being said, I can't wait for EVGA's FTW version with 2GB and the ACX cooler.
I know this is not what the cards are made for, but Nvidia is hoping grandma will pay an extra $30 for a "prettier" display.
And while underhanded does make sense economically.
What does not make sense is why give 1 of those cards 2 gigabytes of ram when it has no reason to have 2 gigabytes of ram. Any game you try to play besides DOS games will run at super low fps without anti-aliasing. Nvidia could shave $20 bucks or so off the price or even just pocket it rather than give the card 2 gigabytes of ram.
It looks exactly like a GeForce GT 620. According to Wikipedia the clocks and the cores are the same.
and please fix the ddr5 error
This type of gig is pretty common on the low end SKU's, especially in mobile.
maybe 3-4 years ago when USB display adapters were hokey black magic but in a world with $120 750s and $140 750Ti's and even a few hot deals on something like an old stock 7770, this is really just...profit bait?
I know this is not what the cards are made for, but Nvidia is hoping grandma will pay an extra $30 for a "prettier" display.
And while underhanded does make sense economically.
What does not make sense is why give 1 of those cards 2 gigabytes of ram when it has no reason to have 2 gigabytes of ram. Any game you try to play besides DOS games will run at super low fps without anti-aliasing. Nvidia could shave $20 bucks or so off the price or even just pocket it rather than give the card 2 gigabytes of ram.
I think the 2GB thing, is because of people that still look at graphics cards only in terms of how much memory they carry... I'm sure they overcharge for it so they make some more money off the ignorance of some consumers.
It is relevant to those who picture/movie editing (for hobby, not pro`s) where more RAM is always welcome. in some situation you can also slight accelerate the edit by utilizing the GPU to render.
it is also somewhat lower the CPU usage (no iGPU).
It can also give life to old systems that cant handle HD content when using the proper codecs.
but no question - 2GB and this tiny fan is unnecessary. 1GB and fanless design is the way to go with those little brothers.