Sign in with
Sign up | Sign in

Report: Nvidia To Launch GK104-based GTX 660Ti in August

By - Source: WCCF Tech | B 76 comments

The Internet rumor mill has turned up another new Nvidia card.

The GK104-based GTX 660Ti is apparently slated for an H1 August launch and debut with a suggested price of $299.

According to Wccf Tech, the new card will arrive with seven active SMX units, 1,344 processing cores and 1.5 GB memory. The clock speed will below the clock speed of the 670 card, but the performance will be, according to the site, above the GTX 580 and directly competing AMD Radeon 7800 cards.

Wccf Tech also noted that the GTX 660Ti will be Nvidia's last GK104 card and the company will be transitioning to the 700 series going forward. At a price of $299, the 660Ti makes a lot of sense below the $399 670 and the $499 680 cards.

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
Top Comments
  • 21 Hide
    icemunk , July 14, 2012 7:17 PM
    Woo hoo! Cheaper 7800s! I can't wait!
  • 15 Hide
    Dangi , July 14, 2012 7:14 PM
    Great news !! With this we can expect to see HD7800 series becoming cheaper as occured before with HD7900 series when GTX670 and GTX680 appeared
  • 12 Hide
    blazorthon , July 14, 2012 10:37 PM
    kristoffeChecking scores, it's almost as if the cuda cores are really just getting slammed in these 104's and not accessed properly in design. If they were a sign of proper parallel architecture, they would KILL the 560Ti, which I have 2 x 2gb in each of my rendering systems. It is not the case. Nvidia is simply engineering marketing now to keep up with ati's 'streams' when in fact the 560Ti 2gb was killing it and the power draw was reasonable.1344~1536 should show a parallel processing advantage of at least 4~5x that of the 560Ti and the scores on various sites are just pathetic. Hopefully someone comes out with a nice hack to enable or properly access the cores, otherwise, what is the point? And this new 660Ti with only 1.5gb, what they can't afford to put in parallel 2gb? ORLY? 4gb for a great custom 680 (which I have read about but never seen IRL)yawn


    That's not how it works. First off, these cores are not the same as the cores in the 560 TI. These are optimized for single precision math and aren't even capable of dual-precision math. They are also only half as fast as the older cores (although much more power efficient and not only because of the die shrink) due to the abandonment of the inefficient hot-clocking method use previously. The dual-precision capabilities of the GK104 are only from a small amount of 64 bit Kepler CUDA cores that don't do single-precision math. Well, since games run on single-precision math, these were not prioritized. This is why the Kepler cards are somewhat more power efficient than AMD's GCN based Radeon 7000 cards. They are purely designed for gaming performance and that is what they excel at when they're VRAM doesn't cause too severe of problems with it's too-small bandwidth.

    Furthermore, there is 1.5GB because it has a 192 bit bus instead of a 256 bit bus... RAM chips have 32 bit buses. You do the math on how many chips a smaller bus can get. That's right, twelve. Twelve chips times 256MiB per chip means 1.5GB of VRAM. 512MiB chips are much more expensive than 256MiB chips. For example, 8GB DDR3 memory modules use 512MiB chips and although their prices have improved substantially in the last few months, they are still oftentimes much more expensive than a similar 2x4GiB memory kit. Also, there is a GTX 670 4GiB at newegg, not that it matters because in any situation where you can use that much VRAM in a gaming situation, the memory bandwidth holds it back so badly that you'd hate to compare it to a multiple 7950 OC or 7970 multiple 7970 system... There might be a 4GB GTX 680 out by now, but I don't really care to check and like I said, it doesn't really matter.

    So, there's not a problem with CUDA cores being improperly accessed... The problem is that you don't know the situation. Beyond that, you ignore the other factors in performance... I guess that you didn't know about how increasing the core count linearly does not give a linear increase in performance and there are other limits in performance, such as the memory and more either. Heck, that's all ignoring any CPU bottle-necks and other bottle-necks that aren't directly related to the graphics card that can hold back performance.
Other Comments
    Display all 76 comments.
  • 15 Hide
    Dangi , July 14, 2012 7:14 PM
    Great news !! With this we can expect to see HD7800 series becoming cheaper as occured before with HD7900 series when GTX670 and GTX680 appeared
  • 21 Hide
    icemunk , July 14, 2012 7:17 PM
    Woo hoo! Cheaper 7800s! I can't wait!
  • -4 Hide
    esrever , July 14, 2012 7:17 PM
    wonder how many times they can cut that die down
  • 3 Hide
    vmem , July 14, 2012 7:17 PM
    hmm, a pair of these should run like a dream for any single monitor needs. 7800 prices will finally drop a little. all in all good news :) 

    wonder how the 700 series will match up to AMD's 8000 series. if the 700 series are indeed GK110 based, then the folks at AMD will have a much better handle on what it's performance might be like, hopefully they use this to their advantage
  • 5 Hide
    bawchicawawa , July 14, 2012 7:25 PM
    hurry up... I want cheaper AMD prices and more cards to compare....
  • 10 Hide
    freggo , July 14, 2012 7:33 PM
    Just building a new system, so maybe an ATI price cut coming soon? Would be perfect timing.
    Also, isn't there constantly something new or updated that gets followed by a price cut somewhere else ?

    God, I wish the car companies would do that too. I'd get myself an 'older' Mercedes convertible from the bargain bin :) 
  • 0 Hide
    dudewitbow , July 14, 2012 7:37 PM
    Skipping the gtx 650, now the release schedule for both sides next gen gpus are close at hand.
  • 7 Hide
    17seconds , July 14, 2012 7:49 PM
    No mention of the 650 Ti and 660 non-Ti also due to be released at the same time.
  • 5 Hide
    tmk221 , July 14, 2012 7:51 PM
    huh I guess there will be no 6xx series for mainstream and low end gamers
  • 3 Hide
    Derbixrace , July 14, 2012 7:57 PM
    well im already glad i got a 670 for 339€ on sale but good that prices are coming down :) 
  • -1 Hide
    blazorthon , July 14, 2012 7:57 PM
    esreverwonder how many times they can cut that die down


    Nvidia often does it up to three times. They could cut it down as far as they want to. Heck, if they really wanted to, they could make all of their dies as full GK100 dies had they made such a die and they could cut it down all the way down to the bottom Kepler cards. It would almost defintiely be a very bad method, but it can be done.
  • -1 Hide
    blazorthon , July 14, 2012 7:59 PM
    john_4Have used both AMD and Nvidia throughout the years in my gaming rig builds that goes for AMD CPUs too. I think allot of it comes down to who has what out at the time when you do your build and how much your willing to pay. I build my rigs on the higher end side without going extreme but still try to make sure to get as much bang for buck as possible. Usually spend around $300 - $350 each for the CPU and Video. Last time I checked the old Athlon 64x2 with a ATI card with 128Mb on-board (before AMD bought them out). Was a good system for the time and is still running strong, no breakage. If I were to build one right now this new Nvidia 660Ti would be on the top of my list for consideration paired with an intel i7 CPU.My Gigabyte 560Ti is still running strong with my older Q9650 CPU right now and until the new kiddie consoles release I see no reason to build a new rig.


    i7 ~= i5 in desktop gaming performance in all modern games and that probably won't change any time soon... Even if it did, I don'st see any way that you could max out the i5 with even two GTX 660 TIs in SLI, so you probably wouldn't get any benefit out of it.
  • 4 Hide
    rocknrollz , July 14, 2012 8:08 PM
    Nvidia really stuck out this time, I must admit. They did great with the 670, but the 680 and 690 were nothing amazing. (They were awesome though)

    Now we hear that near the end of the year they are now releasing their mid ranged cards? IMO, they are too late, as AMD is already in the talks with the 8000 series. Hopefully Nvidia follows the same strategy as AMD in that they release their cards on a month by month basis.
  • 0 Hide
    atikkur , July 14, 2012 8:33 PM
    then next year (2013) will be 700's games... i can wait for a year. i just sense this 600 series are still not the optimal version. nvidia only best on their second/third cycle of revisions.
  • 2 Hide
    atikkur , July 14, 2012 8:41 PM
    matto17secsNo mention of the 650 Ti and 660 non-Ti also due to be released at the same time.


    if i were nvidia, i just let 500 series to co-exist with 600 series to serve the performance under 660ti. then focus to develop 700 series for next year or next iteration.
  • 8 Hide
    blazorthon , July 14, 2012 8:44 PM
    atikkurif i were nvidia, i just let 500 series to co-exist with 600 series to serve the performance under 660ti. then focus to develop 700 series for next year or next iteration.


    That's not a very bad idea, but it means that Nvidia would leave older, much more power-hungry cards as their mid/low end lineup competing against AMD's comparatively power-sipping cards that also tend to be cheaper.
  • -6 Hide
    icemunk , July 14, 2012 8:51 PM
    blazorthonThat's not a very bad idea, but it means that Nvidia would leave older, much more power-hungry cards as their mid/low end lineup competing against AMD's comparatively power-sipping cards that also tend to be cheaper.


    Nvidia makes nice cards but they really do need to work on their power and heat efficiencies. AMD has them beat hands-down in that area at the moment. I'm not a fan of having an extra air conditioner to cool my room, just because of my video card.
  • -6 Hide
    kristoffe , July 14, 2012 9:22 PM
    Checking scores, it's almost as if the cuda cores are really just getting slammed in these 104's and not accessed properly in design. If they were a sign of proper parallel architecture, they would KILL the 560Ti, which I have 2 x 2gb in each of my rendering systems. It is not the case. Nvidia is simply engineering marketing now to keep up with ati's 'streams' when in fact the 560Ti 2gb was killing it and the power draw was reasonable.

    1344~1536 should show a parallel processing advantage of at least 4~5x that of the 560Ti and the scores on various sites are just pathetic. Hopefully someone comes out with a nice hack to enable or properly access the cores, otherwise, what is the point?

    And this new 660Ti with only 1.5gb, what they can't afford to put in parallel 2gb? ORLY? 4gb for a great custom 680 (which I have read about but never seen IRL)

    yawn
  • -5 Hide
    kristoffe , July 14, 2012 9:25 PM
    http://www.tweaktown.com/reviews/4665/palit_jetstream_geforce_gtx_680_4gb_video_card_review/index.html
  • 3 Hide
    Unolocogringo , July 14, 2012 9:46 PM
    These most likely are not a cut down die.
    They are probably the same die as the 680.
    They just had to wait long enough to get enough defective chips to launch the 660-TI model.
    All chip fabricators do this to sale defective chips that could not otherwise be sold.
Display more comments