Sign in with
Sign up | Sign in
Your question

Kepler Rumors

Last response: in Graphics & Displays
Share
October 14, 2011 10:16:57 PM

I don't know much about Kepler, and I'm not sure anyone does at this point. Right now I currently own dual EVGA GTX 570 SC's in SLI. I run 3 monitors at 5760x1080. Bottom line, I don't have enough memory. I can run games, but not on the highest settings and my memory is constantly bumping the 1.25GB capacity. I've read rumors that the BF3 Beta is swallowing up almost that much memory for a single screen. That begs the question, will the 3GB that you can get with a GTX 580 even be enough within 18-24 months?

That brings me to the topic of my post. What are some rumors out there about Kepler? Is it suppose to be a big improvement over the 500's? What are the big improvements? Are we going to see substantial memory increases to deal with the increase in popularity of multiple monitor gaming? I'd be pretty disappointed if I waited to buy Kepler and Nvidia rolled out something like they did with the 400 series.

I want a finished product. I don't want something that will be buggy and have poor power/thermal properties. The EVGA GTX 580 Classified's really peak my interest because the 500 series is very mature by now (finished product), and the Classified version is really beefed up to handle extra clocks/power/heat. Would it be stupid to drop $1200 on a pair of 580 Classy's with Kepler in the not too distant future? Opinions?

More about : kepler rumors

October 14, 2011 10:40:22 PM

Meh, the AMD would have to blow the living socks off of Nvidia before I'd buy it. Always been an Nvidia guy, and specifically I'm a big EVGA fan. Their products have always been great to me, and their customer support is second to none.

I have no problem waiting if early next year means Jan, maybe into Feb. But I know how these things go, and I wouldn't be a bit surprised if we don't see them until May, and that would really suck.

But that was my main question. If the 7000/600 series cards will make me cry, then I better wait. What do we typically see in performance gains from generation to generation? 20%?
m
0
l
a c 254 U Graphics card
a b } Memory
October 14, 2011 10:46:38 PM

Nvidia has a habit of releasing cards with the minimum amout of vram example 1.5gb , you mean to say they couldn't just go with 2gb like AMD. I do think that the 3gb version will be good for what you want. I currently have three Evga GTX 580 Hydro Copper 2 3gb cards and I am tempted by the new classified although in my case it would be $2200 to swap out. I guess it comes down to a waiting game and how long can you wait. You buy now , then you know you are going to want Kepler , and then it becomes very expensive.
m
0
l
Related resources
October 14, 2011 11:00:14 PM

If I knew Kepler was going to be a fully completed product (i.e. 500 series) and not a rushed product (i.e. 400 series) then I'd have no problem waiting. Also, it seems that companies such as EVGA wait a while before coming out with their higher memory versions and dolled up versions (Classified).

If I knew that EVGA was going to come out with a GTX 680 Classified 4GB model right away then I'd most certainly wait for Kepler. But will they come out with the 2GB version and then wait a few months before stacking on the memory?

It's not like my 570's WON'T run games. They run them just fine, but in my opinion, you don't invest several thousand dollars into a gaming PC and have it not be able to run games on max settings. I also don't want to get into a situation where it's February, I've decided I'm waiting, they don't come out with them until March or April, and on top of that they don't release a higher memory version until July. That would really suck as games like COD: MW3 and BF3 would have basically passed me by.
m
0
l
October 14, 2011 11:48:45 PM

According to information released by Nvidia so far, Kepler cards will triple the dual-precision floating point performance of Fermi and hit up to 6 dp GFlops

http://www.tomshardware.com/news/nvidia-gpu-kepler-ferm...

The Kepler allegedly has roughly 16 times the performance of the current Fermi cards. You could play Crysis on 2560x1600 with 16x AA with only one of them!

http://www.tomshardware.com/forum/307044-33-info-nvidia...

Whether or not the rumors are accurate is anyone's guess.
m
0
l
October 14, 2011 11:52:28 PM

I don't know what dual-precision floating point performance means. I also find a 1500% increase in performance to be quite unbelievable. Is this even possible?
m
0
l
October 15, 2011 12:00:11 AM

You completely misread that article.

It says Kepler is expected to produce 6 dp Gflops, and it's successor Maxwell (2013 release) is projected to hit 16 dp Gflops.

If Kepler is 3 times Fermi and Kepler is 6 then Fermi is 2. That makes the next, next gen cards 8 times Fermi. In 2 years, I even find that hard to believe.
m
0
l
a b U Graphics card
October 15, 2011 12:45:20 AM

^this is all very interesting to ponder cant wait for the next gen
m
0
l
October 15, 2011 1:44:22 AM

I heard a rumor that bulldozer was going to be awesome. As it turns out, it is not so impressive lol. *prepares for a 2nd hype fest gauntlet*

still, really interested in the memory upgrade involved in thew new gen of cards for both companies. Either way I will be waiting for the revisions of the new generation cards as many of you have suggested. Not buying a rushed prototype overpriced. Cheers
m
0
l
a c 254 U Graphics card
a b } Memory
October 15, 2011 2:17:29 AM

I do think they will follow the same pattern of releaasing the lower amount of memory cards first and then the exotic ones later on. I wish that they would catch on to what AMD does with coming right out first with two gb ram and put out a 3gb version right out of the gate. I'm sure they can tell that people are snaping up the exotic cards left and right. Just look at the Evga classified , the Evga site as well as Newegg are always out of stock , when they get some in they sell right out.
m
0
l
a b U Graphics card
a b } Memory
October 15, 2011 2:24:34 AM

IMHO, the Nvidia 560's and even the ti's are just a marketing tool to fund the 6XX batch.
560's are an overpriced 460 with a new chip.
I'll wait to see what the 6XX brings us.
m
0
l
a c 254 U Graphics card
a b } Memory
October 15, 2011 5:11:42 PM

thebski said:
If I knew Kepler was going to be a fully completed product (i.e. 500 series) and not a rushed product (i.e. 400 series) then I'd have no problem waiting. Also, it seems that companies such as EVGA wait a while before coming out with their higher memory versions and dolled up versions (Classified).

If I knew that EVGA was going to come out with a GTX 680 Classified 4GB model right away then I'd most certainly wait for Kepler. But will they come out with the 2GB version and then wait a few months before stacking on the memory?

It's not like my 570's WON'T run games. They run them just fine, but in my opinion, you don't invest several thousand dollars into a gaming PC and have it not be able to run games on max settings. I also don't want to get into a situation where it's February, I've decided I'm waiting, they don't come out with them until March or April, and on top of that they don't release a higher memory version until July. That would really suck as games like COD: MW3 and BF3 would have basically passed me by.



If you decide to wait then you do want to wait a few months past the actual release date for all the bugs to be worked out and that would fall into the approximate time for the release of the super cards. Meanwhile you can be playing the games that come out at good settings because two 570's are still good cards and will give you high settings , maybe not maxed out but still high.
m
0
l
a c 80 U Graphics card
a b } Memory
October 15, 2011 5:25:56 PM

if kepler has to be competitive with amd it has to be more power efficient.
needs better multi-monitor support like amd has.
power consumption figures for 4xx and 5xx series are quite high imo.
the whole fermi thing seems rough around the edges.
and nvidia doesn't have good mid-range cards like amd does e.g. 5770, 6850, 6870. i am not partial to any company though, i've used both nvidia and amd cards.
m
0
l
a c 254 U Graphics card
a b } Memory
October 15, 2011 5:45:54 PM

I only use Nvidia because it seems like thier high end cards are better and I always go for the high end.
m
0
l
a c 147 U Graphics card
a b } Memory
October 15, 2011 6:44:07 PM

de5_Roy said:
if kepler has to be competitive with amd it has to be more power efficient.
needs better multi-monitor support like amd has.
power consumption figures for 4xx and 5xx series are quite high imo.
the whole fermi thing seems rough around the edges.
and nvidia doesn't have good mid-range cards like amd does e.g. 5770, 6850, 6870. i am not partial to any company though, i've used both nvidia and amd cards.


to me GTX460, GTX560 and GTX560 Ti is quite good for nvidia mid range line up. the power consumption may be high for the 400 series (GF100) but i think nvidia did a good improvement with GF104 and 500 series though they still can't compete with AMD on that front. on multi monitor front no doubt AMD is ahead but hopefully we can see better solution from nvidia with kepler
m
0
l
a c 80 U Graphics card
a b } Memory
October 15, 2011 7:02:45 PM

yeah, gtx 460, gtx 560 and 560 ti are very good cards from nvidia.
they have superior driver support compared to amd imo, has broader support for their 3d platform and afaik those cards overclock very well.
i hope kepler keeps the good traits of the current cards and utilizes the fabrication process properly they'll give amd some serious competition benifitting users.
if i find an nvidia card with 90 w tdp requiring one pcie connector and delivering better performance than amd's 7850 i'll get one myself :) 
m
0
l
a c 273 U Graphics card
October 16, 2011 12:47:02 AM

Quote:
I agree with you on that, the 560 Ti just seems like a fully unlocked GTX 460, while the 560 doesnt just look like a 460 SOC, but performs like on as well.


It's posts like this that give an insight into how little you actually know. :pfff: 
m
0
l
a c 273 U Graphics card
October 16, 2011 3:14:41 AM

Quote:
well, what little I know is based on what I've read.
The 560 non Ti and 460 have completely similar specs, apart from clocks. same 7 SM unlock, 1 locked


http://www.tomshardware.com/reviews/geforce-gtx-560-amp...
Quote:
The chart makes it easier to see that GeForce GTX 560 is essentially a GeForce GTX 460 overclocked to GeForce GTX 560 Ti levels.

There are no 'locked' cores in either the 460 or 560, they never worked and are laser cut and thus cannot be 'unlocked' as they were never 'locked' in the first place.
m
0
l
a c 273 U Graphics card
October 16, 2011 4:27:12 AM

Quote:
not really, if u're lucky and flashed the right BIOS the 560 can unlock into a 560Ti.
It's like CPU unlocking, if it's not scrapped from the Die, it can be done

Like this guy here, but the fact that it was a mistake and he cant reflash it into a 560 makes me wonder if he's lucky.....

http://www.tomshardware.com/forum/328512-33-gigabyte-gt...
Quote:

I have a video card Gigabyte GTX560 (GV-N560GOC-1GI) and by mistake I rewrite the bios with files from GTX 560 TI (GV-N560OC-1GI). Core clock went up from 830 to 900 MHz and shaders clock went up from 1660 to 1800 MHz, also operating voltage in full load increased to 1.04 V. The card is stable, I didn’t notice any errors, in full load temperature went up to 75 °C with fan at 85%.

Whilst the clockspeeds might have increased I don't see any mention of the core count increasing.
m
0
l
a c 273 U Graphics card
October 16, 2011 5:53:48 AM

Quote:
that u've got to ask that guy,
6950 to 6970 is possible so I dont see it being impossible with the gf114 as well

Just like the newer 6950's it is impossible if the damaged cores have been laser cut.
m
0
l
a c 147 U Graphics card
a b } Memory
October 16, 2011 6:05:53 AM

Quote:
that u've got to ask that guy,
6950 to 6970 is possible so I dont see it being impossible with the gf114 as well


as i understand it the extra cores in 6950 was disabled through software (BIOS). that's why you can unlock it with 6970 BIOS. the same goes with GTX465 which can be unlock into GTX470. the difference between the two is AMD themselves give you the option to unlock 6950 into 6970 hence the BIOS switch exist on reference card while with GTX465 you will have to do it all by yourself. in case of GF114 the disable core was permanently (laser cut) disabled hence even with BIOS flash those disable core will remain disable.
m
0
l
a b U Graphics card
November 7, 2011 11:45:06 PM

NVidia claimed with Fermi that each ALU has a fully pipelined fully capable 32-bit integer block alongside the floating point one. This was misleading if one assumes based on it that the chip can issue 544.32 GInstr/s for INT operands – in practice, the INT block appears to be half-rate, so an INT warp is processed across 4 hot-clocks.
m
0
l
!