Sign-in / Sign-up

Report: GTX 870 Put Through Its Paces

By - Source: Coolaler | B 39 comments

Are these benchmarks of the GTX 870? We would like to think so.

A member on the Coolaler forums known as soothepain has posted two screenshots that show an unknown graphics card running 3DMark11 benchmarks.

We can see that the graphics card name is noted as D17U-20, but GPU-Z doesn’t recognize it. The same goes for the GPU, which is noted as 13C2. The member writes that this graphics card is the GTX 870, and we’re tempted to believe him. The CUDA core count of the graphics card is shown as 1664 cores, which run at 1051 MHz stock, with a boost clock that goes up to 1178 MHz. The card carries 4 GB of GDDR5 memory, which runs over a 256-bit memory interface at an effective speed of 7012 MHz.

The test scores are 4625 points for the 3DMark11 extreme preset and 11919 points for the performance preset. In comparison, this is right around the same level as the GTX 780 performs. The tests were performed on a system that carried the Intel Core i7-4820K Ivy Bridge-E processor.

Unfortunately, the device ID and the bios version of the card are obstructed in the images. These bits of information would help us confirm which card it is despite the fact that GPU-Z doesn't recognize it. Even so, given the performance level and the specifications, along with what we know about the Maxwell architecture, it is a reasonably safe assumption that this is, in fact, the GTX 870. Of course, note that this is still a rumor, and it is still unconfirmed. Hopefully we will know more in September, when we expect Nvidia to launch new graphics cards.

Follow Niels Broekhuijsen @NBroekhuijsen. Follow us @tomshardware, on Facebook and on Google+.

Discuss
Ask a Category Expert

Create a new thread in the News comments forum about this subject

Example: Notebook, Android, SSD hard drive

This thread is closed for comments
Top Comments
  • 17 Hide
    FunSurfer , August 12, 2014 10:32 AM
    Quote:
    So 870=780 and 880=780ti. What is the point???


    780 is 250W TDP, 870 is 190W. Captain Planet smiles.
Other Comments
  • -3 Hide
    vipboy28 , August 12, 2014 8:41 AM
    Does anyone this that the 256-bit memory interface is fine? I feel like we dont need 384 to achieve similar results. Anyone feel like it should be 384?
  • 4 Hide
    xenol , August 12, 2014 8:48 AM
    Let's just hope NVIDIA prices these new cards "correctly". It's jarring to see system builders selling $3500-$4000 systems that comes with a $3000 Titan Z standard.
  • Display all 39 comments.
  • 9 Hide
    xenol , August 12, 2014 9:20 AM
    Quote:
    Does anyone this that the 256-bit memory interface is fine? I feel like we dont need 384 to achieve similar results. Anyone feel like it should be 384?

    Maxwell has more cache, which is supposed to allow less of a need for higher memory bandwidth. It's part of the energy efficiency improvements they were doing.
  • 7 Hide
    oxiide , August 12, 2014 9:26 AM
    Quote:
    Does anyone this that the 256-bit memory interface is fine? I feel like we dont need 384 to achieve similar results. Anyone feel like it should be 384?


    Well, it may be a 256-bit bus but its also 7 GHz memory. That's really, really high; in fact I've heard its the intended limit for the baseline GDDR5 specification. Especially when you consider that this card doesn't sound beefy enough for 4K gaming anyway, I really don't think it will be starved for memory bandwidth.
  • 3 Hide
    hardcore_player , August 12, 2014 9:28 AM
    the Maxwell architecture doesn't need more than 256 bit bandwidth to run effectively , unlike Kepler , which is bandwidth dependent . for instance take a look at GTX 760 .
  • -6 Hide
    Doug Lord , August 12, 2014 10:18 AM
    So 870=780 and 880=780ti. What is the point???
  • 0 Hide
    ingtar33 , August 12, 2014 10:27 AM
    I doubt this is true. The fact the only useful bit of information is obscured in the screenies just points to this being yet another internet hoax.
  • 17 Hide
    FunSurfer , August 12, 2014 10:32 AM
    Quote:
    So 870=780 and 880=780ti. What is the point???


    780 is 250W TDP, 870 is 190W. Captain Planet smiles.
  • -1 Hide
    hardcore_player , August 12, 2014 11:41 AM
    Maxwell architecture "GTX 800"series wont offer a huge step over GTX 700 , still were waiting till Pascal hits the shelves in 2016 or 4th quarter of 2015 . it presents a 3D Memory that offers several times greater bandwidth, more than twice the memory capacity and quadrupled energy efficiency of today's GPUs , it also has what so called NV-Link which puts a fatter pipe between the CPU and GPU , the flow of data between the CPU and GPU in this state allows data to flow at more than 80GB per second, compared to the 16GB per second available at current GPUs according to Nvidia .
    so Maxwell is somehow a refinement to the GTX 700 , adds and sharpen some of the
    the hardcore specs and functions of Kepler " GTX 600 " and Kepler Refresh architecture "GTX 700".
  • 0 Hide
    hannibal , August 12, 2014 11:56 AM
    Of course they are priced correctly. About the same speed as 780... about the same price as 780... That has been the trend in the resent years. Maybe they shave 10-20$ of if they feel generous...
    All in all this is cheaper card to produce than 780 is, so they can get better margins. (I think so at least, because the chip is somewhat smaller.)
  • 5 Hide
    Memnarchon , August 12, 2014 12:50 PM
    Quote:
    Quote:
    So 870=780 and 880=780ti. What is the point???


    780 is 250W TDP, 870 is 190W. Captain Planet smiles.

    According to an other rumor, GTX880 would be $449 unlike the 780ti's MSRP of $699. Scrooge McDuck smiles too :p .
  • 0 Hide
    dovah-chan , August 12, 2014 1:28 PM
    "GPU-Z doesn't recognizing it." neat little error I noticed
  • 8 Hide
    Chris Droste , August 12, 2014 1:43 PM
    so, will AMD push harder to get some TONGA to the market in time for the holiday buying season? or will the GTX880 lanch just in time to take away single GPU share by besting a 290x in speed, efficiency, AND price? Stay tuned for the next "As the Wafer Yields!"
  • -5 Hide
    somebodyspecial , August 12, 2014 10:28 PM
    Quote:
    Let's just hope NVIDIA prices these new cards "correctly". It's jarring to see system builders selling $3500-$4000 systems that comes with a $3000 Titan Z standard.


    If they price them where you want them (lower), they'd be making as much as AMD. Just about nothing. They haven't made as much as 2007 in 7 yrs. When will people start to understand R&D costs a lot more today, and they CLEARLY are not charging as much as R&D has went up or they'd be making MORE money than 2007 NOT less. Right? Simple math people. If they are gouging us they should be making more money, but that's not the case because they now spend more on R&D than AMD. Their R&D has increased ~50% in the last 4yrs, but profits have been completely stagnant. They are PAYING to give you better stuff, but NOT reaping any benefits financially on the bottom line.

    Jarring to see such high prices, but without a the few who DO pay those prices (and can afford them easily), the rest of us would be looking at $1000 mid range cards and $500 bottom rung. You are not the target market for the $1500-3000 cards if you're complaining about them...LOL. I'm not someone who can afford those crazy cards, but I'm sure glad there are enough of them who laugh at those prices and buy immediately upon release so that NV can at least afford to give me a 780ti for $600-700. Without the truly rich buying the ridiculous stuff, that 780TI would surely be $1100+.

    With R&D costs skyrocketing, the only way we'll keep pricing the same is by putting more gpus in other devices and expanding the market for gpus/cpus (IE, mobile etc, where lower income people/developing nations can get on the cpu/gpu train in cheaper devices). AMD made 80mil last 12 months. Do you think they're charging enough? Not enough to make you say "RAISE YOUR PRICES AMD!"??? OK, how about this: They lost 6Billion+ in the last 10yrs. Do you think they charge enough knowing that HUGE number?

    While NV hasn't lost billions in the last 10, they are nowhere near their historical earnings either and R&D is blowing up for everyone. At some point they will have to PRICE to make more money and AMD should too! Immediately!

    You want to talk gouging, start talking Intel who makes $9Bil and is raising prices (haswell $350 now, was $320 last year for top end ivy). Even they are giving a pretty good deal though and their profits are down from 12.3B, to 11B, now to 9B last 12mo (mobile losses is killing Intel's profits, 1.1B/quarter loss on mobile). Same trend as NV basically. Not making as much as before.
  • -1 Hide
    somebodyspecial , August 12, 2014 10:33 PM
    Quote:
    Maxwell architecture "GTX 800"series wont offer a huge step over GTX 700 , still were waiting till Pascal hits the shelves in 2016 or 4th quarter of 2015 . it presents a 3D Memory that offers several times greater bandwidth, more than twice the memory capacity and quadrupled energy efficiency of today's GPUs , it also has what so called NV-Link which puts a fatter pipe between the CPU and GPU , the flow of data between the CPU and GPU in this state allows data to flow at more than 80GB per second, compared to the 16GB per second available at current GPUs according to Nvidia .
    so Maxwell is somehow a refinement to the GTX 700 , adds and sharpen some of the
    the hardcore specs and functions of Kepler " GTX 600 " and Kepler Refresh architecture "GTX 700".


    This is wrong, you're judging maxwell on the 28nm versions which are just to reduce power and give the same perf. The real cards come soon with 20nm and that will be a huge jump in power AND perf (Q1?). AMD will get a good jump from 20nm also. But yes, Pascal will be impressive, but that doesn't mean 20nm Maxwell won't be impressive too, with most of the die shrink going to jacking up perf, where right now they're just dropping power with maxwell's new characteristics while giving basically the same perf or maybe a tad better (and higher margin I hope, if the chips are smaller due to a redesign).
  • 2 Hide
    ra3tonite , August 12, 2014 10:41 PM
    I'd rather it be an 860
  • 0 Hide
    Drejeck , August 13, 2014 12:45 AM
    most useless benchmark ever lol
    256 bit memory interface makes a lot of sense, maxwell architecture is really efficient about that.
    I have the 750Ti and it outperforms the 6950 I had.
    Addressing 2GBs of ram is faster than 4GBs. The 6950 1GB version was slightly faster than the 2GB.
    Speaking of resolutions higher than 1920x1080 you need 256 to start without filters, 512 bit though is not yet capable of 4K gaming.
    There are lots of limits today, any monitor above 1920x1080 increase a lot (really a lot) frametime variance, and the monitor itself has a lot more of latency. This goes far from pure gamer needs. Triple monitors and 4K aswell (not speaking of screen bezels and pricing).
    Maxwell 128 bit seems efficient as 192 bit to me. 256 bit should be like 384.
    The higher the bits the larger the chip, the more populated the PCB gets and the higher the power requirements.
    My personal hope is that they never take again a chip made for HPC to act as gaming videocard.
    They should simply build ad hoc gaming videocard. Not even megachips. To me the 290s are a bad choice. Audio resources on a GPU? I have yet to hear that working but I'd rather go for a 150-170 soundcard. It's cool on a budget APU and mainstream videocard. I hope 870 and 880 brutalize the GK110 spawns in gaming. Maybe when a Phantom comes out I'll think about selling my 770 Phantom.
  • -1 Hide
    Ninjawithagun , August 13, 2014 4:44 AM
    Why hasn't Nvidia advanced to a 512-bit memory bus yet?? I just don't understand!! I do understand that with specific GPU architecture and memory configurations that a 384-bit bus is not always ideal. Fine. So, why not use a 512-bit bus instead of a 256-bit bus. It can't possibly add that much complexity to the fabrication process or to the overall material to make the card. In which case, if either were true, then price would be the ultimate issue. I personally, would be willing to pay for a wider memory interface just for the sake of having that much more headroom to overclock the GPU and memory :D 
  • 0 Hide
    xenol , August 13, 2014 7:28 AM
    Quote:
    Why hasn't Nvidia advanced to a 512-bit memory bus yet?? I just don't understand!!
    Quote:

    It has nothing to do with manufacturing complexity. Maxwell was designed to not need a wide bus interface because adding a wider bus adds more power consumption. Besides, adding a wider bus may not add much performance anyway, simply because of the way it's designed.
  • 0 Hide
    semitope , August 13, 2014 7:58 AM
    Quote:
    Quote:
    Let's just hope NVIDIA prices these new cards "correctly". It's jarring to see system builders selling $3500-$4000 systems that comes with a $3000 Titan Z standard.


    If they price them where you want them (lower), they'd be making as much as AMD. Just about nothing. They haven't made as much as 2007 in 7 yrs. When will people start to understand R&D costs a lot more today, and they CLEARLY are not charging as much as R&D has went up or they'd be making MORE money than 2007 NOT less. Right? Simple math people. If they are gouging us they should be making more money, but that's not the case because they now spend more on R&D than AMD. Their R&D has increased ~50% in the last 4yrs, but profits have been completely stagnant. They are PAYING to give you better stuff, but NOT reaping any benefits financially on the bottom line.


    Is this gpu r&d or just general for all their products? Because they can be gouging us in the gpu segment to support all the rest of their pet projects' R&D. As far as GPU, AMD spends enough. They clearly know how to do a lot with what could be a little.
Display more comments