Sign in with
Sign up | Sign in
Your question
Closed

Report: GTX 880 PCB Pictured

Tags:
  • Graphics Cards
  • Nvidia
Last response: in News comments
Share
a c 334 U Graphics card
a b Î Nvidia
July 4, 2014 3:41:55 PM

I guess by removing a zero, Nvidia can start over and recycle the 8800 moniker again... lol.
Score
9
July 4, 2014 3:47:08 PM

NSFW.
Score
10
Related resources
a b U Graphics card
July 4, 2014 4:29:00 PM

Well if GTX880 will be a GM104 chip and not a flagship GM100/110 chip then its obvious that it will be cheaper like GTX680. Around $500.
At $500 is cheaper than GTX780ti and it should give more performance (around 30% I believe which would come with the greater perf per watt, moar cores and with what GTX750 showed us, more Hz..).
Score
-12
a b U Graphics card
July 4, 2014 4:58:32 PM

Must be a cream of the crop edition card since from the picture it will need 2x8pin connectors and a 1x6 pin connector. Oversized PCB like the EVGA KPE card.

Please pass the salt shaker!
Score
1
July 4, 2014 5:23:41 PM

4GB is entry for today's high GPU bandwidth games like Watch Dogs, thanks to console's unified 8GB memory, games are using more and more GPU memory. I expect nothing less than 6GB, most likely 8GB.
Score
-14
July 4, 2014 5:48:59 PM

Like hey, that's my next card... Stop staring, you peeping Tom's!!
Score
5
a b U Graphics card
July 4, 2014 6:00:25 PM

Quote:
4GB is entry for today's high GPU bandwidth games like Watch Dogs, thanks to console's unified 8GB memory, games are using more and more GPU memory. I expect nothing less than 6GB, most likely 8GB.


Thats... Not right at all. Just because a console has 8gb of vram doesn't mean it has anything close to the power required to fill it up, most of its used for other things, remember its shared with system ram.
Score
17
a c 422 U Graphics card
a c 161 Î Nvidia
July 4, 2014 7:01:27 PM

I lost it at the salt picture...
Score
26
July 4, 2014 7:26:08 PM

realy hoping AMD will reduce the power consumption to keep up with Nvidia
Score
5
a b U Graphics card
July 4, 2014 7:47:49 PM

Sir.... if that is a "pinch of salt" i hate to see a spoon full of salt. lol
Score
14
July 4, 2014 8:02:55 PM

That's a huge die for 20nm. This thing packs some serious cache or serious cores.
Score
1
a b U Graphics card
July 4, 2014 8:06:08 PM

Quote:
4GB is entry for today's high GPU bandwidth games like Watch Dogs, thanks to console's unified 8GB memory, games are using more and more GPU memory. I expect nothing less than 6GB, most likely 8GB.


Sir, please don't try to match console entire memory to just gpu memory on a PC nor how games run on a PC and game console.

xbox one and PS4 uses that same memory for both graphics AND cpu while modern high end PC's have it separated.

Also games on these 2 different systems are programmed to take memory from both vram and system ram for the pc vs these game consoles.



On top of that, just because you have "X" GB's of memory for gpu, doesn't mean that a system is able to utilize it effectively....



For example.... GTX titan black vs GTX 780 ti..... 6GB of memory in the titan black and yet has hardly any performance improvements over the 780 ti.....
Score
14
July 5, 2014 2:48:47 AM

Quote:
4GB is entry for today's high GPU bandwidth games like Watch Dogs, thanks to console's unified 8GB memory, games are using more and more GPU memory. I expect nothing less than 6GB, most likely 8GB.


3GB is reserved for the operating system on Xbox, thats already down to 5GB, then there is the regular RAM, that is about 1.5GB-2GB for a high end game, so thats 3GB used for VRAM.
Score
8
July 5, 2014 6:52:52 AM

Quote:
4GB is entry for today's high GPU bandwidth games like Watch Dogs, thanks to console's unified 8GB memory, games are using more and more GPU memory. I expect nothing less than 6GB, most likely 8GB.


Both the PS4 and Xbox One can't access more than 3.5GB of RAM for game related operations. One of the two is actually less than the other by 512MB, but makes up for it with virtual memory to market it as the same stat. I won't say which because I don't want console-fan flame war, but it isn't the one you are thinking.

So no, no matter how unoptimized a "next-gen" port is to PC you won't see 8GB of VRAM usage. At maximum you would be looking at 3.5GB. That said, I won't be buying Maxwell without at least 6GB of VRAM for higher resolution gaming.
Score
11
a b U Graphics card
July 5, 2014 9:50:46 AM

Yea people shouldn't think about the consoles really with this. For the RAM in them, it is 8GB of total system RAM, and it doesn't all go to the GPU so it will have less effect on RAM usage for desktops than massive RPG games do. Still I wouldn't be so surprised to find a high end GPU with 6GB. Titan already has that so its possible Nvidia is trying to stick their top end card with 6GB, but a lot of companies will probably make 4GB versions.

As for the pictured GPU above, I kind of wonder if it isn't a dual GPU board. It has a lot of RAM and a lot of power flowing into it. I kind of think maybe all that blurred images and the back side of the card might be hiding a second GPU, and this is actually going to be something like the GTX 890
Score
3
July 5, 2014 10:23:48 AM

Why are garbage consoles even being mentioned here with $700 graphics cards? [edited by mod]
Score
1
a b U Graphics card
July 5, 2014 10:47:47 AM

With that much VRAM, it sounds more like a new Workstation card than anything else. There is no good reason to put that much memory on a gaming card at this time.
Score
0
July 5, 2014 5:41:56 PM

I just hope it costs less than $3000 and that as a single card it can out perform my aging 690. :-|
Score
7
July 5, 2014 6:56:40 PM

Looks like we might see this sooner than later :)  I'm hoping 2014! :D 
Score
2
a b U Graphics card
July 6, 2014 3:04:46 AM

Getting the upgrade itch to SLI my 780. Must hold out until 880 comes out, along with DDR4 using motherboards and finally, Windows 9 with DX 12.
Score
2
July 6, 2014 3:35:26 AM

Quote:
I lost it at the salt picture...


He said a pinch. The rest is for sharing.
Score
0
July 6, 2014 4:32:18 AM

It should have as much vram as it can handle as far as I'm concerned. I'm not paying $700 for a gpu that has less vram than it could have had, even if I don't need it.

We need hardware to be overkill so it can trickle down faster and force development to make use of said overkill hardware. Pc users always cry about more cores and more ram and then claim they are some master race. Lol. Stop crying about better tech.
Score
0
July 6, 2014 9:59:40 AM

With AMD giving us 512-bit memory buses on their high end cards and even 384-bit on their [now] upper mid-range cards since the HD7xxx series, why would Nvidia still only offer up a 256-bit bus on what is supposed to be the flagship of the 8-series Maxwells? Why not at least match the 384-bit bus of the Titan (and its variants), the 780 and 780ti?
Score
1
July 6, 2014 12:32:34 PM

I really hope the 880ti has 8GB of ram and only on the GPU side. I want to see optimal cooling. I need to replace my gtx680's badly and so far nothing seems tempting.
Score
-1
July 6, 2014 1:21:27 PM

I really hope the 880ti has 8GB of ram and only on the GPU side. I want to see optimal cooling. I need to replace my gtx680's badly and so far nothing seems tempting.
Score
0
July 6, 2014 3:12:57 PM

The one thing I am not convinced of is the 256bit memory bus. Unless the GDDR5 is faster it will cut memory bandwidth down. The only way I see this as a possible solution is if they have SPUs that are fast enough to not need the extra memory bandwidth which in that case is a good thing as a smaller memory bus will cut down on power pull and temps.

That was one of the downsides of the HD2900s 512Bit ring bus, it used a lot more power than other GPUs.

I guess only time will tell.
Score
1
July 7, 2014 12:03:25 AM

Excuse me, but isn't that supposed to be 16 Gb as in 16 giga bytes???? Essentially it would mean that it's a 2 GB card. AFAIK all memory modules will be in Gigabytes. Assuming that's the case, 8x1 Gb per side would mean 16 Gb on the card which would be 2 GB (16/8) of GDDR5 RAM.
Score
-1
July 7, 2014 12:04:42 AM

Ahhhh!! My mistake it's Giga bits 16 giga bits. Essentially it would mean that it's a 2 GB card. AFAIK all memory modules will be in Gigabits. Assuming that's the case, 8x1 Gb per side would mean 16 Gb on the card which would be 2 GB (16/8) of GDDR5 RAM.
Score
-1
July 7, 2014 2:58:03 AM

your move AMD, I'm waiting...
Score
0
July 7, 2014 3:57:39 AM

H.265 and HDMI 2.0 won't be until 20nm Maxwell. Guess my aging 5770 X fire has to last another 12 months.
Score
0
July 7, 2014 11:33:25 AM

Why shouldn't it be cheaper than a 780Ti? That's a chip not born for games and could be overtaken by the next architecture as we have seen many times in history. Usually it doesn't happen when a new architecture comes out, but the 780Ti in particular is a chip not born for gaming and it may absolutely be overtaken by the next gen gaming focused hardware and be even cheaper than the former flagship video card. Of course, not if the pull the price up just for being alone in the competition.
Score
0
July 7, 2014 11:58:56 PM

If you are waiting for the 880, just go get the 780ti or 690. There is credible information to expect the launch of 880ti (which is the card worth waiting for - 8 gbvram) in 2015 sept/nov.
Score
0
July 20, 2014 5:49:53 PM

i was wondering if there is a way to scale a mobile gpu to desktop. like a way of knowing what the desktop gpu's would be capable of comparing and scaling it from mobile using previous gen's as a medium.
Score
0
!