Sign in with
Sign up | Sign in
Your question
Closed

MSI Introduces its GeForce GTX 770 4 GB Graphics Card

Last response: in News comments
Share
a b U Graphics card
July 19, 2013 12:12:01 PM

Actually we want the 6G GTX 780 cards :( 
a b U Graphics card
July 19, 2013 12:31:37 PM

"a number of heat pipes"

Well that is certainly useful information
Related resources
a b U Graphics card
July 19, 2013 12:40:50 PM

With the upcoming next-gen consoles having effectively 8GB of VRAM (although obviously no game will use that much, since the vram is shared with the rest of the system), 4GB should be useful imo to play upcoming console ports. Titan Fall, for example, is slated to use 5GB of VRAM, so even this card won't be able to handle the full-resolution textures of that particular title.
a b U Graphics card
July 19, 2013 12:41:48 PM

duplicate
a b U Graphics card
July 19, 2013 1:32:55 PM

seems the ram in the xbox will be 8gb of gddr3 while the ps4 will be 8gb of gddr5, i can't seem to find an official bus width but i see specs saying 178gb/s on the ps4 while a tahiti based video card runs at well over 260gb/s ram speed. also keep in mind that part of this 8gb of ram in these system will be for the OS only, probably roughly 2gb to 3gb.
July 19, 2013 1:46:47 PM

Wonder how much difference 4GB will make vs. the 2GB version and is it's worth the extra cash. Most reviews of the normal 770 don't seem to indicate that ram is a bottleneck.
a b U Graphics card
July 19, 2013 1:52:31 PM

jrstriker12 said:
Wonder how much difference 4GB will make vs. the 2GB version and is it's worth the extra cash. Most reviews of the normal 770 don't seem to indicate that ram is a bottleneck.


4G is for three screens set up . or the coming 4K monitors (using Quad SLI ofc lol)

2G is enough for 1 monitor running at 1920x1080.
July 19, 2013 1:58:18 PM

A GTX 770 with 4GB of Vram is a waste, you get no benifit from having more then 2GB due to the 256bit memory bus. The 8GB of Vram in the consoles is probably as usless as tits on a boar, but we will have to wait and see.
While we are at its a 780 with 6GB will give you the same as with 3GB, truth is the Titan only need 3GB as well 384bit memory bus just cant use the 6GB effectively.
a b U Graphics card
July 19, 2013 2:12:17 PM

madogre said:
A GTX 770 with 4GB of Vram is a waste, you get no benifit from having more then 2GB due to the 256bit memory bus. The 8GB of Vram in the consoles is probably as usless as tits on a boar, but we will have to wait and see.
While we are at its a 780 with 6GB will give you the same as with 3GB, truth is the Titan only need 3GB as well 384bit memory bus just cant use the 6GB effectively.


Prove what you just said .
July 19, 2013 2:30:33 PM

MSI have been getting Great review on their video cards. I haven't really seen Asus or EVGA review for the 700 series.
July 19, 2013 2:30:35 PM

MSI have been getting Great review on their video cards. I haven't really seen Asus or EVGA review for the 700 series.
July 19, 2013 2:55:16 PM

SNA3 said:
madogre said:
A GTX 770 with 4GB of Vram is a waste, you get no benifit from having more then 2GB due to the 256bit memory bus. The 8GB of Vram in the consoles is probably as usless as tits on a boar, but we will have to wait and see.
While we are at its a 780 with 6GB will give you the same as with 3GB, truth is the Titan only need 3GB as well 384bit memory bus just cant use the 6GB effectively.


Prove what you just said .


Sigh, can you not use Google your self?
#1 http://hexus.net/tech/reviews/graphics/43109-evga-gefor...
The Good
Cool, near-silent, and quick
Completely non-standard design begs to be pushed hard
Healthy factory-based GPU overclock
Can be made into a beast with over-voltage and better cooling
Remains power-efficient

The Bad
No out-of-the-box memory overclocking
Usefulness of 4GB memory buffer is questionable

#2 http://www.guru3d.com/articles_pages/palit_geforce_gtx_...
Final words and conclusion
The 4GB -- Realistically there was not one game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference.

#3 http://hexus.net/tech/reviews/displays/57849-asus-pq321...
Having larger framebuffers remains more of a marketing tool than a real-world benefit for even enthusiast gamers... there simply won't be many harder cases than rendering to a 4K screen at ultra-quality settings and with a semi-pointless 4x MSAA invoked. Should you really want to have pristine edges in games and aren't happy with the default render from an 8.3MP screen, we'll doubtless see other, more efficient techniques such as FXAA take over.

a b U Graphics card
July 19, 2013 3:34:24 PM

madogre said:

The Bad
No out-of-the-box memory overclocking
Usefulness of 4GB memory buffer is questionable

#2 http://www.guru3d.com/articles_pages/palit_geforce_gtx_...
Final words and conclusion
The 4GB -- Realistically there was not one game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference.

#3 http://hexus.net/tech/reviews/displays/57849-asus-pq321...
Having larger framebuffers remains more of a marketing tool than a real-world benefit for even enthusiast gamers... there simply won't be many harder cases than rendering to a 4K screen at ultra-quality settings and with a semi-pointless 4x MSAA invoked. Should you really want to have pristine edges in games and aren't happy with the default render from an 8.3MP screen, we'll doubtless see other, more efficient techniques such as FXAA take over.

Are they equally sure it will be of NO USE DOWN THE ROAD? That imo is what having 4GB VRAM is about. NOT for current games, obviously, but for future next-gen titles (i.e. over the next few years). Massive amounts of textures specifically will surely effectively utilize more ram. The issue here is that there are currently ZERO titles on which to test this. That will change - you would argue that the 5GB of VRAM that Titan Fall is reportedly going to use won't actually work properly because the memory bandwidth will be too low. Can you prove that?

July 19, 2013 4:50:18 PM

nitrium said:
With the upcoming next-gen consoles having effectively 8GB of VRAM (although obviously no game will use that much, since the vram is shared with the rest of the system), 4GB should be useful imo to play upcoming console ports. Titan Fall, for example, is slated to use 5GB of VRAM, so even this card won't be able to handle the full-resolution textures of that particular title.


According to the developer, it technically uses more then 5 GB when you factor in the cloud computing that Microsoft is doing with the xbox one.
July 19, 2013 6:15:52 PM

nitrium said:
madogre said:

The Bad
No out-of-the-box memory overclocking
Usefulness of 4GB memory buffer is questionable

#2 http://www.guru3d.com/articles_pages/palit_geforce_gtx_...
Final words and conclusion
The 4GB -- Realistically there was not one game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference.

#3 http://hexus.net/tech/reviews/displays/57849-asus-pq321...
Having larger framebuffers remains more of a marketing tool than a real-world benefit for even enthusiast gamers... there simply won't be many harder cases than rendering to a 4K screen at ultra-quality settings and with a semi-pointless 4x MSAA invoked. Should you really want to have pristine edges in games and aren't happy with the default render from an 8.3MP screen, we'll doubtless see other, more efficient techniques such as FXAA take over.

Are they equally sure it will be of NO USE DOWN THE ROAD? That imo is what having 4GB VRAM is about. NOT for current games, obviously, but for future next-gen titles (i.e. over the next few years). Massive amounts of textures specifically will surely effectively utilize more ram. The issue here is that there are currently ZERO titles on which to test this. That will change - you would argue that the 5GB of VRAM that Titan Fall is reportedly going to use won't actually work properly because the memory bandwidth will be too low. Can you prove that?


Yes I am sure, when that happens the GPU horse power of the current card will not be able to keep up with it. There is a reason reference cards stick with memory amounts close to the memory bus size. with GDDR 5 you are able to run 256bus with 2GB of Vram, back in the old days you saw 256bit bus with 512 because it was GDDR2 like in the Geforce 5900 cards, once they moved up to GDDR3 you say 256bit bus with a massive 1GB of Vram, and now we have GDDR5 and it has the speed to work awesome with 2GB of GDDR5 with a 256bit bus, but once you move up to 3GB of GDDR5 you really need the 384bit bus to utilize it like with the GTX780 and Titan.

The Titan was not intended for gamers really, and had AMD not dropped the ball with the 7970 letting them use the mid range GK104 chip as the 680, we would have seen reference cards with 3GB instead of 6GB like the workstation counter parts its made from.

a b U Graphics card
July 19, 2013 6:19:50 PM

madogre said:
SNA3 said:
madogre said:
A GTX 770 with 4GB of Vram is a waste, you get no benifit from having more then 2GB due to the 256bit memory bus. The 8GB of Vram in the consoles is probably as usless as tits on a boar, but we will have to wait and see.
While we are at its a 780 with 6GB will give you the same as with 3GB, truth is the Titan only need 3GB as well 384bit memory bus just cant use the 6GB effectively.


Prove what you just said .


Sigh, can you not use Google your self?
#1 http://hexus.net/tech/reviews/graphics/43109-evga-gefor...
The Good
Cool, near-silent, and quick
Completely non-standard design begs to be pushed hard
Healthy factory-based GPU overclock
Can be made into a beast with over-voltage and better cooling
Remains power-efficient

The Bad
No out-of-the-box memory overclocking
Usefulness of 4GB memory buffer is questionable

#2 http://www.guru3d.com/articles_pages/palit_geforce_gtx_...
Final words and conclusion
The 4GB -- Realistically there was not one game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference.

#3 http://hexus.net/tech/reviews/displays/57849-asus-pq321...
Having larger framebuffers remains more of a marketing tool than a real-world benefit for even enthusiast gamers... there simply won't be many harder cases than rendering to a 4K screen at ultra-quality settings and with a semi-pointless 4x MSAA invoked. Should you really want to have pristine edges in games and aren't happy with the default render from an 8.3MP screen, we'll doubtless see other, more efficient techniques such as FXAA take over.



who said you will get benefet today ?

this is about the future . if the game buffer and data woesnt need more than 2 GB ofc you wont see a difference. but when they start to need it you will see that . and you can still use 3 monitors are higher resolution . like 1600P monitors. or 1400P monitors.

yes most of the people wont need more than 2G of GDDR5 , BUT there are people who pay $10,000 to $20,000 machines .. and they will make use of it.
a b U Graphics card
July 19, 2013 6:30:17 PM

SNA3 said:
madogre said:
SNA3 said:
madogre said:
A GTX 770 with 4GB of Vram is a waste, you get no benifit from having more then 2GB due to the 256bit memory bus. The 8GB of Vram in the consoles is probably as usless as tits on a boar, but we will have to wait and see.
While we are at its a 780 with 6GB will give you the same as with 3GB, truth is the Titan only need 3GB as well 384bit memory bus just cant use the 6GB effectively.


Prove what you just said .


Sigh, can you not use Google your self?
#1 http://hexus.net/tech/reviews/graphics/43109-evga-gefor...
The Good
Cool, near-silent, and quick
Completely non-standard design begs to be pushed hard
Healthy factory-based GPU overclock
Can be made into a beast with over-voltage and better cooling
Remains power-efficient

The Bad
No out-of-the-box memory overclocking
Usefulness of 4GB memory buffer is questionable

#2 http://www.guru3d.com/articles_pages/palit_geforce_gtx_...
Final words and conclusion
The 4GB -- Realistically there was not one game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference.

#3 http://hexus.net/tech/reviews/displays/57849-asus-pq321...
Having larger framebuffers remains more of a marketing tool than a real-world benefit for even enthusiast gamers... there simply won't be many harder cases than rendering to a 4K screen at ultra-quality settings and with a semi-pointless 4x MSAA invoked. Should you really want to have pristine edges in games and aren't happy with the default render from an 8.3MP screen, we'll doubtless see other, more efficient techniques such as FXAA take over.



who said you will get benefet today ?

this is about the future . if the game buffer and data woesnt need more than 2 GB ofc you wont see a difference. but when they start to need it you will see that . and you can still use 3 monitors are higher resolution . like 1600P monitors. or 1400P monitors.

yes most of the people wont need more than 2G of GDDR5 , BUT there are people who pay $10,000 to $20,000 machines .. and they will make use of it.


When you will need more VRAM, you will need a stronger card anyway.
a b U Graphics card
July 19, 2013 7:02:32 PM

yyk71200 said:
SNA3 said:
madogre said:
SNA3 said:
madogre said:
A GTX 770 with 4GB of Vram is a waste, you get no benifit from having more then 2GB due to the 256bit memory bus. The 8GB of Vram in the consoles is probably as usless as tits on a boar, but we will have to wait and see.
While we are at its a 780 with 6GB will give you the same as with 3GB, truth is the Titan only need 3GB as well 384bit memory bus just cant use the 6GB effectively.


Prove what you just said .


Sigh, can you not use Google your self?
#1 http://hexus.net/tech/reviews/graphics/43109-evga-gefor...
The Good
Cool, near-silent, and quick
Completely non-standard design begs to be pushed hard
Healthy factory-based GPU overclock
Can be made into a beast with over-voltage and better cooling
Remains power-efficient

The Bad
No out-of-the-box memory overclocking
Usefulness of 4GB memory buffer is questionable

#2 http://www.guru3d.com/articles_pages/palit_geforce_gtx_...
Final words and conclusion
The 4GB -- Realistically there was not one game that we tested that could benefit from the two extra GB's of graphics memory. Even at 2560x1600 (which is a massive 4 Mpixels resolution) there was just no measurable difference.

#3 http://hexus.net/tech/reviews/displays/57849-asus-pq321...
Having larger framebuffers remains more of a marketing tool than a real-world benefit for even enthusiast gamers... there simply won't be many harder cases than rendering to a 4K screen at ultra-quality settings and with a semi-pointless 4x MSAA invoked. Should you really want to have pristine edges in games and aren't happy with the default render from an 8.3MP screen, we'll doubtless see other, more efficient techniques such as FXAA take over.



who said you will get benefet today ?

this is about the future . if the game buffer and data woesnt need more than 2 GB ofc you wont see a difference. but when they start to need it you will see that . and you can still use 3 monitors are higher resolution . like 1600P monitors. or 1400P monitors.

yes most of the people wont need more than 2G of GDDR5 , BUT there are people who pay $10,000 to $20,000 machines .. and they will make use of it.


When you will need more VRAM, you will need a stronger card anyway.


lol dont worry 4 in GTX 770 4GB in SLI will be enough for that :) 

this stuff is for the rich =)
July 19, 2013 7:46:40 PM

And the tech illiterate, apparently.
July 20, 2013 12:25:43 AM

If I remember correctly, the 770 is a reworked 680 so the card itself is likely suffering from lack of bandwidth at the PCIe slot...though the extra buffer will massively help dual and triple-monitor users.
a c 212 U Graphics card
July 20, 2013 5:41:19 AM

Some selective quoting being done on the usefulness of 4GB. I use 2GB is for single screen builds, 4 GB for multi monitor setups

http://www.guru3d.com/articles_pages/palit_geforce_gtx_...

Quote:
Now the setup could benefit from triple monitor setups at 5760x1080 (which is a 6 Mpixels resolution), but even there I doubt if 4 GB is really something you'd need to spend money on. It might make a difference at 16xAA and the most stringent games, or if you game in 3D Stereo and triple monitor gaming -- I mean sure -- at any point graphics memory can and will run out. There's one exception to the rule, and that's Skyrim all beefed, tweaked and modded upwards. But the universal question remains, is it worth it investing in that extra memory?


http://hexus.net/tech/reviews/graphics/43109-evga-gefor...

Quote:
Of more interest is the 2,204MB framebuffer usage when running the EVGA card, suggesting that the game, set to Ultra quality, is stifled by the standard GTX 680's 2GB.
a b U Graphics card
July 20, 2013 7:33:59 AM

JackNaylorPE said:
Some selective quoting being done on the usefulness of 4GB. I use 2GB is for single screen builds, 4 GB for multi monitor setups

http://www.guru3d.com/articles_pages/palit_geforce_gtx_...

Quote:
Now the setup could benefit from triple monitor setups at 5760x1080 (which is a 6 Mpixels resolution), but even there I doubt if 4 GB is really something you'd need to spend money on. It might make a difference at 16xAA and the most stringent games, or if you game in 3D Stereo and triple monitor gaming -- I mean sure -- at any point graphics memory can and will run out. There's one exception to the rule, and that's Skyrim all beefed, tweaked and modded upwards. But the universal question remains, is it worth it investing in that extra memory?


http://hexus.net/tech/reviews/graphics/43109-evga-gefor...

Quote:
Of more interest is the 2,204MB framebuffer usage when running the EVGA card, suggesting that the game, set to Ultra quality, is stifled by the standard GTX 680's 2GB.


You stopped on the Hexus.net article right before it explained this:

"This graph shows that while the average and per-second framerate of the EVGA card is better than the 6GB-equipped Sapphire TOXIC, both produce a near-identical number of sub-33ms frames. Of more interest is the 2,204MB framebuffer usage when running the EVGA card, suggesting that the game, set to Ultra quality, is stifled by the standard GTX 680's 2GB. We ran the game on both GTX 680s directly after one another and didn't feel the extra smoothness implied by the results of the 4GB-totin' card."

So the results were only noticeable interms of FPS, not the the reviewers eyes.

The argument that the consoles have 8GB of RAM and more RAM will be necessary in the future because of that is specious. The devices themselves have 8GB total system RAM - most systems today have 4, 8, or 16 GB of RAM to go along with their 2, 3, 4, or 6GB of video RAM on their GPU. That's systems ranging from 6-22GB of system RAM, much higher than consoles.

Both Consoles are running a 256-bit system memory bus and the PS4 has the most shaders at 1152, the xbox at 768. Whilst shaders are not everything per se, the specifications for the GPU on the SoC for the consoles is quite similar to the GTX 760. AMD's competing product in that arena is the 7950 which enjoys a higher shader count, 384-bit memory bus, and 3GB of RAM. Both perform very similarly in real world tests. All this is to say that there's no real reason to believe at this point that the 8GB of system RAM on a 256-bit memory bus from the consoles is going to drive up the RAM above 2GB on a 256-bit BUS at 1080p.

Of course these are my theories - don't take it from me. Instead take it from this article:

http://www.pcper.com/news/Editorial/Epic-Games-disappoi...

Back in 2011, the Samaritan Demo was created by Epic Games to persuade console manufacturers. This demo was how Epic considered the next generation of consoles to perform. They said, back in 2011, that this demo would theoretically require 2.5 teraFLOPs of performance for 30FPS at true 1080p; ultimately their demo ran on the PC with a single GTX 680, approximately 3.09 teraFLOPs.

This required performance, (again) approximately 2.5 teraFLOPs, is higher than what is theoretically possible for the consoles, which is less than 2 teraFLOPs. The PC may have more overhead than consoles, but the PS4 and Xbox One would be too slow even with zero overhead.

Now, of course, this does not account for reducing quality where it will be the least noticeable and other cheats. Developers are able to reduce particle counts and texture resolutions in barely-noticeable places; they are also able to render below 1080p or even below 720p, as was the norm for our current console generation, to save performance for more important things. Perhaps developers might even use different algorithms which achieve the same, or better, quality for less computation at the expense of more sensitivity to RAM, bandwidth, or what-have-you.

But, in the end, Epic Games did not get the ~2.5 teraFLOPs they originally hoped for when they created the Samaritan Demo. This likely explains, at least in part, why the Elemental Demo looked a little sad at Sony's press conference: it was a little FLOP.
July 20, 2013 1:40:07 PM

It's an incremental step in the right direction. That being said, being able to load large texture packs would be useful. :)  As for having no benefit to the extra memory, well, that gets said every two or so years. Then a game pulls a GTA IV and starts gobbling up the VRAM. Fidelity increases are not necessarily keyed to polygon count or overall scene complexity. Tech luddites are always griping that we can't make use of this, that no, extra of that is pointless. Ignore it and go on.

I had a friend who griped that around the time of GTA IV saying that quad core CPUs were useless due to blah blah blah threading and bandwidth. I could play the game. He couldn't. :) 
a b U Graphics card
July 20, 2013 2:19:43 PM

tajisi said:
It's an incremental step in the right direction. That being said, being able to load large texture packs would be useful. :)  As for having no benefit to the extra memory, well, that gets said every two or so years. Then a game pulls a GTA IV and starts gobbling up the VRAM. Fidelity increases are not necessarily keyed to polygon count or overall scene complexity. Tech luddites are always griping that we can't make use of this, that no, extra of that is pointless. Ignore it and go on.

I had a friend who griped that around the time of GTA IV saying that quad core CPUs were useless due to blah blah blah threading and bandwidth. I could play the game. He couldn't. :) 


Those are really good points. I'm not super technical with gaming; I prefer to read the experts and formulate my own opinion. That being said the next generation of consoles going for that 60 FPS target is a good thing. So is large texture packs to go along with the possible 3-5 GB of RAM for the GPU. From the article I posted though it looks likes these targets are going to be achieved in the same manner as the current gen - by sacrificing AA or some trickery, read "optimization." That will leave the PC gamers in the same position they've always been in with superior control over their games content and graphics quality.

The one "ace up the sleeve" may in fact be the XBox One that everyone was complaining about. With their cloud infrastructure to support the platform developers may be able to offload certain workflows into the "cloud" and keep up for longer with PC's. Unless something drastically changes with Intel, AMD, and Nvidia CPU components today should be on par if not better than what will be coming out for consoles this holiday season. As the article I posted stated, the demo was shown on one GTX 680 whereas the console version lacked all the effects. Who knows what kind of cards will be around in 7-10 years when they are retiring the PS4 and XB One.
July 20, 2013 4:54:21 PM

useless unless you have a triple monitor setup. for all these people saying future proofing for the games coming out in the next few years you will probably want a 20nm GPU by then as they will have moved down from 28nm GPU manufactures have extracted all thr performance they can from 28nm they can only increase performance by increasing die area. So buying to future proof is stupid the 20nm gpu's are going to be a big boost in performance and you will end up wanting that and replacing the card anyways. Ram prices will probably be cheaper too so the increase in price for the larger ram card may be smaller then too. Always purchase for you immediate needs that way you can stay with the best price to performance ratio. By the time you need something better that something better will have come down in price allowing you to stay current at the cheapest possible prices.
July 20, 2013 11:27:54 PM

4GB VRAM useless? Maybe for gaming but I use the extra VRAM for CUDA applications, not gaming. Not everybody can afford a Tesla or Quadro card
a c 212 U Graphics card
July 21, 2013 6:39:07 AM

OcelotRex said:
You stopped on the Hexus.net article right before it explained this:

"This graph shows that while the average and per-second framerate of the EVGA card is better than the 6GB-equipped Sapphire TOXIC, both produce a near-identical number of sub-33ms frames. Of more interest is the 2,204MB framebuffer usage when running the EVGA card, suggesting that the game, set to Ultra quality, is stifled by the standard GTX 680's 2GB. We ran the game on both GTX 680s directly after one another and didn't feel the extra smoothness implied by the results of the 4GB-totin' card."


It explained that he didn't "see" a difference in a particular game on a particular screen. The 4k screens are just arriving and game manufacturers are just now beginning to design games to the hardware available.

a b U Graphics card
July 21, 2013 7:58:17 AM

JackNaylorPE said:
OcelotRex said:
You stopped on the Hexus.net article right before it explained this:

"This graph shows that while the average and per-second framerate of the EVGA card is better than the 6GB-equipped Sapphire TOXIC, both produce a near-identical number of sub-33ms frames. Of more interest is the 2,204MB framebuffer usage when running the EVGA card, suggesting that the game, set to Ultra quality, is stifled by the standard GTX 680's 2GB. We ran the game on both GTX 680s directly after one another and didn't feel the extra smoothness implied by the results of the 4GB-totin' card."


It explained that he didn't "see" a difference in a particular game on a particular screen. The 4k screens are just arriving and game manufacturers are just now beginning to design games to the hardware available.



To bring this back to the console portion of the discussion -

The PS4 camp has been definitive on 4k gaming; it will not support it.

The Xbox One has less raw power than the ps4 on the console and some tech writers have speculated that Microsoft has been elusive in confirming no 4k gaming because it might be possible with their cloud servers.

Either way we're years (3-5?) away from 4k being common in the living room. In that time frame the extra money spent on cards with a 256-bit memory bus end up being wasteful except in very small instances. The money doesn't justify the small performance increases and can be spent in other areas for more performance gains.
July 22, 2013 3:32:07 AM

The memory bus width is the same as the 2G version, you probably won't ever see any performance improvement :L
July 22, 2013 3:40:17 AM

The memory bus width is the same as the 2G version, you probably won't ever see any performance improvement :L
July 23, 2013 6:41:56 PM

[/quotemsg]

The PS4 camp has been definitive on 4k gaming; it will not support it.

Either way we're years (3-5?) away from 4k being common in the living room. In that time frame the extra money spent on cards with a 256-bit memory bus end up being wasteful except in very small instances. The money doesn't justify the small performance increases and can be spent in other areas for more performance gains. [/quotemsg]

Bear in mind developers need to core code the game for 4K resolution. So it's not just the TV to consider. The PS3 was launched Holiday 2006. It wasn't until 2009-2010 that a number of the new games introduced were actually coded for 1080p from the ground up (my first was Gran Tourismo 5). Keep in mind there's a difference between native coded 1080p and a game that will play at 1080p up-scaled from native 720p. A lot of people have a misconception on that.

So point being, even if the PS4 could/will support 4K with a firmware upgrade or something, doesn't mean we'll be seeing games at 4K native resolution any time soon.

a b U Graphics card
July 23, 2013 7:24:05 PM

10tacle said:


The PS4 camp has been definitive on 4k gaming; it will not support it.

Either way we're years (3-5?) away from 4k being common in the living room. In that time frame the extra money spent on cards with a 256-bit memory bus end up being wasteful except in very small instances. The money doesn't justify the small performance increases and can be spent in other areas for more performance gains. [/msgquoted said:


Bear in mind developers need to core code the game for 4K resolution. So it's not just the TV to consider. The PS3 was launched Holiday 2006. It wasn't until 2009-2010 that a number of the new games introduced were actually coded for 1080p from the ground up (my first was Gran Tourismo 5). Keep in mind there's a difference between native coded 1080p and a game that will play at 1080p up-scaled from native 720p. A lot of people have a misconception on that.

So point being, even if the PS4 could/will support 4K with a firmware upgrade or something, doesn't mean we'll be seeing games at 4K native resolution any time soon.

]

The PS4 camp has been definitive on 4k gaming; it will not support it.

Either way we're years (3-5?) away from 4k being common in the living room. In that time frame the extra money spent on cards with a 256-bit memory bus end up being wasteful except in very small instances. The money doesn't justify the small performance increases and can be spent in other areas for more performance gains.


Bear in mind developers need to core code the game for 4K resolution. So it's not just the TV to consider. The PS3 was launched Holiday 2006. It wasn't until 2009-2010 that a number of the new games introduced were actually coded for 1080p from the ground up (my first was Gran Tourismo 5). Keep in mind there's a difference between native coded 1080p and a game that will play at 1080p up-scaled from native 720p. A lot of people have a misconception on that.

So point being, even if the PS4 could/will support 4K with a firmware upgrade or something, doesn't mean we'll be seeing games at 4K native resolution any time soon.



Reading up a little on the matter it confirms what you are saying the most current gen are 720p upscaled to 1080p. It also goes to prove my point in the discussion the further amounts of VRAM in cards today don't justify the extra money for higher resolutions.

My thoughts are to buy for today and spend money on valuable parts. The 4GB on the card does not offer value. If money is no option buy it knowing that extra $40-50 could be spent elsewhere with better performance gains.
August 3, 2013 2:29:06 PM

Despite all the hoopla about 4gb being overkill even at higher resolution for current games, I'm not quite convinced. First, take into account Skyrim w/ texture mods and Shogun 2. Both of these games are reported to use large swaths of vram over 2gb. The Hexus review cited did not test Skyrim. It tested Shogun 2, but it did not seem to account for the fact that the game will downscale your settings when it runs out of vram. This could be the reason for the 4gb and 2gb 680's in their test running similarly in their 3x monitor setup, hence their reporting of the game using less than 2gb of vram. http://forums.totalwar.com/showthread.php/17730-Holy-VR...****
December 24, 2013 1:31:53 AM

I game at 1920 x1080 at the moment and using high res textures I often get a ram usage of up to 2.3 or 2.4 gb's. For that reason 2gb's isn't enough so I bought the 4gb MSI 770 above, Incidentally as well as higher clocks out of the box than the MSI 2gb gaming model it also comes with a nice backplate. I never kept it long even though I found it to be a really nice well built card that performed well.
December 24, 2013 1:37:23 AM

I game at 1920 x1080 at the moment and using high res textures I often get a ram usage of up to 2.3 or 2.4 gb's. For that reason 2gb's isn't enough so I bought the 4gb MSI 770 above, Incidentally as well as higher clocks out of the box than the MSI 2gb gaming model it also comes with a nice backplate. I never kept it long even though I found it to be a really nice well built card that performed well.
!