Before you read, thank you for your input and suggestions, I appreciate it very much.
So, I am buying a new PC soon. (First build) And I got two builds, I am new to the computer world so I don't know these things. One is a sweet budget build some people on a forum helped me make. For fun, and money saving purposes, I wanted to see what I could do to make this build cheaper. So I took 1 step down in the graphics card. Also I stumbled upon learning that my computer will max any game apparently because of the resolution.
I quote, "At your 720p resolution you will probably never need 2 graphics cards . A single 6770 will max out any game , since it has fewer pixels than a 19 inch monitor.
A 1080p screen has 1920 x 1080 pixels that the graphics card has to calculate and draw each time it draws a frame.
Thats 2,073,600 pixels
A 720p screen is 1280 x 720.
Thats 921,600 pixels
The same graphics card will run a game more than twice as fast on a 720p than it would on a 1080p
Tht means even a mid range card like the 6770 will run really good detail and frame rates at that resolution."
I am going to run my games on a 32" LCD 720p. I didn't know this. I am SOMEWHAT new to computer stuff, I understand some, but not ALOT. (^ Interesting huh?)
Anyways back to the point, I got a build together and I just don't understand the 128 and 256 bit memory interface on cards.
Whats the difference.
How big is it.
How much it affects your computer.
And if its worth having the bigger one over the smaller one.
XFX Radeon HD 6870: http://www.newegg.com/Product/Product.aspx?Item=N82E168...
XFX Radeon HD 6770: http://www.newegg.com/Product/Product.aspx?Item=N82E168...
The clocks are the same. (Or they were... Or close enough anyways ) The card is basically the same. I just wanna know how big of a difference it is and if its worth spending the extra money right now.
You can't compare graphics cards of different models based on clock rates.
I just don't know a whole lot about computers and such. And by reviews on youtube, and the thing about the resolution I said above. The 6770 is probably the card for me, IDK. I just want to see what the best possible PC I can get for gaming is for under $600. I might just get my other build. IDK.
I don't know really, I just want to know how much of a difference the 128 vs 256 bit is. And what it affects.
The bitrate is to do with the amount of memory in the card. A card with 256 will process data twice as fast as a 128 bit card IF all other things remain the same.
I completely disagree. That could only be true if the GPU is starved for memory bandwidth. Lets get some refresher shall we?
Bit width. Mystery to some. It is ONE part that determines how fast your GPU can access info stored in its memory. Other factors that come into play include clock speed and number of bits that can transfer per clock cycle. 1000MHz DDR3 on a 128bit bus has the same bandwidth as 500MHz DDR3 on a 256bit bus. And the same as 250MHz DDR5 on a 256bit bus. If you look at the 4870 and the 5770, you'd see that even though the one has a 256bit bus and the other has only a 128bit bus, it's not even close to twice as fast. Reason being those 800SP didn't need as much memory bandwidth as the 256bit bus found on the 4870 provided. Even the 6790 which has a 256bit memory bus isn't twice as fast.
I agree that for 720 a 6770 will do fine. You might want to up to a 6850 if you want to max everything out or are worried about future games. In a different thread I should 3 games where a 6770 at 720 wouldn't be able to max out and get >30FPS. (I think the games were Crysis, Metro, and AvP.)
I must have been having a slow day... Get that palm off your face.
While what I wrote was true, yours can be true as well. I even showed it in my example. I was thinking overall speed, or FPS. It's not like a card with 256bit width will give you 100FPS while a card with a 128bit width will give you only 50FPS. As I tried to show, doubling the bit width won't double your speed. (we dealt with this mentality when the 5770 came out and had only a 128 bit bus.)
To fix your quote,
The bitrate is to do with the amount of memory in the card. A card with 256 will process data from its memory twice as fast as a 128 bit card IF all other things remain the same.
I totally agreed with you when I wrote,
500MHz DDR3 on a 256bit bus. And the same as 250MHz DDR5 on a 256bit bus.
If you get "all other things remain the same", then 500MHz DDR3 is half the speed of 500MHz DDR5. DDR5 is like the quad pumped FSB of Intel and transfers 4bits per clock instead of DDR3s two. I thought you were talking about doubling your overall speed, though if you were talking about memory bandwidth you'd be correct.
Some days I seriously need a 2 liter of soda into me before I should get on these forums...