Will my new system be bottlenecked?

thepunshment

Distinguished
Oct 31, 2009
30
0
18,530
I'm building a new computer. I have a CD/dvd drive and hard drives so i do not need those.

I have an HD 3650 ati gfx card. I'm going to use this graphics card until there are more Ati 5850's are back in stock.


This is my new build but for this i have to use the HD 3650 with it for about 2 months max. how much of a bottleneck will there be on this build with the Ati HD 3650 graphics card. :kaola:
here is the build:

http://secure.newegg.com/WishList/PublicWishDetail.aspx?WishListNumber=10995271


Do not worry i know it is a bottleneck, but i need to know if it is worse than an HD 3650 and an AMD athlon x64 4800+. (thats what im running right now)

i know when i get the new gfx card everything will be really smooth. :D I'm using for gaming... and whatever cool people do.
 
Solution



Aw, jbakerlent you beat me to it. I was laughing so hard when he said larrabee was a card for the second time

To photonboy: you said "Losing performance?
I assume you are refering to the PCIe limit which is HALF on the 1156 (1x16 PCIe v2.0 or 2x8 instead of 2x16 on the 1366)?

They have tested and confirmed that a Crossfire setup of 2xHD5870 1GB is NOT limited by this bandwidth reduction. You would have to have better cards to run into PCIe bandwidth limits but I'd be building a new system at that point anyway. "

x16 x16 is always better than x8 x8. Read...
The bottleneck in BOTH systems is the HD3650 graphics card.

The AMD X2-4800+ can utilize up to an HD4770 (at which point some games are CPU limited and some games are GPU limited).

You are MASSIVELY bottlenecked in gaming by using the HD3650 with an i7-860, however I would recommend keeping it until NVidia releases their DX11 cards. If you still want the HD5850 or HD5870 at least the prices will have dropped.

Things to compare for NVidia DX11 vs ATI DX11:

1. Idle Power
2. Physics support
3. Game benchmarks
4. Other application support

If I was to build a kick-ass system I'd use the i7-860 and Crossfire 2xHD5850 1GB. (I prefer the HD5850 because of price AND board length.)

The HD5850 performs, on average, at 84% the frame rate of the HD5870. The price is often 70% the cost of the HD5870 and again the board is shorter. Cooling MAY be slightly better too (I'm looking forward to the Sapphire Vapor-X version or similar design when they finally arrive.)

Other considerations:
Drop in a Larrabee card as a second or third card in the future (wait and see in 2010).
 

overshocks

Distinguished
Aug 7, 2009
1,204
0
19,360


^larrabee is not a card...you don't make sense at all.

"If I was to build a kick-ass system I'd use the i7-860 and Crossfire 2xHD5850 1GB. (I prefer the HD5850 because of price AND board length.)"

..talk about losing performance for p55 mobo. get a i7 920 in this case. that's not a "kick-ass system"

to OP: talk about bottleneck..sighs
what's your monitor resolution?
did you order those parts already in the link? because that RAM is not good, switch it out with a lower latency RAM

this ram is 1333mhz 7-7-7-21
http://www.newegg.com/Product/Product.aspx?Item=N82E16820231276
the mobo in the link doesn't support x8 x8..and doesnt support sli. you pretty much limited your option of getting two video cards. because it will only run x16 x4
 

thepunshment

Distinguished
Oct 31, 2009
30
0
18,530



oh ya im just going for the 860 cuz its cheaper and runs cooler. My parents have a 920 and its uses up a lot of power. the ram u sent was good but i already got the ram. i dont think the ram i got is that bad. the mobo is only going to be used for one card and one card only. I'm not a fan of 2 cards
 
to "overshocks":

"Larrabee" is the codename for Intel's upcoming new card which has the capability of acting as a graphics card or to perform tasks normally done on the CPU. It's not certain how it will perform or how Intel will package it. There may be some benefit to getting one in addition to a regular graphics card though it's a wait and see game. Intel also has to write the supporting software as it acts more like an emulator (which is probably much more efficient than it sounds; "dedicated" cards often have unused portions consuming power and the Larrabee can be upgraded by software alone for ANY task including DX12). Obvious uses aside from a full-fledged graphics card include physics support, video transcoding and compression. If I were Intel I wouldn't initially market it to compete with gaming video cards. I'd offer it as a supplemental card.

Losing performance?
I assume you are refering to the PCIe limit which is HALF on the 1156 (1x16 PCIe v2.0 or 2x8 instead of 2x16 on the 1366)?

They have tested and confirmed that a Crossfire setup of 2xHD5870 1GB is NOT limited by this bandwidth reduction. You would have to have better cards to run into PCIe bandwidth limits but I'd be building a new system at that point anyway.

The only "advantage" of the 1366/i7-920 system is this theoretical bandwidth and the fact that it will support the upcoming 6-core 32nm Intel CPU (which should be $1000 or so). The 1156/i7-860 is slightly cheaper, has lower power consumption and can enter a TURBO mode automatically when needed.

There are LOTS of articles comparing the two.

The i7-920 is essentially idential to the 860 for performance but the 860, again, has the auto TURBO mode and uses less power.
 

jbakerlent

Distinguished


It's not a card, it's a chip with both a CPU and a IGP on it.



The 920 has turbo mode as well.

I do agree though that the 860 is a better option than the 920 in many cases.
 

jbakerlent

Distinguished
@OP, what HDD do you have? Also, in addition to the RAM issue mentioned above, you bought a crossfire PSU for a single card system (not a big deal, but you could have saved a little money). And I didn't see you mention anywhere what you will be using the system for so I can't say for sure how big the bottleneck will be.
 

overshocks

Distinguished
Aug 7, 2009
1,204
0
19,360



Aw, jbakerlent you beat me to it. I was laughing so hard when he said larrabee was a card for the second time

To photonboy: you said "Losing performance?
I assume you are refering to the PCIe limit which is HALF on the 1156 (1x16 PCIe v2.0 or 2x8 instead of 2x16 on the 1366)?

They have tested and confirmed that a Crossfire setup of 2xHD5870 1GB is NOT limited by this bandwidth reduction. You would have to have better cards to run into PCIe bandwidth limits but I'd be building a new system at that point anyway. "

x16 x16 is always better than x8 x8. Read articles you do get slight performance benefits for x16 x16 meaning more fps. Now you need to get your facts straight buddy, larrabee a card? Lol. Start googling.


"The i7-920 is essentially idential to the 860 for performance but the 860, again, has the auto TURBO mode and uses less power. "

no big deal, the OP can overclock manunally and disable turbo mode. For example getting 3.6ghz on ALL four cores, instead of just one core with high speed.
 
Solution
FYI, I have an Electronics Technicians Diploma, before retiring from the Navy at 39 recently I maintained Military Radar Systems and the central Command and Control computers which communicate between the Operations consoles, radars, missiles and other hardware. I was also "the guy" to ask when building a new PC and now I keep track of new and future PC architectural changes for fun.

I don't wish to seem arrogant but I do get a little annoyed when responses border on being rude so I thought I'd partially state my credentials. Anyway, here's my last response to clear up some points:

Response about PCIe:
x16 x16 is always better than x8 x8. Read articles you do get slight performance benefits for x16 x16 meaning more fps. Now you need to get your facts straight buddy,"

REPLY:
http://www.techpowerup.com/reviews/AMD/HD_5870_PCI-Express_Scaling/25.html
This is an extensive article benchmarking specifically to see if the HD5870 1GB is limited when running on a x8 PCIe v2.0 instead of a x16. It runs within 1 or 2% for a single card. With two cards (two 8x slots) there would be NO difference at all because due to the way Crossfire/SLI works the bandwidth across the bus is less than twice that of a single card maxed out (as well you might see a CPU bandwidth limit preventing full use of two cards but that's not my main point). Again, a Crossfire of 2xHD5870 1GB will not be restricted but you should go with the full 2x16 if you know you will be upgrading your graphics above this eventually on this motherboard (at which point you also must ensure you have no RAM or CPU bottlenecks as well.)

RESPONSES to Larrabee:
"I was laughing so hard when he said larrabee was a card for the second time" and
"larrabee is not a card...you don't make sense at all."

REPLY 2:
http://news.softpedia.com/news/Intel-Larrabee-Graphics-Cards-Integrate-16-Cores-48746.shtml
I've read several articles on Larrabee including Intel's whitepaper. Larrabee is going to be a PCIe card that drops in to perform CPU and GPU tasks by utilizing a software emulation layer. There is NOTHING that Larrabee can not do provided the emulation software exists. That can include DX11, Physics, DirectCompute, Transcoding video or simply looking like a multi-core CPU. The big question is efficiency. The two largest advantages Larrabee has are:
1. The design is based around the well understood x86 instruction set vastly simplifying programming which I believe will be C++
2. The generic design and software emulation approach give it lots of flexibility and compatibility (update the emulation software and now you have a DX12 card!)
The largest potential disadvantages are price and efficiency so it will be VERY interesting to see how it performs. What it loses in its generic emulation approach it will partially offset by using all of its architecture (graphics cards often can't fully use all their hardware at any point in time but the unused portions often still generate heat.)
Before ridiculing someone at least spend some time to Google the subject and don't say anything in e-mail that you wouldn't say to someone's face.

*I mentioned Larrabee because I'm hoping that there will be some benefit to picking up one in addition to a normal NVIdia or ATI card (If not, then why build it?). Driver support may be an issue as NVidia especially doesn't wish to play nice with other "graphics cards." One option may be the LucidLogix chip. See my little blurb at then end.

RESPONSE about Larrabee #2:
"It's not a card, it's a chip with both a CPU and a IGP on it."

REPLY:
I believe you are confusing Larrabee with Westmere. Westmere is a CPU which also has IGP within the same package.
http://www.pcstats.com/articleview.cfm?articleID=2370

RESPONSE to i7-860 Turbo mode:
i7-860 is superior->
"In our testing, as you saw on previous pages, the Core i7-860 did beat out the i7-920 in many cases thanks to the speed increases provided by a highly tuned Turbo Mode on the Lynnfield core reaching as high as 3.47 GHz."

http://www.pcper.com/article.php?aid=781&type=expert&pid=10

**Lucid Logix and MSI Big Bang:
You should check out this chip on the Big Bang MSI board; reviews should be out in a few weeks which will compare the LucidLogix approach to SLI and Crossfire. The approach is very different. It sits between the PCIe bus and the cards and interprets and allocates the instructions on the fly. It does not require two identical cards. If I'm correct then it can fully use two separate cards for a game that isn't even coded to use SLI or Crossfire. This approach also eliminates the constant need to upgrade Video drivers and should be able to almost fully use both cards. My biggest questions will be:
1. Will this approach work for games that normally can only use one graphics card?
2. Will we see almost 2x the frame rate for all games that aren't CPU bottlenecked?
3. Will I need this to match an NVIdia or ATI card and future Larrabee card?
4. Are there any potential drawbacks now or in the future?

Possible 2010/11 PC design I'm hoping for:
-LucidLogix chip (no SLI/Crossfire)
-2x Graphics cards
-addon Larrabee board
-six-core 32nm Intel CPU
-onboard graphics chip and complete power down of graphics cards when not needed
-ATOM CPU that can shut down the six-core CPU completely when not needed
-run two Operating Systems at the same time, independently or just the ATOM/IGP, 6-core/IGP or 6-core/addon Graphics.
-PCIe Extension allowing "modular" PC so the Graphics addon is a separate box allowing heatsinks and exposed air on top and bottom for minimal fan noise.
 

jbl91

Distinguished
Aug 4, 2009
116
0
18,690


Probably one of the strangest posts ever.
 

overshocks

Distinguished
Aug 7, 2009
1,204
0
19,360


Lol larrabee is a card? It's a chip. I love that he wrote an essay on something that I spent 2 secs to skim through and laughed my butt off again.

what's with the semi essay on the x58 x16 x16 vs p55 v8 v8, you said "no difference" correct? LOLLLL x58 x16 x16 will ALWAYS be better than p55 x8 x8. dude there's no need to debate over this..

for larrabee again..
maybe wikipedia gives you this info too...trustable? i think so.
http://en.wikipedia.org/wiki/Larrabee_%28GPU%29

the article you linked is rather funny.."Larrabee cards are expected to make their debut in early 2009"
telling me to google eh? why don't you google a more trustable and reliable article huh? i find this debate quite pathetic.