FYI, I have an Electronics Technicians Diploma, before retiring from the Navy at 39 recently I maintained Military Radar Systems and the central Command and Control computers which communicate between the Operations consoles, radars, missiles and other hardware. I was also "the guy" to ask when building a new PC and now I keep track of new and future PC architectural changes for fun.
I don't wish to seem arrogant but I do get a little annoyed when responses border on being rude so I thought I'd partially state my credentials. Anyway, here's my last response to clear up some points:
Response about PCIe:
x16 x16 is always better than x8 x8. Read articles you do get slight performance benefits for x16 x16 meaning more fps. Now you need to get your facts straight buddy,"
REPLY:
http://www.techpowerup.com/reviews/AMD/HD_5870_PCI-Express_Scaling/25.html
This is an extensive article benchmarking specifically to see if the HD5870 1GB is limited when running on a x8 PCIe v2.0 instead of a x16. It runs within 1 or 2% for a single card. With two cards (two 8x slots) there would be NO difference at all because due to the way Crossfire/SLI works the bandwidth across the bus is less than twice that of a single card maxed out (as well you might see a CPU bandwidth limit preventing full use of two cards but that's not my main point). Again, a Crossfire of 2xHD5870 1GB will not be restricted but you should go with the full 2x16 if you know you will be upgrading your graphics above this eventually on this motherboard (at which point you also must ensure you have no RAM or CPU bottlenecks as well.)
RESPONSES to Larrabee:
"I was laughing so hard when he said larrabee was a card for the second time" and
"larrabee is not a card...you don't make sense at all."
REPLY 2:
http://news.softpedia.com/news/Intel-Larrabee-Graphics-Cards-Integrate-16-Cores-48746.shtml
I've read several articles on Larrabee including Intel's whitepaper. Larrabee is going to be a PCIe card that drops in to perform CPU and GPU tasks by utilizing a software emulation layer. There is NOTHING that Larrabee can not do provided the emulation software exists. That can include DX11, Physics, DirectCompute, Transcoding video or simply looking like a multi-core CPU. The big question is efficiency. The two largest advantages Larrabee has are:
1. The design is based around the well understood x86 instruction set vastly simplifying programming which I believe will be C++
2. The generic design and software emulation approach give it lots of flexibility and compatibility (update the emulation software and now you have a DX12 card!)
The largest potential disadvantages are price and efficiency so it will be VERY interesting to see how it performs. What it loses in its generic emulation approach it will partially offset by using all of its architecture (graphics cards often can't fully use all their hardware at any point in time but the unused portions often still generate heat.)
Before ridiculing someone at least spend some time to Google the subject and don't say anything in e-mail that you wouldn't say to someone's face.
*I mentioned Larrabee because I'm hoping that there will be some benefit to picking up one in addition to a normal NVIdia or ATI card (If not, then why build it?). Driver support may be an issue as NVidia especially doesn't wish to play nice with other "graphics cards." One option may be the LucidLogix chip. See my little blurb at then end.
RESPONSE about Larrabee #2:
"It's not a card, it's a chip with both a CPU and a IGP on it."
REPLY:
I believe you are confusing Larrabee with Westmere. Westmere is a CPU which also has IGP within the same package.
http://www.pcstats.com/articleview.cfm?articleID=2370
RESPONSE to i7-860 Turbo mode:
i7-860 is superior->
"In our testing, as you saw on previous pages, the Core i7-860 did beat out the i7-920 in many cases thanks to the speed increases provided by a highly tuned Turbo Mode on the Lynnfield core reaching as high as 3.47 GHz."
http://www.pcper.com/article.php?aid=781&type=expert&pid=10
**Lucid Logix and MSI Big Bang:
You should check out this chip on the Big Bang MSI board; reviews should be out in a few weeks which will compare the LucidLogix approach to SLI and Crossfire. The approach is very different. It sits between the PCIe bus and the cards and interprets and allocates the instructions on the fly. It does not require two identical cards. If I'm correct then it can fully use two separate cards for a game that isn't even coded to use SLI or Crossfire. This approach also eliminates the constant need to upgrade Video drivers and should be able to almost fully use both cards. My biggest questions will be:
1. Will this approach work for games that normally can only use one graphics card?
2. Will we see almost 2x the frame rate for all games that aren't CPU bottlenecked?
3. Will I need this to match an NVIdia or ATI card and future Larrabee card?
4. Are there any potential drawbacks now or in the future?
Possible 2010/11 PC design I'm hoping for:
-LucidLogix chip (no SLI/Crossfire)
-2x Graphics cards
-addon Larrabee board
-six-core 32nm Intel CPU
-onboard graphics chip and complete power down of graphics cards when not needed
-ATOM CPU that can shut down the six-core CPU completely when not needed
-run two Operating Systems at the same time, independently or just the ATOM/IGP, 6-core/IGP or 6-core/addon Graphics.
-PCIe Extension allowing "modular" PC so the Graphics addon is a separate box allowing heatsinks and exposed air on top and bottom for minimal fan noise.