Sign in with
Sign up | Sign in
Your question

Nvidia vs Intel - Intel gets revenge! help!

Last response: in CPUs
Share
December 5, 2007 4:23:41 AM

ok we all know now that barcie is doomed, even when the micro code is fixed its a second rate chip - old news. Next...... ati is back and the 3870 is good card and you get 2 in crossfire with some nice results. Ati still looks better then nvidia - the color is just better!

we all know that intel licenced nvidia for ~ 4 years to make chipsets and that relationship is ending and nvidia was not invited for the 45nm launch and it this time may not have a 680i driver for 45nm chips. the new 780 chipset for 45nm may not even get made?

my old 975x mobo was designed for sli but nvidia stuck there nose in the air and tumbed off to intel and claimed intel did not know how to make chipsets - well yes nvidia chipsets are fast but they are hot.

now what is all this crap about - quad core issues in nvidia chipset mobos and sli issues this year - is it possible that intel has been waging a war on nvidia? will intel decide not to buddy up with nvidia?

i do not know if you do please tell me!

i need help, should i be making sli proto types or crossfire?

how about the maximus extreme! dual 16x crossfire, ddr3 and the 3rd pci for raid card! or even 3 cards????

anyone think the 3 card solution will work with maximus mobo?

feel free to pm me if like talk in more detail as i am building systems will all the above parts
December 5, 2007 7:28:29 AM

I don’t know much about the situation between Nvidia and Intel but I think it would only be worth having two ati cards on the maximus extreme as it only has two pci-e slots at full x16 each, while if you have three cards the third one will be limited to x8 (I think) so the card would not be able to show its true potential.

I’m not an expert though and if anyone would like to correct me feel free to do so.
a b à CPUs
December 5, 2007 8:23:51 AM

ATi's new series cards are still a touch behind, power hungry, and considering how late they came, im not impressed.
Related resources
December 5, 2007 8:40:05 AM

ATIs cards are NOT power hungry, and not as hot as nVidias. Using a 16x 16x 8x wont hurt CF, as 8x is plenty still at this point. Heres an example of how "power hungry" a 3850 is from an owner http://www.tomshardware.com/forum/246761-33-just-receiv... I own a 8800GTS 320 and cant even do that with that card, and the 3850 is fairly comprable with it.
December 5, 2007 11:03:46 AM

apache_lives said:
ATi's new series cards are still a touch behind, power hungry, and considering how late they came, im not impressed.


How do you rate them as power hungry? where have you seen that? I think they (3870s) share roughly load consumption with the 8800gt, but their idle usage is a fair bit lower.

Sod it, since I got roughed up on another thread for not posting links, here is one:

http://www.techspot.com/review/76-asus-radeon-hd-3870/p...


3870 peak draw is more than 80 watts less than the 2900xt.
December 5, 2007 11:10:18 AM

On another note, Id like to bring up the differences between the techspot and just about every review ive seen of the 3870 and 2900, and benchmarks from xbitlabs.com.

http://www.xbitlabs.com/articles/video/display/leadtek-...

http://www.techspot.com/review/76-asus-radeon-hd-3870/p...

Check out the differences of the 2900xt in STALKER, is it me - and im an abject r600 and its derivatives fan so feel free to flame - or is there something seriously wrong with the ati results in xbitlabs benchmark's? I know this is deviating from the topic, and I appologise in advance and wont complain if you decide to rip large meaty chunks out of me :bounce: 
December 5, 2007 11:25:35 AM

Xbit uses 16xAF + HDR which would affect its performance compared to techspots non AA/AF/HDR usage
December 5, 2007 11:39:36 AM

Hmmmmmm... thanks for that one! I would have thought the vista factor in xbitlab's review would tear large holes in the framrate, but legionhardware.com used vista to in their review.

http://www.legionhardware.com/document.php?id=703&p=5

Techspot used xp in their review, but quake wars was done with 4xaa and 16xaf the same as in xbitlab's and the results are a fair bit different to:

http://www.techspot.com/review/76-asus-radeon-hd-3870/p...

http://www.xbitlabs.com/articles/video/display/leadtek-...

December 5, 2007 12:04:46 PM

At this point I don't see why people are even bothering with 680i's, P38's, whatever or new 3870's or 8800's. So they can have them for 6mo's maybe? Then new 9000 series Gfx comes out and Nehalem socket mobos start to hit the streets. You will need a whole new everything by then or get left behind. Yes it will happen again eventually but its better to be into new hardware toward the beginning of a whole new socket and gpu design as opposed to the end of it.
December 5, 2007 12:19:58 PM

Yeah totally. However you can just trawl ebay for second hard performance cards. Its a cheap upgrade path lol In may I got a superclocked evga 7900gt still in the celophane for 80 quid. See what pickings there are in february when the new nvidias come out. :) 
December 5, 2007 12:20:16 PM

Yeah totally. However you can just trawl ebay for second hard performance cards. Its a cheap upgrade path lol In may I got a superclocked evga 7900gt still in the celophane for 80 quid. See what pickings there are in february when the new nvidias come out. :) 
December 5, 2007 1:00:34 PM

Intel is the M$ of hardware.
December 5, 2007 6:36:39 PM

ok guys

anyone have any info and intel vs nvidia for 2008?

3 cards in the asus mobo? will ati make drivers?

what about nvidia and intel licensing?
December 5, 2007 8:47:48 PM

Why are we still calling ATI ... ATI??? I thought they were getting rid of the name... Ive been calling ATI... AMD for the last 6months.

To my knowledge Tri and Quad crossfire will be based of AMD chipsets and wont be enabled on intel's... meaning if you want Tri or Quad crossfire you have to by AMD CPU (Spider platform).

In any case its going to be interesting either way... Intel are looking like they are against NVIDIA so no SLI on intel chipsets. Intel and AMD/ATI wont want to talk as they are in direct competition, so this should mean no tri/quad crossfire on intel chipsets. AMD/ATI wont make chipsets to support intel anymore. NVIDIA cant really team up with AMD/ATI because they are in direct competition. Outcome:

Intel CPU / ATI or NVIDIA single card
AMD CPU / ATI cards (crossfire)
NVIDIA in the dark.
a c 127 à CPUs
December 5, 2007 10:32:16 PM

Actually you missed one thing chookman. Larabee. SO soon it will probably be better to pair up a Intel CPU with a Intel GPU since they are planning on using a CPU set to GPU instructions. So we could see the very first 2GHz+ GPU coming out from Intel.

So NVIDIA may be left in an even darker dark since Larabee will also have dual+ options all running DX10.1 and PCIe 2.0+.

But NVidia brought it on themselves really. They should work with Intel instead. Also NVidias chipsets are great for SLI but don't have the same performance as a Intel chipset in memory.
December 5, 2007 11:00:19 PM

I'd prefer 128 RISC's at 1.625Ghz rather than roughly 16 2GHZ ones,
December 5, 2007 11:05:03 PM

chookman said:
Why are we still calling ATI ... ATI??? I thought they were getting rid of the name... Ive been calling ATI... AMD for the last 6months.

To my knowledge Tri and Quad crossfire will be based of AMD chipsets and wont be enabled on intel's... meaning if you want Tri or Quad crossfire you have to by AMD CPU (Spider platform).

In any case its going to be interesting either way... Intel are looking like they are against NVIDIA so no SLI on intel chipsets. Intel and AMD/ATI wont want to talk as they are in direct competition, so this should mean no tri/quad crossfire on intel chipsets. AMD/ATI wont make chipsets to support intel anymore. NVIDIA cant really team up with AMD/ATI because they are in direct competition. Outcome:

Intel CPU / ATI or NVIDIA single card
AMD CPU / ATI cards (crossfire)
NVIDIA in the dark.


Has there been anything showing the performance gains using 3 or 4 GPUs over 1 or 2 GPUs?
If so, please give a link, cause I haven't seen any.

I'm just interested, cause everyone is touting quad GPU, but no one has any real benchmarks showing what type of gains it can get, or what the power/heat of running 4 GPUs consume/produce.

a c 127 à CPUs
December 5, 2007 11:09:01 PM

As of right now no there is no real gain for Quad GPU and there hasn't been any real benchmarks since.

Also I wounder if the 790FX will support an older X2.
December 5, 2007 11:24:05 PM

dragonsprayer said:
ok guys

anyone have any info and intel vs nvidia for 2008?

3 cards in the asus mobo? will ati make drivers?

what about nvidia and intel licensing?


Ok, my opinion and its only an opinion. I think Nvidia has blown it big time. After AMD bought out ATI, AMD no longer had to depend on Nvidia to produce compatible chipsets. AMD could produce its own, and Nvidia could ride off into the sunset.

This would have been the perfect moment for Nvidia to side up with Intel and help squash AMD/ATI. Instead, Nvidia decided to stand out on its own, making its own chipsets and video cards. But wait, by not licencing Intel, it made Intel support its rival, AMD/ATI, if it wanted to make a motherboard that supported two cards. So Intel told Nvidia to ride off into the sunset alone. Intel has the further option of refusing to licence Nvidia to make motherboards for future Intel chips.

This leaves Nvidia in a precarious position. It can make video cards, which can be used in single card form, but none in SLI except for its own motherboards. It might be able to make motherboards, but if it looses licensing from both AMD and Intel, whose chip will go into them?

Somehow, this reminds me of 3DFX long ago. 3DFX had a good card, but it slowly got left behind and got bought out by another company. I'm not saying that such will happen to Nvidia, but their dominance in using two or more cards may be disappearing. And who knows, AMD may fail in its cpu division but end up doing well enough in its video card division to survive, maybe making only mid range and budget chips on the side.

Just some thoughts.
December 6, 2007 12:07:20 AM

intel needs to make a GPU that will make nvidia cry and nvidia needs to make a processor that will make intel cry
December 6, 2007 12:19:45 AM

NMDante said:
Has there been anything showing the performance gains using 3 or 4 GPUs over 1 or 2 GPUs?
If so, please give a link, cause I haven't seen any.

I'm just interested, cause everyone is touting quad GPU, but no one has any real benchmarks showing what type of gains it can get, or what the power/heat of running 4 GPUs consume/produce.


This is the exact reason why i never mentioned performance. I havent seen anything to support the use of 3 or 4 gpu's. If they get it scaling half as well as the current 2nd gpu in crossfire you can bet itll be a hit with enthusiasts.

AdamJ
"intel needs to make a GPU that will make nvidia cry and nvidia needs to make a processor that will make intel cry"

The latter is not going to happen.

Interesting times we have ahead...
December 6, 2007 12:48:41 AM

chookman said:
This is the exact reason why i never mentioned performance. I havent seen anything to support the use of 3 or 4 gpu's. If they get it scaling half as well as the current 2nd gpu in crossfire you can bet itll be a hit with enthusiasts.

AdamJ
"intel needs to make a GPU that will make nvidia cry and nvidia needs to make a processor that will make intel cry"

The latter is not going to happen.

Interesting times we have ahead...


Fair enough.

I was just curious cause a lot of people keep using the whole quad GPU as a selling point, but I have yet to see anything showing what kind of performance numbers it generates.
December 6, 2007 2:54:52 AM

NMDante said:
Fair enough.

I was just curious cause a lot of people keep using the whole quad GPU as a selling point, but I have yet to see anything showing what kind of performance numbers it generates.


From last I read, you won't. ATI has yet to write any drivers to allow 3 or 4 gpu's. No drivers, no performance tests, just fancy pictures from AMD/ATI.
December 6, 2007 3:27:13 AM

i agree sailor

but you guys missed the point of this thread - what is the future?

amd - makes gpu's for intel and nvidia is locked out of chipsets?

or nvidia kisses intels butt?

i think many of you do not understand issues with quad cores and nvidia chipsets - hp (worlds largest pc maker) bought voodoo and what did they make?

they made a 680i amd/ati hybird system(2 2900xt on a sli mobo with home build bios code)? why? the nvidia/intel faud

here's little me, the guy that only sold overclocked pc's wondering what to do? so help me out! dam it ask your people whats going with this feud!


yes ati is still ati - amd will might change their name to ati the way things are going!

December 6, 2007 3:29:45 AM

are you saying that an nvidia motherboard wont support yorkfield?
December 6, 2007 3:49:47 AM

yes - that is not case? are you running a 9650?
December 6, 2007 8:31:40 PM

Current Nvidia chipsets dont support Yorkfield yet
December 6, 2007 8:55:02 PM

Well, I just finished replacing my 8800 GTX for 2 HD 3870's in xFire. All cards are Asus manufactured and used 3DMark06 default settings for the following results:

12127 8800 GTX
11588 3870
14273 3870 xFire

I was skeptical about switching over from the 8800 to 2 slower cards but so far I'm very impressed. The colour is soo much more vibrant on the ATI cards and always wanted to utilize the Xfire ability on my X38 mobo anyhow. Each driver was set for maximum performance except for the 16x ansio setting for improved IQ. To the OP, maybe these results will change your mind, looks like SLI support is dead unless Nvidia start making their own mobo's lol.
December 6, 2007 9:06:49 PM

Scaj said:
looks like SLI support is dead unless Nvidia start making their own mobo's lol.


Even if Nvidia made its own mobos, it might not do any good. Nvidia would have to get licencing from Intel and AMD/ATI to do it. Since Nvidia wouldn't give SLI licencing to Intel, would Intel licence Nvidia for Yorkfield, etc? And since ATI motherboards are a competitor to Nvidia motherboards, is ATI going to licence Nvidia to make its motherboards, motherboards which wouldn't support ATI cards?

I can't really say how all this will turn out, but SLI may disappear as far as the future chipsets go. And who knows, ATI cards and Crossfire might end up saving AMD.
December 7, 2007 12:56:17 AM

Intel did do a cross licensing agreement with Nvidia. How long it lasts I do not know. It did cover support for the 45nm Penryn processors.

Did Intel get SLi in the cross licensing agreament? Nope.

Nvidia's driver looks to see what motherboard that it is on and if it detects an Intel chipset it will not load the SLi enabled drivers. Will SLi work on Skulltrail? Yes. How is this done? Intel has included on the board 2 Nvidia bridge chips that Nvidia drivers detect and allow support for SLi.

December 7, 2007 2:19:08 AM

it ends with the 45nm but i think the quad core is exempt - the nvidia quad core issues may be due to a lack of support from intel and not a lack of a licenece. regardless there are far more issues with quad core chips then dual core - while intel chipset mobos do not have these issues

skulltrail - wow - good stuff, something for me too look into thank you!
December 7, 2007 6:24:38 AM

My friend that works at the Hawthorne Farms campus says they are trying to pilot the Skulltrail by early February. (This could change of course) It is going to be a beast as long as you use the DDR2 800 FBDIMM's with low latency. Cost and arm and leg but I heard that the 3DMark2006 CPU score was around 8000. I think the processors were around 4GHz. Of course that is using the 150W DTP QX9775 processors. Those things will not come cheap.

I also heard that they will have a full compliment of board and processor overclocking features in the Bios.

Of course you can still run any of the earlier Xeons based on the Core 2 Duo core along with the new 45nm Cores (Harpertown). You can buy one processor and then upgrade to 2 at a later date.

The caveat is that you must have the same processors in both CPU sockets otherwise the board wont' boot.
December 18, 2007 8:26:44 PM

pausert20 said:
My friend that works at the Hawthorne Farms campus says they are trying to pilot the Skulltrail by early February. (This could change of course) It is going to be a beast as long as you use the DDR2 800 FBDIMM's with low latency. Cost and arm and leg but I heard that the 3DMark2006 CPU score was around 8000. I think the processors were around 4GHz. Of course that is using the 150W DTP QX9775 processors. Those things will not come cheap.


hmm, doesn't really sound very beastly considering its very high speed, quad cores or dual quads for that matter, and only 8000 3Dmark06. Putting that in perspective that even a good 8800GT or GTS can hit 10,000 to 11,000 with a fast quad core.
a c 127 à CPUs
December 18, 2007 9:23:20 PM

warezme said:
hmm, doesn't really sound very beastly considering its very high speed, quad cores or dual quads for that matter, and only 8000 3Dmark06. Putting that in perspective that even a good 8800GT or GTS can hit 10,000 to 11,000 with a fast quad core.


He stated it was for the CPU score alone not all together system score. Thats roughly 2x the score of my Q6600's CPU score. If I remember correctly there are 3 scores at the end. CPU, Graphics and overall. The CPU one is just the game running using the CPU only and not the GPU. Normally it is all but one of the game demos and you can tell since you get only like 1-10 FPS.

The Skulltrail will be for the uber-enthusiasts. I am sure they will make a more affordable version for high end users but then again Nehalem will come out and depending on what Nehalem can do it may destroy Skulltrail with just one CPU. And then there may be a Skulltrail for Nehalem which would be devestatingly powerfull.

Here is a link for a picture of the mobo for it:
http://www.techpowerup.com/img/07-09-19/idf01_03.jpg

I think the northbridge(which might be X48) is between the CPUs and a bove the memory, SouthBridge(ICH9R or maybe ICH10) is to the bottom left of the second CPU and the two near the PCIe 2.0 slots are mostelikely the SLI chips so it can support either CFX or Quad SLI.

There "may" be support for Yorkfeild on the 780 NVidia chipset but we can't be sure since NVidia is being stubborn and wants to have SLI on their chipsets only. I think its going to get worse from here. I feel sorry for those who bought a 680i being told it will support Yorkfeild and here they are unable to.

BTW correct me if I am wrong, but Crossfire is not done via chipset is it. I think I remember reading it was done via the Crossfire capable cards but I might be wrong.
December 18, 2007 10:00:36 PM

jimmysmitty said:
BTW correct me if I am wrong, but Crossfire is not done via chipset is it. I think I remember reading it was done via the Crossfire capable cards but I might be wrong.


I believe the answer is that the chipset must support Crossfire, or SLI in the case of Nvidia, while the cards themselves work together on the given chipset. I think what you're referring to is an earlier method where one ATI card was a "Master" card and the other card was a regular card. The cards are the same now. Of course, the driver package must be there with the cards.

This is part of what the Intel-Nvidia disagreement involved. Nvidia would not license Intel to make a chipset for their SLI technology. So Intel made their chipset to support Crossfire. It does seem a little weird that Intel is helping AMD in this, but it was Intel's only choice if it wanted to give support for two or more video cards.

Then again, I might be mistaken in my understanding of how it all works.
a c 127 à CPUs
December 18, 2007 10:18:54 PM

I forgot that ATI changed it although I thought their system was better since it didn't need chipset support. It just meant buying 2 cards, 1 master and one slave, and put it in any dual AGP(at the time) or PCIe mobo and bam it would work.

But I guess everyone thought NVidias way was easier although I don't see how that is....
December 18, 2007 11:48:49 PM

The 3870 is showing good dx10 results and i have the tri- slot mobo

Only can only pray asus does not lets us down again with the maximus extreme and the 3-slot solution will work!

I hope 2x 3870's and another card will take on 2x98xx's (unknow nvidia g92 monsters) and the 7800 chipset.

i am betting about 20k on it!


note: any small computer builders looking to sell there systems nationally pm me (you must have a site or store and an established track record)- site is under construction and national advertising in mid 2008 if all goes according to .......
a b à CPUs
December 19, 2007 12:19:47 AM

spoonboy said:
How do you rate them as power hungry? where have you seen that? I think they (3870s) share roughly load consumption with the 8800gt, but their idle usage is a fair bit lower.

Sod it, since I got roughed up on another thread for not posting links, here is one:

http://www.techspot.com/review/76-asus-radeon-hd-3870/p...


3870 peak draw is more than 80 watts less than the 2900xt.


performance per watt, altho they aint that bad, the first get (2900's) were crap.
!