Nvidia vs Intel - Intel gets revenge! help!

dragonsprayer

Splendid
Jan 3, 2007
3,809
0
22,780
ok we all know now that barcie is doomed, even when the micro code is fixed its a second rate chip - old news. Next...... ati is back and the 3870 is good card and you get 2 in crossfire with some nice results. Ati still looks better then nvidia - the color is just better!

we all know that intel licenced nvidia for ~ 4 years to make chipsets and that relationship is ending and nvidia was not invited for the 45nm launch and it this time may not have a 680i driver for 45nm chips. the new 780 chipset for 45nm may not even get made?

my old 975x mobo was designed for sli but nvidia stuck there nose in the air and tumbed off to intel and claimed intel did not know how to make chipsets - well yes nvidia chipsets are fast but they are hot.

now what is all this crap about - quad core issues in nvidia chipset mobos and sli issues this year - is it possible that intel has been waging a war on nvidia? will intel decide not to buddy up with nvidia?

i do not know if you do please tell me!

i need help, should i be making sli proto types or crossfire?

how about the maximus extreme! dual 16x crossfire, ddr3 and the 3rd pci for raid card! or even 3 cards????

anyone think the 3 card solution will work with maximus mobo?

feel free to pm me if like talk in more detail as i am building systems will all the above parts
 

David345

Distinguished
Nov 20, 2007
3
0
18,510
I don’t know much about the situation between Nvidia and Intel but I think it would only be worth having two ati cards on the maximus extreme as it only has two pci-e slots at full x16 each, while if you have three cards the third one will be limited to x8 (I think) so the card would not be able to show its true potential.

I’m not an expert though and if anyone would like to correct me feel free to do so.
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280


How do you rate them as power hungry? where have you seen that? I think they (3870s) share roughly load consumption with the 8800gt, but their idle usage is a fair bit lower.

Sod it, since I got roughed up on another thread for not posting links, here is one:

http://www.techspot.com/review/76-asus-radeon-hd-3870/page8.html


3870 peak draw is more than 80 watts less than the 2900xt.
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280
On another note, Id like to bring up the differences between the techspot and just about every review ive seen of the 3870 and 2900, and benchmarks from xbitlabs.com.

http://www.xbitlabs.com/articles/video/display/leadtek-8800gt-extreme_15.html#sect1

http://www.techspot.com/review/76-asus-radeon-hd-3870/page6.html

Check out the differences of the 2900xt in STALKER, is it me - and im an abject r600 and its derivatives fan so feel free to flame - or is there something seriously wrong with the ati results in xbitlabs benchmark's? I know this is deviating from the topic, and I appologise in advance and wont complain if you decide to rip large meaty chunks out of me :bounce:
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280
Hmmmmmm... thanks for that one! I would have thought the vista factor in xbitlab's review would tear large holes in the framrate, but legionhardware.com used vista to in their review.

http://www.legionhardware.com/document.php?id=703&p=5

Techspot used xp in their review, but quake wars was done with 4xaa and 16xaf the same as in xbitlab's and the results are a fair bit different to:

http://www.techspot.com/review/76-asus-radeon-hd-3870/page5.html

http://www.xbitlabs.com/articles/video/display/leadtek-8800gt-extreme_14.html#sect0

 

warezme

Distinguished
Dec 18, 2006
2,450
56
19,890
At this point I don't see why people are even bothering with 680i's, P38's, whatever or new 3870's or 8800's. So they can have them for 6mo's maybe? Then new 9000 series Gfx comes out and Nehalem socket mobos start to hit the streets. You will need a whole new everything by then or get left behind. Yes it will happen again eventually but its better to be into new hardware toward the beginning of a whole new socket and gpu design as opposed to the end of it.
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280
Yeah totally. However you can just trawl ebay for second hard performance cards. Its a cheap upgrade path lol In may I got a superclocked evga 7900gt still in the celophane for 80 quid. See what pickings there are in february when the new nvidias come out. :)
 

spoonboy

Distinguished
Oct 31, 2007
1,053
0
19,280
Yeah totally. However you can just trawl ebay for second hard performance cards. Its a cheap upgrade path lol In may I got a superclocked evga 7900gt still in the celophane for 80 quid. See what pickings there are in february when the new nvidias come out. :)
 

chookman

Distinguished
Mar 23, 2007
3,319
0
20,790
Why are we still calling ATI ... ATI??? I thought they were getting rid of the name... Ive been calling ATI... AMD for the last 6months.

To my knowledge Tri and Quad crossfire will be based of AMD chipsets and wont be enabled on intel's... meaning if you want Tri or Quad crossfire you have to by AMD CPU (Spider platform).

In any case its going to be interesting either way... Intel are looking like they are against NVIDIA so no SLI on intel chipsets. Intel and AMD/ATI wont want to talk as they are in direct competition, so this should mean no tri/quad crossfire on intel chipsets. AMD/ATI wont make chipsets to support intel anymore. NVIDIA cant really team up with AMD/ATI because they are in direct competition. Outcome:

Intel CPU / ATI or NVIDIA single card
AMD CPU / ATI cards (crossfire)
NVIDIA in the dark.
 
Actually you missed one thing chookman. Larabee. SO soon it will probably be better to pair up a Intel CPU with a Intel GPU since they are planning on using a CPU set to GPU instructions. So we could see the very first 2GHz+ GPU coming out from Intel.

So NVIDIA may be left in an even darker dark since Larabee will also have dual+ options all running DX10.1 and PCIe 2.0+.

But NVidia brought it on themselves really. They should work with Intel instead. Also NVidias chipsets are great for SLI but don't have the same performance as a Intel chipset in memory.
 

NMDante

Distinguished
Oct 5, 2002
1,588
0
19,780


Has there been anything showing the performance gains using 3 or 4 GPUs over 1 or 2 GPUs?
If so, please give a link, cause I haven't seen any.

I'm just interested, cause everyone is touting quad GPU, but no one has any real benchmarks showing what type of gains it can get, or what the power/heat of running 4 GPUs consume/produce.

 

sailer

Splendid


Ok, my opinion and its only an opinion. I think Nvidia has blown it big time. After AMD bought out ATI, AMD no longer had to depend on Nvidia to produce compatible chipsets. AMD could produce its own, and Nvidia could ride off into the sunset.

This would have been the perfect moment for Nvidia to side up with Intel and help squash AMD/ATI. Instead, Nvidia decided to stand out on its own, making its own chipsets and video cards. But wait, by not licencing Intel, it made Intel support its rival, AMD/ATI, if it wanted to make a motherboard that supported two cards. So Intel told Nvidia to ride off into the sunset alone. Intel has the further option of refusing to licence Nvidia to make motherboards for future Intel chips.

This leaves Nvidia in a precarious position. It can make video cards, which can be used in single card form, but none in SLI except for its own motherboards. It might be able to make motherboards, but if it looses licensing from both AMD and Intel, whose chip will go into them?

Somehow, this reminds me of 3DFX long ago. 3DFX had a good card, but it slowly got left behind and got bought out by another company. I'm not saying that such will happen to Nvidia, but their dominance in using two or more cards may be disappearing. And who knows, AMD may fail in its cpu division but end up doing well enough in its video card division to survive, maybe making only mid range and budget chips on the side.

Just some thoughts.
 

AdamJ

Distinguished
Aug 21, 2004
198
0
18,680
intel needs to make a GPU that will make nvidia cry and nvidia needs to make a processor that will make intel cry
 

chookman

Distinguished
Mar 23, 2007
3,319
0
20,790


This is the exact reason why i never mentioned performance. I havent seen anything to support the use of 3 or 4 gpu's. If they get it scaling half as well as the current 2nd gpu in crossfire you can bet itll be a hit with enthusiasts.

AdamJ
"intel needs to make a GPU that will make nvidia cry and nvidia needs to make a processor that will make intel cry"

The latter is not going to happen.

Interesting times we have ahead...
 

NMDante

Distinguished
Oct 5, 2002
1,588
0
19,780


Fair enough.

I was just curious cause a lot of people keep using the whole quad GPU as a selling point, but I have yet to see anything showing what kind of performance numbers it generates.
 

sailer

Splendid


From last I read, you won't. ATI has yet to write any drivers to allow 3 or 4 gpu's. No drivers, no performance tests, just fancy pictures from AMD/ATI.
 

dragonsprayer

Splendid
Jan 3, 2007
3,809
0
22,780
i agree sailor

but you guys missed the point of this thread - what is the future?

amd - makes gpu's for intel and nvidia is locked out of chipsets?

or nvidia kisses intels butt?

i think many of you do not understand issues with quad cores and nvidia chipsets - hp (worlds largest pc maker) bought voodoo and what did they make?

they made a 680i amd/ati hybird system(2 2900xt on a sli mobo with home build bios code)? why? the nvidia/intel faud

here's little me, the guy that only sold overclocked pc's wondering what to do? so help me out! dam it ask your people whats going with this feud!


yes ati is still ati - amd will might change their name to ati the way things are going!