Sign in with
Sign up | Sign in
Your question

Nvidia has lost the SLI exclusive War

Last response: in Graphics & Displays
Share
July 15, 2008 12:45:20 PM

I'm not sure why this isn't big news but according to this article, Nvidia boards will no longer be the only game in town for SLI with the launch of Nehalem. Unless Nvidia puts some BIG advantages and major performance and STABILITY into their new Nehalem compatibility boards, they are going to lose big time. The writing has been on the wall since their 680i series which Intel made sure would NOT be compatible with Yorkfield Quads, forcing Nvidia to release its 780i and 790i boards just to be compatible.

http://www.digitimes.com/mobos/a20080715PD205.html
July 15, 2008 1:09:01 PM

Intel has not licensed any other company to use the LGA 1366 socket, meaning there will be no nVidia board and that Intel is basically inviting a lawsuit on itself. nVidia tried to use SLI as leverage but Intel wouldn't budge.

Personally, if it comes to SLI or Intel for me, as long as it's tri-SLI I'll take it.
July 15, 2008 1:09:22 PM

I don't think Nvidia really lost out on this deal - SLI will still only be supported through BR04 on Intel chipsets which pretty much means that implimentation is likely to be left to the devices of the manufacturers as Intel's default reference board is going to be designed with Crossfire in mind. Its already been stated that in order to put SLI on an x58 at this point would require a significant re-design.

I'm not into praising nvidia motherboards as I know a number have been trash (although I do own two that i'm perfectly happy with) - but I really have my doubts that a BR04 enabled x58 wouldn't suffer from worse problems.

This is a significant licensing win for Intel, making it the only one of the three main x86 chipset makers who will be able to offer support for both SLI and AMD's Crossfire multi-GPU technologies (not mention its own Larabee technology when it eventually launches too) although it is possible not all board makers will choose to pay the extra cost for the nForce 200 chip. said:
This is a significant licensing win for Intel, making it the only one of the three main x86 chipset makers who will be able to offer support for both SLI and AMD's Crossfire multi-GPU technologies (not mention its own Larabee technology when it eventually launches too) although it is possible not all board makers will choose to pay the extra cost for the nForce 200 chip.


I think this is being blown a bit out of proportion - as I said before I really doubt SLI is going to end up being standard on an Intel chipset. My sincere recommendation for anyone even thinking about an Intel chip based SLI motherboard - wait till the second gen, cause the first gen is likely to be a rushed product.



That all having been said, I kinda have a suspicion that this licensing deal is coming about because Nvidia might be trying to focus on gpus instead of dividing its resources between gpus and chipsets as much. I'd think with this SLI licensing deal being given to Intel, we can expect 1 Nehalem compliant chipset from Nvidia next year sometime; but that would likely be all - I have doubts they will try to roll out a fleet of boards like 750i, 780i, 790i, 790i ultra - hopefully, anyways, it'll be just one good board.
Related resources
a b U Graphics card
July 15, 2008 1:10:03 PM



Then I saw that it would require a nForce 200 chip to work :( 
July 15, 2008 1:18:18 PM

Who cares?



Right now the sweet spot is an Intel CPU with an AMD GPU.


July 15, 2008 1:32:45 PM

Because maybe some people like to discuss technology news and if you have no interest in it then why are you posting here other than to (increase post count +1!!)
July 15, 2008 1:49:06 PM

I guess the bottom line is how well the implementation of SLI through the Nforce chip will work on Intel Boards. Personally I have always viewed Intel boards to be more stable with Intel chips, as it should be, but less feature rich. However, Intel has awoken to the realization that unless they provide those rich features (OC'ing) they lose out to enthusiast level boards provided by Nvidia. This has brought us great enthusiast level boards like X35, X38 series and higher. I would not have minded having one of those boards if it hadn't been for their complete lack of SLI support.

Lets face it, until "very" recently AMD hasn't been able to keep up with a high end SLI setup like dual 8800's or 9800's. Now they can but it wasn't the case for a couple of years. I think the ballgame may change a lot if Intel manages to create a powerful stable and high performance board that both supports CF and SLI. Who wouldn't want a board that does it all? ... so long as it does it well, that is the question.

The ideal board for me is one that has 3X - 16X PCIE(2.0) slots able to handle 2 high end Nvidia boards and the third slot for a Physx enabled board, helped alone by an uber fast OC'ed Nehalem class chip.
July 15, 2008 1:52:25 PM

ovaltineplease said:
Because maybe some people like to discuss technology news and if you have no interest in it then why are you posting here other than to (increase post count +1!!)


Who cares was probably a bad way of putting it...

"Who is this really gonna affect" would be a better way of saying it.



Nvidia are deep in the doo-doo right now with their big chip/big power/big heat method. They are gonna have to ditch their arrogance and move to a more AMD like approach - do they have too much pride to admit their mistake?


I do not see Nvidia striking back for a couple of years - especially given the work AMD have done on crossfire and 2 GPUs on 1. Although R700 does not have all the goodies we wished for, the fact AMD/ATI have been mentioning them for a long time probably means they have been working towards them for a longer time.

Nvidia have a lot of catching up to do in that respect.
July 15, 2008 1:54:35 PM

warezme said:
Lets face it, until "very" recently AMD hasn't been able to keep up with a high end SLI setup like dual 8800's or 9800's. Now they can but it wasn't the case for a couple of years.


Eh?


When it worked right, crossfire always scaled better than SLI. :??: 




(But it was much harder to get working right)
July 15, 2008 1:59:48 PM

Amiga500 said:
Who cares was probably a bad way of putting it...

"Who is this really gonna affect" would be a better way of saying it.



Nvidia are deep in the doo-doo right now with their big chip/big power/big heat method. They are gonna have to ditch their arrogance and move to a more AMD like approach - do they have too much pride to admit their mistake?


Don't be to quick to discount Nvidia. After all aren't those the exact adjectives used to describe the 2900XT?? upon its release.

What has AMD done since then..., used the exact same architecture in that board, shrunk it, reducing thermals, increasing yields and performance and simply finished optimizing and putting two in a package.

There is nothing stopping Nvidia from doing the exact same thing...., think about it, and frankly compared to the performance envelope of the original 2900XT, you could say Nvidia is actually ahead of them in that respect.
a b U Graphics card
July 15, 2008 2:07:40 PM

this makes me think hmmmm. intel wanted only ati cf to work on there boards they want ppl to lean towards ati since ati wasnt doing better then nvidia intel gave them permission to use there havoc on there cards to make it more attractive. makes me wonder... intel and ati allies?
July 15, 2008 2:37:44 PM

invisik said:
this makes me think hmmmm. intel wanted only ati cf to work on there boards they want ppl to lean towards ati since ati wasnt doing better then nvidia intel gave them permission to use there havoc on there cards to make it more attractive. makes me wonder... intel and ati allies?


it's complicated - but deep down Intel and AMD have always been both the closest of allies and the furthest of enemies... one can not exist without the other, be it the complicated historical cross licensing or the fact that without a reasonable competitor then Intel stands to be broken up by anti trust law… In the long run, Intel will not allow AMD to go under – better the devil you know….
July 15, 2008 2:38:05 PM

I commend Nv for holding of from giving Intel SlI.With SLI and crossfireX,Intel Looks waaaay better than AMD and Nv.AMD being short of money I hope they have some else ,for giving away crossfireX,the basiclly have not much left.
July 15, 2008 2:59:25 PM

Intel X58 chipset to support SLI and CrossFire

Quote:
New chipset will feature Nvidia’s nForce 200 chip, allowing support for 3-way and 2-way SLI, as well as CrossFire support via the Intel chipset.

After denying Intel the use of its SLI technology in all but its top-end Tumwater and Skulltrail chipsets, Nvidia has finally given in and allowed Intel to make its nForce 200 chip a part of Intel’s forthcoming X58 ‘Tylersburg’ chipset for Nehalem CPUs.

Read more of Article at this link below.
http://www.custompc.co.uk/news/604443/intel-x58-chipset-to-support-sli-and-crossfire.html
July 15, 2008 3:21:19 PM

warezme said:

There is nothing stopping Nvidia from doing the exact same thing....



Apart from AMD have been heading down the road of heterogeneous CPU/GPU cores sharing resources since ATI was bought.


Obviously shared resources impacts on multi GPU as well as CPU/GPU.
a b U Graphics card
July 15, 2008 3:46:31 PM

I see this as a ways for nVidia to get in line. It allows them to have a competitive highest end product. They cant continue doing the single huge chip chip and still have the lead. Its multi core and/or sli, no other choice. At the same time, AMD and Intel both have x86 liscenses, and want to go to Havok for the most part. AMDs reasoning is that by taking away from the gpu capabilities diminishes gaming quality, whereas it can still be addresssed by quad or greater cored cpus., and thus not effecting the gpu performance. Intel wants this, as Larrabee will be a x86 product, and still keeps it within this realm. Like Ive said, nVidia has to quit sitting on its high horse, and get it in gear. No more denying sli, no more denying DX10.1 and it had better hope it can use its CUDA and ageia abilities to the best, as quickly, and strongly as possible, just to get a foothold against whats coming. I hope nVidia can do a great chipset for Nehalem, and have great mobos, they need them, badly
July 15, 2008 4:36:13 PM

JAYDEEJOHN said:
..... I hope nVidia can do a great chipset for Nehalem, and have great mobos, they need them, badly


Agreed,

I just don't see Larrabee being a big threat to either AMD or Nvidia at least not for a long while.

Another thing I haven't really seen explored (oddly enough) is the distinct possibility of either AMD or Nvidia, going the way two cores in a single die and eventually finally doing native multi-cores aka, Phenom style.

Doing this would greatly improve GPU performance, reduce package size, thermals, etc. It all comes down to who can shrink their GPU's the fastest. We saw that evolution (battle) in the CPU front and I expect we will in the GPU front as well. This can help keep the GPU well ahead of Larrabee but probably not forever.
July 15, 2008 4:45:48 PM

the 2900xt was only really bad with it's release drivers, but since they were somewhat slow comping it was pretty much discounted. Go take a look at the GPU charts...

However nvidia is the company guilty of recycling the gpu arch throughout countless card launches with little or no performance gain and minor technical changes.

The 2900xt was really the only bad launch ATI has had in a long while. The 9700/9800 started the ATI riegn, the 6800 took the lead for a few months until the x800/x850 launched, which stayed competitive with the 7800 until it was stomped by the x1800xl/xt which also out performed the 7900 until the x1900xt/xtx came out and stomped it, with the x1950xt/xtx stomping it further.

The 8800 is really the only clear lead that has sustained an ATI launch in several generations.

ATI has obviously adopted the AMD credo of design innovation being more important than performance crown. Innovation leads to better performance anyway which is evident with the 4800 series launch. ATI gained ground with the 3800 series for the price bracket, and have slaughtered nvidia with 4800's as nvidia has just been putting out rewashed G80 cores for near two years. Which is the same thing they did with the 6000/7000 series of cards, the arch changed very little in the gpu.
July 15, 2008 5:01:21 PM

iocedmyself said:
...
...ATI has obviously adopted the AMD credo of design innovation being more important than performance crown. Innovation leads to better performance anyway which is evident with the 4800 series launch. ATI gained ground with the 3800 series for the price bracket, and have slaughtered nvidia with 4800's as nvidia has just been putting out rewashed G80 cores for near two years. Which is the same thing they did with the 6000/7000 series of cards, the arch changed very little in the gpu.


come now, both ATI and Nvidia are equally guilty of as you put it "rewashing" their technology. The 3800 series and 4800 series are all derivitives of the original 2900.

I believe Nvidia has just been more blatant about rebadging some of their old tech with new numbers and offering little to no compelling performance difference from their older models. This is a DISTINCT NO NO in my book. It is a good thing that AMD is giving them an eye full and bringing them down to earth. It is good for ALL of us.
July 15, 2008 5:23:48 PM

warezme said:
come now, both ATI and Nvidia are equally guilty of as you put it "rewashing" their technology. The 3800 series and 4800 series are all derivitives of the original 2900.

I believe Nvidia has just been more blatant about rebadging some of their old tech with new numbers and offering little to no compelling performance difference from their older models. This is a DISTINCT NO NO in my book. It is a good thing that AMD is giving them an eye full and bringing them down to earth. It is good for ALL of us.


Are you sure the 3800 series and 4800 series came from the the 2900 series?

I thought both generations involved major reworking of the tmu's and shaders (particularly the 4800 series). Also, don't they all use different memory? GDDR3, GDDR4, GDDR5.

I think it stands that ATI has done a good job with using new technology in recent launches.
a b U Graphics card
July 15, 2008 5:25:02 PM

warezme said:
I guess the bottom line is how well the implementation of SLI through the Nforce chip will work on Intel Boards.... I think the ballgame may change a lot if Intel manages to create a powerful stable and high performance board that both supports CF and SLI. Who wouldn't want a board that does it all? ... so long as it does it well, that is the question.



Quoted for Truth



Regarding the original post/news - Thank you for posting it here, and I'm happy your post count has increased as a result :D 


**

Kind of a foregone conclusion tho - There were some bits in the news a number of weeks ago about Intel not granting the necessary licences to nVidia for Nehalem related technologies. As pointed out in the discussions at that time, the non-granting of licence was perfectly predictable given the amount of garbage nV fling in Intel's direction. Not to mention a Great Big Crowbar to get Intel an SLI licence in the form of forcing further nVidia boards be limited to AMD processors... And being limited to such in the face of highly competitive Crossfire implementations and ATi products.

I can only imagine the twinge felt across the management types when the new ATi releases were found to be as good as they are, and the connection made that ...."Hmmm... Intel told us to F Off... ATi is driving the revenues at Advanced Micro Devices, instead of the processor folks... You can guy two ATi cards for the price of one of our 2**'s... Wow... If the Graphics guys at AMD force AMD to reconsider their licence grant, our motherboard division would be up Sh*t Creek with no paddle.. and no boat... and no water...

nVidia may do some dumb things from time to time (such as engaging in a very public E-Peen contest with Intel...), but they're not *that* stupid.
a b U Graphics card
July 15, 2008 5:26:01 PM

I would say ATI did less (came out with fewer products) and did more (went from R2xxx to R4xxx) than nVidia did going from the 8800GTX to the G280 with everything in between
!