Sign in with
Sign up | Sign in
Your question

X38 no SLi?

Last response: in Motherboards
Share
October 19, 2007 8:37:22 AM

WTF? Why would Intel support Xfire but not SLi? So does this mean that we will have to wait for Nvidia to come up with a new chipset to really match the new intel line up? This seems like a huge gaff to me.

Was wondering if someone could fill me in on what I am missing....I just do not get why Intel would stiff such a large market segment....especially supporting a CPU competitor.

More about : x38 sli

October 19, 2007 9:37:42 AM

It is not Intel that is to blame for this, it is Nvidia. The X38 hardware is completely compatible with SLI, but Nvidia writes its drivers in such a way that they detect the presence or absence of Nvidia chips on the motherboard and, if there are none there, SLI is disabled. Nvidia figures they can thereby force you to buy an Nvidia-chipset motherboard if you want to use SLI and they can make more money out of you.

The question you should be asking is: "WTF? Why would Nvidia not allow SLI on Intel hardware?"
October 19, 2007 11:53:18 AM

Ahh see now that is something I haven't seen in the articles I have read...so the slap up side of the head goes to Nvidia :) 
Related resources
October 19, 2007 12:55:45 PM

if the slots and bandwidth are there, couldnt someone "and i belive ive heard of this already for dual X8 platforms" write hacked drivers to allow SLI support on a intel config?
October 19, 2007 1:25:02 PM

I heard of it being done on other platforms but not for the X38
October 19, 2007 1:30:06 PM

R0B0T said:
if the slots and bandwidth are there, couldnt someone "and i belive ive heard of this already for dual X8 platforms" write hacked drivers to allow SLI support on a intel config?
Well, yes and no: hacked drivers are available for SLI on (say) the 975X platform, but they're not good hacked drivers. I'm not sure they even support G80-series cards at all (they certainly didn't for a long time, and may well still not). The hacking is also likely to introduce bugs, and they will be based on older driver versions so they won't have all of the latest bugfixes and performance improvements from Nvidia. Using hacked drivers is very much an "at your own risk" phenomenon - I wouldn't want to take that risk myself.

On the plus side, if the rumours are correct, RV670 vs G92 will be a much more equal struggle than R600 vs G80 was: performance similar this time round and ATI's chip possibly being cheaper.

So the inability to support SLI may only matter to people who already own a pair of Nvidia cards - other people may not have a problem with going Crossfire instead. In fact it's not impossible that Nvidia will actually lose money because of this decision: it's possible that there will be more X38 owners who don't buy Nvidia GPUs than there will be Nvidia GPU owners who don't buy X38 mb's. I really hope that happens!
October 19, 2007 3:15:40 PM

Nvidia seems to be going farther and farther out on a limb. They won't support Intel chipsets because Intel is their rival. They won't support AMD/ATI chipsets because AMD/ATI is their rival. One of these days, both Intel and AMD/ATI may decide they don't need Nvidia at all for their CPUs. Then Nvidia may find itself with video cards that only fit Nvidia chipsets, but Nvidia chipsets don't support CPUs from either of the two major companies.

Something similar to that happened to 3DFX many years ago, when 3DFX supported Glide only, while the rest of the world was going to Microsoft's DX. 3DFX went from being a premier video card company to being, well, has anyone seen any 3DFX video dards for sale recently? Nvidia needs to be careful, in my opinion.
October 19, 2007 3:30:33 PM

immagikman said:
WTF? Why would Intel support Xfire but not SLi? So does this mean that we will have to wait for Nvidia to come up with a new chipset to really match the new intel line up? This seems like a huge gaff to me.

Was wondering if someone could fill me in on what I am missing....I just do not get why Intel would stiff such a large market segment....especially supporting a CPU competitor.

I read some articles where some people testing the boards were able to run SLI but they had special drivers and such, just to test it; but mentioned that we probably won't see it as consumers.
a b V Motherboard
October 19, 2007 3:44:49 PM

Quote:
Its a joke isnt it. When Intel released C2D, the only mobo available for it was Intels own which they charged a premium for ($260+) and it didnt support SLI either. Makes you wonder why they need to rape people like that, knowing that the majority of them will want to upgrade that overpriced mobo to SLI eventually.
Kinda funny how the Intel X38 supports AMD/ATI crossfire, but not nvidia SLI. I guess that dispells the rumor of Intel buying nvidia.



MB - It has nothing to do with Intel not wanting to provide SLI. It's nVidia refusing to licence SLI to Intel. Why? Because then they sell more nVidia chipsets to go wth the video cards instead of merely collecting a smaller licencing fee. nVidia (probably rightly) are asserting their #1 performance position in the marketplace by refusing to play along with Intel. We can all be sure that Intel would rather pay someone *other* than AMD/ATI for the ability to run dual(or more) cards. However, INT is taking the pragmatic approach of paying AMD/ATI for Crossfire rights because it's still cheaper than having to develop their own and thereby having to support nVidia *and* ATI cards by themselves.
a b V Motherboard
October 19, 2007 4:16:31 PM

All this only matters to those running 2 video cards. For most builders on a budget of some type, they can choose the Intel X38 and have a fine system with 1 card be it NV or ATI.

As too hacked drivers I would say watch out. I have had a number of friends get burned when they had modded drivers running and had to do any type of XP re-install. The OS install just hangs up when it gets to the point of installing the drivers.
October 19, 2007 4:43:14 PM

sailer said:
Something similar to that happened to 3DFX many years ago, when 3DFX supported Glide only, while the rest of the world was going to Microsoft's DX. 3DFX went from being a premier video card company to being, well, has anyone seen any 3DFX video dards for sale recently? Nvidia needs to be careful, in my opinion.
3dfx did a lot of things wrong, but an over-dependence on Glide was certainly not one of them. The games industry focused on Glide for a long time because 3dfx hardware was so much better than anybody else's that no serious gamer ever used anything else, and so there was no point in using an "open" API: the only effect would have been to make the games run slower. This situation persisted for quite some time, partly because the Unreal engine was so widely used and worked very well with Glide.

3dfx came unglued for a number of reasons, including:

- Rash involvement in the abortive "Voodoo Rush" project.
- Banshee underperforming in 3D when compared to Voodoo 2.
- Ill-advised decision to purchase STB (AIB manufacturer) and stop selling chips to anyone else.
- Delayed product releases.
- Mis-reading what consumers wanted. They did this first when Voodoo 3 didn't support 32-bit colour and then (much more seriously) when Voodoo 5 didn't support geometry acceleration.

It was actually missed deadlines that did the most damage. Voodoo 5 was originally intended to go against the original GeForce 256, and could have been a very serious rival to it: Nvidia cocked up their design and had to drop the clock speed from 200MHz to 120, and the geometry acceleration was too weak to be of any practical use. But, as it was, Voodoo 5 didn't ship until it was going up against GeForce 2, which was very powerful and had useful geometry acceleration, which people decided they wanted a lot more than they wanted a T-Buffer.
October 19, 2007 4:44:07 PM

Dogsnake said:
All this only matters to those running 2 video cards. For most builders on a budget of some type, they can choose the Intel X38 and have a fine system with 1 card be it NV or ATI.
On the other hand, if you're not using 2 video cards, it may be hard to justify choosing X38 over P35.
October 19, 2007 5:12:57 PM

nicolasb said:
3dfx did a lot of things wrong, but an over-dependence on Glide was certainly not one of them. The games industry focused on Glide for a long time because 3dfx hardware was so much better than anybody else's that no serious gamer ever used anything else, and so there was no point in using an "open" API: the only effect would have been to make the games run slower. This situation persisted for quite some time, partly because the Unreal engine was so widely used and worked very well with Glide.

3dfx came unglued for a number of reasons, including:

- Rash involvement in the abortive "Voodoo Rush" project.
- Banshee underperforming in 3D when compared to Voodoo 2.
- Ill-advised decision to purchase STB (AIB manufacturer) and stop selling chips to anyone else.
- Delayed product releases.
- Mis-reading what consumers wanted. They did this first when Voodoo 3 didn't support 32-bit colour and then (much more seriously) when Voodoo 5 didn't support geometry acceleration.

It was actually missed deadlines that did the most damage. Voodoo 5 was originally intended to go against the original GeForce 256, and could have been a very serious rival to it: Nvidia cocked up their design and had to drop the clock speed from 200MHz to 120, and the geometry acceleration was too weak to be of any practical use. But, as it was, Voodoo 5 didn't ship until it was going up against GeForce 2, which was very powerful and had useful geometry acceleration, which people decided they wanted a lot more than they wanted a T-Buffer.


I realize that there were more problems with 3DFX than just Glide, but I mainly remember having 3DFX cards and finding fewer and fewer games that supported it. As I recall, and my memory may not be perfect, Sierra was the last game maker that supported Glide. Glide was great at the time, but it got left behind.

Most assuredly, the initial lack of support for 32 bit color and the later lack of geometry aceleration hurt as well, along with the high prices being asked. Those were what drove me to buying Nvidia cards. The point I was trying to make was not about Glide iself, but the possibility that Nvidia could dig itself into a hole similar to the one 3DFX did. For those, including myself, who only use one card, this is no problem of course. Then again, I'm saving for a bigger monitor which will need two cards, especially using Vista, so where does that leave me except in ATI's camp.
October 19, 2007 5:16:27 PM

nicolasb said:
It is not Intel that is to blame for this, it is Nvidia. The X38 hardware is completely compatible with SLI, but Nvidia writes its drivers in such a way that they detect the presence or absence of Nvidia chips on the motherboard and, if there are none there, SLI is disabled. Nvidia figures they can thereby force you to buy an Nvidia-chipset motherboard if you want to use SLI and they can make more money out of you.

The question you should be asking is: "WTF? Why would Nvidia not allow SLI on Intel hardware?"


Hear hear... Nvidia, your chipsets suck. Give the people what they want!!!! :non: 
October 19, 2007 5:23:21 PM

SiriusStarr said:
Hear hear... Nvidia, your chipsets suck. Give the people what they want!!!! :non: 


Exactly, and this is what nicolasb wrote in reference to 3DFX, "Miss-reading what consumers wanted". If a consumer, like myself, wants two cards and Nvidia won't allow SLI on Intel hardware, then I have to buy two ATI cards. Its all nice and good to stand on principle, but if they shoot themselves in the foot, it gets hard to stand.
October 20, 2007 1:46:26 AM

No point beating around the bush that Nvidia is certainly doing much better in the GPU war right now. So, does Nvidia have a new chipset coming out soon? I'd hate to choose the X-38 and not be able to use SLI with Nvidia. I want to get a new computer when Penryn comes out, so I'm stuck with the 680I unless something brand new from Nvidia comes out soon.

Suggestions from anyone? I won't be going dual graphics right away, but I'm going to have to choose my alliance by the new year or so.
October 20, 2007 3:53:59 AM

nVidia is supposed ot come out with the 780i chipset. but I would still rather have the x38...
October 20, 2007 6:21:10 AM

Well the 680i isn't a bad board, My favorite builder uses them :) 

More importantly to me though is 3 full X16 PCIe slots, two highend cards in SLi and a third mid range card for multi monitor support.
I'll be running the SLi cards to a 40" 1920x1080 display and the third card supporting two 30" side monitors. I should be well set for years to come. :)  Ill have to look at the specs of the 7xx series chips from Nvidia....ATi got on my bad side years ago and I just don't have an interest in their crossfire product. (When I say years ago, Matrox was a big player in the video market...yes Im old) Age and Eyesight being what it is :)  Big monitors are more important that higher resolutions on smaller screens.
October 20, 2007 9:05:20 AM

immagikman said:
Well the 680i isn't a bad board, My favorite builder uses them :) 

More importantly to me though is 3 full X16 PCIe slots, two highend cards in SLi and a third mid range card for multi monitor support.
I'll be running the SLi cards to a 40" 1920x1080 display and the third card supporting two 30" side monitors. I should be well set for years to come. :)  Ill have to look at the specs of the 7xx series chips from Nvidia....ATi got on my bad side years ago and I just don't have an interest in their crossfire product. (When I say years ago, Matrox was a big player in the video market...yes Im old) Age and Eyesight being what it is :)  Big monitors are more important that higher resolutions on smaller screens.


my hero :pt1cable: 
October 20, 2007 2:00:41 PM

sailer said:
I realize that there were more problems with 3DFX than just Glide, but I mainly remember having 3DFX cards and finding fewer and fewer games that supported it. As I recall, and my memory may not be perfect, Sierra was the last game maker that supported Glide. Glide was great at the time, but it got left behind.
What the hell are you talking about? Are you trying to suggest that 3dfx cards couldn't run Direct3D or OpenGL? Of course they could! And a great deal better than any other card could at least up until the Riva TNT2 came out.
October 20, 2007 2:03:05 PM

dashbarron said:
No point beating around the bush that Nvidia is certainly doing much better in the GPU war right now.
Yeah, but wait till the end of November, and things may look very different.

dashbarron said:
So, does Nvidia have a new chipset coming out soon?
Yes, but it's pretty much exactly the same as the old one: the north bridge chip is identical.
October 20, 2007 2:47:07 PM

nicolasb said:
What the hell are you talking about? Are you trying to suggest that 3dfx cards couldn't run Direct3D or OpenGL? Of course they could! And a great deal better than any other card could at least up until the Riva TNT2 came out.


I know they could run Direct3D and OpenGL. Glide was simply considered better at one time. And they were a good card for a time. I owned a couple of them. Even did their version of SLI. But time moved on and they didn't. So I eventually bought Nvidia cards. And I've had a couple ATI cards in the past, when ATI was at the top of the heap.

My main wish is that both Nvidia and ATI could be used on the same motherboards so that a person didn't have to choose the motherboard based on which video card he was planning to buy, at least if he's planning on buying two cards. This proprietary stuff when it comes to using two cards is a total pain.
October 20, 2007 3:14:57 PM

Doesn't really matter X-38 was DOA. No real advantages over G-35. Isn't any better overclocker and cost more. Intel will be changing sockets for next gen chips and DDR3 in not worth the money. I waited on X-38 before buying a Blitz Formula has every feature under the sun and around $250. It will support a 500MHz FSB on air (chipset) and companies are almost giving away DDR2 ram.

Beside Intel was clear from the beginning that it would be crossfire only. All the rumors about SLI support came from the usual sources. Let's see what happens next year.
October 20, 2007 3:31:27 PM

I read on Techpowerup and a couple other sites that a X48 chipset was being introduced in a couple months, and that is one reason there have not been many X38 boards produced. For those who aren't in a hurry, the X48 might well be worth waiting for. From what I understand, its mainly an updated X38, with a couple of the bugs fixed.
November 8, 2007 2:08:52 PM

Quote:
Something similar to that happened to 3DFX many years ago, when 3DFX supported Glide only, while the rest of the world was going to Microsoft's DX. 3DFX went from being a premier video card company to being, well, has anyone seen any 3DFX video dards for sale recently? Nvidia needs to be careful, in my opinion.


nVidia bought out 3dfx. Now does this make more sense?
November 8, 2007 3:15:49 PM

nicolasb said:
Yeah, but wait till the end of November, and things may look very different.

Yes, but it's pretty much exactly the same as the old one: the north bridge chip is identical.


If the Northbridge chip is idetical to the 680i wouldn't it be posible to enable via firmware or BIOS flash the PCIe x16 2.0 on existing 680i motherboards? The reason why I ask this is that I am currently on the market for a SLI 8800GT PCIe 2.0 motherboard and of course the Intel chipsets support it but there are no Nvidia chipsets that support it as of yet, right?
November 8, 2007 3:55:53 PM

tvh said:
If the Northbridge chip is idetical to the 680i wouldn't it be posible to enable via firmware or BIOS flash the PCIe x16 2.0 on existing 680i motherboards? The reason why I ask this is that I am currently on the market for a SLI 8800GT PCIe 2.0 motherboard and of course the Intel chipsets support it but there are no Nvidia chipsets that support it as of yet, right?


Probably would not work.

No there are no current Nvidia chipsets that support pcie 2.0

also there is no reason to buy a pcie 2.0 motherboard at the moment. the 8800gt works fine in pcie 1.1 and does not come close to using up the pcie 1 bandwidth...
November 8, 2007 4:20:27 PM

i only scanned this thread and hopefully i did not miss something .. but what about the intel skulltrail chip
that will have sli and crossfire etc etc ....
due 1qtr 2008 ..
November 9, 2007 2:05:17 AM

jwolf24601 said:
Probably would not work.

No there are no current Nvidia chipsets that support pcie 2.0

also there is no reason to buy a pcie 2.0 motherboard at the moment. the 8800gt works fine in pcie 1.1 and does not come close to using up the pcie 1 bandwidth...


I dont have an old 775 socket processor to use to flash the 680I board.... I assumed 780I would have Penryn set.
November 9, 2007 9:14:35 AM

zingers said:
i only scanned this thread and hopefully i did not miss something .. but what about the intel skulltrail chip
that will have sli and crossfire etc etc ....
due 1qtr 2008 ..


Intel have bought the rights to use SLi on their skulltrail boards only. But seriously, who wants to buy Intel boards? (Maybe this will change with these boards, though?)

As a general point, you have to lay the blame at both parties' doors (though to what extent I don't think anyone knows). Intel are as much to blame for not giving (enough) money to nVidia for the rights to use their brand (Sli) and incorporate their technology. Intel will make lots of money at nVidia's expense, therefore it's not unreasonable for nVidia to want to get in on that action. Who knows what actually happened at the negotiating table, though.
!