Sign in with
Sign up | Sign in
Your question

Nvidia to make chipsets for Via CPU's???!!!

Last response: in CPUs
Share
March 25, 2008 7:27:42 PM

Take with a shaker of salt, it's from The Inquirer.

http://www.theinquirer.net/gb/inquirer/news/2008/03/25/...

So, will we see nice and cool low power Via CPU's on Nvidia chipsets with quad SLI? Will Nvidia basically lend engineers to Via to build a CPU that can compete with Intel, sort of the way IBM researches for AMD? What is the world coming to?

Well, it's coming to a merger of GPU cores on a multicore CPU. That won't make discrete GPU's obsolete at the mainstream and performance end, but it's the future of notebooks and hybrid Crossfire and SLI. Will we see the only CPU choices for Nvidia card fans those from AMD and Via? How will Larrabee compete against both ATI and Nvidia brands in the discrete GPU front? Will Intel's GPU on a CPU compete against Swift, or will it be a bust as news articles speculate?

http://www.dailytech.com/Intel%20Discusses%20GPU%20Hybr...

Quote:

Auburndale and Havendale will have two Nehalem cores paired with a graphics subsystem. The twin cores will share 4MB of L2 cache and feature an integrated dual-channel memory controller that supports memory configurations up to DDR3-1333.

The graphics subsystem will be initially derived from Intel's G45 integrated graphics. This indicates that neither Auburndale nor Havendale will be for heavy graphics processing, but will be more of an integrated graphics replacement.
According to Intel roadmaps, the new processors are expected to enter the market in the first half of 2009. This beats out the expected time of arrival of AMD's Fusion processors, which are planned to debut in the second half of 2009.


Though Larrabee might work out well, I don't expect Nehalem with a G45 core to be a hit compared to Deneb with a 4xxx core. Intel's aiming towards better CPU with weaker GPU while AMD's headed in the opposite direction, unless Deneb surprises us more than we expect.

Where is Nvidia headed? Will they work to make Via CPU's competitive so people will buy their chipsets? Or is this simply a low power notebook and business desktop market that won't have an impact upon gamer and video oriented enthusiast desktops?
March 25, 2008 8:38:57 PM

I'm not really surprised.More than likely they will make processors and only their processors will work with nvidia chipset which means no more SLI and Intel Processors.PREPARE FOR THE GREATEST RIP-OFF THE WORLD HAS SEEN.
a b à CPUs
March 25, 2008 9:28:46 PM

Nice find. I had a good laugh on the Inquierer one.
Related resources
a c 126 à CPUs
March 25, 2008 10:54:45 PM

Well I doubt VIA will get anywhere. Might go for ultra low power mobile machines like Intels Atom is going for. But thats the thing. How well can what VIA has compete against Atom in the price/performance/per watt segment? From what I have seen, Atom is very low in power usage with great performance for that segment.

As for Larabee, considering that a lot of game design companies are going towards Ray Tracing and even Intel has a Ray Tracing engine called IntelRT, I think Larabee will be set for the next gen in gaming/PC graphics. But no way to tell what it will do before its out.

As for what the media/analyst think of Intels CPU/GPU combo its just garbage. I used to visit bloomberg.com at work all the time. Every time AMD had something come out the article would talk about how great it was. Same with Intel or NVidia.

The only problem I have with what they are saying is that they cannot prejudge Intel. Considering the R&D Intel has and the size of their software department I wouldn't doubt it their CPU/GPU combo does very well. Of course we have to wait and see.
a c 99 à CPUs
March 26, 2008 2:44:29 AM

iMAKEtheDEVIL CRY said:
I'm not really surprised.More than likely they will make processors and only their processors will work with nvidia chipset which means no more SLI and Intel Processors.PREPARE FOR THE GREATEST RIP-OFF THE WORLD HAS SEEN.


I doubt that NVIDIA will make processors- it sounds like they just agreed to make chipsets for VIA's CPUs. That is good for both sides as it lets VIA concentrate its resources into making only CPUs (which apparently is their goal) and gives NVIDIA another market for its already-existing chipset business. It wouldn't be logical for NVIDIA to make general-purpose desktop/notebook processors for a whole host of reasons:

1. The CPU would have to support 32-bit x86 operation in order to be anything but a niche product today. This would require licensing directly from Intel as the existing x86 agreements are IIRC non-transferable. Perhaps NVIDIA could make a CPU that *only* used x86_64 long mode, which would possibly bypass Intel's agreements, but it will take several years past the first version of x86_64-only Windows to ship before that can happen- either that or something like Linux that is more ISA-independent pushing out x86 Windows as being the dominant general-purpose desktop/laptop OS.

2. Economies of scale. It takes a ton of money to create a competitive desktop/laptop CPU microarchitecture and a ton of money to develop cutting-edge fabs to produce these CPUs. Unit prices have dropped as uarch and fab complexity have gone up, leading to the few-large-companies situation we have today. You could try to go fabless to cut it as a smaller company, but then you'll usually be behind the fab-holding CPU companies (particularly Intel) in bringing a CPU using a particular process node online and also have additional overhead that those guys do not have in paying the foundry.

3. It's not their area of expertise and they don't critically need to that market to sell their core products. NVIDIA's core products are GPUs, not chipsets. NVIDIA can always count on there being a market for selling discrete GPUs as discrete GPUs are peripherals and use standard bus interfaces. I'd be willing to say that they'd be able to produce chipsets for AMD boards as well, considering that the onboard interfaces needed are public spec (HyperTransport.) Some may say that NVIDIA's current predicament looks like the counterpart of AMD's predicament since the demise of Super Socket 7, but it's different. A CPU *has* to have specific chipset support in order to function, so AMD *had* to have chipset support they could count on 100% of the time to assure themselves of a market for their CPU sales. GPUs only care that they have the correct type of public-specification expansion slot to be plugged into.

I personally think that if NVIDIA were to get into CPUs, it'd be embedded/ultra-mobile units. Those are relatively simple in design and construction compared to desktop units and also do not need the latest fab technology to profitably and effectively produce. Thers is not one overwhelmingly-used version of any OS used in embedded and ultra-mobile devices like routers, cell phones and PDAs such as there is in desktop and notebook computers, so you can get away with using a non-x86 ISA. The segment is also very fast-growing and there are many smaller companies making ICs, so NVIDIA would not be outclassed as if they'd instantly burst onto the x86 scene.
a c 126 à CPUs
March 26, 2008 3:17:12 AM

MU_Engineer nice theorys. Although even trying to bypass Intels agreements would still be hard if it uses any part of the x86 architecture.

As for them going through Linux with an ISA, until it becomes very standardized like Windows it wont push Windows out as the dominant OS. Only reason why Windwos will remain the dominant OS is due to compatability, ease of use and standarization. Its easier for developers and people to only have one OS to develop for and use/learn instead of 3, 4 or 5.

I for one can see NVidia getting itno the embedded/ultra mobile but thats still hard to crack into with Intel, VIA and a few others already dominating. They will probably do like ATI did and start developing GPUs for that arena like cell phones and PDAs but I am very much doubting them getting into the desktop/server/laptop CPU arena. Too much money to invest and it would take too much time.

I am guessing that AMDs move to buy ATI was mainly for the chipsets rather than Fusion as having their own standardized chipsets will benefit them in the CPU market where as when VIA /ATI were making chipsets(along with others) was causing problems. I remember VIA chipsets used to have problems correctly ID'ing Athlon XP CPUs and would require manual BIOS tweaking.

But now AMD has their ATI chipsets and will probably phase NVidias chipsets out of practicle use for AMD chips especially with the hybrid CF. That might be why NVidia is going with VIA so they will have more than just Intel CPUs and of course to make more money, right?
March 26, 2008 4:16:28 AM

the VIA isaiah chips will do fine vs Intel.
yes, they will.
March 26, 2008 5:06:02 AM

Isaiah has been demonstrated to play Crysis in a UMPC, it will compete fine. I really hope we see them move to be a strong desktop competitor in the future.
March 26, 2008 5:57:19 AM

jimmysmitty said:

But now AMD has their ATI chipsets and will probably phase NVidias chipsets out of practicle use for AMD chips especially with the hybrid CF. That might be why NVidia is going with VIA so they will have more than just Intel CPUs and of course to make more money, right?


I'm guessing that AMD wanted ATI chipsets and GPU's. Swift looks to be the future of notebooks, so just having chipsets alone would not work because AMD chipsets with "old fashioned" IGP won't be competitive vs. GPU on CPU in the future.

Nvidia might be forced to give up SLI and go Crossfire. I can't see Intel ditching a free Crossfire standard for their discrete cards. Nvidia might not get an Intel license for Nehalem chipsets; if they lose their AMD license, then their chipset business is only Via. More shaker of salt from The Inquirer:

Quote:

License stand-off

That's where we touch the sensitive spot: the rumours circulate - bear in mind, just rumours - that Nvidia might not get the QPI license after all. It didn't deliver on the Xeon side, and its value add is far less in the Nehalem era: so, why give it to them at all?

Well, QPI block would cut Nvidia off ALL future Intel chipset business, and, quite possibly, future high-end tightly-coupled QPI-based GPU market where this connection may enable important performance wins.

http://www.theinquirer.net/gb/inquirer/news/2008/03/24/...

I would like to see Via put out some decent CPU's. It would bring back the days when a Cyrix 486DLC on a 386 board was a viable upgrade from a 386SX CPU/motherboard. Back then, Intel CPU's were expensive, as were AMD's, so I went Cyrix for a year or so. Still have the CPU in a drawer, though the motherboard got lost years ago.
March 26, 2008 7:58:51 AM

Hopefully Via will merge with AMD/ATI and AMD will finally will be able to compete in the CPU market. Those VIA people are good at what they do. But thats probably not going to happen and were back to reality
a c 126 à CPUs
March 26, 2008 12:35:10 PM

yipsl said:
I'm guessing that AMD wanted ATI chipsets and GPU's. Swift looks to be the future of notebooks, so just having chipsets alone would not work because AMD chipsets with "old fashioned" IGP won't be competitive vs. GPU on CPU in the future.

Nvidia might be forced to give up SLI and go Crossfire. I can't see Intel ditching a free Crossfire standard for their discrete cards. Nvidia might not get an Intel license for Nehalem chipsets; if they lose their AMD license, then their chipset business is only Via. More shaker of salt from The Inquirer:

Quote:

License stand-off

That's where we touch the sensitive spot: the rumours circulate - bear in mind, just rumours - that Nvidia might not get the QPI license after all. It didn't deliver on the Xeon side, and its value add is far less in the Nehalem era: so, why give it to them at all?

Well, QPI block would cut Nvidia off ALL future Intel chipset business, and, quite possibly, future high-end tightly-coupled QPI-based GPU market where this connection may enable important performance wins.

http://www.theinquirer.net/gb/inquirer/news/2008/03/24/...

I would like to see Via put out some decent CPU's. It would bring back the days when a Cyrix 486DLC on a 386 board was a viable upgrade from a 386SX CPU/motherboard. Back then, Intel CPU's were expensive, as were AMD's, so I went Cyrix for a year or so. Still have the CPU in a drawer, though the motherboard got lost years ago.



I just think that Fusion was a after result that AMD decided to go after. Of course a IGP cannot compete with a GPU but in the business world an IGP works just fine and is much cheaper than a GPU. But Nehalems CPU/GPU combo and AMDs Fusion(unless they are only focusing on the laptop market with Swift) might be cheaper. Either way they will both be competing.

As for NVidia dropping SLI, probably not. Even in the GPU market there cannot be one type and CF would mean AMD would technically have a monopoly with multi GPU setups since NVidia would have to license it from AMD/ATI. But NVidia is the one causing their own problems since they wont license it to Intel. If more Intel chipsets had SLI support NVidia would make more money for being able to sell GPUs. Some people are willing to go ATI b/c they want a Intel chipset over NVidia. And those who want multi-GPU can only do CF hence NVidia loses sales that way.
March 27, 2008 4:27:30 AM

jimmysmitty said:
...unless they are only focusing on the laptop market with Swift....


Swift makes sense on both notebook and OEM PC's. That's a money making market where performance matters. So I can see them ditching IGP solutions for GPU on CPU.

jimmysmitty said:
...
As for NVidia dropping SLI, probably not. Even in the GPU market there cannot be one type and CF would mean AMD would technically have a monopoly with multi GPU setups since NVidia would have to license it from AMD/ATI. But NVidia is the one causing their own problems since they wont license it to Intel. If more Intel chipsets had SLI support NVidia would make more money for being able to sell GPUs. Some people are willing to go ATI b/c they want a Intel chipset over NVidia. And those who want multi-GPU can only do CF hence NVidia loses sales that way.


Nvidia charges for their license. AMD did not charge Intel for Crossfire. That's why Crossfire is on Intel platforms. I can't see AMD giving Nvidia a free license, but perhaps they could swap licenses where AMD gets PhysX and Nvidia gets Crossfire?

At any rate, the only people going SLI will be those interested in Nvidia chipsets. That's got to hurt Nvidia's sales.
a c 126 à CPUs
March 27, 2008 4:39:32 PM

Some people will want SLI but know that Intel chipsets OC better and end up gouing ATI. Kinda like that guy Hipor(?). He mods the living daylights out of his 2900XT 1GB's GPU core. Last I saw he was going for a 1500MHz GPU core.

But either way AMD/NVidia wont do that. NVidia will either break or when Larabee hits NVidia wil do something. AMD will need to aquire theuir own physics company. Intel got Havok(who is probably the best), NVidia Ageia. Whos left, I have no idea. But we shall see.
March 27, 2008 9:19:32 PM

AMD can just license Havok from Intel. They share so much already. Both benefit from a single standard for physics and multi-GPU, that allows their CPU's to be compared in an enthusiast environment. It wouldn't be a bad idea for Intel to allow AMD to make chipsets for Nehalem in the future.

Nvidia probably should just come up with their own console to compete against the Xbox 2 and PS3. They could give it some decent PC capabilities to woo some Nvidia fanboys away from the desktop.
a c 126 à CPUs
March 27, 2008 9:40:37 PM

yipsl said:
AMD can just license Havok from Intel. They share so much already. Both benefit from a single standard for physics and multi-GPU, that allows their CPU's to be compared in an enthusiast environment. It wouldn't be a bad idea for Intel to allow AMD to make chipsets for Nehalem in the future.

Nvidia probably should just come up with their own console to compete against the Xbox 2 and PS3. They could give it some decent PC capabilities to woo some Nvidia fanboys away from the desktop.


Meh consoles suck. They just take technology from PC's then turn around and give us the finger. If NVidia went that route I would never ever buy them again.
!