Sign in with
Sign up | Sign in
Your question

No more discrete chipset from NVIDIA

Last response: in Graphics & Displays
Share
March 29, 2008 1:09:53 AM

Just found this on Expreview.

Official: No more discrete chipset from NVIDIA
http://en.expreview.com/2008/03/27/official-no-more-dis...

Good or Bad?

Your thoughts?
a b U Graphics card
March 29, 2008 1:28:22 AM

.
a b U Graphics card
March 29, 2008 1:29:44 AM

I think its better idea to have discrete cards. Motherboards will generate more heat and overclocking is going to be limited. Other component might be limited to overclock as well due to the heat. The only way to get past this is by making water cooling standard. So prices of computers would increase. I rather stick to discrete cards.
Related resources
March 29, 2008 1:30:29 AM

pchoi04 said:
Just found this on Expreview.

Official: No more discrete chipset from NVIDIA
http://en.expreview.com/2008/03/27/official-no-more-dis...

Good or Bad?

Your thoughts?


Disappointing for me personally, but since I am not to big into SLI it isnt that big of a deal...hopefully AMD's chipsets get better because I usually only buy nVidia chipsets when I purchase an AMD processor. For Intel processors I generally stick to Intel Chipsets.

Best,

3Ball
March 29, 2008 1:48:13 AM

Whats the difference between a discrete gpu and an intigrated gpu? I'm not stupid but this is just a little confusing.

Edit: I looked it up and it says any graphics card other than intigrated gpu would be considered a discrete card, but by that saying the 9800GX2 would be considered a discrete card but I would hardly call that monster "discrete".
March 29, 2008 1:54:10 AM

gamecrazychris said:
Whats the difference between a discrete gpu and an intigrated gpu? I'm not stupid but this is just a little confusing.


Discrete is an actual graphics card that you would purchase and install in a slot on the motherboard such as an 8800GTS or 3870 512mb video card. Where as a integrated graphics chip is an onboard video which comes with the motherboard. It is physically integrated into the board itself. You cannot remove it and they are generally not very powerfull or overlockedable. An example would be something such as an Intel Extreme graphics processor. It is physically apart of the motherboard. Hope this helps!

Best,

3Ball
a b U Graphics card
a b Î Nvidia
March 29, 2008 2:14:28 AM

All they are saying is that new mobos will come with integrated graphics. There will still be discrete graphics and SLI, but now there will be an integrated chip as well. With this nVidia can enable hybrid power which can turn off the discrete cards when not in use to save power and reduce heat. Normal gameplay won't change.
March 29, 2008 2:40:59 AM

Nvidia is smart. This is a dual reply to intel making discrete gfx card and amd's 780g. But it is sure they aren't going to dump it forever. May be a year or two to resume the discrete manufacture, I say. And I can see the geforce prices rising as a remote consequence. Better hasten my purchase.
a b U Graphics card
a b Î Nvidia
March 29, 2008 2:45:08 AM

NVidia is NOT getting rid of their discrete graphics cards, only their discrete chipsets (ie, northbridges w/o graphics). Now all mobos will have integrated graphics for power saving as well as the ability to add discrete graphics (and SLI) for the enthusiast segment. There is no need to worry.

edit: I think the confusion is rising from how badly written that article is. It is barely english.
March 29, 2008 3:18:05 AM

That article must've been babelfished into english from japanese..
a b U Graphics card
March 29, 2008 4:44:55 AM

3Ball said:
Discrete is an actual graphics card that you would purchase and install in a slot on the motherboard such as an 8800GTS or 3870 512mb video card. Where as a integrated graphics chip is an onboard video which comes with the motherboard. It is physically integrated into the board itself. You cannot remove it and they are generally not very powerfull or overlockedable. An example would be something such as an Intel Extreme graphics processor. It is physically apart of the motherboard. Hope this helps!

Best,

3Ball
Dont forget the ocing being done on the 780G, which looked like almost 60% on the Toms review
March 29, 2008 4:51:40 AM

nukemaster said:
I think its great...think of the power and heat you save when your high end card shuts off and you just run onboard for surfing the net and if they make one like AMD even blue ray will only need onboard.

http://www.tomshardware.com/2008/03/26/no_more_discrete...

Yes, I think so.
Also, it was a matter of time. Such as descrete audio doesn't exist anymore.
March 29, 2008 5:45:00 AM

JAYDEEJOHN said:
Dont forget the ocing being done on the 780G, which looked like almost 60% on the Toms review


I would say that my "generally" covered that! :ange: 
March 29, 2008 8:29:34 AM

JAYDEEJOHN said:
Dont forget the ocing being done on the 780G, which looked like almost 60% on the Toms review


I like having an integrated GPU for board setup purposes. In fact, I have an ASUS 780G board at home just waiting for a Phenom 9850. I usually get a new build stable with the IGP before setting up my graphics card.

That said, the new power saving modes that both AMD and Nvidia are bringing out are great. I'd like to see an IGP on the upcoming AMD 890 boards next summer. All market segments should have motherboards with power saving. The 10 watts at idle of the 780G sure beats simply clocking down the discrete GPU.

The only Nvidia chipset I've owned was on an MSI board, and I really didn't like the budget 405 chipset with 8x PCIe. So, I switched the 4600+ to the 690V in my sig, and that was just a stopgap till 780G and B3.

The only thing an Nvidia chipset brings is SLI at the high end, they don't perform as well as ATI for AMD or Intel for Intel. Nvidia chipsets beat Via's and that's all. So, it's no wonder they're doing chipsets for Via right now.

hsetir said:
Nvidia is smart.


If Nvidia was smart, they'd not have nixed a merger with AMD, which is why AMD bought ATI instead. If Nvidia were smart, they'd have made the 9800gx2 on one PCB, and if they were smart they'd be working towards competing with Intel and AMD in the CPU field with a Fusion style product.

If they were smart, they'd not have tried to charge Intel for SLI, but ensured their dual graphics card solution was available on Intel motherboards. If they were smart, they'd have traded SLI for either a new x86 license or Intel permission to buy Via along with their old Cyrix license. IMHO, Nvidia's not smart right now because their huge discrete GPU market share has made them complacent.

People here bashed Ruiz to no end, but the guy who's really made bad business decisions for his company has been Huang. IMHO, Nvidia engineering is usually smart, but their overall approach is complacency with dinosaur GPU's. That can hurt them just like it hurt AMD. The rumors say it already has in regards to a chipset licence for Nehalem.
March 29, 2008 8:44:52 AM

yipsl said:
If Nvidia was smart, they'd not have nixed a merger with AMD, which is why AMD bought ATI instead. If Nvidia were smart, they'd have made the 9800gx2 on one PCB, and if they were smart they'd be working towards competing with Intel and AMD in the CPU field with a Fusion style product.

If they were smart, they'd not have tried to charge Intel for SLI, but ensured their dual graphics card solution was available on Intel motherboards. If they were smart, they'd have traded SLI for either a new x86 license or Intel permission to buy Via along with their old Cyrix license. IMHO, Nvidia's not smart right now because their huge discrete GPU market share has made them complacent.

People here bashed Ruiz to no end, but the guy who's really made bad business decisions for his company has been Huang. IMHO, Nvidia engineering is usually smart, but their overall approach is complacency with dinosaur GPU's. That can hurt them just like it hurt AMD. The rumors say it already has in regards to a chipset licence for Nehalem.


Man... It's usually annoying when people rant on the forums but you just said everything that I was thinking...

Well done. I too am disappointed with the direction that Nvidia has decided to go. Although AMD hasnt been pumping out top performance lately... I'm still rooting for them and I've been pretty impressed at what they have to offer lately.
a c 84 U Graphics card
a b Î Nvidia
March 29, 2008 2:43:11 PM

Power savings FTW :D 
!