Sign in with
Sign up | Sign in
Your question

NVidia 8300 & 8600 details released!

Last response: in Graphics & Displays
Share
a c 355 U Graphics card
a b Î Nvidia
January 15, 2007 3:07:05 PM

It has been a while since I decided to click on a link to the Inquirer.

I just hope someone comes out with a 8600 that supports HDCP when I decide to build a new HTPC.
Related resources
January 15, 2007 3:31:17 PM

Excited to see the performance of the 8600. Wonder how it will stack up against a 7950GT or the crippled version of the 8800GTS.
January 15, 2007 3:34:43 PM

If you look at the specs of the 8600 (regular) it doesn't seem that much better then a 7600gt. It has the higher memory bus speed (256-bit and xtra shaders) but on the other hand it has to deal with a far lower core clock (maybe better design?) and lower clocked memory. Also the "up to 256mb" message doens't sound to good... (128mb 8600's any1?)
January 15, 2007 3:39:17 PM

Any news of a much needed AGP card from nvidia !
ATI already is talking about a AGP x1950xt !
There is some dude on 3dguru who wants to dump his agp X1950pro !!
They seem to have some overheating and PSU issues.
January 15, 2007 3:51:56 PM

Looks like the GT might be something worth getting now with a proposed 256bit memory bus. Thankfully, 128bit will be left for the 'low/high-end'.
The Ultra would be my choice. 512mb of DDR3 - it would be 5x more usefull down the road as games are maxing 256 quickly. Too bad about it's release date anyhow.

My WAG would be 20 - 24amps :? Ofcourse the process has shrank. 90nm 8800GTS compared to the 8600's 80nm process.

With that price questimate, it makes you wonder if nV is planning on a large unified price drop - or if it's not the 8600's performance. 8O Ofcourse it's release is 3 months in the future, though, the 7600GT was 200$+ upon it's appearance. *scratch* Can't wait for some benchmarks.
January 15, 2007 4:03:33 PM

Yeah. Can't wait.
It's all about the bars. :wink:

If the GTS drops below 300 with the mid-range debut, it'l be mine.
January 15, 2007 4:09:36 PM

Agreed. That 8600Ultra model looks like it will be the sweet spot. Correct me if I'm wrong here, but 64 programmable shaders and up to 512MB od GDDR3 1400Mhz memory puts it in league with X1950Pros and X1900XTs. All that for around $180 plus you get DX10 capability.
January 15, 2007 4:21:44 PM

Healthy competition.
Interested in ATI's new wave. Hopefully they've broken through thier power consumption tunnel. Though, as they need to stay on the competition bus, who knows which way thier cards will be going.
A blip on the Inq. a while back aimed at the R600 having a seperate PSU. :lol: 
January 15, 2007 4:47:01 PM

Gotta do something to get hits Tacos.
I Agree. It's looking like the G8x and Rxxx will be relative to DirectX10 being born and in it's first stages. Gotta crawl before it walks.
January 15, 2007 4:59:41 PM

I wonder if we will have photo-realistic rendering by the GeForce 9xxx series?
That would be hot....
-cm
January 15, 2007 5:01:18 PM

There's no real way to tell right now as this is all just speculation. I think the lower clock speeds of the 8600Ultra compared to the higher clock speeds of the X1950 series is a bit of a moot point as the architectures are entirely different and we won't know anything until the cards are out and have been benchmarked in real world testing.
January 15, 2007 5:08:02 PM

<shivers in expectation>

Every time I look out the window, I can see the huge amount of data flowing into my eyes. On second thought, I don't believe a computer will ever gain complete photorealism. There is just to much data.
-cm
January 15, 2007 5:12:31 PM

Of course it will it's only be a matter of time, who knows maybe by then computer graphics will have a complete overhaul utilizing never before heard of techniques allowing photo realism.
January 15, 2007 5:16:05 PM

hmm, wondering how the 8600 ultra will compare @dx9 games in par with my 1 month old x1950 pro? That should be interesting.
January 15, 2007 5:19:04 PM

It will involve laser beams shooting directly into your eyeballs. Of course the initial product will be incredibly painful and probably kill a few people but you gotta break a few eggs to make an omlete, right?
January 15, 2007 5:21:07 PM

Exactly :lol: 
January 15, 2007 5:46:25 PM

Kind of bizarre how they play with these naming schemes, huh? The Ultra was top of the line two generations ago, then it disappeared, now it's midrange. The GT was second best, now it's second worst.
January 15, 2007 6:38:31 PM

Oh please, when are people gonna start to realize that clock rate means nothing. It can only be used to compare 2 cards with the exact same architecture.
Eg A 7600GT clocked at 600/750 is faster than a 7600GT clocked at 560/700

I'm sure the 8600Ultra will kill the 7600GT, just like the 7600GT kills the 6600GT.

Lets see: 7600GT vs 8600 Ultra
256MB vs 512MB
128bit bus vs 256bit bus (this means that the memory bandwidth will double)
This one is BIG! 12 pixel shaders vs 64
DirectX 9.0c GPU vs All new DirectX 10 GPU (which will also run DX9 better)
The cores are running at a similar speed: 560 vs 500

By the way, these are the nvidia specifications. Be sure that various manufacturers will clock them higher, both GPU and RAM.
January 15, 2007 7:28:03 PM

Right Speedy.
Would you rather have a Pentium 4 @ 3.4 or a C2D E6600 running at 1GHz less.
Architecture.
nVidia's mid-range 8xxx look to be power houses for DX9 and great cheap solutions for DX10. Cannot wait to see a top end X1950xxx shoot out it with these cards. Iching for some bars :p 
January 16, 2007 12:55:08 AM

Wont they already?
January 16, 2007 4:14:05 AM

Quote:
True, the amount of pipelines makes more of a difference, but still, the 8800gtx is clocked pretty high with 575, 1800, and when you oc it, the performance is insane


Keep in mind that technically they are no longer "pipelines" in the sense that other nvidia cards before the 8 series had pipelines. They are stream processors there is a large difference. Stream processors are floating point and can be assigned to a variety of jobs ranging from geometric processing to pixel shading. Pipelines were fixed functions meaning that they could only do what they were created to do (pixel pipelines could only process pixels etc) This means that the new architecture is much more efficient and able to fully utilize it's core processor on rendering an image instead of some pipelines standing around unused if that particular scene does not require it (i.e a more heavily shaded scene requires more use of pixel shaders, but not vertex processing (?))
January 16, 2007 4:24:23 AM

Quote:
I agree, but what I'm saying is that if nvidia kept the same clocks, then those cards will be demolishing anything on the market


But this is simply not possible in every scenario. Certain (special) Netburst processors could clock up to 5Ghz (and I heard of one at 7Ghz) and the max I've seen for Core2 is 5.5Ghz (a la Coolaler) and that's nearly impossible (not to mention unstable, except for SuperPi :wink: ). New technology needs time to develop and mature. I'm sure nVidia would set their clocks higher if they thought it was safe for a majority of the cards.
January 17, 2007 11:29:28 AM

The reasons for nvidia's conservative clocks: (IMO)

#1 Power consumption, to keep it down, they have the clocks a bit lower, both the 8800GTX and GTS are the most powerful cards on the market, no need to push them any further.

#2 See what ATI comes up with. If nvidia's lead is threatened, they will just release a 8850GTX with higher clocks. They are playing a card game with ATI - never put all of your cards on the table.

This is the exact same thing Intel has done with its Core 2 CPUs. See what AMD comes up with, if threatened, then they will just release higher clock rates.
January 17, 2007 10:00:24 PM

Quote:
This one is BIG! 12 pixel shaders vs 64


That one is big, but the fact that under DX10 those are unified shader paths not pixel shaders is the most important aspect, NOT the fact that there are more of them.
January 17, 2007 10:06:24 PM

Quote:
true, but half the stuff the inquerer say are false, especially information on the r600 and qfx. AMD did a pretty good job on keeping that information silent, and a lot of the stuff the inquerer says is proven to be wrong, such as the saphire dual gpu thing? That thing looked so fake that I bet they simply put a weird cooler on a regular saphire card

http://www.dailytech.com/article.aspx?newsid=5436
the inquirer isn't the only place reporting on that
January 17, 2007 10:25:03 PM

Quote:
bah, and even if more than one site is reporting on that, who would want to buy a dual x1950pro gpu.


http://www.steampowered.com/status/survey.html

I know it isn't a representative survey but ATI have their work cut out if they truly want to compete in the multi-GPU market. For this section of users, they have less than 4% market share.

Mind you, again on this section of users only, multi GPU isn't big anyway yet - less than 1.5%.
!