NVidia 8300 & 8600 details released!


Yeh, I know it's INQ but seems realistic enough.
28 answers Last reply
More about nvidia 8300 8600 details released
  1. It has been a while since I decided to click on a link to the Inquirer.

    I just hope someone comes out with a 8600 that supports HDCP when I decide to build a new HTPC.
  2. Quote:

    Yeh, I know it's INQ but seems realistic enough.

    I wonder what sort of PSU you will need to run it.
  3. Excited to see the performance of the 8600. Wonder how it will stack up against a 7950GT or the crippled version of the 8800GTS.
  4. If you look at the specs of the 8600 (regular) it doesn't seem that much better then a 7600gt. It has the higher memory bus speed (256-bit and xtra shaders) but on the other hand it has to deal with a far lower core clock (maybe better design?) and lower clocked memory. Also the "up to 256mb" message doens't sound to good... (128mb 8600's any1?)
  5. Any news of a much needed AGP card from nvidia !
    ATI already is talking about a AGP x1950xt !
    There is some dude on 3dguru who wants to dump his agp X1950pro !!
    They seem to have some overheating and PSU issues.
  6. Looks like the GT might be something worth getting now with a proposed 256bit memory bus. Thankfully, 128bit will be left for the 'low/high-end'.
    The Ultra would be my choice. 512mb of DDR3 - it would be 5x more usefull down the road as games are maxing 256 quickly. Too bad about it's release date anyhow.

    My WAG would be 20 - 24amps :? Ofcourse the process has shrank. 90nm 8800GTS compared to the 8600's 80nm process.

    With that price questimate, it makes you wonder if nV is planning on a large unified price drop - or if it's not the 8600's performance. 8O Ofcourse it's release is 3 months in the future, though, the 7600GT was 200$+ upon it's appearance. *scratch* Can't wait for some benchmarks.
  7. Yeah. Can't wait.
    It's all about the bars. :wink:

    If the GTS drops below 300 with the mid-range debut, it'l be mine.
  8. Agreed. That 8600Ultra model looks like it will be the sweet spot. Correct me if I'm wrong here, but 64 programmable shaders and up to 512MB od GDDR3 1400Mhz memory puts it in league with X1950Pros and X1900XTs. All that for around $180 plus you get DX10 capability.
  9. Healthy competition.
    Interested in ATI's new wave. Hopefully they've broken through thier power consumption tunnel. Though, as they need to stay on the competition bus, who knows which way thier cards will be going.
    A blip on the Inq. a while back aimed at the R600 having a seperate PSU. :lol:
  10. Gotta do something to get hits Tacos.
    I Agree. It's looking like the G8x and Rxxx will be relative to DirectX10 being born and in it's first stages. Gotta crawl before it walks.
  11. I wonder if we will have photo-realistic rendering by the GeForce 9xxx series?
    That would be hot....
  12. There's no real way to tell right now as this is all just speculation. I think the lower clock speeds of the 8600Ultra compared to the higher clock speeds of the X1950 series is a bit of a moot point as the architectures are entirely different and we won't know anything until the cards are out and have been benchmarked in real world testing.
  13. <shivers in expectation>

    Every time I look out the window, I can see the huge amount of data flowing into my eyes. On second thought, I don't believe a computer will ever gain complete photorealism. There is just to much data.
  14. Of course it will it's only be a matter of time, who knows maybe by then computer graphics will have a complete overhaul utilizing never before heard of techniques allowing photo realism.
  15. hmm, wondering how the 8600 ultra will compare @dx9 games in par with my 1 month old x1950 pro? That should be interesting.
  16. It will involve laser beams shooting directly into your eyeballs. Of course the initial product will be incredibly painful and probably kill a few people but you gotta break a few eggs to make an omlete, right?
  17. Exactly :lol:
  18. Kind of bizarre how they play with these naming schemes, huh? The Ultra was top of the line two generations ago, then it disappeared, now it's midrange. The GT was second best, now it's second worst.
  19. Oh please, when are people gonna start to realize that clock rate means nothing. It can only be used to compare 2 cards with the exact same architecture.
    Eg A 7600GT clocked at 600/750 is faster than a 7600GT clocked at 560/700

    I'm sure the 8600Ultra will kill the 7600GT, just like the 7600GT kills the 6600GT.

    Lets see: 7600GT vs 8600 Ultra
    256MB vs 512MB
    128bit bus vs 256bit bus (this means that the memory bandwidth will double)
    This one is BIG! 12 pixel shaders vs 64
    DirectX 9.0c GPU vs All new DirectX 10 GPU (which will also run DX9 better)
    The cores are running at a similar speed: 560 vs 500

    By the way, these are the nvidia specifications. Be sure that various manufacturers will clock them higher, both GPU and RAM.
  20. Right Speedy.
    Would you rather have a Pentium 4 @ 3.4 or a C2D E6600 running at 1GHz less.
    nVidia's mid-range 8xxx look to be power houses for DX9 and great cheap solutions for DX10. Cannot wait to see a top end X1950xxx shoot out it with these cards. Iching for some bars :P
  21. Wont they already?
  22. Quote:
    True, the amount of pipelines makes more of a difference, but still, the 8800gtx is clocked pretty high with 575, 1800, and when you oc it, the performance is insane

    Keep in mind that technically they are no longer "pipelines" in the sense that other nvidia cards before the 8 series had pipelines. They are stream processors there is a large difference. Stream processors are floating point and can be assigned to a variety of jobs ranging from geometric processing to pixel shading. Pipelines were fixed functions meaning that they could only do what they were created to do (pixel pipelines could only process pixels etc) This means that the new architecture is much more efficient and able to fully utilize it's core processor on rendering an image instead of some pipelines standing around unused if that particular scene does not require it (i.e a more heavily shaded scene requires more use of pixel shaders, but not vertex processing (?))
  23. Quote:
    I agree, but what I'm saying is that if nvidia kept the same clocks, then those cards will be demolishing anything on the market

    But this is simply not possible in every scenario. Certain (special) Netburst processors could clock up to 5Ghz (and I heard of one at 7Ghz) and the max I've seen for Core2 is 5.5Ghz (a la Coolaler) and that's nearly impossible (not to mention unstable, except for SuperPi :wink: ). New technology needs time to develop and mature. I'm sure nVidia would set their clocks higher if they thought it was safe for a majority of the cards.
  24. The reasons for nvidia's conservative clocks: (IMO)

    #1 Power consumption, to keep it down, they have the clocks a bit lower, both the 8800GTX and GTS are the most powerful cards on the market, no need to push them any further.

    #2 See what ATI comes up with. If nvidia's lead is threatened, they will just release a 8850GTX with higher clocks. They are playing a card game with ATI - never put all of your cards on the table.

    This is the exact same thing Intel has done with its Core 2 CPUs. See what AMD comes up with, if threatened, then they will just release higher clock rates.
  25. Quote:
    This one is BIG! 12 pixel shaders vs 64

    That one is big, but the fact that under DX10 those are unified shader paths not pixel shaders is the most important aspect, NOT the fact that there are more of them.
  26. Quote:
    true, but half the stuff the inquerer say are false, especially information on the r600 and qfx. AMD did a pretty good job on keeping that information silent, and a lot of the stuff the inquerer says is proven to be wrong, such as the saphire dual gpu thing? That thing looked so fake that I bet they simply put a weird cooler on a regular saphire card

    the inquirer isn't the only place reporting on that
  27. 7900gt performs about the same and 7950gt beats it by a little in pretty much everything.
  28. Quote:
    bah, and even if more than one site is reporting on that, who would want to buy a dual x1950pro gpu.


    I know it isn't a representative survey but ATI have their work cut out if they truly want to compete in the multi-GPU market. For this section of users, they have less than 4% market share.

    Mind you, again on this section of users only, multi GPU isn't big anyway yet - less than 1.5%.
Ask a new question

Read More

Graphics Cards Nvidia Graphics