New GeForce 3 (NV20) Inf

<A HREF="http://www.digit-life.com/articles/nv20/index.html" target="_new">Here</A> is some up to date info. Sorry if this has been posted already, but a quick search didn't reveal that this link had been posted yet already.
24 answers Last reply
More about geforce nv20
  1. Well, I hope the data in the second link is wrong, as it doesn't match my expectations after reading the Xbox specs... I suppose it's probably right though :( but I'm waiting for the official reviews (from Tom's) to roll in before I allow myself too much disapointment.

    Cheers,
    Warden
  2. those are probably the specs for a "basic" version of GeForce3, I'd expect the "Ultra" version of GeForce3 to have higher specs and as powerful as the XBox.
  3. OK I'm depressed now.... In case you missed it here is a bit from the tech news section on Tom's:

    Both NVidia Corp. and Microsoft Corp. are reportedly saying that PCs will not be able to compete with Xbox game console speed for some time. The reason, according to a Wired writer, is that NVidia's 3-D graphics chip and audio and I/O controller work together directly. Though the XGPU will use the same NVidia NV20 chip that will be used in PCs, its performance will be much faster because it will lack an AGP video bus and north bridge. The XGPU and CPU will share 64MB of 200MHz DDR memory with no buses to slow down transfer rates.

    :-(
    Warden
  4. I guarantee that the NV20 will not be the bottleneck of you system for at least 6 months to a year. The NV20 will be plenty, no need for the xbox unless you have nothing better in life(ie. no family, no life, no nothing)

    Someday I will stop asking all the questions!
  5. More like almost 1 year.

    -----------------
    "648kb is all the space anyone would ever need!"

    Bill Gates, 1980s
  6. I was trying to be safe, personally I think 18 months at least but I don't want my foot in my mouth.

    Someday I will stop asking all the questions!
  7. Yeah, I read that. Apparently, there will be no busses in the XBox, just the chips wired together so no bottlenecks, with an interface designed solely for eachother and not having to worry about compatibility and all that stuff.

    Maybe it's about time we had AGP 8x+++ appearing on the market. It may be more expensive, but no-one thought people would pay for the geforce 2 ultra. They were wrong. People will pay big if they want to play games.

    ~ I'm not AMD biased, I just think their chips are better ~
  8. This thing is so powerful that when I get it (<i>if</i> I get it) I won't be upgrading for a long time! :smile:

    -----------------
    "648kb is all the space anyone would ever need!"

    Bill Gates, 1980s
  9. Yeah good call. AGP 8x would probably help. Still I wonder if there is a limit to how much faster you can make it by simply increasing bandwidth. I bet there is additional latency involved with the bus protocols that the Xbox simply won't have. Sorta like how DDR RAM is only about 20% faster overall, maybe AGP "x"-factor will only help so much. I'm far from being an expert though. :)

    You other guys are only sorta right about it not being the bottleneck for many moths, IMO. The Xbox does NOT have an impressive processor, but is capable of VERY impressive graphics due to the GPU. By traditional thinking, this would be a very poorly balanced setup as the slow CPU would just hold back the graphics processor. I think the reason it works is because so MUCH of the stuff that is, in current PC games, done by the CPU, will be done by the GPU. We have yet to get our hands on a game that is so GPU-dependent as Xbox games will be. When we do, we may find that it really changes the rules on where the bottlenecks are.

    Of course, it will probably be 18 months before PC games catch up with Xbox games in that regard, so I guess you guys are still right. LOL

    Cheers,
    Warden
  10. Not sure I understand the difference between CPU and GPU. I thought that they were the same except the GPU was on the video card and was only used for the graphics not the whole system. Is this right? Now if im right then here is a question. Have you ever played Falcon 4.0? Well in the campaine when you get over the "FLOT" (if you played it you know what im talking about) you FPS drop bad, this is because the CPU cannot keep up. The AI in this game is huge and eats up CPU power like mad. Now why does the new video cards with GPU's still seem to be affected by the CPU getting slowed down? Is it because the AI engine is useing the CPU power and the GPU cannot support the AI engine just the graphics engine? Just my curiousity is wanting to know :-)

    Someday I will stop asking all the questions!
  11. Hehe, no doubt about that. Im only getting the GF2 Ultra, I can't afford $599 for a video card, and I think that will last me untill I build a new PC all together in 2 years or so.

    Someday I will stop asking all the questions!
  12. IT's TRUE:

    But one thing

    IBM PowerPC 450mhz vs Intel P3 733mhz

    Who would win ???

    :cool: First person to get a topic banned. :cool: ABIT BP6 Lives FOREVER!!! :cool: VIA SUCKS !!! :cool:
  13. I read that it will be between $350-$400. That will seriously lower the cost of the Ultra, although if it is that price, I might consider buying it!

    - I don't write Tom's Hardware Guide, I just preach it"
  14. Booky,
    You are right: CPU = Central Processing Unit; this is the processing unit for the whole computer. GPU = Graphics Processing Unit; this is the processor that sits on the video card and processes only graphics, with the most emphasis on 3d graphics. The GeForce 3 is the latest GPU to come out and it will be the processor around which a number of graphics cards are built, like the "SUS AGP-V8200 GF3."

    Now, no I haven't played Falcon 4.0 (wish I could but it would totally bomb my system) but I understand that in campaign mode, the AI involved is beyond complex. And it just so happens that your example here is a perfect illustration of what I was saying about needing to re-evaluate the traditional bottlenecks in a gaming system.

    In your example, the CPU is a bottleneck because of the complex AI, so would adding a faster video card help? No. This is because up until the GeForce 256, and now much more so with the GeForce 3, newer, faster graphics cards did not take any new load off the processor, but simply did their part faster. If the CPU was the bottleneck, then they didn't help.

    But go with me for a second and consider a future game with intense AI like Falcon has, running on a computer like yours but with a GeForce 3. With the modern code in this game, the GeForce 3 could take a very large chunk of work off the CPU. This would then make a traditionally CPU limited game run faster even though the only hardware change was in the graphics card.

    Now don't misunderstand my point. Adding a GeForce 3 to your current system wouldn't help Falcon 4.0 because it wasn't written to take advantage of the GeForce 3's new features. And it will take a frustratingly long time for games to come out that DO support these features, as was stated earlier in this thread. My only point is this: the GeForce 3 is a powerful enough GPU that when the software DOES catch up, the CPU will be way less of an issue, and very rarely a bottleneck This is in stark contrast to today's systems, where the CPU is often the limiting factor, and will require an adjustment on our parts when building gaming systems.

    Once again this is beautifully illustrated by the Xbox, which has a ho-hum CPU but produces phenomenal graphics, better by far than what the latest Athlon DDR systems can do. The major difference between the Xbox and a PC equipped with a GeForce 3 is that every piece of Xbox software will take full advantage of the GPU. Once this happens with PC games, I don't think we will see much in the way of CPU bottlenecks. I also hold on to the vain hope that since developing for the Xbox and the PC will be so similar, game engines won't take as long as usual to catch up to the latest hardware. But I suppose it is a VAIN hope.

    Wow that was long :)

    Cheers,
    Warden
  15. You're right on the nose.

    And game development should be enhanced greatly by the XBox. Since the platform and code are going to be very similar to a PC, this will allow game developers to write games that will run on both the XBox and the PC with hardly any code changes. So when this happens, I bet we'll see a LOT of games launched for both platforms that will use the full potential of the PC's GeForce3 cards.

    But until that time, I doubt software support for the features of the GeForce3 will happen much (if at all).

    And for the general discussions: Yes, the XBox will probably out perform a PC in a lot of areas because it simply doesn't have the busses that a PC has, so data tranfsers a LOT faster between components. But I'd like to see someone try to upgrade an XBox.

    PCs will never die. They will always exist because they simply can't be replaced. They're too multi-functional and upgradable.

    Meanwhile things like PDAs, internet TVs, and game consoles will always be fun. But because they will always be designed to fill only a specific role well, and because they can hardly be upgraded, that they will never replace the role of the PC.

    If you want state-of-the-art games with graphics that nothing can match, go with the latest console. If you want to wait a little while until the games aren't brand-new or aren't as graphically stunning, just use your PC.

    And if you want both, get both. :)

    Toast will always toast faster in a toaster than it would in an oven. But you would never try to cook a turkey in a toaster.

    - Sanity is purely based on point-of-view.
  16. Doom3 is supposedly to use the GeForce3 this is a screen shot. I couldn't find more but I remember them being on this site as well as a link showing the GeForce3 in the new Mac G4 (a streaming video) this is some serious, no joke 3d power!!
    <A HREF="http:// www.doomworld.com/news/postpic.php?picture=/gfx/sidepicfull.jpg " target="_new">http:// www.doomworld.com/news/postpic.php?picture=/gfx/sidepicfull.jpg </A>

    <font color=white> This new forum still sucks </font color=white>
  17. Interesting. I'd have thought that they would just give up on Doom. I mean talk about an over-killed storyline.

    But hey, at least it looks graphically interesting.

    Not that you couldn't do the same level of graphics with just about any video card for a video clip though. **shrug**

    So why do you say that this forum still sucks?

    - Sanity is purely based on point-of-view.
  18. Yeah Slvr I agree. I think PC's and consoles will always co-exist just fine, and I don't know why everybody worries so much about this every time a new console comes out. And consoles only dominate the graphics department for a short while specifically because they aren't ungradable. In 2 years the Xbox will still be the same old Xbox.... but the PC will have moved on to bigger and better things. Consoles generally go in about 5 year cycles. PC graphics chips go in 6 month cycles. Do the math. The bus-less, unified memory architecture of the Xbox will keep it ahead of the PC for longer than other consoles have been able to do in the past, but it still won't last long. And like you said, I think the shared game architecture between the Xbox and the PC (DX8) could provide many benefit for both sides.

    Regards,
    Warden
  19. Doomworld.com seems to be down right now and I couldn't access the picture, but I have read elsewhere that Doom 3 is supposed to be optimized for the GeForce 3. What I do not know is when Doom 3 is supposed to come out.

    As for the Doom story never being that gripping... hehe yeah what Id game HAS had a good story. But in Doom 3 they are supposed to be "focusing on the single-player game." Hmmmmm.... guess we'll wait and see if they can turn Doom into Half-life. :)

    Cheers,
    Warden
  20. I think they made a new one. Completely new engine, I'm pretty sure.

    -----------------
    "648kb is all the space anyone would ever need!"

    Bill Gates, 1980s
  21. Righto Griz. Id Software has stated that Doom 3 will use an all new engine. Here is a quote from John Carmack's .plan where he first announced the next Doom:

    "It wasn't planned to announce this soon, but here it is: We are working on a new DOOM game, focusing on the single player game experience, >>> and using brandnew technology in almost every aspect of it.<<<" (http://www.webdog.org/cgi-bin/finger.pl?id=1&time=20000601040557)

    In his latest .plan he refers to Doom 3 by calling it "our new engine." (http://www.webdog.org/plans/1/)

    At the end of this same .plan, after giving all his thoughts on the pros and cons of the GeForce 3 (a very worthwhile read), Carmack says:

    "I think things are going to really clean up in the next couple years. All of my advocacy is focused on making sure that there will be a completely clean and flexible interface for me to target in the engine after DOOM, and I think it is going to happen."

    Since the above comment is talking about the future beyond the GeForce 3, saying that his next engine beyond Doom 3 will be targeted at such a future... Does this then mean that "current" Doom 3 engine is targeted at the "current" GeForce 3? Yeah I know that's stretching it, but hey...

    Of course, once again, I don't know when Doom 3 is coming out. Maybe by the time it does out everything will be supporting the GeForce 3.

    Cheers,
    Warden

    (mass typos)<P ID="edit"><FONT SIZE=-1><EM>Edited by warden on 03/02/01 06:54 AM.</EM></FONT></P>
  22. Now, is the Doom 3 engine optimized just for GeForce 3, or DirectX 8.0? Cuz I thought all this fancy dancy suff that the GeForce can do, in terms of on the fly lighting, etc., is because of DirectX 8.0 features, right? To me, it seems pretty farked up to make a game specifically designed for one video card, as it makes the situation bad for those without the card. For instance, the Radeon 2 will probably have DirectX 8 compatibility (full compatibility I mean), and be a kick ass card, which means that everything special about Doom 3 (which does have some pretty special stuff in it), will work with Radeon 2, or any other competeing card that supports new DirectX 8 functionality, right?
  23. I have read many times of game developers saying that a game is developed for a particular kind of card, but they all seem to mean cards LIKE the ones they mention. For example, most hardware transform and lighting enabled games were written "for" the GeForce, because it was the first T&L card to come out. But those games still run great on the Radeon. And yes, this is basically why APIs like DX8 are used. For what it's worth though, I think that Doom 3 will probably use OpenGL instead of DX8, but that shouldn't make any difference to the end user, as long as all the card companies make good OpenGL drivers.

    Regards,
    Warden
  24. Well if Doom3 doesn't support anything except GF3 then I certainly won't buy it. Unless I happen to own a GF3 which currently I am not planning to get.
Ask a new question

Read More

Graphics Cards Geforce Graphics