NV-30 will Used 3dfx Technology

Nvidia will have 3dfx technology meaning fusion and rampage and sage in it. This means thats its over Nvidia wil monopolize the market. Gamer over Ati and Matrox


RIP 3dfx and Rampage
23 answers Last reply
More about 3dfx technology
  1. lol...
    They will make a card that will be soooooooooo good... that they will ask 20000 US $ for that [-peep-]...
  2. Where is the link showing this. Honestly, the 3dfx technology was not all that. If they were they would still be in the game. People buy great products. Products that get better and better live. Products that get worse and worse don't. Look at how Nvidia is doing FSAA and they don't hardly impact their performance. And they don't need to put multiple processors on their card. Personally, I hope that they steer clear of legacy video methodologies or technology.

    It worked yesterday! :lol:
  3. they don't use many chips on board but they like the brute force aproach..
    more transistors please !!!
    nvidia [-peep-]
  4. Well the additional transitors are attached to features that lesser cards only dream about.
  5. lesser cards ?
    what lesser cards ?

    nv30 is damn good !
    were I can buy one ?

    man I was only saying that brute force is not close to smart design...
    i.e. with a similar transistor count the smarter design will win against the brute force aproach !
  6. gorila against man

    who will win the fight ?
    obvious answer..
  7. Where's the link?
  8. Quote:
    <font color=green><b>lesser cards ?
    what lesser cards ?

    </font color=green></b>
    GF3s and down buddy. We are talking about NV-30 here not NV-20.

    <font color=green><b>gorila against man who will win the fight ?
    obvious answer..

    </font color=green></b>
    Man with a much more complexed brain as in the NV-30 compared to Kyro2 brain of Gorilla (simple)will design riffle with laser scope, aim and blasts gorilla out of brush even before Gorilla knows what happening. That is called evolution man. Evolution. Gorilla put in cage for kids to poke at and make fun of just like Kyro2 put in museum to be poked at and made fun of by kids.
    <P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/27/01 11:38 PM.</EM></FONT></P>
  9. lol good post noko !!!
    but I was refering to kyro 4
  10. if nvidia used tile based rendering they wouldnt have the damn problems with memory bottlenecking that they do. kyro2 is way ahead of geforce2 in that respect. do any of you people know what technology is inside the kyro? because it sure doesnt sound like it.

    <A HREF="http://synthetic.bsd-fan.com/pika.swf" target="_new">Hyakugojyuuichi!!</A>
  11. TBR is great technology, it is the rest of the design that is being looked at.
  12. I sure hope a Kyro4 will be designed meaning that STMicro was successful here. The NV-20 is a evolution in video cards where it can do things that other cards could never do even with the fastest cpu's. Now we are talking about NV-30. I don't even know where to begin.
  13. noko you will allways find something negative on kyro 3, kyro 4!
    a perfect world or a perfect card does not exist!

    but if we start to count the pros and cons even with a kyro 2, we will find that in performance/price kyro 2 beats everything on the market now (and I am mean everything !!! )
  14. so you are conviently saying "oh yea theres some great technology, but we wont mention that and just blast them for the negative aspects." youre not to great in debates are you?

    <A HREF="http://www.512productions.com/lobstermagnet/" target="_new">Hyakugojyuuichi!!</A>
  15. There is no reason to hype on the good. Sometimes you can have a huge list of small good things but have one bad aspect of a design that can ruin the entire chip. No one is saying that the kyro II sucks. The kyro is just a mid ranged card that is at the top of the mid range list right now. Also, for the first time, this mid ranger can beat the best of the best in some benchmarks. Sure this is great but when the kyro II sucks it sucks hard. I won't make any final judgment on this card until I see the final board design with the official release drivers. If the kyro continues to perform poorly in some of the benchmarks that it does then I wouldn't buy it.

    Last month, when the kyro II was first previewed, we all got hyped about the price/performance. Since then, the release has been pushed back a month and we are now seeing areas where the kyro fails to perform at a decent rate in certain tests. Nvidia is pushing down its prices for the gts/pro cards and I suspect ati will fallow in this price cut trend. This will diminish the price/performance ratio that the kyro II had last month. Even the ultra's will be coming down in price.

    Unless the kyroII comes in at a lower price then the $150 suggested retail price it won't be able to have the appeal that it could have. Remember, the kyroII has a few things going against it:
    1. Relatively new brand name to get people to know (will have to fight back the radeon/geforce market strong hold let alone the performance of those chips)
    2. New technology is always scrutinized. Any flaw in the design of the chip will be harshly scrutinized by the industry.

    The card isn't even out yet and problems with it are already being found. We've all seen the arguments of the life of a 3d card and the price range. Now that the kyro II is being attacked by possibly having a shorter life then it should and it's intended market already being filled by equal or better performing cards I question the impact it will actually have besides being a card that debus tile rendering. If STM was, smart they would have had two cards to market at mid range and high performing cards. Their motto not to overprice more than what is needed is nice but won't have too much of an effect if they don't market to the entire market for game cards.

    We’ve already seen that the chip can’t be run any faster than it can now so we won’t see an ‘ultra’ version. Looks like we’d have to wait and see if kyro III can perform at the high end. It is sad, but 90% of the people out there only care about the highest frame rate. That is why nvidia has the market share they do. The kyro II is nice but when it comes down to it people will pat STM on the back and say good job but then turn around and by a geforce2 or radeon instead.
  16. Rather do an analysis, find facts and make conclusions. Now if you want to enlighten us please do, what do you find so great about this chip? Is it only one feature that sticks out on the design or is the design complete in most aspects? Does excuses have to be made up in why the chip doesn't have key features or are those features totally not necessary or overcome by the strengths of the card? Who would benefit from this card and who wouldn't? Besides this thread is about the NV-30 and 3dfx not about the atiquated Kyro2. Now if you want to debate about this go to the Kyro2 thread.
    <P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/29/01 03:08 AM.</EM></FONT></P>
  17. okie both of u break it off... =)

    This is getting further and further from the main topic but since u all are at it, well, seriously speaking, there's nothing to amuse over Kyro 2...

    Why??? First, it doesn't support T&L, and that is a major setback already...T&L is the standard now for games. Can u imagine playing realistic games like "Quake 4", that is if Quake 4 were to come out, without T&L support??? I can tell u it sux!!! Am I not right??? =)

    If Quake 4 were to come out any time this year or next, GeForce 2 owners will be able to play at decent fps provided they have a minimum of 64MB DDR-RAM. For Kyro 2 users that will be too bad!!! GeForce 2 owners also doesn't have to worry too much because in the new future, NVidia will come out with better drivers. Everyone have seen how the GeForce 2 maintain its position again with the new Detonator3 drivers when the Radeon challenged it. In fact, the newer drivers untterly trashed the Radeon, giving it more than 10fps compared to the Radeon. This will happen as the drivers are not final yet.

    Be ready for surprises!!! =) This is one reason why NVidia is not even the slightest threatened by Radeon and maybe even the Kyro 2.

    One last thing, the Kyro 2 will not live long. First it doesn't have hardware T&L but the GeForce 2 have a 2nd generation hardware T&L support, which is already enough to blow the Kyro 2 away...

    lol... =)

    P.S If NV30 were to import 3dfx Technology, I can imagine how fast NV30 will be in terms of speed and features...Brute force had always been NVidia's philosophy in making the market's faster card... : )
  18. Are you a fortune teller ?

    doing more T&L implementation (more poly's ) than todays present games offer to us will destroy the "low" bandwitdh (even with DDR) of geforces that will drain the brute force of geforce out !

    if games start using more layers (i.e. like seriously sam) that will also kill the "low" bandwith" of geforce...

    the only capable video card for future games is at the present geforce 3...
    kyro is maybe ( I am not sure ) more future prof than even geforce 2 GTS!

    but if you want to talk about kyro go to the kyro topic..
    ok ???

    will a new detonator offer like 100 % more performance to geforces?
    then all geforce of today are getting a lousy driver, performance wise !

    I have taken this from ace's hardware forum from Wlaote:

    <font color=red>
    i would like to point out what (very) high polygon counts mean for the kyro1/2 and its "sorting unit".

    in a game played in 1024x786 kyro divides the picture into 1536 tiles (32x16). if we take quake3 with its 10.000 to 15.000 polys per frame on average, every tile would be filled of roughly 10 polys. but because there are polys which go over several tiles and cover other polys, we have to multiply that with 3, and take a number of 30. as it has been mentioned, kyro2 at 175mhz incorporate a 5,6 GigaPixel (32Pixel x 175mhz) unit which "sorts" the polygons in the tile. with those numbers we can do some nice maths than. :)

    we have a "sorting unit" cabable of 5,6 GPixel:
    we have to compute a pixel 30 times (15.000/1536x3):
    5.600.000.000 / 30
    and we have a screen resolution of 1024x786:
    5.600.000.000 / 30 / 1024 / 786 = ~230fps

    what comes out is that kyro2s "sorting unit" at 175mhz could push 15.000 polys 230 times per second.

    the same thing with 100.000 polys:

    5.600.000.000 / (100.000/1536x3) / 1024 / 786 = ~35fps

    it is interesting what happens if we change the resolution to 1600x1200, which would be 3750 32x16 tiles:

    5.600.000.000 / (100.000/3750x3) / 1600 / 1200 = ~35fps

    thats because we have more tiles but on average there are less polys in one tile.
    </font color=red>

    So where is the issue of not having T&L ?
    more T&L implementation will give kyro a boost if kyro is coupled with a decent cpu ...
    <P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/29/01 10:11 AM.</EM></FONT></P>
  19. waheyyyy! "My dad's better than your dad" type bickering in full effect. :D Beautifully off-topic.

    Much as I dislike most of naziVidia's business practices (yeah, I know other big companies do the same, so what, that makes it alright?), the GeForce has proven itself to be a somewhat decent and successful series I would say, and it certainly looks to carry on this way into the 3rd GF generation. nV wouldn't be the big bad monster it is now if the GeForce hadn't taken them there, and it wouldn't have taken them there if it hadn't been such a good product range in the first place.

    As for Kyro II, well I can't say I have many complaints with it. Tile based rendering is a fantastic idea. It does seem funny seeing all the rockets continually being strapped onto brute force rendering, then watching (or not watching as the case may be) all that power being thrown at polys/textures that can't even be seen, especially when we see now that it is just not necessary to do that, and we can see what a difference it makes.

    Anyone who dismisses the ideas that K2 and TBR bring to the scene with laughing casual abandon, deeming it's technology as having no value, is only fooling themselves, and the Rose-tint-bespectacled GeForce luddites, who simply can't stand the concept that someone other than nVidia just might come up with a good idea. It does happen you know.

    So please stop the arguing. Thankyou.

    Back to the topic, 3DFX technology in an nVidia chipset can only be a good thing. I think someone commented that if their tech was so worthy of note, they wouldn't have gone down, but not so. Their technology certainly is good. What took 3DFX down was the farcical manufacturing problems of their flagship, the Voodoo5. Instead of manufacturing 500,000 or so chips, I think they managed to manufacture about... oh, 6 or 7. V4/5 hit the market ludicrously behind schedule, and in this brutal business that is fatal.
    I wouldn't say it's game over for Matrox though, they seem to target and win favour with a different audience.

    So yeah, NV-30=good. NV-30+3DFX=very good. But NV-30+3DFX+TBR ...? NOW ya got me reaching into my wallet, ready to fork out the 50,000 USD price tag nVidia will slap on it. ;)
  21. I even forgot that I even wrote that stuff, why did you dig this thread up? Kyro2 is a outstanding card but it doesn't compare to the GF3 that well, like all the other video cards, a generation behind. Now the thread was about NV30, something beyond GF3 in which very few people know any thing about and they are not talking, at least yet. Maybe alot of speculation but no hard facts, even more frustrating is the lack of information about the Radeon2 when it is surpose to be out in a few months.
  22. yeh yeh I know, just catchin' up. :P

    Speculations but no facts, sounds par for the course... No doubt some very impressive-looking documents will be "leaked" at some point. "Oh whoops-a-daisy, we accidentally allowed this document full of impressive features/specs to accidentally be viewed". I guess amongst the myriad of dreaming and speculation, the design is still probably largely up in the air, so I suppose there may not be that many solid facts yet. Except that we all want one of course.

    And yes, nV's competition seem to be confidently beavering away, but they're sure doing it quietly. I guess ATI will begin their song 'n' dance campaign when they see fit. In the meantime, we just sit 'n' twiddle our thumbs.

    I wonder what 3DFX's next super uber-card would have been like, had they not utterly shot themselves in the foot... :(
  23. I have a better handle of Radeon 2 specs, TheInquier had info about the NV25 in my other thread. Looks like NV30 is way off into the future. That could be good if it incorporates the newest greatest stuff.

    <P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 07/06/01 10:47 PM.</EM></FONT></P>
Ask a new question

Read More

Graphics Cards Fusion Rampage Nvidia Graphics