Sign in with
Sign up | Sign in
Your question

Kyro 2 the killer of nvidia ???

Last response: in Graphics & Displays
Share
Anonymous
a b U Graphics card
April 4, 2001 9:06:06 PM

hum...
this new chp is very very good...
look at this...
http://www.geocities.com/johnrreynolds2000/the_truth.ht...

and this:
http://www.voodoosource.net/
or this:
http://www.theregister.co.uk/content/3/18092.html

Nvidia is afraid...
So this is a great product...
I will buy a new kyro 2 !!!!!!!!!!
maybe an hercules based card...
;) 

More about : kyro killer nvidia

April 4, 2001 9:21:51 PM

Well I've read a few reviews about it and it doesn't look SO good. Of course, it's a lot cheaper but if you want top quality, you gotta stay with Nvidia. It all depends on your needs

Fuzzy Wuzzy was a bear, Fuzzy Wuzzy had no hair...
Anonymous
a b U Graphics card
April 4, 2001 9:31:28 PM

A promising card, as far as the technology principle goes. More like a proof of concept, but still a way to go before it beats GeForce. Kyro III, maybe? Kyro II is not <i>that</i> impressive yet. Only succeeded in one or two benchmarks, and mediocre in the others.

Leo
Related resources
Anonymous
a b U Graphics card
April 4, 2001 10:54:26 PM

It looks pretty decent as price/performance. The technology behind it is kinda weird/cool, but it's nothing for Nvidia to be piss-pantsing themselves about.

It's better to be pissed off than pissed on :) 
Anonymous
a b U Graphics card
April 4, 2001 11:52:17 PM

Well, yes. I, for one, would only agree that having competition will only make things better. I'll be on a lookout for the developments in the Kyro family.

Leo
April 5, 2001 12:49:46 AM

I am awaiting for this cool chip to hit the streets. The geocities article and break down of (supposenly by nVidia) is inmaterial. The chipset will prove itself or not on the market and game developers. The benchmarks for this very intelligent designed gem with only some 15 million transistors compared to the 50 million something transistor the G3 sports is really funny especially comparing real world benchmarks by reviewers to the GF3. Not saying the KyroII beats the GF3 but the question becomes, is the GF3 performance worth the extra $200-$300 extra? I hope other card makers give a choice to their customers and future customers not only the KyroII design but also S3. What would a super charged s3 2000 chip do? Also if ATI would sell more chipsets to card markers even a broader choice would be available for us to make and the days of $400-$500 video cards will be over. :cool:
Anonymous
a b U Graphics card
April 5, 2001 1:27:44 AM

not good ????
:) 
for a 150$ it's good for me...

this card can achieve better results even with lower average fps ....

let's focus on what this kyro does and what a traditional card doesn´t:

it does not draw any pixel that will be obscured by something... so when all the others get 10-15 fps in more complex scenes (even if the average fps is above 50, that is one of the reasons to have more than 100 fps with a traditional card...)

the drop will not get visible if the scene requires more fill rate ... this card will achieve better results at least on my eyes... (higher lower fps ) I don't care about average benchmarking ...

by the way even in the average benchmarking the kyro wins in some benchmarking against a card that is 400$ more expensive...
I must say this:
with the money that I spend in a geforce 3
I can buy 3 generations of kyro ...
so buy the best card out there go ahead... buy the geforce 3
I will buy the best price/performance card out there...
;) 
Sorry about my poor english...

p.s.
if the developers take in account this kyro then they also may apply some optimizations like...
NOT DO IN SOFWARE WHAT THIS CARD DO IN HARDWARE (taking out "some" of the triangles that will not be visible in software...)
IMAGINE THE PROCESSING POWER UNLEASHED BY THIS OPTIMIZATION !!!!
April 5, 2001 4:06:18 AM

The reviews make the Kyro II look like a mixed bag, some good attributes and some less good. However, a very interesting feature is the Kyro II's FSAA comes with <b>NO</b> performance penalty, meaning pretty good performance but excellent video quality, all in a budget oriented card.
Anonymous
a b U Graphics card
April 5, 2001 5:42:34 AM

agreed, the kyro II has so many good things about it but has some questionable attributes about it that can only be tested out when we see a final version on the pcb and official drivers. Mid april seems to far away :) 
April 5, 2001 6:54:11 AM

I'm anxious to hear about personal experiences with Kyro II when it becomes available. I've been looking for a budget card and unless the Kyro II is significantly better I will probably go with the Radeon LE. Price will make a difference.
April 5, 2001 9:18:30 AM

I expect the Kyro II to go the way of the S3 Savage4. It will probably be only a matter of time when ATI, NVIDIA, and Matrox adapt Tile Based Rendering to their cards. Not now but maybe for the next generation of graphics cards. (Radeon LE2, GeForce 3 MX, G600, or whatever they will call them.)
Let's hope Kyro can keep up with (or ahead of) the competition so that they (NVIDIA, ATI, Matrox) are forced to put out better, faster, and cheaper cards.

Believe me, if it ain't broke, don't fix it.
Anonymous
a b U Graphics card
April 5, 2001 10:59:56 AM

i doubt nvidia is seriously frightened by the new kyro card or plan on aggressively pursuing tile based rendering (although i believe that was in the 3dfx bag of goodies).

"Beyond3D : Can traditional renderers overcome the problems that Tile-Based solutions solve through their structure? Like the rendering of 8-layer multitextured pixels that will never be visible, the horrible memory access patterns with smaller and smaller polygons (where many OD algorithms like early Z and hierarchical systems also fail to work effectively), the huge memory abuse when doing AA, costly stencil procedures, expensive memory-readback when doing multi-pass effects? Its easy to nag about the buffering issue that tilers might or might not have in the future, what about all these bottlenecks in traditional systems? Not to mention the issues when using more accurate frame buffers, 64bit floats?

Croteam : Tile-Based rendering can be a very good solution for the present time. But I don't think that it will hold much longer. Brute-force approach with its power, has already hit the limits of the monitor resolution. It all comes to two things: either developers will completely embrace tile-based rendering and adopt their engines to that, or we'll all stick to simplier brute force solutions. Tile based rendering will always be faster than brute-force, but who needs (potential) complications of TBR, when brute-force approach is already fast enough.

MadOnion : The technology is not as important as the end result. Currently best results have been achieved with the "traditional" 3D accelerator types. Both can be made to work, but a tile based system may probably be more cost-effective in the long run.

Basically, game developers could not care less how it's made if it renders fast, has good feature set and they don't need to think about any special cases.

NVIDIA : There are pros and cons to any architecture. I believe in the future, we are moving toward more and more geometry, as well as more and more per-pixel shading and computation. Both of these directions require more muscular and powerful pipelines. Tile-based renderers don't address these needs. The optimization provided by a tile-based renderer is that occluded pixels that don't contribute to the final picture also don't contribute to the bandwidth consumption at the memory. In the limit, a tile-based renderer optimizes out that redundance, and provides only a minimal impact to buffering and re-scanning of command streams and geometry. In the limit, a conventional renderer with occlusion culling has exactly the same performance. In each case, we have separated the visibility (what is on top) from the shading. "

from <A HREF="http://www.beyond3d.com/interviews/croteammonv/index1.p..." target="_new">http://www.beyond3d.com/interviews/croteammonv/index1.p...;/A>
Anonymous
a b U Graphics card
April 5, 2001 11:56:06 AM

well did you know that most of the patents are in the hands of the guys who produced kyro ???
(imagination technologies)
so..
yah..
they could produce a tile and rendering chip but :
1. they have to pay royalties to imagination
2. it was required years of investigation to produce a tile chip so good...
3. we al now that spec's is allmoust everything (mhz etc...)
and this card achieve better results than chips clocked way higher... with speediers memories.. and that sells too...
if you see a card clocked at 175 and other at 250, what would be the card of your choise???
the better one or the one with better spec ???
many would chose the better spec's...
and nvidia know that...
so...
about kyro do no compare kyro with savage ...
that make me laugh...
did you ever played dreamcast or a sega arcade???
the newer ones are based upon older imagination (tile chip) with transform and lighting on it...
so...
Anonymous
a b U Graphics card
April 5, 2001 12:05:34 PM

yap it is significant better....
at least in higher resolutions..
1024x768 and up
well geforce gts cannot achieve better for example in quake3...
look at this:
http://www.anandtech.com/showdoc.html?i=1435
that will answer you.
take note that this was a preproducing chip a pre-producing driver (very stable nevertheless)
it will be even better...
that is why nvidia is getting afraid..
please visit the site that I told you..
in may or late april we will get the newer cards unless nvidia achieves what trying to do (on the pdf that I told you.... )
http://www.geocities.com/johnrreynolds2000/nvidia_on_ky...
and we get few cards on the shops...
few means higher prices...
:( 
April 5, 2001 3:04:22 PM

And thats the shape of things to come. All 3D games from now on will utilise t&L to an extent, and the Kyro will be crippled from its weaknesses. Come-on its just a faster kyro, with no technological improvments.

As I know, 3DFX were researching hidden surface removal. nVidia currently owns all of that and perhaps will introduce it in the next product cycle. If they do that, Imagination Tech wont get a penny off them!

Its the same problem over again, the PowerVR couldn't stand up to 3DFX Voodoo. power vr2 based neon, again was far weaker than the competition. They need to include T&L at the minimum on the next card. I don't think the Kyro 2 is going anywhere big.


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
Anonymous
a b U Graphics card
April 5, 2001 4:39:32 PM

Do you think that t&l will get implemented in future games..
yah!!!
but for the geforce3...
the T&L of geforce 3 is way to different of the Fixed T&L of geforce 1 & 2 ...
:) 
yours geforces will have to do this T&L via software...
lol
the evolva is funtioning great with my kyro I ... of course in windows 9x...
maybe there's a bug in the game in windows 2000
or in the drivers ... who cares...
99% of games runs well on windows 2000
;) 
on a KYRO I ...
I think that kyro II will be better don't you think ??
;) 



<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/05/01 12:41 PM.</EM></FONT></P>
April 5, 2001 4:47:41 PM

Well ATi radeon has

A from of tile base rendering (not as good as the kyro)
HyperZ (The best solution for a z-buffer)
T&L (Nice to have around)

Radeon 32mb DDR is on the same level as the Geforce2 GTS cost $130 cheap. The new drivers rule.

!!! Leader of the Anti-via army !!!
April 5, 2001 4:52:59 PM

You cannot do T&L in software, as it is just a given name to performing the Geometry and lighting calculations in hardware. You cannot perform these in software with a fraction of the performance.


<i>the T&L of geforce 3 is way to different of the Fixed T&L of geforce 1 & 2 ...
:) 
yours geforces will have to do this T&L via software...
</i>

You mean nFinite FX. They wont be used on anything below the GF3 but the T&L will run just the same. so, the performance or quality wont be anywhere near the GF3 but completely playable and way faster than the Kyro.


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
Anonymous
a b U Graphics card
April 5, 2001 5:56:51 PM

Im glad to see another option in the Video Board market. I also hope that the KyroII is just good enough to get Nvidia to start thinking about pricing. Their GF3 at 549 is ludicrous, as 3 months after that you wont be cutting edge anymore.
Between ATis 64 MB radeon now priced at 199, and the new Kyro also at a much lower retail cost maybe the same thing we see AMD doing to Intel will happen here. Force pricing to be competitive. Just my 2 cents=)
Anonymous
a b U Graphics card
April 5, 2001 6:41:19 PM

The GeForce 3 and Radeon incoperate the same technology, i.e. the ability to ignore surfaces hidden by others.

~ I'm not AMD biased, I just think their chips are better. ~
Anonymous
a b U Graphics card
April 5, 2001 8:39:47 PM

maybe T&L is a great thing...
like the 8-layer multitexturing... did you know that doom 3 will support 8 layer multitexturing ?
I wonder if the geforce 3-2 etc.. will be that great in a 8 layer multitexturing game
;) 

by the way I wonder Why the new kyro's drivers beats the geforces in evolva ??
is it because T&L ??
I WONDER ??
;) 
Anonymous
a b U Graphics card
April 5, 2001 8:48:43 PM

yap..
look at these benchmarking on these new game...

http://www.anandtech.com/showdoc.html?i=1435&p=14

not bad for a 140$ card !!!


<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/05/01 04:57 PM.</EM></FONT></P>
April 5, 2001 9:27:23 PM

The Damn Game is being designed to be optimal on the GF3 Platform. The Doom3 Preview is being used the primary graphics demonstration for the Apple/GF3 combination.


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
a b U Graphics card
April 5, 2001 9:34:19 PM

That's what sucks aobut nVidia cards-each chip is a more advanced version fo the one before, it's like taking a 1972 Belvedier and adding a 426Hemi, then a Dana 60, then a lenco transmission, then a blower, then nitrous, then a bigger racing fuel cell so you can go cruising, then a big aluminum radiator so it doesn't overheat when you go cruising-then trying it out for a Trans-Am racing event. The Geforce series never ditches the inefficient stuf from it's previous incarnation, so it is big, uses too much power, and runs hot. Eventually you get to the point that you need to start with a clean sheet. BTW, even with it's older parts left in it's still the most powerfull gaming card. But I think they could get the same performance from a new, efficient, simplified design.

Suicide is painless...........
Anonymous
a b U Graphics card
April 5, 2001 9:46:35 PM

"looks like kyro get ripped by radeon and geforce mx...
on evolva..."
(that was with an earlier driver)

look at what happens now
this is with T&L enabled...

I don't know german but look..
at these benchmarks:

http://www.rivastation.com/review/3dprophet_4500/3dprop...

;) 
it beats the geforce 2 GTS in 32 bits on a game that have T&L support funny !!!
I wonder why Tom are avoiding to talk about this new cheap new product..
<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/05/01 06:18 PM.</EM></FONT></P>
April 5, 2001 10:27:10 PM

The GeForce 256 DDR needs 18W of power, the GeForce 2 GTS needs just over 6W.

The GeForce 3 is new Technology.

The Kyro 2 is just a beefed up Kyro, like the GF2 Ultra is to the GTS.


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
Anonymous
a b U Graphics card
April 5, 2001 10:29:55 PM

Who cares if toms ignores kyro 2...
it will be a great product...
in fact IT IS !!!
Anonymous
a b U Graphics card
April 5, 2001 10:34:48 PM

I think that kyro uses only 4 wats or less..
;) 
yap geforce 3 is new technology ...
yap it is...
traditional rendering is so new that it was used in 1950...
Anonymous
a b U Graphics card
April 5, 2001 10:46:30 PM

:) 
I Would buy 4 kyro II with the price of 1 geforce III

damn!!! nvidia must have their hass kicked so much to prevent them to bring another 600 $ card ...
and I think Kyro 2 will do that !!!

even if nvidia could bring the fifth dimension to us..
LOL
600 $$$
LOL
a b U Graphics card
April 6, 2001 12:21:47 AM

Maybe, but wouldn't that be because of a transition to the .18 micron process? (I think the original 256 was .25 micron maybe?)

Suicide is painless...........
Anonymous
a b U Graphics card
April 6, 2001 12:39:10 AM

beta???
imagine Kyro 2 will get even better...
Anonymous
a b U Graphics card
April 6, 2001 12:44:54 AM

I see this on a forum in beond3d...
this guy is from Croteam:

"
------------------------------------------------------------
Originally posted by Reverend:
I've asked one of the Croteam programmer to join in on this thread and he said he would. Specifically to address this TBR "controversy" and Serious Sam (which basically started with regards to his answer to one of the questions in the Croteam/MadOnion/NVIDIA joint-interview here at Beyond3D). He may also post something about TBR and Serious Sam on Croteam's homepage.
Let's hope we don't have to wait too long (hear that Dean? )



--------------------------------------------------------------------------------

Gee, Rev, you don't know when to quit, er?

Anyway, I was thinking to write something about this issue in the news section at our site (www.croteam.com).
In short: When PVR guys sent me the latest internal driver, I was AMAZED. You see, I always thought that driver's got to have some overhead at glSwapBuffers() portion because of this controversal deffered rendering method. And it used to have. A lot of overhead (30% of all rendering). Until the latest driver (soon to be out, I hope).
The point is ... my statement about how TBR can be very complicated to efficialy implement (from the driver side), has gone down the drain. And I'm glad about it!
Welcome Kyro (not just Kyro2!).
Just my 2 cents...

bye
DEN

P.S. I don't know whether AnandTech tested with latest internal driver (the one I have), or with latest public. If latter, then Kyro guys can expect more speed boost, especially at lower reses! "
April 6, 2001 10:58:40 AM

The Kyro 2 is a decent card, like the 3dfx cards were. But these days without T&L a card can be deemed almost nothing. Remember when 3DFX first came out, all the manufacturers were developing using the Glide API, not the PowerVR or even direct 3D. This is because it was much better than the other two.

Same here T&L is way better than the traditional Geometry and lighting. With the programmable shaders and renderers, After they have been programmed, doing the same for Non-T&L will be secondary and will not recieve as much attention.

When the XBox is released, any game designed for that will be using the same api as the GF3 DX8 on the PC. Any needed convertions will only be for other cards, which again wont recieve the same attention.

The Kyro II is stuck in the DX7 World. When Games start asking for DX8, it will be in deep [-peep-] taking all the kyro users with it.

The way I see it, you saw one benchmark where it is doing well, on anandtech where the people couldn't do anything but praise the card even though it only did well on one of the games out of so many. And, you think it is capable of wiping the floor with the competition. Well, even the anandtech people could only say it is a good 'low budget' card.

What I think is, you are either, just like Fugger and AMDMeltdown, or you probably are a Imagination tech employee, trying to anonymously create some product awareness. Why else would you choose a name like powervr2? It does restrict you to one area with one opinion!



<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
Anonymous
a b U Graphics card
April 6, 2001 6:17:48 PM

I am not a imagination employe...
I would like to be :) 
I am António Carlos Vitor
I live in Lisbon in Portugal and my email is antonio.vitor@teleweb.pt
I am not a anonymous person...

I only like cheap good products..
like AMD and sdr/ddr etc...

DO YOU know that the only card that support all the directx8 feature is geforce 3 ???

DO you know that it is easily ported to the cpu the task of the transform and lighting (the transform and lighting of the olders geforces not the t&L of geforce 3)
In a 1 ghz athlon T&L in software could count a maximum of 20% of the power of the cpu devoted to that...
the T&L of the older geforce is easily emulated by a good cpu (I am not talking the emulation via sofware of the new geforce, it's possible but.... 1 frame per second is not viable)

Do you know that if they implement the T&L of geforce 3 in a application/game, that will require to all the others (geforce 1,2 radeon etc..) cards to run T&L via sofware?

DO YOU KNOW THAT THERE ARE MANY PEOPLE OUT THERE THAT DON'T WANT TO SPEND 600 $ IN A CARD?
YA GEFORCE 3 IS GOOD ... BUT AT WHAT PRICE ???
April 6, 2001 7:11:41 PM

Don't you think there are any advantages to the "tile-based" architecture. The Kyro II does not have to waste time rendering "hidden" surfaces as other cards do. Granted, the CPU must spend more time handling lighting and transforms but graphics cards are already the limiting factor.

The Kyro II doesn't have T&L support but it renders efficiently so who's to say what the real world performance will be?

Besides, no one is trying to compare the Kyro II to a Geforce 3 or even an Ultra with the Kyro II likely to be priced closer to a Geforce GTS, possibly even as low as Geforce 2 MX.
April 6, 2001 7:59:36 PM

The GF3 does have Hidden surface removal. That is why it can be about 2 - 7 times faster than the GF2 Ultra (nVidia Claims). 2x in simple scenes, 7x in complex.

What I am saying is the kyro II is a good card now. But, in the near future, it is gonna be dead. What happens when the screen contains a huge number of polygons with a huge textures, and bump mapping. Tile Based Rendering cannot rescue the card.

It has only one good feature. They are trying to sell it on that one good feature. There are two things that can happen now.

1. People will see sense and not buy it. Imagination Tech in deep [-peep-]!

2. People will actually buy it and trash it six months later. The customers in deep [-peep-]!

Unless they keep upgrading their cpu's to keep up with the ever more complex games. And, trust me there will be complex games i.e. XBOX ported games. Like I've said a hundred times before, the developers wont like removing features to accommodate inferior platforms, such as the Kyro II.

powervr2:
Go find out what Transform and Lighting actually is before trying to reply to my post.

<i><b><font color=red>"How much wood would a wood chuck chuck if a wood chuck could chuck wood?"</font color=red></b></i>
Anonymous
a b U Graphics card
April 6, 2001 8:58:05 PM

nice post....

but liked the old sig better.
April 6, 2001 9:22:13 PM

KyroII is smart technology that will get the job done eloquently. If T&L is not available in Hardware then isn't it automatically done in software by the cpu? Some benchmarks improve by using software T&L vice hardware.

<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/06/01 08:02 PM.</EM></FONT></P>
April 6, 2001 9:59:46 PM

OK...


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
April 6, 2001 10:50:24 PM

Quote:
What I am saying is the kyro II is a good card now. But, in the near future, it is gonna be dead. What happens when the screen contains a huge number of polygons with a huge textures, and bump mapping. Tile Based Rendering cannot rescue the card.

It has only one good feature. They are trying to sell it on that one good feature. There are two things that can happen now.

1. People will see sense and not buy it. Imagination Tech in deep [-peep-]!

2. People will actually buy it and trash it six months later. The customers in deep [-peep-]!


I see your point but I do think the Kyro II has 2 points going for it, tile-based architecture and no-penalty FSAA

I just have some questions. When (not counting when there was only one 3D card) has the highest performing card also been the number one selling card sitting in the majority of gamer's computers.

It has never happened. The majority of sales have always been older generation or budget minded current generation card.

Now, if you were a developer, planning on a 2-3 year cycle, would you develop exclusively for technology that is still on the drawing board hoping that it will be in the majority of users computers at the time of release or would you develop for technology that is already available?

What I am saying, now, is that the games being released over the next year will work fine. In a year, maybe. In two years, probably not. However, at that point I will be looking for another graphics card, anyway. This means, with the card I buy today, I can enjoy games I can buy today. I don't need to buy the best technology because the best technology will always be ahead what games will demand.

How frequently do you upgrade your 3D cards? I mean the Geforce256 was announced in July of 1999 and released in November of 1999. It's now 17 months later and we are about to see the third Generation. That's what, an 8-1/2 month cycle, is it not? At that pace, 4-6 hardware generations can come and go within the time frame of one software cycle.

It just doesn't seem at all practical to try to buy for the future.
April 6, 2001 11:39:08 PM

Game developers sometimes, get samples of the cards before they are released. The manufacturer is only too keen to provide the samples if they think it will demonstrate the potential of the card. Look at Doom 3. The game is miles off completion, yet there are demos available to the manufacturer (nVidia and Apple), And they look good too!

There are three games already on release, that support GF3 using DX8. Do you remember when 3D games were brand new and nobody had 3D hardware? The same happened again when 3DFX came out. All these games supported Glide but not any other Hardware API. Some Games did allow software modes, but were excruciatingly slow, even on the fastest cpu. The games thesedays have huge budgets. The lower the development time, the lower the cost. There are so many games of the same genre out there, the companies fight to release the games early. often with several bugs. So, They have to make choices, something tells me they'll choose to develop the T&L and the programmable shader parts before the cpu based geometry and lighting calculations. So, there will be more time and attention spent on the Hardware based features. The bugs get to go to the cpu (software) based geometry and lighting.

I still have my GeForce 1, and don't plan to buy the GeForce 3 unless, I feel like doing some development work on it, or doom 3 gets the better of me. But, If I had to buy a Graphics card soon, I'd Get the GeForce 3 If I had the budget or I'd get a GeForce2 or Radeon if my budget is limited. Personally, I'll be waiting for different flavours of GeForce 3 and what ATI has to offer in the same area.

I would also like to see other companies, even Imagination technologies get something out using the same type of technology, such as the Programmable shaders.

One potential problem is that, it is highly likely, the programmable shaders of the different companies will have different standards, and command sets. I'm not sure if DX8 takes that problem into account (or if it is anything to do with dx8 at all).

Anyway, if different manufacturers used the same type of tech, it can only improve the market and the competition. Of Course there should be innovation, but if it means struggling to release a product on time, and ignoring current technology state and the future proofing of the product, it can only be damaging to the company itself. Imagination Tech have been repeatedly making the same mistake since the Neon (powervr sg).

<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
April 7, 2001 7:44:30 AM

Very eloquently spoken.

You may be right and it may also be possible the future of 3D gaming becomes very dependent on T&L. On the other hand some other feature may be invented that overshadows all that we have seen. This too would tend to leave the Kyro II based cards, (and any other current card), behind but that doesn't matter. If a card works well enough now then I am happy. Tomorrow I will just have to buy something else. That something else maybe one of those "flavors" of the Geforce 3 you mentioned, if it's cheap enough, or may be something not yet seen but in the meantime I have to use something. I might just stick with my Geforce SDR or I might buy a Radeon LE or I might go with the Kyro II but I don't think I will ever spend $550 for a graphics card. I will probably always be on the trailing edge. What do I care if I enjoy things 1 or 2 generations behind everyone else. I will still enjoy them (eventually).
Anonymous
a b U Graphics card
April 7, 2001 9:40:12 AM

None of us write games for a living? Well let's see what some of the actual game programers ARE saying:

"For <i>Unreal</i> and <i>UT</i> it helped to have a good video card, but we weren't able to take full advantage of it because of the software renderer--though so many more people were able to play the games because of that. But in <i>Unreal II</i> we can increase the polygon count by a factor of 100 because there's no software mode." --Tim Sweeney, lead programmer at Epic Games

"There is a video card gap right now between high and low end. Basically, as soon as you cross the hardware T&L barrier, you literally can triple frame rates. To make a game that scales across both is very hard, and frankly, by the time we come out, if we don't use hardware T&L we'll look like crap." --Raph Koster, <i>Star Wars Galaxies'</i> Creative Director for Sony Entertainment Online (and former lead designer of <i>Ultima Online</i>)

Will T&L be mandatory on future games? Looks like it. Will it be mandatory on games coming out in the next year? Not sure, but less likely I would say. Either way, by the end of this year you better have T&L.

Cheers,
Warden
Anonymous
a b U Graphics card
April 7, 2001 11:31:28 AM

The T&L of older geforces (I & II) is a crap easiliy emulated by a cpu...if we put more than one light... It turns worse than T&L via software... less fps...

the one that is good is T&L from geforce 3...
so in a year time we will see only games that requires geforces 3 ???
that is curious...
in a year time I will buy the next kyro with programable T&L
cheap...
say 150 $ ???
;) 
Why the kyro beats (at least the MX) the geforces without T&L in games that uses T&L ?
figure it out...


By the way if they increase the polygones that will benefit the kyro..
why?
more complex scenes will have more overdraw so less fill rate for "traditional card" and more fill rate to kyro...
maybe I will not need a new video card in a year time
;) 

P.S.
He is talking about software rendering, to those that don't have a 3d card (those even without a voodoo 1 )

<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/07/01 07:44 AM.</EM></FONT></P>
Anonymous
a b U Graphics card
April 7, 2001 11:47:47 AM

Hercules response to Nvidia:

"They are right to be scared, 3D Prophet 4500 will really be a great product. Street Date: May 16th, 2001"
- Claude Guillemot, President, Hercules Technologies

sweet...
in may 16th I will get one...
April 7, 2001 3:22:03 PM

Who told you software lighting is quicker than hardware lighting?


<i><b><font color=red>"2 is not equal to 3, not even for large values of 2"</font color=red></b></i>
April 7, 2001 3:29:16 PM

Is the Kyro I that you have really that bad that you have to get the Kyro II? If the Kyro II was released about a half year ago, then it could have stand a chance against their competition. Most gamers, if not all, who have something like a GeForce, Ati Radeon, or even a G400 wouldn't dare to downgrade to a Kyro II. There is more to a card than just fames per second ya know.
Now, to get back on topic. Is the Kyro II the killer of nvidia? The answer is no. The only thing that STMicroelectronics has to show off in their card is the Tile Based Rendering. "Whoopee"
Nvidia's GeForce 3 has Hidden Surface Removal, lossless data compression, Quincunx Antialiasing, and DirectX 8 support support. I could go on with Nvidia's nifiniteFX engine but I'm sure you get the idea.

I wonder if I should consider getting a Xbox....

Believe me, if it ain't broke, don't fix it.
    • 1 / 10
    • 2
    • 3
    • 4
    • 5
    • More pages
    • Next
    • Newest
!