Kyro 2 - Part 2
Tags:
-
Graphics Cards
-
Graphics
- Product
Last response: in Graphics & Displays
phsstpok
April 25, 2001 7:03:38 PM
(continuation from Kyro 2 the Killer of Nvidia ???)
<b>I hope the discussion will continue here as it was taking nearly 2 minutes to load the old thread!!!</b>
In my last post I posed the following to Teasy.
Hello, Teasy. I noticed that the Kyro II is already available in your country. Do you you own one? It sounds like you don't because you keep mentioning the Kyro 1. In any case, the performance problems that you are describing regarding DX8 vs DX7, do they apply to the Kyro II? (Sorry, if I am asking for information that is already posted but this thread is now to long to scan through it's entirety). Have you tried the beta DirectX 8.1 drivers. You can find them at them at <A HREF="http://www.3dchipset.com" target="_new">http://www.3dchipset.com</A>
<b>Update:<b> It looks like the links for DirectX 8.1 downloads have been removed. Sorry.
<font color=red>I found another link for DirectX 8.1.</font color=red>
<A HREF="http://www.Gamers-Ammo.com" target="_new">http://www.Gamers-Ammo.com</A>.
<P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 04/25/01 03:42 PM.</EM></FONT></P>
<b>I hope the discussion will continue here as it was taking nearly 2 minutes to load the old thread!!!</b>
In my last post I posed the following to Teasy.
Hello, Teasy. I noticed that the Kyro II is already available in your country. Do you you own one? It sounds like you don't because you keep mentioning the Kyro 1. In any case, the performance problems that you are describing regarding DX8 vs DX7, do they apply to the Kyro II? (Sorry, if I am asking for information that is already posted but this thread is now to long to scan through it's entirety). Have you tried the beta DirectX 8.1 drivers. You can find them at them at <A HREF="http://www.3dchipset.com" target="_new">http://www.3dchipset.com</A>
<b>Update:<b> It looks like the links for DirectX 8.1 downloads have been removed. Sorry.
<font color=red>I found another link for DirectX 8.1.</font color=red>
<A HREF="http://www.Gamers-Ammo.com" target="_new">http://www.Gamers-Ammo.com</A>.
<P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 04/25/01 03:42 PM.</EM></FONT></P>
More about : kyro part
Anonymous
a
b
U
Graphics card
April 25, 2001 9:16:44 PM
The Kyro II based 32mb Vivid!XS and 64mb Hercules prophet 4500 are now selling online in the U.K but the first Kyro II cards still aren't in stock. Yes the DX8 problem will also effect the Kyro II but hopefully MS will fix the problem very soon. I have tried DX8.1 but it doesn't fix the problem, but then DX8.1 isn't finished yet so it might be fixed by the time its finished. Hopefully I'll have more info on this soon.
I also think this DX8 problem is why the Kyro II doesn't do well in Aquanox. There doesn't seem to be any other logical explanation for it doing worse then the other cards since they all use SW T&L in Aquanox because of the vertex shaders.
I just got this info from Matt of IMGTEC:
"This is exactly correct. We do support render to texture but DX7 does not have a flag for that so you need a hardcoded list. MS have a hardcoded list for non DX7 features which DX7 level drivers do support. So the solution is to update that list hopefully this will have occured by the time DX8.1 is officially released. We could also update our driver to export the render to texture cap but I have no info at present as to whether this will occur. Currently we are working with Microsoft to find a solution by the time DX8.1 is released."
So it looks like there will be a solution one way or the other by May or June when DX8.1 is officially released. The Kyro II's released on May 16th in the U.S (I think thats the right date) so hopefully this problem won't effect many people buying a Kyro II since the problem should be fixed then or shortly after.
I also think this DX8 problem is why the Kyro II doesn't do well in Aquanox. There doesn't seem to be any other logical explanation for it doing worse then the other cards since they all use SW T&L in Aquanox because of the vertex shaders.
I just got this info from Matt of IMGTEC:
"This is exactly correct. We do support render to texture but DX7 does not have a flag for that so you need a hardcoded list. MS have a hardcoded list for non DX7 features which DX7 level drivers do support. So the solution is to update that list hopefully this will have occured by the time DX8.1 is officially released. We could also update our driver to export the render to texture cap but I have no info at present as to whether this will occur. Currently we are working with Microsoft to find a solution by the time DX8.1 is released."
So it looks like there will be a solution one way or the other by May or June when DX8.1 is officially released. The Kyro II's released on May 16th in the U.S (I think thats the right date) so hopefully this problem won't effect many people buying a Kyro II since the problem should be fixed then or shortly after.
Anonymous
a
b
U
Graphics card
April 25, 2001 9:58:21 PM
on that toms review:
<font color=red> "The benchmarks show that the Kyro II on the Hercules 3D Prophet 4500 is a thoroughly competitive product when compared to GeForce2 MX/GTS or the ATI Radeon, as long as you prefer 32-bit color. A real highlight is the price - for a mere $149, you get a 64Mbyte graphics card that gives its more expensive competitors a real run for their money" </font color=red>
well any card even a TNT-2 is able to deliver playable frames in 16 bits and bellow 1024x768
so I prefer better performance on 32 bits and higher depths when others start to deliver non playable frames
<font color=red> "The benchmarks show that the Kyro II on the Hercules 3D Prophet 4500 is a thoroughly competitive product when compared to GeForce2 MX/GTS or the ATI Radeon, as long as you prefer 32-bit color. A real highlight is the price - for a mere $149, you get a 64Mbyte graphics card that gives its more expensive competitors a real run for their money" </font color=red>
well any card even a TNT-2 is able to deliver playable frames in 16 bits and bellow 1024x768
so I prefer better performance on 32 bits and higher depths when others start to deliver non playable frames
Related resources
- Creative Labs Geforce 2 Ultra & Kyro PCI problems - Forum
- Kyro in Q3 vrs JK2 - Forum
- Kyro2 + celeron = crap? - Forum
- kyro2 vs gf2 ti - Forum
- GForce 2 or Kyro II - Forum
HolyGrenade
April 25, 2001 10:45:39 PM
The Kyro 2 may be a decent card but it does not compete with the GF2 Ultra or the GF3. It was the 'sudden-nes' of the comment they made. It was just unexpected, and it did make me laugh out loud. no, really! not in a mad scientist way though.
For the evolution of graphics, the comment I made was from a technological point of view. I say this because, If people buy this card the games designers will again may have to cater for it by restricting T&L operations. This will restrict the amount of polygons and thus the game detail.
In a perfect world we would have the Fixed T&L, Programmable shaders, Tile based renderring and some newer but essential features such as Vertex processors and physics engines on board the current graphics cards. All of this technology is available, but the cost is far too high. Putting all of this into one card could be rather expensive. But, if it were to be done, in this world, it would take a while to catch on, as the games developers will have to accomodate users of the older cards. Also there always seem to be 3 groups of users. One group would try to get it at first possible chance, Some that would rather get something that will run current day apps at a decent rate, and the final group are the ones standing on the thin line in between.
A near Catch 22 situation is created, when a product comes along and says we will run the current apps really well, in fact better than anything else in our cost category, but future apps are likely to be somewhat of a different story. This will stop developers jumping on to using the new tech. Which in turn will make the undecided group of people, choose to buy something of the older generation. It does not create a deadlock situation but does slow everything down to a crawling speed.
This is what happened to the GeForce Line of cards. The Kyro could have been designed to have a T&L engine which would have made it a far more credible card in my book. Of course, nVidia should take some action to include the Gigapixel technology into their chips asap. But, then again, moving a technology from the drawing board to the production line can usually take about 18-24 months.
The GeForce DDR... I got that for £150 on pre-order. budget or what! It wa going for £220 inc vat at the time. I got it from a vendor known to me (an aquaintance of mine). I only got 10 years warranty. I hear the yanks got lifetime warranty. may be in 2008 i should call up nVidia and say my GeForce card isn't compatible with the Super-Ultra-Tera HoloGraphics port. i need a replacement. ;-)
When I bought this card it was Unanimously THE BEST card in the market. something which is not the case with the kyro 2.
I read the Kyro2 review in thg. It seems to be one of the more flattering reviews. I thought you would be happy. Also, there is no point blaming tom, because he only writes the english reviews, I think this is a translation of the german review.
i noticed I stopped using capitalisations in my sentences. I should stop now.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
For the evolution of graphics, the comment I made was from a technological point of view. I say this because, If people buy this card the games designers will again may have to cater for it by restricting T&L operations. This will restrict the amount of polygons and thus the game detail.
In a perfect world we would have the Fixed T&L, Programmable shaders, Tile based renderring and some newer but essential features such as Vertex processors and physics engines on board the current graphics cards. All of this technology is available, but the cost is far too high. Putting all of this into one card could be rather expensive. But, if it were to be done, in this world, it would take a while to catch on, as the games developers will have to accomodate users of the older cards. Also there always seem to be 3 groups of users. One group would try to get it at first possible chance, Some that would rather get something that will run current day apps at a decent rate, and the final group are the ones standing on the thin line in between.
A near Catch 22 situation is created, when a product comes along and says we will run the current apps really well, in fact better than anything else in our cost category, but future apps are likely to be somewhat of a different story. This will stop developers jumping on to using the new tech. Which in turn will make the undecided group of people, choose to buy something of the older generation. It does not create a deadlock situation but does slow everything down to a crawling speed.
This is what happened to the GeForce Line of cards. The Kyro could have been designed to have a T&L engine which would have made it a far more credible card in my book. Of course, nVidia should take some action to include the Gigapixel technology into their chips asap. But, then again, moving a technology from the drawing board to the production line can usually take about 18-24 months.
The GeForce DDR... I got that for £150 on pre-order. budget or what! It wa going for £220 inc vat at the time. I got it from a vendor known to me (an aquaintance of mine). I only got 10 years warranty. I hear the yanks got lifetime warranty. may be in 2008 i should call up nVidia and say my GeForce card isn't compatible with the Super-Ultra-Tera HoloGraphics port. i need a replacement. ;-)
When I bought this card it was Unanimously THE BEST card in the market. something which is not the case with the kyro 2.
I read the Kyro2 review in thg. It seems to be one of the more flattering reviews. I thought you would be happy. Also, there is no point blaming tom, because he only writes the english reviews, I think this is a translation of the german review.
i noticed I stopped using capitalisations in my sentences. I should stop now.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
phsstpok
April 25, 2001 11:18:49 PM
This is to everyone who have joined this new thread.
I thank you. The old thread was just too slow.
I wish I had something meaningful to add to the discussion but I don't. I will just sit back and continue to read. I'll ask questions when I wish to understand something.
I do find it find it fascinating that there has been such a heated debate over a video card of which none of us have yet had access.
I guess I do have a question to ask at this time. It is clear that the Kyro II is not the end-all graphics card and that it has, as with all graphics cards, a limited service life. With that in mind, what would each of you think is a good price point, if at all, if one was to purchase the Kyro II at the expected release date, mid-May? Feel free to qualify your answer.
<P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 04/25/01 07:20 PM.</EM></FONT></P>
I thank you. The old thread was just too slow.
I wish I had something meaningful to add to the discussion but I don't. I will just sit back and continue to read. I'll ask questions when I wish to understand something.
I do find it find it fascinating that there has been such a heated debate over a video card of which none of us have yet had access.
I guess I do have a question to ask at this time. It is clear that the Kyro II is not the end-all graphics card and that it has, as with all graphics cards, a limited service life. With that in mind, what would each of you think is a good price point, if at all, if one was to purchase the Kyro II at the expected release date, mid-May? Feel free to qualify your answer.
<P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 04/25/01 07:20 PM.</EM></FONT></P>
lhgpoobaa
April 25, 2001 11:47:27 PM
Anonymous
a
b
U
Graphics card
April 26, 2001 1:43:15 AM
noko
April 26, 2001 2:17:03 AM
Anonymous
a
b
U
Graphics card
April 26, 2001 4:22:42 AM
OzzieBloke
April 26, 2001 4:33:47 AM
No worries, Warden, but it didn't really matter... I'm on dial-up, but my settings list every ten messages and each batch of ten with a number... I just load up from where I left off and voila! But, never mind
To Teasy: As for the overdraw problem, I realised I kind of repeated myself with the pipeline statement after I posted, but was too tired to bother changing it before going to bed
Still, what I meant was, possibly a hardware solution that would force either front to back or back to front sorting of objects/layers/whatever to maximise use of the pipelines... Kyro's 8 is sufficient for maybe the next 2 years, but once games start coming out that beging using more passes, more textures, etc, then you will still need a way to reduce the overdraw... or am I getting myself confused again... Bugger! That's what you get when a vet has a hobby interest in computers
Cow with legs spread wide either dead or playing 'cello.
To Teasy: As for the overdraw problem, I realised I kind of repeated myself with the pipeline statement after I posted, but was too tired to bother changing it before going to bed
Still, what I meant was, possibly a hardware solution that would force either front to back or back to front sorting of objects/layers/whatever to maximise use of the pipelines... Kyro's 8 is sufficient for maybe the next 2 years, but once games start coming out that beging using more passes, more textures, etc, then you will still need a way to reduce the overdraw... or am I getting myself confused again... Bugger! That's what you get when a vet has a hobby interest in computers
Cow with legs spread wide either dead or playing 'cello.
Anonymous
a
b
U
Graphics card
April 26, 2001 5:51:23 AM
Teasy,
Are you sure about every card (except the GF3 of course) using software T&L in Aquanox? I understand your reasoning, in that since they don't have Vertex Shaders they can't do the native T&L that Aquanox is written for. But I am unclear if this requires ALL T&L duties to be done by the CPU. I have read rumors that the CPU only has to do part of the work (the vertex shader modifications) but that the hardwired T&L unit still does some or most of it. Do you have some more info on this?
Agreed. Performance is usually (hopefully) the most important thing, especially in the situation of the Kyro II where big incompatibilities don't exist. We had just spent about 35+ pages discussing the performance side, so I thought I would branch things off to include politics too. Never meant to say politics were THE FINAL WORD in the matter. :cool:
Also agreed. This created a hole in the market for another 3rd player, a 3rd player that probably does look better than NVIDIA to ex-Voodoo fans.
You also made some remarks about NVIDIA being the killers of 3dfx. I realize you may have been speaking more of the common public perspective rather than of your own opinions, but I wanted to get my $.02 in about it anyway. :smile:
3dfx killed themselves off. NVIDIA did not come along and gobble them up kicking and screaming--they sold to NVIDIA by choice to cut their losses, a process that took several weeks of negotiations. 3dfx had stopped making money for several reasons. Some of it was bad political moves on their part, when they alienated all their card manufacturers by deciding to make the cards themselves. Later they realized they couldn't pull off the card making side of things, but all the card manufacturers had lost trust for them, and were happily making NVIDIA cards by then anyway. Some was marketing: NVIDIA did a better job of making their cards sound like you HAD to have them. Some was the market decline that has been hitting the whole industry. And a pretty large part was performance too. 3dfx sat around on their 16-bit Voodoo 2 technology for too long instead of bringing their Voodoo 5 technology to market. Anyway, in the end 3dfx was dead before long before NVIDIA bought them. Many people saw their demise coming a good year in advance.
Well, I was going to say a bunch more about the NVIDIA flame thing but I think we understand each other. I am glad to hear that IMGTEC is good to work with, though I am not surprised. Underdog companies are almost always friendly because they are trying to impress. After they become successful... well too often their tone changes, but I hope that doesn't happen with IMGTEC. We could use more companies that manage to be friendly <i> and </i> highly successful at the same time. :cool:
Regards,
Warden
Quote:
<font color=blue> I also think this DX8 problem is why the Kyro II doesn't do well in Aquanox. There doesn't seem to be any other logical explanation for it doing worse then the other cards since they all use SW T&L in Aquanox because of the vertex shaders. </font color=blue>Are you sure about every card (except the GF3 of course) using software T&L in Aquanox? I understand your reasoning, in that since they don't have Vertex Shaders they can't do the native T&L that Aquanox is written for. But I am unclear if this requires ALL T&L duties to be done by the CPU. I have read rumors that the CPU only has to do part of the work (the vertex shader modifications) but that the hardwired T&L unit still does some or most of it. Do you have some more info on this?
Quote:
<font color=blue> When I said that performance would gain it acceptance I was saying that overall its the most important thing. Because your comment seemed to say that politics was the most important thing, which IMO it isn't. </font color=blue>Agreed. Performance is usually (hopefully) the most important thing, especially in the situation of the Kyro II where big incompatibilities don't exist. We had just spent about 35+ pages discussing the performance side, so I thought I would branch things off to include politics too. Never meant to say politics were THE FINAL WORD in the matter. :cool:
Quote:
<font color=blue> Also something to the Kyro II's advantage it 3dfx going down. </font color=blue>Also agreed. This created a hole in the market for another 3rd player, a 3rd player that probably does look better than NVIDIA to ex-Voodoo fans.
You also made some remarks about NVIDIA being the killers of 3dfx. I realize you may have been speaking more of the common public perspective rather than of your own opinions, but I wanted to get my $.02 in about it anyway. :smile:
3dfx killed themselves off. NVIDIA did not come along and gobble them up kicking and screaming--they sold to NVIDIA by choice to cut their losses, a process that took several weeks of negotiations. 3dfx had stopped making money for several reasons. Some of it was bad political moves on their part, when they alienated all their card manufacturers by deciding to make the cards themselves. Later they realized they couldn't pull off the card making side of things, but all the card manufacturers had lost trust for them, and were happily making NVIDIA cards by then anyway. Some was marketing: NVIDIA did a better job of making their cards sound like you HAD to have them. Some was the market decline that has been hitting the whole industry. And a pretty large part was performance too. 3dfx sat around on their 16-bit Voodoo 2 technology for too long instead of bringing their Voodoo 5 technology to market. Anyway, in the end 3dfx was dead before long before NVIDIA bought them. Many people saw their demise coming a good year in advance.
Well, I was going to say a bunch more about the NVIDIA flame thing but I think we understand each other. I am glad to hear that IMGTEC is good to work with, though I am not surprised. Underdog companies are almost always friendly because they are trying to impress. After they become successful... well too often their tone changes, but I hope that doesn't happen with IMGTEC. We could use more companies that manage to be friendly <i> and </i> highly successful at the same time. :cool:
Regards,
Warden
chrisojeda
April 26, 2001 7:45:14 AM
Anonymous
a
b
U
Graphics card
April 26, 2001 8:04:39 AM
Anonymous
a
b
U
Graphics card
April 26, 2001 8:26:38 AM
if geforce 3 mx will be bandwidth limited like present geforce 2 mx then it will not be a great product...
Holygrenade, what teasy and I were saying all along was that kyro 2 can beat ultra in some games on higher res and 32 bits...
Because of that we can say that kyro 2 is competing with geforce 2 ultra...
Holygrenade, what teasy and I were saying all along was that kyro 2 can beat ultra in some games on higher res and 32 bits...
Because of that we can say that kyro 2 is competing with geforce 2 ultra...
Anonymous
a
b
U
Graphics card
April 26, 2001 8:39:35 AM
Noko said on part I that :
<font color=red> Thanks for your outstanding reply (from teasy). Its kinda funny that the Kyro2 does so well with a crippled DX8 hurting the Kyro2 performance. Which just means it will do even better when DX8 is fixed and addresses the problem that ties up the CPU with textures when the Kyro2 can do it many times faster on the card. I believe virtually all the benchmarks on all the reviews where done using DX8 alone, which means the true performance of the Kyro2 card will be much much better.
If W2k drivers are as good as Win9x drivers then I highly recommend the Kyro2. Even the Radeon after almost a year has crappy DX7 drivers in W2k as in DX7 games run like 60% of Win9x performance. So far the only DX8 item and test besides AquaNox is 3dMark2001 which the Radeon runs in W2K just as good as Win9x which is even more stranger considering the lack lusty performance of 3dMark2000 in W2k.
Still the lack of T&L will hurt the Kyro2 performance with lower end cpu's which is indicated when a MX card could keep up and surpass a Kyro2 in a Duron700 machine. With higher end cpu's the hardware T&L hurt the performance of those cards when in high resolutions when there is a bandwidth problem. Sounds like smarter technology to me for the Kyro because even with SDRam, it doesn't have servere bandwidth problems in high resolutions. Still there are exceptions with T&L cards in high resolutions where a T&L card can add significant performance increases in rendering, i.e. 3d modellers.
Really the low prices on GF2s are maybe clearance of older technology while the GF2 pros takes its place. I believe the GF2 mx is a dead end card with a short expected life. The mx doesn't compare to the Kyro2 nor to the lower price DDR Radeons. The Kyro2 is really a blessing because it contributed in the price reductions of the nVidia chipset cards and will probably contribute in ATI stepping up their new releases and even further price reductions. After 8 months I think the Radeon is reaching not only its maturity but also the need for an update </font color=red>
yap !!
good point noko !
But I don't know what are the games that use that crippled feature of kyro 2
Maybe most of them I don't know...
I am with a crap 33.6 modem and I only read the last pages..
No problems here...
you can allways post here and on the part 2 ...
<font color=red> Thanks for your outstanding reply (from teasy). Its kinda funny that the Kyro2 does so well with a crippled DX8 hurting the Kyro2 performance. Which just means it will do even better when DX8 is fixed and addresses the problem that ties up the CPU with textures when the Kyro2 can do it many times faster on the card. I believe virtually all the benchmarks on all the reviews where done using DX8 alone, which means the true performance of the Kyro2 card will be much much better.
If W2k drivers are as good as Win9x drivers then I highly recommend the Kyro2. Even the Radeon after almost a year has crappy DX7 drivers in W2k as in DX7 games run like 60% of Win9x performance. So far the only DX8 item and test besides AquaNox is 3dMark2001 which the Radeon runs in W2K just as good as Win9x which is even more stranger considering the lack lusty performance of 3dMark2000 in W2k.
Still the lack of T&L will hurt the Kyro2 performance with lower end cpu's which is indicated when a MX card could keep up and surpass a Kyro2 in a Duron700 machine. With higher end cpu's the hardware T&L hurt the performance of those cards when in high resolutions when there is a bandwidth problem. Sounds like smarter technology to me for the Kyro because even with SDRam, it doesn't have servere bandwidth problems in high resolutions. Still there are exceptions with T&L cards in high resolutions where a T&L card can add significant performance increases in rendering, i.e. 3d modellers.
Really the low prices on GF2s are maybe clearance of older technology while the GF2 pros takes its place. I believe the GF2 mx is a dead end card with a short expected life. The mx doesn't compare to the Kyro2 nor to the lower price DDR Radeons. The Kyro2 is really a blessing because it contributed in the price reductions of the nVidia chipset cards and will probably contribute in ATI stepping up their new releases and even further price reductions. After 8 months I think the Radeon is reaching not only its maturity but also the need for an update </font color=red>
yap !!
good point noko !
But I don't know what are the games that use that crippled feature of kyro 2
Maybe most of them I don't know...
I am with a crap 33.6 modem and I only read the last pages..
No problems here...
you can allways post here and on the part 2 ...
GrahamD
April 26, 2001 12:49:27 PM
I think CPU scaling is an important issue that has been missed in this review. After all, most people buying a budget video card will not have the fastest processors. How would this run on a Celeron or slowish Duron? Without hardware TnL it would take more of a hit than the GeForce and Radeon cards (for games that support this feature).
I would also like to see a performance comparison with the 32 MB Videologic Vivid XS. This is cheaper still, but I have no idea what difference the missing 32 MB makes. Since Kyro does not use an external Z buffer, it does not need (quite) as much memory.
I would also like to see a performance comparison with the 32 MB Videologic Vivid XS. This is cheaper still, but I have no idea what difference the missing 32 MB makes. Since Kyro does not use an external Z buffer, it does not need (quite) as much memory.
Anonymous
a
b
U
Graphics card
April 26, 2001 2:08:08 PM
Be careful with what you are saying here. The KYRO II is not in competition with the GeForce Ultra baord, even if it can beat out the Ultra in one or two games in benchmarks. The KYRO II is competing with the new GeForce 2 MX 400 boards and possibly the GTS boards and Radeon boards.
Just because the KYRO II is able to beat out the Ultra in a benchmark or two does not make this an Ultra competitor as in 99% of other benchmarks the KYRO II will lose.
THe KYRO II will beat out all MX boards in benchmarks. It will beat out the GTS in most benchmarks when looking at 32-bit color and high-resolutions. The KYRO II will beat out the GTS most of the time in FSAA 4X performance in 32-bit color. The KYRO II may beat out the Ultra in Serious Sam (So far we have only seen ONE benchmark that has found this to be true AnandTech). So lets be careful not to paint a picture of the KYRO II that is not accurate. In terms of price/performance, the KYRO II is an excellent buy in its price class. With the boards ability to scale well with CPU power and its ability to play increasing complex games (in terms of overdraw etc...) the KYRO II will continue to move ahead of both the MX and GTS boards with future games (even games that use the DX7 T&L feature -- this is MY personal opinion as I believe a T&L unit will do squat for performance if the board hits its bandwidth limitation).
Rich
<A HREF="http://pvr.gamestats.com/start.shtml" target="_new">PowerVR Revolution</A>
Just because the KYRO II is able to beat out the Ultra in a benchmark or two does not make this an Ultra competitor as in 99% of other benchmarks the KYRO II will lose.
THe KYRO II will beat out all MX boards in benchmarks. It will beat out the GTS in most benchmarks when looking at 32-bit color and high-resolutions. The KYRO II will beat out the GTS most of the time in FSAA 4X performance in 32-bit color. The KYRO II may beat out the Ultra in Serious Sam (So far we have only seen ONE benchmark that has found this to be true AnandTech). So lets be careful not to paint a picture of the KYRO II that is not accurate. In terms of price/performance, the KYRO II is an excellent buy in its price class. With the boards ability to scale well with CPU power and its ability to play increasing complex games (in terms of overdraw etc...) the KYRO II will continue to move ahead of both the MX and GTS boards with future games (even games that use the DX7 T&L feature -- this is MY personal opinion as I believe a T&L unit will do squat for performance if the board hits its bandwidth limitation).
Rich
<A HREF="http://pvr.gamestats.com/start.shtml" target="_new">PowerVR Revolution</A>
Anonymous
a
b
U
Graphics card
April 26, 2001 2:16:08 PM
If you want to see a review that includes CPU scaling (Duron 600, Athlon 800 and a Athlon 1200) check out this review:
<A HREF="http://www.hardware.fr/html/articles/lire.php3?article=..." target="_new">http://www.hardware.fr/html/articles/lire.php3?article=...;/A>
It shows the KYRO II is still very competitive on low-end PC's and scales very nice to high-end PC's. This is especially evident in games where T&L is used. The only problem I saw in this review are the Giant results. I do now performance is much better than what is shown in this review.
<A HREF="http://www.hardware.fr/html/articles/lire.php3?article=..." target="_new">http://www.hardware.fr/html/articles/lire.php3?article=...;/A>
It shows the KYRO II is still very competitive on low-end PC's and scales very nice to high-end PC's. This is especially evident in games where T&L is used. The only problem I saw in this review are the Giant results. I do now performance is much better than what is shown in this review.
phsstpok
April 26, 2001 6:38:47 PM
Now that we have exhaustively discussed the technical benefits and drawbacks of the Kyro II I was wondering if we might move on to practicality.
Until the the Kyro II came along (and maybe afterward as well) CPU scaling has not been much of a factor, being greatly hampered by video card bandwidth problems. It seems to me that we are still at a point where even a Geforce 3 is not sufficient to play most games at 1600x1200x32. Maybe I am wrong here, but if this the case then we are stuck at 1280x1024x32 and 1024x768x32 much of time. Most video cards on the shelf today can play most games at the lower of these two, at least. So my question is, knowing that the just mentioned resolutions are what we will be using, which video cards will still be useful, going forward from here, and for how long?
Sure, it sounds as if cards without T&L would be left useless if and when the software industry switches to hardware T&L-only games. I contend that by the time that actually happens all current cards with the possible exception of the Geforce 3 will be comparatively too slow for the more complex games in either case. I also contend that there will be some portion of the software industry that won't completely forget about the installed base of video cards. These are owned by millions of potential, new-game customers, after all.
There will be a transition period. This transition period has already begun but it will be a very long time before even a Geforce256 will be completely useless. A better card will make the transition more enjoyable, and longer, and a Geforce 3 better still. The choice is up to the individual consumer. It always has been.
My point is this. There will never a time when everyone has to stop, all at once, and buy the latest and greatest video card. People can buy what ever they want, whenever they want, and use it for however long it feels comfortable to do so. Some people are still using Voodoo Graphics video cards and they are using them with new titles like the Need for Speed series. Those cards are about 5 years old now yet some are still using and enjoying them.
For the same reasoning, someone might buy a Kyro II today (or whenever it is actually released). Sure, it will be surpassed in a year or sooner by other budget cards but it will still play today's games quite well, and most the games during the coming year, and some of the games later on. Not only that, but it will play the games as well or better than some of the competing budget cards. This fact gives the Kyro II value and more value the longer one continues to use it. As I mentioned before, it depends on the individual.
<P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 04/26/01 02:44 PM.</EM></FONT></P>
Until the the Kyro II came along (and maybe afterward as well) CPU scaling has not been much of a factor, being greatly hampered by video card bandwidth problems. It seems to me that we are still at a point where even a Geforce 3 is not sufficient to play most games at 1600x1200x32. Maybe I am wrong here, but if this the case then we are stuck at 1280x1024x32 and 1024x768x32 much of time. Most video cards on the shelf today can play most games at the lower of these two, at least. So my question is, knowing that the just mentioned resolutions are what we will be using, which video cards will still be useful, going forward from here, and for how long?
Sure, it sounds as if cards without T&L would be left useless if and when the software industry switches to hardware T&L-only games. I contend that by the time that actually happens all current cards with the possible exception of the Geforce 3 will be comparatively too slow for the more complex games in either case. I also contend that there will be some portion of the software industry that won't completely forget about the installed base of video cards. These are owned by millions of potential, new-game customers, after all.
There will be a transition period. This transition period has already begun but it will be a very long time before even a Geforce256 will be completely useless. A better card will make the transition more enjoyable, and longer, and a Geforce 3 better still. The choice is up to the individual consumer. It always has been.
My point is this. There will never a time when everyone has to stop, all at once, and buy the latest and greatest video card. People can buy what ever they want, whenever they want, and use it for however long it feels comfortable to do so. Some people are still using Voodoo Graphics video cards and they are using them with new titles like the Need for Speed series. Those cards are about 5 years old now yet some are still using and enjoying them.
For the same reasoning, someone might buy a Kyro II today (or whenever it is actually released). Sure, it will be surpassed in a year or sooner by other budget cards but it will still play today's games quite well, and most the games during the coming year, and some of the games later on. Not only that, but it will play the games as well or better than some of the competing budget cards. This fact gives the Kyro II value and more value the longer one continues to use it. As I mentioned before, it depends on the individual.
<P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 04/26/01 02:44 PM.</EM></FONT></P>
noko
April 26, 2001 8:38:31 PM
Anonymous
a
b
U
Graphics card
April 26, 2001 9:53:16 PM
<<<<<To Teasy: As for the overdraw problem, I realised I kind of repeated myself with the pipeline statement after I posted, but was too tired to bother changing it before going to bed
Still, what I meant was, possibly a hardware solution that would force either front to back or back to front sorting of objects/layers/whatever to maximise use of the pipelines... Kyro's 8 is sufficient for maybe the next 2 years, but once games start coming out that beging using more passes, more textures, etc, then you will still need a way to reduce the overdraw... or am I getting myself confused again... Bugger! That's what you get when a vet has a hobby interest in computers
>>>>>
Yeah you are getting confused between overdraw and texture layers.
The Kyro II can put 8 texture layers on each pixel in one pass. Which means the pixels with upto 8 textures on each only have to be sent from the chip to ram once even though it only has one texture mapping unit on each pixel pipe.
Usually if a card has only one texture mapping unit on each pixel pipe and wants more then 1 texture layer on the pixel its working on it first puts 1 texture on the pixel being rendered and that pixel passes to the framebuffer in ram. Then in the next clock cycle the card needs to make a second pass for the same pixel to add another texture. This also means the poly being worked on needs to be resent over the AGP port to the card for each extra texture layer. So for 8 layers of textures on each pixel a normal card with one texture mapping unit per pipe would need to do what I just described 4 times (8 passes to the frambuffer altogether) which would kill memory bandwidth completely and could also clog the AGP port depending on how many polys are in the scene being rendered. Now the Kyro II does only have one texture mapping unit on each pipe but what is does is this. It adds the first texture layer to the pixel and instead of sending it out to ram it uses its small on-chip cache to hold that pixel inside the chip. Then in the next clock cycle it adds the second texture layer while still keeping the pixel inside the chip and then is can add another layer and another and so on until it has all 8 layers on the pixel (at this point the pixel has never left the chip) Then it sends the 8 layered pixel out to the framebuffer in ram only once, so there's no wasted memory bandwidth at all.
Now overdraw is a different thing. Overdraw is when a card renders pixels over pixels its already rendered in the framebuffer. The standard way of doing things is this. The card is sent each poly in turn and renderes every pixel. Since the card doesn't know which pixels will actually be sceen when the full frame is rendered and only checks after the pixel has been rendered (depth testing) this leads to lots of pixels being overdrawn. What the Kyro II does is first collect all the polys in the scene and then cuts the scene up into tiles. The tiles are then sent to a on-chip cache one tile at a time were the Kyro II checks which polys will be seen on the monitor when the frame is finished and fully renders that tile after each pixel has been depth tested. Once the tile has been fully rendered its sent to the framebuffer in ram and the next tile is sent to the chip. So because the Kyro II checks which pixels will be seen before rendering it never renders over pixels its already rendered. This saves a massive amount of fillrate and also saves a massive amount of memory bandwidth. Incase anyone wants to know the on-chip z-buffer can check 32 pixels in 1 clock cycle. Each tile is 32x16 pixels in size so each tile needs 16 clock cycles to test the whole tile. The Kyro II can render the scene at the same time as depth testing and since it can test 32 pixels for each clock cycle and can only render 2 pixels in each clock cycle the depth testing is 16 times fater then the rendering speed. So the depth testing doesn't slow down rendering. There doesn't need to be any optimisations for this method of rendering, it gets rid of 100% of overdraw. Most newish games have an overdraw average of at least 2 (this is an average and not a constant, because each frame will have different amounts of overdraw) which means that on average each pixel in the frames your seeing on the screen has been rewritten three times. A game like Serious Sam has an overdraw level significantly higher then that as does tribes 2 and many other games.
Warden:
<<<<<Are you sure about every card (except the GF3 of course) using software T&L in Aquanox? I understand your reasoning, in that since they don't have Vertex Shaders they can't do the native T&L that Aquanox is written for. But I am unclear if this requires ALL T&L duties to be done by the CPU. I have read rumors that the CPU only has to do part of the work (the vertex shader modifications) but that the hardwired T&L unit still does some or most of it. Do you have some more info on this?>>>>>
Anand and others Aquanox tests show that each card including the Kyro II is showing the same poly throughput. The only card that showed a much higher poly throughput was the Geforce 3. 3Dmark2001 uses a custom skinning technique for all the characters when being used with a DX7 HW T&L unit. On a Geforce 3 or a Kyro II it uses normal vertex shader skinning either done totally in hardware on the Geforce 3 or totally in software with the Kyro II. But when using a card with a DX7 HW T&L unit the skinning is done by the CPU and the HW T&L unit transforms and illuminates the skinned vertices.
Though this method is not used in Aquanox at this point. Whether they will use this method or not in future I have no idea.
Something I have to say about the Aquanox benchmark is that the final game will not be as slow as the benches shown. The benchmark (Aquamark) is made specifically to stress the graphics card and the final game will be allot faster. I got this info from Massive the people making the game when I signed an NDA to get a copy of the benchmark. Yep I had to sign an NDA just to get the bench, there really keeping this bench under wrapps at the moment.<P ID="edit"><FONT SIZE=-1><EM>Edited by Teasy on 04/26/01 10:41 PM.</EM></FONT></P>
Still, what I meant was, possibly a hardware solution that would force either front to back or back to front sorting of objects/layers/whatever to maximise use of the pipelines... Kyro's 8 is sufficient for maybe the next 2 years, but once games start coming out that beging using more passes, more textures, etc, then you will still need a way to reduce the overdraw... or am I getting myself confused again... Bugger! That's what you get when a vet has a hobby interest in computers
>>>>>Yeah you are getting confused between overdraw and texture layers.
The Kyro II can put 8 texture layers on each pixel in one pass. Which means the pixels with upto 8 textures on each only have to be sent from the chip to ram once even though it only has one texture mapping unit on each pixel pipe.
Usually if a card has only one texture mapping unit on each pixel pipe and wants more then 1 texture layer on the pixel its working on it first puts 1 texture on the pixel being rendered and that pixel passes to the framebuffer in ram. Then in the next clock cycle the card needs to make a second pass for the same pixel to add another texture. This also means the poly being worked on needs to be resent over the AGP port to the card for each extra texture layer. So for 8 layers of textures on each pixel a normal card with one texture mapping unit per pipe would need to do what I just described 4 times (8 passes to the frambuffer altogether) which would kill memory bandwidth completely and could also clog the AGP port depending on how many polys are in the scene being rendered. Now the Kyro II does only have one texture mapping unit on each pipe but what is does is this. It adds the first texture layer to the pixel and instead of sending it out to ram it uses its small on-chip cache to hold that pixel inside the chip. Then in the next clock cycle it adds the second texture layer while still keeping the pixel inside the chip and then is can add another layer and another and so on until it has all 8 layers on the pixel (at this point the pixel has never left the chip) Then it sends the 8 layered pixel out to the framebuffer in ram only once, so there's no wasted memory bandwidth at all.
Now overdraw is a different thing. Overdraw is when a card renders pixels over pixels its already rendered in the framebuffer. The standard way of doing things is this. The card is sent each poly in turn and renderes every pixel. Since the card doesn't know which pixels will actually be sceen when the full frame is rendered and only checks after the pixel has been rendered (depth testing) this leads to lots of pixels being overdrawn. What the Kyro II does is first collect all the polys in the scene and then cuts the scene up into tiles. The tiles are then sent to a on-chip cache one tile at a time were the Kyro II checks which polys will be seen on the monitor when the frame is finished and fully renders that tile after each pixel has been depth tested. Once the tile has been fully rendered its sent to the framebuffer in ram and the next tile is sent to the chip. So because the Kyro II checks which pixels will be seen before rendering it never renders over pixels its already rendered. This saves a massive amount of fillrate and also saves a massive amount of memory bandwidth. Incase anyone wants to know the on-chip z-buffer can check 32 pixels in 1 clock cycle. Each tile is 32x16 pixels in size so each tile needs 16 clock cycles to test the whole tile. The Kyro II can render the scene at the same time as depth testing and since it can test 32 pixels for each clock cycle and can only render 2 pixels in each clock cycle the depth testing is 16 times fater then the rendering speed. So the depth testing doesn't slow down rendering. There doesn't need to be any optimisations for this method of rendering, it gets rid of 100% of overdraw. Most newish games have an overdraw average of at least 2 (this is an average and not a constant, because each frame will have different amounts of overdraw) which means that on average each pixel in the frames your seeing on the screen has been rewritten three times. A game like Serious Sam has an overdraw level significantly higher then that as does tribes 2 and many other games.
Warden:
<<<<<Are you sure about every card (except the GF3 of course) using software T&L in Aquanox? I understand your reasoning, in that since they don't have Vertex Shaders they can't do the native T&L that Aquanox is written for. But I am unclear if this requires ALL T&L duties to be done by the CPU. I have read rumors that the CPU only has to do part of the work (the vertex shader modifications) but that the hardwired T&L unit still does some or most of it. Do you have some more info on this?>>>>>
Anand and others Aquanox tests show that each card including the Kyro II is showing the same poly throughput. The only card that showed a much higher poly throughput was the Geforce 3. 3Dmark2001 uses a custom skinning technique for all the characters when being used with a DX7 HW T&L unit. On a Geforce 3 or a Kyro II it uses normal vertex shader skinning either done totally in hardware on the Geforce 3 or totally in software with the Kyro II. But when using a card with a DX7 HW T&L unit the skinning is done by the CPU and the HW T&L unit transforms and illuminates the skinned vertices.
Though this method is not used in Aquanox at this point. Whether they will use this method or not in future I have no idea.
Something I have to say about the Aquanox benchmark is that the final game will not be as slow as the benches shown. The benchmark (Aquamark) is made specifically to stress the graphics card and the final game will be allot faster. I got this info from Massive the people making the game when I signed an NDA to get a copy of the benchmark. Yep I had to sign an NDA just to get the bench, there really keeping this bench under wrapps at the moment.<P ID="edit"><FONT SIZE=-1><EM>Edited by Teasy on 04/26/01 10:41 PM.</EM></FONT></P>
phsstpok
April 26, 2001 10:03:30 PM
HolyGrenade
April 26, 2001 11:17:35 PM
Quote:
Be careful with what you are saying here. The KYRO II is not in competition with the GeForce Ultra baord, even if it can beat out the Ultra in one or two games in benchmarks. The KYRO II is competing with the new GeForce 2 MX 400 boards and possibly the GTS boards and Radeon boards. That is what I was trying to tell them all along. I guess they'll believe it more coming from a Power VR Supporter. Also, I agree with the rest of your post too. Beating the MX (200 or 400) will not be vey difficult, but beating the GTS will depend very much on the design of the Game Engine and the CPU. Beating the Ultra will be difficult in any given situation.
---------
Further to my evolution piece, I believe that most of the new technology in the GeForce 3 will be taken on quicker, because It is possible to have a Game that uses Pixel & Vertex Shaders AND Fixed T&L functions, without huge amounts of duplication, even if the shaders are to perform the same functions as the T&L unit. Developing this is much easier than developing something that will run on both a T&L unit and on the CPU. That is why T&L take up has been very slow. Also, the X-Box factor will aid the take up of GeForce 3 Technology.
One part of the GeForce that will take a while to catch on is the tesselation. This is where a smooth (i.e. spline) 3D structure is broken into polygons. Normally this is done in the cpu. Doing this in the GPU will reduce the bandwidth by huge amounts as well as further accelerating the game and allowing far more complex scenes and far more "curves" in a game. But, it will need large amounts of duplication in the code.
<font color=red>"My name is Ozymandias, King of Kings:
Look on my works, ye Mighty, and dispair!"</font color=red>
Anonymous
a
b
U
Graphics card
April 27, 2001 12:01:32 AM
<<<<That is what I was trying to tell them all along. I guess they'll believe it more coming from a Power VR Supporter. Also, I agree with the rest of your post too. Beating the MX (200 or 400) will not be vey difficult, but beating the GTS will depend very much on the design of the Game Engine and the CPU. Beating the Ultra will be difficult in any given situation.>>>>
Not everyone in this threads been saying that the Kyro II's an Ultra competitor. In fact not many have been saying that AFAICS. I certainly haven't said that. I've always said its an MX and GTS competitor that sometimes beats the Ultra which is just a bonus. What I will say is when the price is taken into consideration its a better buy then the Ultra for people on a budget.
<P ID="edit"><FONT SIZE=-1><EM>Edited by Teasy on 04/26/01 08:03 PM.</EM></FONT></P>
Not everyone in this threads been saying that the Kyro II's an Ultra competitor. In fact not many have been saying that AFAICS. I certainly haven't said that. I've always said its an MX and GTS competitor that sometimes beats the Ultra which is just a bonus. What I will say is when the price is taken into consideration its a better buy then the Ultra for people on a budget.
<P ID="edit"><FONT SIZE=-1><EM>Edited by Teasy on 04/26/01 08:03 PM.</EM></FONT></P>
OzzieBloke
April 27, 2001 1:50:27 AM
Ah, thanks mate
You could get a nifty job as a professor or something of the sort in a university with that clarity of explanation... good to see on a discussion board
Well, I'm clear as a bell now on that... my only thought is, will tile-based rendering always be applicable, or useful? I seem to remember somewhere that as polygon counts begin getting higher, the benefit of tile rendering starts to decrease... personally, I am not sure how that is supposed to work, but it has been floating around somewhere.
Here's hoping Kyro 3 can compete with the big-boys. That would get the market going again.
Cow with legs spread wide either dead or playing 'cello.
You could get a nifty job as a professor or something of the sort in a university with that clarity of explanation... good to see on a discussion board
Well, I'm clear as a bell now on that... my only thought is, will tile-based rendering always be applicable, or useful? I seem to remember somewhere that as polygon counts begin getting higher, the benefit of tile rendering starts to decrease... personally, I am not sure how that is supposed to work, but it has been floating around somewhere.
Here's hoping Kyro 3 can compete with the big-boys. That would get the market going again.
Cow with legs spread wide either dead or playing 'cello.
kurokaze
April 27, 2001 2:07:46 AM
OzzieBloke
April 27, 2001 2:19:49 AM
Anonymous
a
b
U
Graphics card
April 27, 2001 2:21:48 AM
I want to start giving more freedom to my imagination
I know that they (another videologic division "metagence") are going to introduce DSP (processors) very soon, these DSP can be used in a vast of applications (allmoust everything, from digital radios to graphics). DSP probably means digital signal processors... Not sure
)
<A HREF="http://www.metagence.com/" target="_new">http://www.metagence.com/</A>
These DSP can be put inside a graphic chip(we will have one chip in a board, no multiple chips solutions... ), because 3D is massively parallel it will help to have lots of dsp working in parallel,
these "Imagination" DSP will be programmable they could modify a feature set of a given board via drivers...
I think that kyro 3 will be a kyro 2 with more pipelines and with many DSP cores ...
So kyro 3 will have all the directx8 feature...
maybe even directx9 , directx 10 etc...
but this is only my imagination working...
Damn I make to many error in english... Portuguese language is sometimes in reverse, yap gramatically it's different from others latin languages, very different from germanic languages.
<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/26/01 10:49 PM.</EM></FONT></P>
I know that they (another videologic division "metagence") are going to introduce DSP (processors) very soon, these DSP can be used in a vast of applications (allmoust everything, from digital radios to graphics). DSP probably means digital signal processors... Not sure
)<A HREF="http://www.metagence.com/" target="_new">http://www.metagence.com/</A>
These DSP can be put inside a graphic chip(we will have one chip in a board, no multiple chips solutions... ), because 3D is massively parallel it will help to have lots of dsp working in parallel,
these "Imagination" DSP will be programmable they could modify a feature set of a given board via drivers...
I think that kyro 3 will be a kyro 2 with more pipelines and with many DSP cores ...
So kyro 3 will have all the directx8 feature...
maybe even directx9 , directx 10 etc...
but this is only my imagination working...
Damn I make to many error in english... Portuguese language is sometimes in reverse, yap gramatically it's different from others latin languages, very different from germanic languages.
<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/26/01 10:49 PM.</EM></FONT></P>
Anonymous
a
b
U
Graphics card
April 27, 2001 2:26:02 AM
what ?
go here:
<A HREF="http://www.xbitlabs.com/news/" target="_new">http://www.xbitlabs.com/news/</A>
<font color=red> <i>"Although the new STM’s chip, KYRO II, can boast a really good price-to-performance ratio, the manufacturers don’t hurry to announce anything built on this chip. Alongside with Hercules, STM’s partner #1, there is currently only one more company, which is about to manufacture graphics cards based on KYRO II - InnoVision. The behavior of other companies is just puzzling. For example, PowerColor refused to launch KYRO II based products shortly after it had actually announced them. Creative was talked to support the new solution of STM, but now it denies intentions like that.
What’s the matter? As we have learned from a number of Taiwanese graphics cards manufacturers, NVIDIA still produces an indirect pressure to make the companies use only its chips. And the manufacturers have nothing to say against. NVIDIA offers a full line of graphics chips ranging from the Low-End to the High-End solutions. If the manufacturers risk to seek for another shipper, they are likely to find no alternative. GeForce2 MX series has some equivalents, while the elder GeForce2 and GeForce3 are unique. Hereat most graphics cards manufacturers have to indulge all NVIDIA’s whims. Hercules doesn’t depend much on NVIDIA. Firstly, this company doesn’t concentrate on graphics cards only, producing a wide range of computer hardware. Secondly, it occupies a big share of the European graphics card market and NVIDIA can’t quarrel with Hercules that easily. So, Hercules has a good chance to get good profits from KYRO II cards, since their manufacturing costs are lower than those of GeForce2 MX cards. Yet it doesn’t work for other manufacturers, that are humble to NVIDIA’s will. But the situation can be twisted by the brand new STM chip of the next generation, KYRO III. Presumably, compared with KYRO II, it is rumored to give a 150-200% performance gain. Should the new chip arrive in time, within the third quarter, STM will be armed with a complete line of products to tilt someone else in its favor. For the time being, NVIDIA enjoys its supremacy over the graphics chipsets sector."
</i> </font color=red>
Nvidia is indeed giving a big push on evolution!!
Preventing others to compete thus enabling the possiblity to sell crap for lots of $$$.
go here:
<A HREF="http://www.xbitlabs.com/news/" target="_new">http://www.xbitlabs.com/news/</A>
<font color=red> <i>"Although the new STM’s chip, KYRO II, can boast a really good price-to-performance ratio, the manufacturers don’t hurry to announce anything built on this chip. Alongside with Hercules, STM’s partner #1, there is currently only one more company, which is about to manufacture graphics cards based on KYRO II - InnoVision. The behavior of other companies is just puzzling. For example, PowerColor refused to launch KYRO II based products shortly after it had actually announced them. Creative was talked to support the new solution of STM, but now it denies intentions like that.
What’s the matter? As we have learned from a number of Taiwanese graphics cards manufacturers, NVIDIA still produces an indirect pressure to make the companies use only its chips. And the manufacturers have nothing to say against. NVIDIA offers a full line of graphics chips ranging from the Low-End to the High-End solutions. If the manufacturers risk to seek for another shipper, they are likely to find no alternative. GeForce2 MX series has some equivalents, while the elder GeForce2 and GeForce3 are unique. Hereat most graphics cards manufacturers have to indulge all NVIDIA’s whims. Hercules doesn’t depend much on NVIDIA. Firstly, this company doesn’t concentrate on graphics cards only, producing a wide range of computer hardware. Secondly, it occupies a big share of the European graphics card market and NVIDIA can’t quarrel with Hercules that easily. So, Hercules has a good chance to get good profits from KYRO II cards, since their manufacturing costs are lower than those of GeForce2 MX cards. Yet it doesn’t work for other manufacturers, that are humble to NVIDIA’s will. But the situation can be twisted by the brand new STM chip of the next generation, KYRO III. Presumably, compared with KYRO II, it is rumored to give a 150-200% performance gain. Should the new chip arrive in time, within the third quarter, STM will be armed with a complete line of products to tilt someone else in its favor. For the time being, NVIDIA enjoys its supremacy over the graphics chipsets sector."
</i> </font color=red>
Nvidia is indeed giving a big push on evolution!!
Preventing others to compete thus enabling the possiblity to sell crap for lots of $$$.
noko
April 27, 2001 3:10:27 AM
I thought of something, a lonely pixel is going to be shaded 6 times (6 texel layers) in a GF2 card. The pixel churns(?) along the GF2 pipeline begging to be colorized but is let down due to the limit of two textures per pass. The pixel says, "oh well I will wait in line again but first I make a pit stop in the frame buffer". While in the frame buffer the pixel is escorted again into the inner chambers and is dressed (setup) again to make the long path. The path of color, the pixel making line. Still only two more textures are added, "Oh I am right here, finish me now, I don't want to wait any longer", says the angry little pixel. Then a deep voice responds "Send him back into the frame buffer". Finally the last escort occurs, a pleased and mighty shinny pixel to be maybe the best ever to dazzle the old master, but wait???? Zoro the Z king comes along and slashes him to pieces. Poor little pixel, poor little pixel, (sob, sob, cry, cry). Feel sorry for all those mighty pixels that are never seen or heard from again.
A story by a saved lost pixel.
A story by a saved lost pixel.
Anonymous
a
b
U
Graphics card
April 27, 2001 3:22:18 AM
Allot of people think this thread (including part 1) is one of the best threads in a while here at Toms and there's allot of interesting info in this thread. The technology being discussed here is something that quite a few people didn't even know existed so its quite an interesting subject compared to most.
OzzieBloke, thanks thats very kind of you I'm glade my explanation was clear. I didn't know a thing about graphics cards a year ago (and I really mean not a thing, I was the sort of person that bought a card if it had more ram and thought that was all that mattered :smile: ) so I think I'm doing ok just from studying the Beyond3d forums and a few articles and reviews.
Powervr2:
<<<<These DSP can be put inside a graphic chip(we will have one chip in a board, no multiple chips solutions... ), because 3D is massively parallel it will help to have lots of dsp working in parallel>>>>
For a nice little article on this subject go here: http://pvr.gamestats.com/Dynamic/Standard.shtml?/articl...
Its interesting stuff. Apparently metagence (the name for the DSP tech) would effectively allow IMGTEC to design say two (or more) chips like Kyro II and say one (or more) HW T&L unit all into 1 chip which would be controlled by the DSP. The DSP would be completely programable allowing it to intelegently control all three (or more) cores at the same time varying the resources each core gets depending on the task at hand allowing for incredible efficiency (which I suppose is what PowerVR chips are all about really).
I'd also just like to add that Hercules have just put all there Prophet 4000XT(Kyro 1)/Prophet 4500(Kyro II) drivers on there public FTP. Including W98/ME, W2k, WNT4 so there obviously ready to release the cards very soon. Toms can also now change his review so it no longer says there's no Win2k or NT drivers for the prophet 4500 too.
OzzieBloke, thanks thats very kind of you I'm glade my explanation was clear. I didn't know a thing about graphics cards a year ago (and I really mean not a thing, I was the sort of person that bought a card if it had more ram and thought that was all that mattered :smile: ) so I think I'm doing ok just from studying the Beyond3d forums and a few articles and reviews.
Powervr2:
<<<<These DSP can be put inside a graphic chip(we will have one chip in a board, no multiple chips solutions... ), because 3D is massively parallel it will help to have lots of dsp working in parallel>>>>
For a nice little article on this subject go here: http://pvr.gamestats.com/Dynamic/Standard.shtml?/articl...
Its interesting stuff. Apparently metagence (the name for the DSP tech) would effectively allow IMGTEC to design say two (or more) chips like Kyro II and say one (or more) HW T&L unit all into 1 chip which would be controlled by the DSP. The DSP would be completely programable allowing it to intelegently control all three (or more) cores at the same time varying the resources each core gets depending on the task at hand allowing for incredible efficiency (which I suppose is what PowerVR chips are all about really).
I'd also just like to add that Hercules have just put all there Prophet 4000XT(Kyro 1)/Prophet 4500(Kyro II) drivers on there public FTP. Including W98/ME, W2k, WNT4 so there obviously ready to release the cards very soon. Toms can also now change his review so it no longer says there's no Win2k or NT drivers for the prophet 4500 too.
Anonymous
a
b
U
Graphics card
April 27, 2001 3:56:08 AM
Yep all that hard work that little pixel went through to look its best only to be discarded, so sad. It must feel so used :frown:
Those are the horrors of post rendering HSR and clumbsy texturing caused by traditional immediate mode rendering that so many poor little pixels have to suffer every day. Please....give £1 or whatever you can, to The Home For Discarded Pixels, PO BOX 84959. Just £1 or $1.4 can feed a family of 786432 32bit (a average 1024x768x32 pixel family) pixels for an entire week. Just £85 can buy a Kyro II utilising intelligent deferred tile based rendering which would saves so many future pixels from this terrible indignified existence. Please don't let these tiny lost souls be on your conscience......bring some happiness into a life today.
Those are the horrors of post rendering HSR and clumbsy texturing caused by traditional immediate mode rendering that so many poor little pixels have to suffer every day. Please....give £1 or whatever you can, to The Home For Discarded Pixels, PO BOX 84959. Just £1 or $1.4 can feed a family of 786432 32bit (a average 1024x768x32 pixel family) pixels for an entire week. Just £85 can buy a Kyro II utilising intelligent deferred tile based rendering which would saves so many future pixels from this terrible indignified existence. Please don't let these tiny lost souls be on your conscience......bring some happiness into a life today.
Anonymous
a
b
U
Graphics card
April 27, 2001 6:25:39 AM
thanks for the <A HREF="http://pvr.gamestats.com/Dynamic/Standard.shtml?/articl..." target="_new"> Link </A>teasy...
<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/27/01 02:29 AM.</EM></FONT></P>
<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/27/01 02:29 AM.</EM></FONT></P>
Negaverse23
April 27, 2001 7:08:35 AM
<i><<go here:
<A HREF="http://www.xbitlabs.com/news/" target="_new">http://www.xbitlabs.com/news/</A>>></i>
I couldn't resist... <font color=red>"Hercules doesn’t depend much on NVIDIA. Firstly, this company doesn’t concentrate on graphics cards only, producing a wide range of computer hardware."</font color=red>
LMAO! Creative is a Giant compared to Hercules! They also don't heavily depend on NVIDIA to exist. I can build half a PC with Creative products alone.
<font color=red><<...For example, PowerColor refused to launch KYRO II based products shortly after it had actually announced them.>></font color=red>
I too think Nvidia influenced PowerColor to not produce video cards based on the Kyro II. It's all about money in PowerColor's eye right now. I guess Nvidia made some kind of offer that they couldn't refuse.
=
<font color=green>Are you just lazy or incredibly stupid? -<i>King of the Earth</i></font color=green>
<A HREF="http://www.xbitlabs.com/news/" target="_new">http://www.xbitlabs.com/news/</A>>></i>
I couldn't resist... <font color=red>"Hercules doesn’t depend much on NVIDIA. Firstly, this company doesn’t concentrate on graphics cards only, producing a wide range of computer hardware."</font color=red>
LMAO! Creative is a Giant compared to Hercules! They also don't heavily depend on NVIDIA to exist. I can build half a PC with Creative products alone.
<font color=red><<...For example, PowerColor refused to launch KYRO II based products shortly after it had actually announced them.>></font color=red>
I too think Nvidia influenced PowerColor to not produce video cards based on the Kyro II. It's all about money in PowerColor's eye right now. I guess Nvidia made some kind of offer that they couldn't refuse.
=
<font color=green>Are you just lazy or incredibly stupid? -<i>King of the Earth</i></font color=green>
GrahamD
April 27, 2001 10:08:17 AM
On the subject of how long before games start to depend on T&L. Developers are now working on X-Box games and to get the most out of this console they will make the most of T&L. No doubt, many of the same games will be ported to PC and will only look as good with GeForce 3 equivalent graphics cards. The problem to date is that not enough people have had T&L hardware to drive the developers. X-box will change this, and probably quite rapidly.
Off topic I know (but related), I wonder how X-Box will affect AMD, if developers start spending more time optimising for SSE to get best X-Box performance?
Off topic I know (but related), I wonder how X-Box will affect AMD, if developers start spending more time optimising for SSE to get best X-Box performance?
Anonymous
a
b
U
Graphics card
April 27, 2001 10:46:05 AM
I'm pretty sure (not certain but this is what I hear from allot of people) X-Box doesn't have a DX7 HW T&L unit like the Geforce 3 but instead it has 2 DX8 vertex shaders (DX8 vertex shaders are more flexible then DX7 HW T&L but slower, but having 2 DX8 vertex shaders allows for programability at the same speed as hardwired DX7 HW T&L). So yes X-Box games will be written with HW T&L in mind and HW T&L will catch on faster because of the X-Box. But it'll be DX8 programable HW T&L that catches on because of the X-Box not DX7 HW T&L like the Geforce 2 and Radeon have.
<P ID="edit"><FONT SIZE=-1><EM>Edited by Teasy on 04/27/01 06:48 AM.</EM></FONT></P>
<P ID="edit"><FONT SIZE=-1><EM>Edited by Teasy on 04/27/01 06:48 AM.</EM></FONT></P>
Anonymous
a
b
U
Graphics card
April 27, 2001 1:01:12 PM
PowerColor will in fact be releasing a KYRO II board. Well the company will be releasing the board, but NVIDIA has FORCED THEM to release the board using a differnt trade name other than PowerColor. What that name will be is anybody's guess at the moment. It is really sad though to see NVIDIA bully OEM's about releasing products from other companies that compete with their products. It is funny how this compnay has changed in the last couple of years. I think they have been visiting Bill Gates too much lately.
noko
April 27, 2001 1:24:13 PM
That was good, I almost died laughing
. I wished other people had more humor around here. It appears that nVidia and probably ATI has more to fear about the Kyro2 or more exactly the refined technology of TBR then being let onto. First TBR allows for much cheaper cards with great performance. I can see the KyroII becoming very attractive for card builders due to better profit margins not to mention of OEMs. Second for nVidia and ATI to ditch their hardware designs and starting from scratch voiding all their driver work would set them back years. Yet those inefficient hardware designs are driving the video card prices sky high to begin with.
<b>These are my best guesses or predictions:</b>
ATI, introduction of the Radeon SE which will show faster benchmarks in most benchmarks except for FSAA over the Kyro2. Nvidia already has two cards that shows better performance on most benchmarks. What this shows is that the Kyro2 is not the best performing card in the publics eyes which means the companies that cater to the public as in OEMs and card manufacturers will be pressured into using ATI/Nvidia cards due to a general public perception. Even though what the OEMs will be installing are stripped down versions of Radeons or GTS MXs that don't even perform as well as the Kyro2, the general perception of ATI or nVidia being better does have influence. I think what was mentioned earlier is politics does play a part. Just think, how many GF2 Ultras where sold? Very little compared to the rest of nVidia lines yet the perception that nVidia is better helped to sell many MX's and TNT2's (to OEMs).
Second, rapid price reductions, mostly to get rid of merchandise in stock but also to blur any new designs out and to slow down the introduction rate meaning less of a fan base and voice in the market. Just think of 10 million Kyro2 owners praising and recommending a Kyro2 not to mentionthe KyroHolics that will spring from this. That in itself establishes a brand name and promotes many other people in buying a Kyro2. Resulting in loss sells to the other folks making video chips(a.k.a. ATI, nVidia). Look at the so called ATIist and nVidiots that are on most computer hardware boards, regardless of facts they just promotes one company worshipping whatever the company produces.
Lastly a earlier introduction of the RadeonII and nVidia upgrade to the GF3 I think will result.
The kyro was contained pretty much by this method and origianlly the Kyro had driver problems which looks solved now. So Imagination does have some work cut out for them but if they perservere and introduce a killer Kyro the technology I think will catch on big. ATI may have some stuff up their sleaves, for one thing the Radeon drivers can do tiles with textures, buffers, AGP textures with the onboard ram. Now moving some of that on the chip and you will see something similar to the TBR of the Kyro.
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/27/01 09:58 AM.</EM></FONT></P>
. I wished other people had more humor around here. It appears that nVidia and probably ATI has more to fear about the Kyro2 or more exactly the refined technology of TBR then being let onto. First TBR allows for much cheaper cards with great performance. I can see the KyroII becoming very attractive for card builders due to better profit margins not to mention of OEMs. Second for nVidia and ATI to ditch their hardware designs and starting from scratch voiding all their driver work would set them back years. Yet those inefficient hardware designs are driving the video card prices sky high to begin with.<b>These are my best guesses or predictions:</b>
ATI, introduction of the Radeon SE which will show faster benchmarks in most benchmarks except for FSAA over the Kyro2. Nvidia already has two cards that shows better performance on most benchmarks. What this shows is that the Kyro2 is not the best performing card in the publics eyes which means the companies that cater to the public as in OEMs and card manufacturers will be pressured into using ATI/Nvidia cards due to a general public perception. Even though what the OEMs will be installing are stripped down versions of Radeons or GTS MXs that don't even perform as well as the Kyro2, the general perception of ATI or nVidia being better does have influence. I think what was mentioned earlier is politics does play a part. Just think, how many GF2 Ultras where sold? Very little compared to the rest of nVidia lines yet the perception that nVidia is better helped to sell many MX's and TNT2's (to OEMs).
Second, rapid price reductions, mostly to get rid of merchandise in stock but also to blur any new designs out and to slow down the introduction rate meaning less of a fan base and voice in the market. Just think of 10 million Kyro2 owners praising and recommending a Kyro2 not to mentionthe KyroHolics that will spring from this. That in itself establishes a brand name and promotes many other people in buying a Kyro2. Resulting in loss sells to the other folks making video chips(a.k.a. ATI, nVidia). Look at the so called ATIist and nVidiots that are on most computer hardware boards, regardless of facts they just promotes one company worshipping whatever the company produces.
Lastly a earlier introduction of the RadeonII and nVidia upgrade to the GF3 I think will result.
The kyro was contained pretty much by this method and origianlly the Kyro had driver problems which looks solved now. So Imagination does have some work cut out for them but if they perservere and introduce a killer Kyro the technology I think will catch on big. ATI may have some stuff up their sleaves, for one thing the Radeon drivers can do tiles with textures, buffers, AGP textures with the onboard ram. Now moving some of that on the chip and you will see something similar to the TBR of the Kyro.
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/27/01 09:58 AM.</EM></FONT></P>
noko
April 27, 2001 2:08:01 PM
Here is a review of a RadeonSE:
<A HREF="http://www.pcpop.com/info/20010426/" target="_new">http://www.pcpop.com/info/20010426/</A>
It is in Chinese so unless you read Chinese looking at the test graphs and PowerStrip data gives you an idea of whats coming up. Note: it can overclock to 250mhz from its normal 230mhz speed.
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/27/01 10:56 AM.</EM></FONT></P>
<A HREF="http://www.pcpop.com/info/20010426/" target="_new">http://www.pcpop.com/info/20010426/</A>
It is in Chinese so unless you read Chinese looking at the test graphs and PowerStrip data gives you an idea of whats coming up. Note: it can overclock to 250mhz from its normal 230mhz speed.
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/27/01 10:56 AM.</EM></FONT></P>
Anonymous
a
b
U
Graphics card
April 27, 2001 3:20:18 PM
I see a whole lot of posts on various message boards with people saying they love the idea of the Kyro II technology and a small lesser known company coming out with such a nice card, and stuff like "did anyone see this card coming?, its awesome I'll be getting one as soon as their released". IMO there will be quite high sales for Kyro II in the retail market and it will get a good unser base which will help the next PowerVR card. I don't really think that the reason the Kyro 1 didn't do that well was because of politics from Nvidia or ATI. Nvidia AFAIK really didn't bother with the Kyro 1 and didn't see it as a threat. For proof of this look at the Powercolor Evil Kyro. If Nvidia had seen the Kyro 1 as a threat they'd have done the same thing as there doing now with the Kyro II by persuading Powercolor not to use there name for the card and producing scanderlous PDF files and bullying board makers like INNO3d not to make the card, which AFAIK they didn't do with the Kyro 1. The reason the Kyro 1 didn't do that well was it didn't have performance that would reach out from a review and smack you in the face like the Kyro II has. It was a card that started off slower then an MX in low res and slightly faster at high res and 32bit because of unoptimised drivers. Then after most of the reviews had already been done loads of driver improvments were made and now its quite a bit faster then the MX but not many people saw this. Also it cost the same to preduce as the Kyro II so it wasn't cheaper then the MX when first released (£115 here in the U.K at release for a plain 32mb Kyro 1 compared to £85 at release for the Kyro II 32mb TV-Out). Now the Kyro II comes out for a budget price and convincingly sweeps the MX aside by as much as 70%+ at high res and challanges the GTS in 16bit and beats it in 32bit and beats the Radeon DDR too and also has great FSAA. Its a very impressive card and really leaps out at you from a benchmark. So the Kyro II will be allot more succesful then the Kyro 1. All its publicity about its great performance from sites like Anand has already seen to that. You should have seen the Anand forum after that first Kyro II preview. It was in a state of shock and people couldn't beleive what this card, that most of them had never heard of until that preview, could do. Allot of those people will be buying a Kyro II. I agree though the Kyro II won't be some massive hit but it will build on the minor success (compared to the Neon250) of the Kyro 1 card and make a user base and build awarness ready for NP2 and NP3 (Kyro 1 and 2 were NP1, that was there code name).
Also the Radeon SE probably won't be much under $300 ($250 online) at release if its released at all (at the moment its still up in the air whether ATI will release it or not), it certainy won't be in the Kyro II's price range.
Also the Radeon SE probably won't be much under $300 ($250 online) at release if its released at all (at the moment its still up in the air whether ATI will release it or not), it certainy won't be in the Kyro II's price range.
noko
April 27, 2001 4:19:02 PM
The RadeonSE as far as I see it is a image card. Meaning something that performs well but probably won't sell that many. Kinda like the GF2 Ultra as a status symbol for a company. Anyways we will soon see how happy people are or not with the Kyro2. When it will be placed in 10s of thousands of machines and conflicts start to arise. Hopefully not. The RadeonSE may still be canned and the Radeon2 will shortly come out instead. Still the price is the driving factor in this and the Kyro2 has that potential advantage. The Kyro2 is only a chip for a video card which does have limitations like all the other video chips produced. Plus it is competing with the GF2s, RadeonDDRs to start off with. ATI does claim games around christmas time to take advantage of the 3rd texture unit of the Radeon. We will see by then how all of this stacks up.
Plus do you really believe the DX8 bug on making the cpu write to textures on the Kyro2 will dramatically increase the Kyro2 benchmarks when fixed? Greater than 30%? Or more likely a 10% increase. Which would mean that it would continue to score the lowest in 3dMark2001 benchmark a benchmark possibly indicating future game potential. A hardware T&L optimize benchmark.
Plus the thousands and thousands of hardware configurations and different degrees of Operating system clutter and updates all that could cause significant KyroII problems. This could get pretty ugly. So we will all be eager to see how effective Imagination updates the drivers.
Allot of games havn't been tested on the Kyro2 but shortly they will be. What problems are lurking around the corner for the thousands and thousands of games available today?
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/27/01 01:13 PM.</EM></FONT></P>
Plus do you really believe the DX8 bug on making the cpu write to textures on the Kyro2 will dramatically increase the Kyro2 benchmarks when fixed? Greater than 30%? Or more likely a 10% increase. Which would mean that it would continue to score the lowest in 3dMark2001 benchmark a benchmark possibly indicating future game potential. A hardware T&L optimize benchmark.
Plus the thousands and thousands of hardware configurations and different degrees of Operating system clutter and updates all that could cause significant KyroII problems. This could get pretty ugly. So we will all be eager to see how effective Imagination updates the drivers.
Allot of games havn't been tested on the Kyro2 but shortly they will be. What problems are lurking around the corner for the thousands and thousands of games available today?
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/27/01 01:13 PM.</EM></FONT></P>
Anonymous
a
b
U
Graphics card
April 27, 2001 5:57:27 PM
Well in Giants, were rendering into textures is used for advanced water effects and shadows (amongs other things), the framerate drops by 10fps when those 2 features are enabled with DX8. I haven't got anyway of knowing exactly what increase the Kyro II will get in all games because different games use different features but what I do know is 3dMark2001 uses rendering into textures in low detail tests and then adds dynamic shadows (again more rendering into textures) for the high detail tests so I can see a high increase in speed when the bugs fixed. Also lets face it the Kyro II is closer to the GTS in benchmarks seen so far then the Radeon DDR. Any speed boost will put it close to the GTS in 16bit (which is were its still a little slower then the GTS, in 32bit its faster).
Serious Sam already uses all 3 TMU's on the Radeon cards. It fully supports the full 3 TMU's per pipe on the Radeon and it still doesn't impress in that game falling behind the GTS and well behind the Kyro II. Don't get me wrong I like the Radeon and I have a Radeon myself (its not in my system at the moment though) but its not quite upto Kyro II speeds in most benches out so far.
On 3dmark2001 ,when any game is anything like a 3dmark series test I'll then beleive that non-hw T&L cards are in trouble. I've said it before, the high detail tests in 3dmark2000 are still much much slower on my PC then any game I've ever played (and I play all the new big games). And why? because 3dmark2000 is supposed to run like crap thats why. Think about it, does 3dmark2000's high detail helicopter test look anywhere near as good as most games out now?.....it looks rubbish compared to Giants so that should show you the difference between a benchmark and a game. A game developer wants to make their game look great and run great to sell copies of the game, and a benchmark developer wants there bench to look great and run like crap on anything but the best hardware. The engine of Max Payne might be in that test but its not Max Payne. Its completely made by madonion and its in no way an indication of the real game performance of Max Payne. For a start it deliberatly uses an incredibly wastful reflective floor that doubles the poly count in the high detail test and also doubles the fillrate and bandwidth hit that the low detail test takes, appart from the reflective floor nothing much is added to the high detail test over the low detail one. Do you think Remedy will make a game that uses reflective floors all over the place which makes the speed of the game slow down by half?, if they do I'll turn the reflective floor off and double the framerate and so will most others because the reflective floor simply isn't worth the trouble.
<<<<<Allot of games havn't been tested on the Kyro2 but shortly they will be. What problems are lucking around the corner for the thousands and thousands of games available today?>>>>>
I think your really underestimating the amount of people that have Kyro cards and report problems to all the PowerVR forums. I myself have tested close to 60 games on this card, with 3 problems and those problems were only when TC was forced (one was the Giants TC problem which is now fixed and I haven't tried Rune or Dues Ex to see if there fixed yet). Over the last months I've had 4 new drivers from IMGTEC. They don't release all of these to the public like Nvidia but they wait until a decent amount of problems are reported as fixed by webmasters like me and PVR-REV and Paraknowya etc and then they release a much improved driver to the public. If I don't find a problem with the drivers then there good drivers because I don't just play standard PC games. I play PSX emulators, Arcade emulators, N64 emulators, and allsorts of stuff and Kyro does fine with all of them, this shows a general maturity in the drivers. When I got my Radeon I installed it (with latest drivers at that time) and I had allot more problems with it then I've had with the Kyro. In conclusion IMO the Kyro II drivers I'm currently using in my Kyro 1 (which are now publicly available) are very stable and much more stable then my Radeon. I don't have a problem with any games I play, I download allot of warez so I play allot of games. The only problem I have is with DX8 thats it. The 2d stability is great too. Anyway as you say we'll all see the quality of the drivers soon when Kyro II is released, but I'll say with absolute certainty that there will be no big problem with these drivers when more people start getting the Kyro II.
Something that lots of those PCI only mobo owners out there might be happy to hear is a Herc 4500 PCI will be released along with the AGP version, so all those people stuck with only the option of a PCI MX now have a better faster option.
Check it out here: http://www.acidhardware.com/reviews/3dprophet4500/index...
Serious Sam already uses all 3 TMU's on the Radeon cards. It fully supports the full 3 TMU's per pipe on the Radeon and it still doesn't impress in that game falling behind the GTS and well behind the Kyro II. Don't get me wrong I like the Radeon and I have a Radeon myself (its not in my system at the moment though) but its not quite upto Kyro II speeds in most benches out so far.
On 3dmark2001 ,when any game is anything like a 3dmark series test I'll then beleive that non-hw T&L cards are in trouble. I've said it before, the high detail tests in 3dmark2000 are still much much slower on my PC then any game I've ever played (and I play all the new big games). And why? because 3dmark2000 is supposed to run like crap thats why. Think about it, does 3dmark2000's high detail helicopter test look anywhere near as good as most games out now?.....it looks rubbish compared to Giants so that should show you the difference between a benchmark and a game. A game developer wants to make their game look great and run great to sell copies of the game, and a benchmark developer wants there bench to look great and run like crap on anything but the best hardware. The engine of Max Payne might be in that test but its not Max Payne. Its completely made by madonion and its in no way an indication of the real game performance of Max Payne. For a start it deliberatly uses an incredibly wastful reflective floor that doubles the poly count in the high detail test and also doubles the fillrate and bandwidth hit that the low detail test takes, appart from the reflective floor nothing much is added to the high detail test over the low detail one. Do you think Remedy will make a game that uses reflective floors all over the place which makes the speed of the game slow down by half?, if they do I'll turn the reflective floor off and double the framerate and so will most others because the reflective floor simply isn't worth the trouble.
<<<<<Allot of games havn't been tested on the Kyro2 but shortly they will be. What problems are lucking around the corner for the thousands and thousands of games available today?>>>>>
I think your really underestimating the amount of people that have Kyro cards and report problems to all the PowerVR forums. I myself have tested close to 60 games on this card, with 3 problems and those problems were only when TC was forced (one was the Giants TC problem which is now fixed and I haven't tried Rune or Dues Ex to see if there fixed yet). Over the last months I've had 4 new drivers from IMGTEC. They don't release all of these to the public like Nvidia but they wait until a decent amount of problems are reported as fixed by webmasters like me and PVR-REV and Paraknowya etc and then they release a much improved driver to the public. If I don't find a problem with the drivers then there good drivers because I don't just play standard PC games. I play PSX emulators, Arcade emulators, N64 emulators, and allsorts of stuff and Kyro does fine with all of them, this shows a general maturity in the drivers. When I got my Radeon I installed it (with latest drivers at that time) and I had allot more problems with it then I've had with the Kyro. In conclusion IMO the Kyro II drivers I'm currently using in my Kyro 1 (which are now publicly available) are very stable and much more stable then my Radeon. I don't have a problem with any games I play, I download allot of warez so I play allot of games. The only problem I have is with DX8 thats it. The 2d stability is great too. Anyway as you say we'll all see the quality of the drivers soon when Kyro II is released, but I'll say with absolute certainty that there will be no big problem with these drivers when more people start getting the Kyro II.
Something that lots of those PCI only mobo owners out there might be happy to hear is a Herc 4500 PCI will be released along with the AGP version, so all those people stuck with only the option of a PCI MX now have a better faster option.
Check it out here: http://www.acidhardware.com/reviews/3dprophet4500/index...
Anonymous
a
b
U
Graphics card
April 27, 2001 8:08:31 PM
<font color=red>"Plus do you really believe the DX8 bug on making the cpu write to textures on the Kyro2 will dramatically increase the Kyro2 benchmarks when fixed? Greater than 30%? Or more likely a 10% increase"</font color=red>
Hum... I think that the increase in performance will be greater on lower end cpus ...
a greater boost on a duron 700 than a athlon 1,2 ghz...
Hum... I think that the increase in performance will be greater on lower end cpus ...
a greater boost on a duron 700 than a athlon 1,2 ghz...
Anonymous
a
b
U
Graphics card
April 27, 2001 8:45:22 PM
I found this interview with a man of nvidia here is the link
<A HREF="http://www.hwzone.it/html/text.php?id=205" target="_new">http://www.hwzone.it/html/text.php?id=205</A>
<font color=red> [Marco] What about ATi and Kyro boards? Do you see them as challengers for the GeForce 3?
[Dan] Not at all! We think that even a GeForce 2 GTS can do better than those product, so GeForce 3 is really way ahead for them </font color=red>
yah right if you play at 16 bits then geforce 2 gts is a little better but not at 32 bits or higher resolutions or even FSAA, if geforce 2 gts will do better than kyro 2 in the next months (in sales) then being evil pays off...
<font color=red>. Though, we see them asOne may view them as competitors for our MX range of boards but we feel that we have the best solutions at all price points. Especially with our new line of faster MX products...</font color=red>
your new lines of faster Mx (faster???? lol)
<font color=red>[Marco] The present prices for a top-level 3d card are much higher when compared to those from some years ago. How will your price politics change, now that there are not many competitors left, at the same level of NVIDIA?
[Dan] The high end has been pretty consistently priced in the $399 to $499 range for the last two years. Price politics won't change much, mainly becouse the biggest part of the price you have to pay for a cutting edge 3D card is due to the fast DDR memory used. Anyway, we'll be able to cover all the price segments, from workstation (top high end) to desktop. Desktop segment is then divided in performance (GeForce 3) and value (GeForce 2 GTS and MX) products.</font color=red>
or because you only know the brute force way !!!
nvidia don't (or want) make smarter design cards
they want to take our money from us!
<A HREF="http://www.hwzone.it/html/text.php?id=205" target="_new">http://www.hwzone.it/html/text.php?id=205</A>
<font color=red> [Marco] What about ATi and Kyro boards? Do you see them as challengers for the GeForce 3?
[Dan] Not at all! We think that even a GeForce 2 GTS can do better than those product, so GeForce 3 is really way ahead for them </font color=red>
yah right if you play at 16 bits then geforce 2 gts is a little better but not at 32 bits or higher resolutions or even FSAA, if geforce 2 gts will do better than kyro 2 in the next months (in sales) then being evil pays off...
<font color=red>. Though, we see them asOne may view them as competitors for our MX range of boards but we feel that we have the best solutions at all price points. Especially with our new line of faster MX products...</font color=red>
your new lines of faster Mx (faster???? lol)
<font color=red>[Marco] The present prices for a top-level 3d card are much higher when compared to those from some years ago. How will your price politics change, now that there are not many competitors left, at the same level of NVIDIA?
[Dan] The high end has been pretty consistently priced in the $399 to $499 range for the last two years. Price politics won't change much, mainly becouse the biggest part of the price you have to pay for a cutting edge 3D card is due to the fast DDR memory used. Anyway, we'll be able to cover all the price segments, from workstation (top high end) to desktop. Desktop segment is then divided in performance (GeForce 3) and value (GeForce 2 GTS and MX) products.</font color=red>
or because you only know the brute force way !!!nvidia don't (or want) make smarter design cards
they want to take our money from us!
Anonymous
a
b
U
Graphics card
April 27, 2001 9:03:01 PM
hey what a xx ?
comparing a geforce 3 with kyro 2 ???
lol
comparing a card that is 1/4 of the price of geforce 3 ?
It must be better, it should be better than kyro 2 !!!
<A HREF="http://www.aceshardware.com/Spades/read.php?article_id=..." target="_new">http://www.aceshardware.com/Spades/read.php?article_id=...;/A>
<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/27/01 05:07 PM.</EM></FONT></P>
comparing a geforce 3 with kyro 2 ???
lol
comparing a card that is 1/4 of the price of geforce 3 ?
It must be better, it should be better than kyro 2 !!!
<A HREF="http://www.aceshardware.com/Spades/read.php?article_id=..." target="_new">http://www.aceshardware.com/Spades/read.php?article_id=...;/A>
<P ID="edit"><FONT SIZE=-1><EM>Edited by powervr2 on 04/27/01 05:07 PM.</EM></FONT></P>
noko
April 27, 2001 10:07:03 PM
Well I hope you are right and the Kyro2 makes a very big splash with great success. That the DX8 issue is resolved by Microsoft promptly. Since it is a DX8 issue then comparing the differrent cards on OpenGL games would probably show the real potential of the Kyro2. In which case Serious Sam (A OpenGL game) also QuakeIII shows outstanding performance of the Kyro2.
Looks like after 9 months after the Radeon was official lauched 11 months from time of annoucement the Kyro2 caught up to it in some benchmarks and games, except for the video stuff. The T&L issue will also be seen in the coming months when new games like Max Payne are officially launced and benchmarks are conducted that is if it is different from the 3dMark2001 benchmark that used the same engine. In the mean time I will be enjoying my Radeon in all my games at max settings.
Ace Hardware is coming up shortly with a Video Card guide including the Kyro2 and a overclocked Radeon 225mhz representing what the Radeon SE can do if it is launced. Frankly I think ATI should just concentrate on the Radeon2 and start making annoucements so as to not loose potential buyers. I am sure ATI will respond to the new competition. Nvidia is not standing still either, look at the dramatic price reductions goin on. Now the question is when will the Radeon2 be delivered and in what variations. Comparing the Radeon2 and GF3 against the Kyro2 will make the Kyro2 look like a budget and a rather low end card. Still all the new cards available now will play the new games just fine for the next 6-12 months.
The current video cards surpass what the developers are delivering. Taking 18 months to 32 months to develope the new high end games while the video cards are being significantly updated at 6 months intervals caused a significant gap in cards ability and usage.
Do you know if a mobile Kyro chip solution is in the makings? Seems like an ideal chip for a mobile unit due to its low overhead.
Are there any official roadmaps for the Kyro line of graphic chips? Not the unofficial versions on the web sites.
Looks like after 9 months after the Radeon was official lauched 11 months from time of annoucement the Kyro2 caught up to it in some benchmarks and games, except for the video stuff. The T&L issue will also be seen in the coming months when new games like Max Payne are officially launced and benchmarks are conducted that is if it is different from the 3dMark2001 benchmark that used the same engine. In the mean time I will be enjoying my Radeon in all my games at max settings.
Ace Hardware is coming up shortly with a Video Card guide including the Kyro2 and a overclocked Radeon 225mhz representing what the Radeon SE can do if it is launced. Frankly I think ATI should just concentrate on the Radeon2 and start making annoucements so as to not loose potential buyers. I am sure ATI will respond to the new competition. Nvidia is not standing still either, look at the dramatic price reductions goin on. Now the question is when will the Radeon2 be delivered and in what variations. Comparing the Radeon2 and GF3 against the Kyro2 will make the Kyro2 look like a budget and a rather low end card. Still all the new cards available now will play the new games just fine for the next 6-12 months.
The current video cards surpass what the developers are delivering. Taking 18 months to 32 months to develope the new high end games while the video cards are being significantly updated at 6 months intervals caused a significant gap in cards ability and usage.
Do you know if a mobile Kyro chip solution is in the makings? Seems like an ideal chip for a mobile unit due to its low overhead.
Are there any official roadmaps for the Kyro line of graphic chips? Not the unofficial versions on the web sites.
noko
April 27, 2001 10:09:44 PM
noko
April 27, 2001 11:28:58 PM
Looks like STmicro is getting all they can from the Kyro2, this is fresh from Ace's Hardware which today will be publishing on the web a Video Card Guide
Until then, a few words about the "overclockability" of the Kyro II. It is true that the review samples come with 5 ns memory (you can see "-50" on the chips) and that retail samples will probably ship with 5.5 ns. <b><font color=purple>However, I can honestly say that the memory chips are not limiting the overclockablity. The core does not get hot, but even with improved cooling (a giantic fan blowing air on both the chip and memory chips) we could not get it past 185 MHz. IMHO, the Kyro II is limited by its architecture and that might also explain why the Kyro I was such a bad overclocker (1-2% overclockable).</font color=purple></b></i>
Looks like what you get will be a slower ram version Kyro2 when it hits the street with no chance of overclockability. My Radeon easiliy goes to 200mhz/200mhz stably which does indeed improve the benchmarks. With some ram heat sinks I am sure I will be able to go to 205-210mhz which would smoke the Kyro2 in most benchmarks except for FSAAs. With the core of the Kyro2 being pushed to its maximum speed I see a real potential of high rate of failure rate of the Kyro2 boards. So now I think the Kyro2s that where given for reviews where hand chosen and may not represent what us buyers would be paying for. That wouldn't be the first time this occurs. Well we will see how a Kyro2 stacks up against a overclocked Radeon at Ace's Hardware a video card that is 9 months old competing against STMicroelectronics best still to be released chip which is overclocked to its max speed.
Quote:
<i>Until then, a few words about the "overclockability" of the Kyro II. It is true that the review samples come with 5 ns memory (you can see "-50" on the chips) and that retail samples will probably ship with 5.5 ns. <b><font color=purple>However, I can honestly say that the memory chips are not limiting the overclockablity. The core does not get hot, but even with improved cooling (a giantic fan blowing air on both the chip and memory chips) we could not get it past 185 MHz. IMHO, the Kyro II is limited by its architecture and that might also explain why the Kyro I was such a bad overclocker (1-2% overclockable).</font color=purple></b></i>
Looks like what you get will be a slower ram version Kyro2 when it hits the street with no chance of overclockability. My Radeon easiliy goes to 200mhz/200mhz stably which does indeed improve the benchmarks. With some ram heat sinks I am sure I will be able to go to 205-210mhz which would smoke the Kyro2 in most benchmarks except for FSAAs. With the core of the Kyro2 being pushed to its maximum speed I see a real potential of high rate of failure rate of the Kyro2 boards. So now I think the Kyro2s that where given for reviews where hand chosen and may not represent what us buyers would be paying for. That wouldn't be the first time this occurs. Well we will see how a Kyro2 stacks up against a overclocked Radeon at Ace's Hardware a video card that is 9 months old competing against STMicroelectronics best still to be released chip which is overclocked to its max speed.
Anonymous
a
b
U
Graphics card
April 28, 2001 12:23:08 AM
noko
April 28, 2001 1:45:48 AM
That was a good review of the GF3 and Kyro2 which included other cards as well. Giants at 1024x768x32 had some interesting results
GamePlay)
<A HREF="http://www.aceshardware.com/Spades/read.php?article_id=..." target="_new">http://www.aceshardware.com/Spades/read.php?article_id=...;/A>
. . . GF3 - 38FPS
. . . Radeon - 37FPS just 3% slower then a GF3!
. . . GTS 2 - 29FPS Radeon 28% faster :smile:
. . . Kyro 2 - 26FPS <b>Radeon 42% faster</b> (oops the Kyro2 couldn't beat this 9 month old card here :frown: )
Even at 16bit mode the Radeon was able to beat the GTS 2 card besides the Kyro2 card. Once again showing everybody the card you buy should depend on what you do and sometimes the games you play.
Now Serious Sam benchmark shows a very awesome result with the Kyro2 and my hat is off to the Hercules 3d Prophet 4500. Only thing faster is a GF3 period. Excellent!!!
Now with QuakeIII 800x600x32 FSAA4x for QuakeIII the GTS 2 beats the Kyro2 barely while the Kyro2 beats the Radeon hands down. In Nascar racing even the Radeon beats the Kyro2 FSAA4x at 800x600x16??
Formula 1 Grand Prix tests using different methods of filtering shows a rather steep degradation from bilinear filtering to Anisotropic from 120FPS to ~21FPS!
While the Radeon maintain a rather decent consistent 70FPS :smile: . I not only use 16x but also 128x Anisotropic filtering on my Radeon which is just plain awesome for improving visual quality. Even a GeForce MX beats the Kyro2 easilily when Anisotropic filtering is used. What gives?
<A HREF="http://www.aceshardware.com/Spades/read.php?article_id=..." target="_new">http://www.aceshardware.com/Spades/read.php?article_id=...;/A>
I really don't see the Kyro2 clearly beating the Radeon or GTS 2. Where advance games are soon to be hitting the shelves the GTS and Radeon will look even better.
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/27/01 10:24 PM.</EM></FONT></P>
GamePlay)<A HREF="http://www.aceshardware.com/Spades/read.php?article_id=..." target="_new">http://www.aceshardware.com/Spades/read.php?article_id=...;/A>
. . . GF3 - 38FPS
. . . Radeon - 37FPS just 3% slower then a GF3!
. . . GTS 2 - 29FPS Radeon 28% faster :smile:
. . . Kyro 2 - 26FPS <b>Radeon 42% faster</b> (oops the Kyro2 couldn't beat this 9 month old card here :frown: )
Even at 16bit mode the Radeon was able to beat the GTS 2 card besides the Kyro2 card. Once again showing everybody the card you buy should depend on what you do and sometimes the games you play.
Now Serious Sam benchmark shows a very awesome result with the Kyro2 and my hat is off to the Hercules 3d Prophet 4500. Only thing faster is a GF3 period. Excellent!!!
Now with QuakeIII 800x600x32 FSAA4x for QuakeIII the GTS 2 beats the Kyro2 barely while the Kyro2 beats the Radeon hands down. In Nascar racing even the Radeon beats the Kyro2 FSAA4x at 800x600x16??
Formula 1 Grand Prix tests using different methods of filtering shows a rather steep degradation from bilinear filtering to Anisotropic from 120FPS to ~21FPS!
While the Radeon maintain a rather decent consistent 70FPS :smile: . I not only use 16x but also 128x Anisotropic filtering on my Radeon which is just plain awesome for improving visual quality. Even a GeForce MX beats the Kyro2 easilily when Anisotropic filtering is used. What gives?<A HREF="http://www.aceshardware.com/Spades/read.php?article_id=..." target="_new">http://www.aceshardware.com/Spades/read.php?article_id=...;/A>
I really don't see the Kyro2 clearly beating the Radeon or GTS 2. Where advance games are soon to be hitting the shelves the GTS and Radeon will look even better.
<P ID="edit"><FONT SIZE=-1><EM>Edited by noko on 04/27/01 10:24 PM.</EM></FONT></P>
Related resources
- Kyro2, AMD and Quake Forum
- Kyro 2 + P2 400 ? Forum
- KYRO II 4500-Vs-INNO3D GEFORCE2 Ti---whats best ! Forum
- GF2MX400 / Radeon VE / Kyro 2 Forum
- Teasy, Powervr2: No Kyro III for a while. Forum
- Kyro2 vs GTS Pro Forum
- Kyro 2 Forum
- V6800 Deluxe vs RADEON 32 DDR vs Kyro 2 Forum
- Kyro 2 Forum
- Kyro 2 ROCKS - pt. 2 Forum
- Is Kyro2 really the right choice? pls help! Forum
- KYRO 2 Rocks - a review Forum
- Quake 3 on Kyro 2 - Settings? Forum
- Kyro2 vs GF2 vs Voodoo5 vs Radeon Forum
- kyro 2 better than ultra ??? Forum
- More resources
Read discussions in other Graphics & Displays categories
!