Sign in with
Sign up | Sign in
Your question

*** Parhelia vs GF4 Ti4600 benchmarks - Oops huh?

Last response: in Graphics & Displays
Share
June 25, 2002 2:51:32 AM

Well looks like THG in Germany let the cat out of the bag earlier - the Parhelia flops on current benchmarks.

Even though the site quickly pulled down the article and barcharts - look at them here:

http://arstechnica.infopop.net/OpenTopic/page?a=tpc&s=5...

Amazing that they even lost the FSAA vs GF4 FSAA tests in all benchmarks!!!

or

http://www.3dgpu.com/yabb_se/index.php?board=2;action=display;threadid=716

"Earlier today a well-known German website disclosed their Matrox Parhelia numbers. The pages seem to have been removed at this time. While we cannot legally give you the content, as it is copyright material, we can certainly discuss what we have seen.

Aquanox - Parhelia was beaten by ATI's 128MB 8500 and the Ti4600 nearly doubled the score.

Comanche 4 - Parhelia not breaking the 30FPS barrier at 1024x768 while the Ti4600 broke 40.

Jedi Knight 2 - At 1024x768 the Parhelia was about 30% behind both the 8500 and the Ti4600.

3DMark2002 SE v330 - Just breaking into the 7000s while their test system was breaking 10K with the Ti4600. The 8500 dusted it again as well.

Quake 3 Arena - Parhelia lagging way behind both cards and not even giving deathmatch playable frame rates at 1600x1200 in my opinion.

At this point I am really wondering what Matrox was thinking. I know full well that they have explained that their Parhelia will be the card for tomorrow, but while it is currently not keeping pace with the current generation's GPUs, across the board, you have to wonder. Triple head gaming is not going to save Matrox this time round if what we saw is correct. I can certainly understand their reasons with not wanting to give the [H] a card at this point.
June 25, 2002 3:01:20 AM

LOL, yeah, I just saw the news over at hardocp.com, looks like a misfire on Matrox's part...too bad, since they've been tooting their own horn for all this time about the card. I didn't realize that the "well-known" German site was our own Tom LOL
June 25, 2002 3:01:46 AM

Yes I also read it and was about to post something here.

I simply don't understand why it was beaten so easily. It has plently of bandwidth to spare, provides a new FAA method, has tons of pipeline stages, and 4*4 shaders, as well as a lot of memory AND a high clock+mem speed. How can it flop? I don't know but I just think it could've done MUCH better, something along the lines of 100% better.

And if this is what Matrox has been working on for 2 years now, I think they misscheduled its release by a lot...


--
:smile: Intel and AMD sitting under a tree, P-R-O-C-E-S-S-I-N-G! :smile: <P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 06/24/02 11:02 PM.</EM></FONT></P>
Related resources
Can't find your answer ? Ask !
June 25, 2002 3:08:38 AM

It's really quite a shame. I had hoped that this card would be competitive with the next generation of cards expected from nVidia and ATi, but obviously it can't even compete with the older generation. It would have been nice to see some more competition in this market. Let's hope that the 3DLabs' card will fare better.
June 25, 2002 4:32:29 AM

Don't give up just yet. First the drivers may be part of the issue, but secondly, it seems the Parhelia has a lot of optimizations yet to come, maybe it'll be something like the gf3- it was good when it came out, and shined later when games for it came out. We'll see.

My frog asked me for a straw...dunno what happened his ass all over the place :eek: 
June 25, 2002 4:48:03 AM

Better yet, compare it to the original Radeon 8500...give the drivers some time to mature, and we may have a winner in a while

"When there's a will, there's a way."
June 25, 2002 5:36:37 AM

There's just no way with those specs the Parhelia can be getting smacked around by the Ti4600 and R8500 this badly. No, it HAS to be a driver issue. But I'm amazed that Matrox released a card with such crappy drivers. Didn't they learn anything from ATi?

AMD Athlon XP 1900+, Asus A7V333, 512mb DDR RAM
PNY Geforce4 Ti4400, Win2k
June 25, 2002 6:21:19 AM

Yeah, it's sad. Cause I had such high hopes for this card but you may be right Quetzacoatl.

So now... guess it's back to you, nvidia.

This little cathode light of mine, I'm gonna let it shine!
June 25, 2002 7:02:13 AM

*shakes head in disagreement* but wouldn't it be suprising if Matrox recovered and wiped up Nvidia and Ati? On the paper, the Parahelia should wipe the floor, but i'm almost positive it's due to poor drivers. They shouldn't be pressured though, considering they have more than enough time to modify their design or improve drivers. It's a very, very bad time to release the Parahelia now. Gods above, have mercy on them.

"When there's a will, there's a way."
June 25, 2002 7:23:37 AM

Well, it is true that most games have specific card optimizations, especially the newer ones, like Comanche 4, JK2, Unreal Perf. Test, and others, so it could be not only poor drivers, but lack of software support. I remember reading at Anandtech that the latest builds of UPT had ATi and Nvidia optimizations built in. I would love to see Anand's review, since he has access to the UPT. If indeed it is the card of tomorrow, then that engine should make it shine, or prove that it's gonna fall flat on its face.

Too bad really, as we could really use a third company to shake things up. PowerVR managed it for a while, but without a fab to make the KyroIII, or Series4, since STM owns the Kyro name, PowerVR is out of the game until it's sold. 3DLab's PL10 workstation <A HREF="http://www.extremetech.com/article2/0,3973,264433,00.as..." target="_new">benches</A> are nothing to be proud about either, so it still looks like ATi and Nvidia are the only games in town. Maybe someone will surprise us.

-SammyBoy
June 25, 2002 2:26:28 PM

I was most amazed at the Quad Vertex Shader test by Matrox. THIS my friends is what the Parhelia should be able to do. I think also the drivers are a big issue. It might be a GF3-like with 8500-qualities back when they were released. In the future it might shine above the Ti4600, but those must be the most amazing drivers ever to make it jump nearly 100% faster.

And I don't think it's just game optimizations that'll do, I mean it'd take time, and it will STILL be left in the dust with current games.
The specs show more. Even if GF4 has LMAII, that improves by 30% the performance, and even then, the Parhelia has 110% the amount of bandwidth, so it still has a good 60% lead in overall performance. I may be miscalculating but I feel the Parhelia was supposed to shine. High clock speeds would also help, but this is not the issue here, it has so much to offer that the IPC per clock would've done more, like the Radeon 8500.

You're a good graphics editor man, what do you think of the speccs? Shouldn't they give homage to it?

--
:smile: Intel and AMD sitting under a tree, P-R-O-C-E-S-S-I-N-G! :smile:
June 25, 2002 2:48:25 PM

I was supprised to find that it did as poorly as it did. It's performance was around the level of the 8500, and for double the MSRP, that's not a good position for Matrox.

The 3 monitor setup for gaming seems like it's limited to a low resolution at 2400 x 600 (3 x 800 x 600), so that is a bit of a disapointment as well.

Drivers might help it a bit, maybe even bring it within strinking distance of the 4600, but untill the price goes down, it won't sell to well to the gaming public.

The biggest problem is that there is no compression architecture, so the huge bandwidth is wasted on lots of data.

It can be said that smoking is one of the leading cause of statistics.

It can be said that smoking is one of the leading cause of statistics.
June 25, 2002 3:55:24 PM

My first response to the article: Ouch! With that amount of bandwidth and power behind it... deary me :/  But, here I am hoping that the Parhelia is a bull whos balls just haven't dropped yet... and I think the proverbial gubernaculum will be in the updating of the drivers and getting some compression of some sort in there.

-

I plugged my ram into my motherboard, but unplugged it when I smelled cooked mutton.
June 25, 2002 4:06:50 PM

It doesn't "Have" to be a driver issue. Many predicted it would have problems due to it's lack of data compression at the hardware level, seriously crimping it's memory usage.

Some of it will be customization at the game level too, like the sharkmark test. Or Jedi Knight 2, which likely has some for the 8500 along the lines of how Carmac was praising the 8500.

English is phun.
June 25, 2002 4:15:00 PM

ATI and Nvidia have hardware level compression, not driver level compression. Driver level compression would be inefecient and heavily CPU intensive, defeating the purpose of a high end GPU.

English is phun.
June 25, 2002 4:45:47 PM

Well by the time that this hits the shops I think ATi and Nvidia will have released the R250/R300 and NV30's respectively, i doubt the Parhelia will be anywhere near them performance wise, no matter if there is a driver improvement....it's a shame, but Matrox have failed to deliver on the gaming front.

<font color=red>isit alan?</font color=red><P ID="edit"><FONT SIZE=-1><EM>Edited by FiL on 06/25/02 12:58 PM.</EM></FONT></P>
June 25, 2002 5:14:21 PM

Like the Kryo II, this card would be a nice alternative to ATI and Nvidia if it was priced reasonably. However, with that price tag, who is going to buy it?

I like the fact that Matrox is stressing visual quality which is sometimes lost in the framerate race. However, when comparing the Parhelia with the Ti4600, I don't see much of an improvement.

I'm glad Matrox is back in the 3d video card market. They have always made good stuff. I was really hoping that this card would bring Nvidia to its knees and above all back to the drawing board. If anything they will keep Nvidia looking over its shoulder. Also, I'm sure you'll see improvements with subsequent driver releases and perhaps in the near future a GPU that runs faster. Until then, it seems like a pretty good mid-range card with a whopper of a pricetag.

To start press any key. Where's the "any" key? --Homer Simpson.
June 25, 2002 10:30:43 PM

Well, the difference is that the KyroII performed beyond it's specs (who'd think that a GPU at 175MHz and <i>SDR</i> RAM could out perform GF2 GTS in high res and 32 bit, especially for half the price?). And it still was able to have playable frame rates in the Unreal Performance Test up to 1024x768x32. The problem is that PowerVR was unable to follow up on the success of the KyroII. The SE version was s'posed to have 200MHz and a hardware/software solution to T&L. It never came to be. STM, with the uncertainty surrounding the future of its graphics division, is offering PowerVR for sale, and until someone buys it, nothing more will be seen from the Kyro/PVR Series. On paper, the Series 4 should have destroyed the Ti4200, and remained in the sub-$200 price range. But alas, now there will probably not be a Series4 until they are sold off. VIA backed off, so no suitors remain. PowerVR remains hopeful that Series 4 will be out by Christmas, and Series 5 will debut next year, but it's all hope. They have no fab, so they need someone with access, like STM.

Oh yeah, Anandtech has their review up. The Parhelia was able to best the Ti4600 twice, but those were is very niche catagories.

-SammyBoy
June 25, 2002 11:32:17 PM

HAH, watching that mediocre performance and anisothropy suckness in matrox, makes me more proud of my gf4 4400
June 26, 2002 12:32:17 AM

I'll agree with Phob... feels good to own a ti4400 :) 
June 26, 2002 12:36:47 AM

Anandtech's preview using UT2K3 at least showed some encouraging signs. However in the frames-per-second-drooling-benchmarking world we inhabit, they've got their work cut out...
June 26, 2002 1:19:34 AM

Matrox has been weighed.....measured.....and found lacking.

:wink: The Cash Left In My Pocket,The BEST Benchmark :wink:
June 26, 2002 1:21:56 AM

Quote:
I was most amazed at the Quad Vertex Shader test by Matrox. THIS my friends is what the Parhelia should be able to do. I think also the drivers are a big issue. It might be a GF3-like with 8500-qualities back when they were released. In the future it might shine above the Ti4600, but those must be the most amazing drivers ever to make it jump nearly 100% faster.


If nvidia released a card where the only benchmark it did well in was one nvidia released itself, I would think something was fishy with that benchmark.

:wink: The Cash Left In My Pocket,The BEST Benchmark :wink:
June 26, 2002 2:03:30 AM

I KNEW IT! they have done dual channel DDR ram. most interesting. has to be with the chips in sets of two and the 256bit bandwidth.

hope the card matures well. nvidia needs more competition.

<font color=green>Proud member of THG's</font color=green> <font color=blue>Den Of Thieves</font color=blue> :lol: 
June 26, 2002 2:38:51 AM

Funny, the key word here is "fishy", I hope that wasn't a pun intended for Matrox' benchmark name!

--
:smile: Intel and AMD sitting under a tree, P-R-O-C-E-S-S-I-N-G! :smile:
June 26, 2002 2:50:19 AM

If i remember well many here have bash Anandtech because of there conclusion on Matrox.

Once again many was wrong.Number 1 trouble for all of you you allwayse looking for the underdog who will crush intel nvidia rambus from there place.

cheap, cheap. Think cheap, and you'll always be cheap.AMD version of semi conducteur industrie
June 26, 2002 3:45:20 AM

To build my conclusion I would like to have one last test.

Test this card with real applications, and I mean CAD applications.
Maybe this card is not intended (only) for graphic gaming.

Test it with some model object that has a couple of million of polygons in an application like 3DS Max or AutoCad2000. And look at the framerate.

I've seen a test like that with GF4 and FireGL, and GF4 didnt do quite well.
June 26, 2002 8:11:31 AM

Jeeze, why can´t cardmanufacturers release their card in a "complete" state?

Such a failure, if you release a card today it has to be able to compete or beat
the cards that are currently on the market.

It doesn´t matter what fancy new technology it support.
I will not buy such a card anyway, I want to be able to play todays
games with the same framerates and functions as todays cards
are capable of and then knowing that I have a card that will
work better with future games.

It´s like if they release a car with 3 wheels today.
Yeah...sure, people will buy the car and then wait for the fourth wheel, not likely.

Just the fact that people are debating wether this card is going to have a
future or not is bad for Matrox.

I am dissapointed.

<font color=red>Japanese Telecom</font color=red>
June 26, 2002 12:24:42 PM

Like Sammyboy was saying, the PowerVR series still has a lot of promise. It's a pity that things didn't turn out well because if Matrox was a decent gaming competitor, I sincerely believe the graphics industry would advance even more quickly than it is doing now.

Everyone knows that computer games give incentive for computer technology to improve. Even in the future when I will probably not be playing computer games at all, I still think that they serve a good function of encouraging upgrades to existing technology and preventing us from becoming content with just what we have.

And you know, the psychology behind this baffles me, because it's about collective psychology. Not just some big guy like Bill Gates pulling the strings, but entire companies and the gaming and technical communities reacting to this, and all of the different voices and opinions people have. Out of the many voices, mine is one of disdain. I think the graphics card sector could use some more competition. I'm sick of hearing every day about monopoly hearings with Micron about Hynix and Microsoft and the RIAA doing things to strangle consumers. I think nvidia could use a good competitor that doesn't compete on the basis of price, like ATi does, but competes on the same level of speed, features, and technology.

In time, will the drivers get 'better'? Who knows what better is? One can only surmise (from what little we know) that the Parhelia could very well be a mediocre attempt at something great but also something that ordinarily failed.

This little cathode light of mine, I'm gonna let it shine!
June 26, 2002 2:59:08 PM

Quote:
I think nvidia could use a good competitor that doesn't compete on the basis of price, like ATi does, but competes on the same level of speed, features, and technology.

Remember, the 8500 was brought out to compete with the GF3, which it did very nicely with speeds close to a GF3 and extra features that were not available on a GF3.

Also, realize that the high end market where the GF3 TI500, R8500, and TI4600 were initialy aimed at is a smaller percentage of the market than the midrange market where the MX, 7500 and 8500 are aimed at now. Without competition there, you'll get two high priced, albiet fast, cards available, and anything below that will not be fast enough to make it worth it.

Matrox should have aimed this product around the 8500/TI4200 pricerange, or at least the TI4400, as that's a price range where the features might be concidered over performance for some users.

English is phun.
June 26, 2002 6:57:23 PM

The worst was how it got 150FPS in Q3, a GF2 would do that, and I find it rather sad...

--
:smile: Intel and AMD sitting under a tree, P-R-O-C-E-S-S-I-N-G! :smile:
June 26, 2002 7:20:50 PM

Quote:
Matrox should have aimed this product around the 8500/TI4200 pricerange, or at least the TI4400, as that's a price range where the features might be concidered over performance for some users.

Dunno about it competing too well with the Ti4400, as most Ti4400s can OC with coolbits to Ti4600 levels. As to the 8500 and Ti4200, Anandtech shows that in most cases, the 8500 can trounce the Parhelia, and the Ti4200 is close, performance-wise, to the 8500. I'd think Maxtor's best chance is to go after the $100-200 market until they can prove they are able to compete in the mainstream on the level of performance and support. PowerVR proved that they can, and the driver updates up to this point have yielded good performance upgrades, as well as solving bug problems with games not handling TBR well. They would have had a good spring refresh to the KyroII (SE... sound like Ultra to anyone :wink: ), and the idea, before STM announced the intention to sell PowerVR, was to release the KyroIII this fall, which, if it was able to live up to the paper specs (which the previous Kyros exceeded) would probably have trounced anything in the sub-$200 range (after the R300 and NV30 release, that is). That said, the $h!t hit the fan, and needless to say, PowerVR is out of the game until they are sold (development is still going on, PVR has said that the KyroIII is all ready for production fabbing, and Series5 (KyroIV) is in advanced development). Maxtor would have been wise to follow that plan, instead of trying to go for the gusto in the first release.

If Anandtech is right, the Maxtor roadmap shows revisions and refreshes coming, one maybe as soon as Christmas time, but unless the drivers improve (there were "issues" with UT2003 according to Anand) it won't matter. I applaud Maxtor for making such a bold step, but being as maligned as they are in terms of support and performance, this definately does not help them.

And yes, I was, and still am, though I own a Ti4400 right now, a PowerVR fanboy. I love competition, as anyone can see that it keeps prices down (look at gas/petrol stations for proof) and keeps innovation high (look at how <i>little</i> OSes for x86 have changed since Win95, though XP was, admittiedly, a much larger step then in the past). I wish PowerVR had been sold quickly, so the the KyroIII could have come out, and I was really hoping that the PL10 and Parhelia would put heat on Nvidia. The former isn't doing so hot in workstation enviroments, and the latter, well, we're discussion the shortcomings of the latter.

-SammyBoy
June 26, 2002 9:38:31 PM

But considering current performance and drivers, and the time for decent drivers, is this card worth $399 when it runs at GF3ish levels with extra features(the only two worth mentioning are the beautiful screens and 3 monitor support). Personally, I'd take an R8500 right now, better performance, $250 less, and drivers finally getting close to Detonator levels(my estimate, but haven't actually seen Catalyst in action).

<font color=black>Need Money!! Accepting Donations to help better my future. Thanks!</font color=black>
June 26, 2002 10:06:00 PM

Here's a thought. IF the Parhelia gets driver fixes to make its anisotropic filtering faster and allows its FAA to work with the stencil buffer, then we'd possibly have the best 3d image quality ever rendered in the history of mankind, period. That might be worth holding a candle to, even for $400. Now since practically everyone is saying that the GPU is just too slow, I would really like to see what would happen if someone OC'd this board. I'm sure that Matrox told all reviewers not to post OC benchmarks of their cards, and I really have the feeling that they said this for technical reasons. Perhaps the Parhelia is not a very good overclocker? Then again, I hope someone proves me wrong.

Anandtech: "On the flip side of the board you'll see all of the solder pads for an extra 128MB of memory and all of the pads for the termination resistors that go along with the added memory. Matrox will be offering a higher clocked version of the Parhelia with 256MB of memory later this year; a 64MB card will also follow."

I'm considering buying this card when the price goes down, especially since I'm hoping we'll see some overclocking tests by then, a driver fix for improved anisotropic filtering performance, overall 3d performance increase, and FAA with stencil buffer fix. You'll notice that in the UT2003 pictures in the anandtech article, the "covering" over the top of the entrance of the tunnel is aliased. It's because of FAA not working on stencil buffers right now. Finally, the prospect of having an all in one card with a TV tuner chip is nice, with VIVO. I should take back what I said earlier, this card actually has a lot of promise. Not bad for a 2 year hideaway.
June 27, 2002 2:38:07 AM

Well look, the features sell, for sure. I am sure most of us think it has awesome interesting speccs. It just turned out like the early P4s, disappointing results, and only clock speed helped, until more cache and higher FSBs came in.
I hope they improve it thru drivers. What I always wanted, is that one day they integrate AA into the chip, completly on all the time. Then with the performance lost, they start from there and improve so that way, in the end when it is released, you got AA and performance better than all cards, now THAT is nice. I am a lazy guy and rarely use AA nowadays, I dunno, I guess I don't like enabling/disabling.

BTW what is stencil buffering, I never exactly knew what it was, nor how it looks like in an example image?

--
:smile: Intel and AMD sitting under a tree, P-R-O-C-E-S-S-I-N-G! :smile:
June 27, 2002 7:42:13 AM

I never used AA much, as if things looked to jagged, I'd up the res. Works out well for me, and in newer games, which are more CPU limited than anything, upping the res has little effect on performance.

-SammyBoy
June 27, 2002 9:30:41 AM

Upping the res only helps so much, also 2x aa usually dosent put as much stress on the videocard as going up 2 resolutions(whihc imo have similar video looks).

Also at high res many monitors have refresh rate issues.(id rather have aa@1064x768@125hrz than no aa@1600x122@65-80hrz)

:wink: The Cash Left In My Pocket,The BEST Benchmark :wink:
June 27, 2002 2:24:50 PM

Now THAT is a good idea. Full, always on AA, fully optimized to work with the memory interface.

Stencil buffering can be see here:

<A HREF="http://opengl.org/developers/code/features/StencilTalk/..." target="_new">http://opengl.org/developers/code/features/StencilTalk/...;/A>
<A HREF="http://www.gamasutra.com/features/20000807/kovach_pfv.h..." target="_new">http://www.gamasutra.com/features/20000807/kovach_pfv.h...;/A>

If you don't want to scroll through reading it all here's a description of the type of stencil buffering used in the screenshot below:

"Composites
You can use stencil buffers for compositing 2D or 3D images onto a 3D scene. By using a mask in the stencil buffer to occlude a portion of the render-target surface, you can write stored 2D information (such as text or bitmaps). You can also render 3D primitives -- or for that matter a complete scene -- to the area of the render-target surface that you specify in a stencil mask.
Developers often use this effect to composite several scenes in simulations and games. Many driving games feature a rear view mirror that displays the scene behind the driver. You can composite this second 3D scene with the driver's view forward by using a stencil to block the portion to which you want the mirror image rendered. You can also use composites to create 2D "cockpits" for vehicle simulations by combining a 2D, bitmapped image of the cockpit with the final, rendered 3D scene."

Here's an example:

<A HREF="http://www.anandtech.com/video/showdoc.html?i=1645&p=14" target="_new">http://www.anandtech.com/video/showdoc.html?i=1645&p=14...;/A> - Look at the picture captioned "Matrox Parhelia". Now look closely. Can you see the part that is aliased? This is of course because Matrox released a patch that makes the Parhelia skip using FAA on stencil buffers so that it doesn't crash. The aliased part is the work of a stencil buffer.

The Parhelia's 16XFAA is not based on supersampling, but mathematical algorithms, so it would be interesting to see if adding stencil buffer+16XFAA support for the Parhelia's drivers would cause a performance hit.

On the topic of image quality, here's an except taken from tech-report.com:

"Matrox has claimed Parhelia offers "the world's most advanced texture filtering units," capable of delivering 64 texture supersamples per clock.
The best sort of texture filtering we tend to see is anisotropic filtering. Unfortunately, with current drivers, the "most advanced texture filtering units" can't do better than 2X (16-sample) anisotropic filtering. I noticed this limitation and asked Matrox about it, and they confirmed to me that current drivers are limited to 2X aniso for <b>performance reasons</b>. The hardware can do 8X (64-sample) aniso, and Matrox is considering enabling that capability in future drivers. Given that the GF4 Ti can do 8X aniso and the Radeon 8500 can (with some caveats) handle 16X aniso, I think enabling stronger forms of anisotropic filtering would be wise.
That said, Parhelia can do 2X aniso and trilinear filtering simultaneously, which gives it a leg up on the Radeon 8500."

This leaves us with two final questions:

1. Reviewers have commented on how good Matrox's 4X FSAA looks. How good is it, really? From the screenshots at <A HREF="http://www17.tomshardware.com/graphic/02q2/020625/parhe..." target="_new">tom's</A> it looks better than any other hardware setting. Seeing it in motion, of course, is imperative before making decisions. In particular see how the word "Entry" and the very back of the subway station are clearest in 4XFSAA.
2. With both trilinear filtering and 16-tap anisotropic filtering turned on, how does texture quality compare to nvidia's with 64-tap? Also, exactly how large is the 'performance hit' Matrox refers to with regard to anisotropic filtering?

This little cathode light of mine, I'm gonna let it shine!
June 27, 2002 6:49:36 PM

I totally agree with Juin here. Everyone (especially AMD fans) is always looking for an underdog to go crush, Nvidia, Intel, or Rambus. Ok, we can forget Rambus right now, since they are really working slow, but why does everyone always want the underdog to crush Nvidia and Intel?

I agree competition is good, but that's the only thing we need; lots of comptetion. We don't need everyone always thinking or hoping a small company is going to come along and crush Nvidia, or Intel, and then sell great products at cheap prices. Everything has to remain balance. Companies like Nvidia should definitely exist, and so should comptetion. But neither should be wiped out. Imagine if Intel didn't exist, and only AMD did. You think AMD would be selling their CPU's at cheap prices? Probably not. AMD has kept low prices for years for one main reason: to compete with Intel.

And anyways, Nvidia doesn't need alot of competition, because even though Nvidia is the king of video cards right now, they are working hard and keeping up with the promise of releasing a new card every six months. Nvidia is a great company. They are just like Intel, only they are a bit more aggressive then Intel. Nvidia doesn't just want to beat the competion, they want to blow them away. ATI has tried their hardest to compete, and they're doing good. But, Nvidia has taken over the high-end sector of the market. We saw PowerVR try to knock Nvidia in high end, but they failed. Matrox and 3dLabs are trying to do the same. And so far, MAtrox has failed. Right now, Nvidia can't be beaten in high-end. In midrange, and low-end, the competition has a greater chance of competing with Nvidia. Ati is doing good in midrange, and low end, but so is Nvidia. Nvidia's profits have been consistently climbing for almost 3 years now. And right now, their stock is holding steady.

Now, back on topic, I doubt that's it's "poor" drivers which are causing <b>all</b> the problems for the Parhelia right now. I admit, it looked amazing on paper. But, I believe the main reason it's not performing well is because the card doesn't have alot of hardware optimizations. Poor drivers are one of the things limiting performance, but it's mainly the hardware that's actually limiting it. As Bront said, Parhelia doesn't have hardware-level compression, or any other similar optimization for that matter. There are several reasons why the GF4 does so well. First of all, Nvidia optimized the hardware for the GF4 to the limit. They optimized the memory cross-bars, and optimized and tweaked the LMA; after optimizing, and revising, it's now called LMA II. the GF4 manages memory extremely well, and the GF4's GPU has mind-boggling power. If you want to know more, read that long review of the GF4 here on Tom's. Another reason it performs so well is because of the amazing Detonator drivers. The driver team at Nvidia is simply amazing. They've managed to squeeze out additional performance from not only the new cards, but the old ones too. The detonator drivers are very mature, and soon, Nvidia will release the "Detonator 5" series of it's drivers. The 5 series are rumoured to have additional performance improvements for all existing Geforce cards, along with support for the NV30, and for Nvidia's new CG programming language. With the Cg language, Nvidia has taken yet another step to ensure that their cards perform well. The Cg language gives an easy way for programmers and developers to communicate with the Geforce cards, and to fully utilize all the features the cards possess, like rasterizers, vertex shaders, and pixel shaders. Rumours are next-gen game like Quake 4, Doom III, and Deus Ex II will all fully utilize the features of the Geofrce cards.

Justiceissweet, that's real smart comparing a mainstream GF4 with a workstation-class, professional-level FireGL. A better comparison would be an Nvidia Quadro 4 XGL compared to the latest FireGL.

------------------------------------------------
Montecito & Chivano; Intel's Big Guns.<P ID="edit"><FONT SIZE=-1><EM>Edited by Dark_Archonis on 06/27/02 02:51 PM.</EM></FONT></P>
June 27, 2002 7:09:14 PM

You're telling me that Detonator 5s are coming soon, and have even more performance squeezing for my GF3 Ti200?
Man is Nvidia awesome or what!

--
:smile: Intel and AMD sitting under a tree, P-R-O-C-E-S-S-I-N-G! :smile:
June 27, 2002 8:00:28 PM

DAMN... I have a R8500.... :tongue:

:smile: Falling down stairs saves time :smile:
June 28, 2002 2:13:59 AM

Feels even better to own ti4600 ;) 

Money talks, so they say, but mine only says
"Goodbye"
June 28, 2002 2:15:32 AM

Ok I admit I was being a bit like a fanboy... I should've said that only if the driver was out and proved to be THIS promising!

--
:smile: Intel and AMD sitting under a tree, P-R-O-C-E-S-S-I-N-G! :smile:
!