Parhelia can O/C soon

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
Currently no software programs support overclocking the Parhelia. But this will change in a few weeks. Straight from matroxusers.com:

Thursday July 11, 2002
Parhelia - Overclocking And Buying Custom Resolutions
News item posted at 16:44 GMT by Ant
There have been a lot of people asking about overclocking the Parhelia and how to do it. The answer is yes the Parhelia is overclockable and it will be made possible by the second release of the new MTS Tweak utility.

Alot of people here and on mirc are curious about o/c'ing P and I have posts pretty much scattered all over the place. In order to prevent people going back n forth, here's what you should all know.

All retail and bulk versions can be o/c'ed by a few %ages.

Since not all electronic components are made equal, not everyone will be able to get the same %age even if you mod the HSF.

We'll be adding an o/c'er in the 2nd release of tech supp tweaker with P support. 1st release will be out in a few weeks.

This little cathode light of mine, I'm gonna let it shine!
 

OzzieBloke

Distinguished
Mar 29, 2001
167
0
18,680
Whoop-dee-f***ing-doo... let's overclock a card that is so far behind in perfoarmance compared to GeForce 4 and even Radeon 8500 by 3%... whoopee.

By the way, I was being sarcastic.

-

I plugged my ram into my motherboard, but unplugged it when I smelled cooked mutton.
 

eden

Champion
I am excited to see this, to be honest.

But slightly off-topic, here is what I expect that can make the Parhelia kick ass:

-At 220MHZ core clock, it is 160MHZ DDR below the Ti4600. 160MHZ is pretty big, so if it can get there with a new version or through OCing, it will rival it. Has anyone checked if per clock a Parhelia is as good as a Ti4600? If it is, or better than at 300MHZ, it CAN kick ass. Now this isn't over yet.

-Drivers are a big issue so far, according to many review sites, including Anandtech. Now I think Matrox can do as much as 10% improved performance, even when lazy. If they had Nvidia's team, I am betting they could do a sure 50% improvement like Detonator XPs. However we must be realistic and put strains here so that the hype won't be so big and expectations get destroyed. We'll be conservative at 10% max improved performance by drivers. SO, so far with Ti4600 clock speed and improved drivers, we can expect the Parhelia to be 15% above the Ti4600. Not bad but still, the value has not been more important with its price.

-Next, the Parhelia is expected to have 256MB, maybe its bandwidth needs that, who knows, so we'll be strict and say a max 3% improvement.

-Finally, this one can account a lot, as most of the features in the card are being overly used, and not efficiently used, such as all the nice 4 Vertex Shaders and the 36 stage pipeline, wow! I think to make this happen, they need to apply the Occlusion Culling techniques, as well as the extremly efficient memory controller, such as Nvidia's X Bar mem controller, which so far with GF4s now improves well over 30% per clock against GF2s. With that in place, you also have the HUGE bandwidth of the Parhelia, so you can picture a GF4 Ti4600 on steroids at 18GB/sec! Roughly twice the bandwidth. With that, a nice 50% can be done, but to be strict, 35%, as to not sound very ecstatic.

Alright, summing things up, the Parhelia CAN be able to get as much as 65% better performance than a Ti4600, when its clock is at 300MHZ. Am I being too much? I suggest you review my possible additions which can improve it. Add to that the fact Fragment AA is actually very efficient and barely drains strengh, you can get AAed gaming at still about 45% better performance than a Ti4600. Although it is flawed at times like on Stencil Buffers, it still created a beautiful scene most of the time. Now I ask you, is an average of 65% improved performance, worth the 400$, which gives partial DX9 support, Surround Gaming, Fragment AA, excellent 2D image quality and very nice 3d image quality as well?

To me, it is, and I sure hope these possibilities are implemented. This is a true card with potential, but it was done by a serious underdog in gaming now.

--
:smile: Intel and AMD sitting under a tree, P-R-O-C-E-S-S-I-N-G! :smile: <P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 07/12/02 06:11 PM.</EM></FONT></P>
 

OzzieBloke

Distinguished
Mar 29, 2001
167
0
18,680
Without hardware texture compression, it is going to crawl since a good chunk of the bandwidth is taken up by textures... that is a big problem, I would think.

And to be honest, I think you are fantasizing a bit about the Parhelia's overcloackability... yes, better drivers will help, but even including the other bits and pieces you mention, that lack of texture compression is going to make it chug :/ Which is a bit of a pity... it looked like such a promising card.

-

I plugged my ram into my motherboard, but unplugged it when I smelled cooked mutton.
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
I'm also wondering about hidden potential of this card. But I do have to remain true to my roots and be a pessimist.

However as I so loquaciously put it in my other post, and partly hinted at, I am one of those people that values image quality over speed in every case unless the speed gets to a point where it is impractical. This means that if I have a chance to run 4X FSAA at 20-30 fps average I will run that instead of having no anti-aliasing at 80 fps.

It's also why it upsets me to see so many unfair reviews of the Parhelia only emphasizing speed and benchmarks. Some of you may disagree with me on this one, but you must admit that the most of the discussion generated by the reviews and even a few early conclusions reached about the card were based largely on benchmarks, pure numbers, but not giving image quality as much credit as they should.

The Omega drivers have really opened my eyes. I mean, look with your own eyes at what we can do to increase image quality. So there's aliasing you say? Moire? No problem, on TOP of increased LOD settings, let's add a filtering algorithm that will slow down performance another 20% but make things look so perfect it will be near impossible to note any rendering defects at all. We've been doing this kind of "filtering algorithm" for a while now anyway so it should be easy: It's called anisoptropic filtering. So let's make a modified version of anisotropic filtering to work with the increased LOD. And I also have a hunch too. You know why I think the 4X FSAA mode on the Parhelia had such crap performance? Because it was doing what I just described. If you see the pictures in the reviews, the textures for 4X FSAA mode were the clearest. Aliasing for 4X mode obviously won't be as good as 16X FAA but still, it's a definite step in the right direction, and much cleaning and better looking than anything out there right now.

Now all someone has to do is combine fragment anti-aliasing, greatly increased texture LOD settings, and a filtering algorithm to filter the moire that occurs with increased texture LOD. IMO this combination would project the most perfect looking 3D image anyone could ever come up with. And it's very doable, perhaps even with the Parhelia. (Matrox, however, has alluded to the fact that it has disabled certain graphical combinations because performance would drop too much--we know already that 2X anisotropic filtering is the max its drivers will alow you to set because of this very reason) The reason why I suggest this combination of these features is that we've already seen what anisoptropic filtering can do, we've already seen what fragment AA can do, and we've already seen what increased texture LOD settings can do. Now all we have to do is combine them. It may run at 3/10 of the speed of another graphics card rendering with no AA, no texture filtering, and no enhanced textures, but is the improved quality worth it? Once gamers see the level of image quality that is possible, I think half of them will say yes. The other half will ride the clock speed/fps bus because we're only realizing right now that <i>both are equally important</i>. nVidia is picking up on this with their next gen cards, and ATi has given us a small sampling of what is to come with Truform.

Too many gamers these days don't even realize that their eyes cannot actually see 150 frames per second. It may take a while for the industry to adjust, but very soon more and more attention will be paid to adding graphical details and image quality enhancements as well as adding more speed. Definitely a bright and more aesthetic future ahead for graphics.