What does "Truform" do?

MrPanther0

Distinguished
Sep 12, 2003
178
0
18,680
I am just wondering about this feature in ATIs driver. They seem to think its not a good idea as its default is always off no matter what performance/quality setting is taken. Does it really improve the image quality or something?
 

GeneticWeapon

Splendid
Jan 13, 2003
5,795
0
25,780
Google is wonderful.
<A HREF="http://www.digit-life.com/articles/atitruform/" target="_new">http://www.digit-life.com/articles/atitruform/</A>

<b><font color=red>3DMark03</font color=red></b><A HREF="http://service.futuremark.com/compare?2k3=1790928" target="_new">http://service.futuremark.com/compare?2k3=1790928</A>
<b><font color=red>3DMark 2001SE</b></font color=red>
<A HREF="http://service.futuremark.com/compare?2k1=7374242" target="_new">http://service.futuremark.com/compare?2k1=7374242</A>
 

Crashman

Polypheme
Former Staff
Trueform can make an octagonal ball look like a spherical one. It's designed to overcome the fact that a lot of graphics use too few triangles. It's a really good technology that unfortunately came out when nVidia was undisputed champ of the graphics industry, hence it was mostly ignored.

<font color=blue>Only a place as big as the internet could be home to a hero as big as Crashman!</font color=blue>
<font color=red>Only a place as big as the internet could be home to an ego as large as Crashman's!</font color=red>
 

silverpig

Splendid
Dec 31, 2007
5,068
0
25,780
It's a cool thing for sure, but I find it slows down my games way too much at the default settings. Even in CS my 9800 Pro becomes unplayable with truform enabled. Turn it off and I'm stuck at 99 fps constant.

Some day I'll be rich and famous for inventing a device that allows you to stab people in the face over the internet.
 

mrblah

Distinguished
Dec 28, 2003
4
0
18,510
I had it enabled on my 8500LE on an 800mhz P3 and it still ran so smooth in CS that I got tearing at high res.

The thing about Truform is that it doesnt work with everything. For some game models it makes them look like balloon animals. In CS I had it enabled along with a high res weapon skin pack (Fusion) and Truform made the guns puff out and look really bad, then the models for the characters had wierd distortions.

I think a game has to be made for Truform to actually work and look normal, which is probably why its default is turned off.
 

SoDNighthawk

Splendid
Nov 6, 2003
3,542
0
22,780
Most game servers can run up to 500 fps or provide them. As admin for the CanadianForces server I can log into the server from the back door and watch everyone play like T.V Using a DOS screen logged into the server on my desktop I can also read how many FPS the server is currently running at. Most of the time our server provides over 600 FPS but even on my home computer I get 100 FPS in game, the reason is because we have the player model defaults set to 100 FPS max and in the Config folder for your own game Day Of Defeat at home you can set the FPS from 75 FPS default up to 200 FPS that way if your card can produce those numbers you will get the 100FPS in game playing.

If I watch the game server in HDTV mode like watching T.V I can get up to 500 FPS off my current system as displayed by a command line in Console called net_graph 3.

That command in Day of defeat provides a FPS in the lower right hand corner as well as connection information.

The reason ATI must have invented the TRUFORM driver option was to eliminate the rips you can see in some DirectX games where adjoining walls in games show pixalation or tearing (gaps)

As Nvidia works closely with the Engineering group at DirectX they have an insight into the Eng software DX provides and have been able on the new FX cards to eliminate most of the tearing or pixalation in the DirectX games and supposedly in the new DX9 games..... yet to be proved.

Nvidia uses Vertex shader units. FP Array, Extreme timing multiplier in place of the old Vertex standard ATI still uses to eliminate the graphics glitches. It is what Nvidia claims but on my current card I can still see the gaps in the walls in DoD just like I did on my old card.

So either it doesn't work like they claim or I wont see a difference until I get into Half Life 2 that will use DX9.

The other thing that occurs to me is that the games are simply written in such a manner that those gaps are in programming language normal and no amount of graphics technology can remove them. Unless the games are designed around the new DX9 Engineering specs.

Have to wait and see how HL2 runs.

Barton 3200+ 400MHz
A7N8X Deluxe
Liquid
2x512 KinstonHyperX PC3200
GeForce FX5900
Maxtor DiamondMaxPlus9@80Gig
SONY CD 52x
SONY RW 52x/24x/52x
SONY DVD 16x/40x
 

silverpig

Splendid
Dec 31, 2007
5,068
0
25,780
Yeah, it was all fine until I got quite a few players on the screen at once and viewed a large open part of the map. The extra polys were just too much.

Some day I'll be rich and famous for inventing a device that allows you to stab people in the face over the internet.
 
Nothing, it's a stupid useless feature from a stupid useless company.

Much like this is a stupid useless post from a stupid useless idiot? :smile:

Really guys... if you have to make a company look good by constantly bashing the other guy... what does that say about the company you support?

<font color=red> If you design software that is fool-proof, only a fool will want to use it. </font color=red>
 

Captain_BS

Distinguished
Dec 29, 2003
7
0
18,510
I love you! The way you spread bullsh!t is excellent, the way you provide the information makes what you say completely believeable to the layperson. It's... It's... almost enough to make me cry. You really are my hero. It has only been since your arrival here that I've had enough blatent bullsh!t to become a cohesive personality with! Thank you! Thank you very much! Please continue!
 

SoDNighthawk

Splendid
Nov 6, 2003
3,542
0
22,780
I think we should have Captain_BS work with the homeless during the New Year he has some real personality problems. Only a fook like him could find a reason to be nasty in a help message board.

It is not anyone's fault here that he's more bent then a 8 inch penile implant but we could all wish him a better New Year and hope he climbs down off that lump of coal he's stuck on top of.

Barton 3200+ 400MHz
A7N8X Deluxe
Liquid
2x512 KinstonHyperX PC3200
GeForce FX5900
Maxtor DiamondMaxPlus9@80Gig
SONY CD 52x
SONY RW 52x/24x/52x
SONY DVD 16x/40x
 
Ok... on paper, the nVidia FX-series looks better... higher clock speed, higher transistor count, all that kind of BS.

Now, let's look at CPUs for a moment. Taking SoD's logic (or rather lack thereof) and using it against him, we find the following:

Intel's Pentium 4 processor runs at a much higher clock rate and has a much higher transistor count than Athlon XP. Therefore, (again, using SoD's 'logic') the Pentium 4 must absolutely CRUSH the Athlon XP. Yet, here is SoD using an Athlon XP.

Please provide a reasonable explanation, SoD. You use nVidia because it looks better on paper (rather than real world apps like the rest of us), yet you use a processor that is quite inferior on paper. As you say, look at the numbers. P4 has higher transistor count, much higher clock rate, and a faster FSB. Why aren't you using a P4?

Admit you're wrong just once, will you?

<font color=red> If you design software that is fool-proof, only a fool will want to use it. </font color=red>