geforce image quality??

jchedley

Distinguished
Nov 14, 2001
37
0
18,530
HI,
today at work my Oxygen VX1 (im my opinion a great card, very nice 2d quality and OGL accuracy, but not so great for DX these days) was swapped for a Winfast Geforce2 MX.

after 2 years of the VX1, i instantly noticed a degredation in 2d quality, more murky, more fuzzy. its not bad by all means, but i noticed it straight away.

are new gf cards better at 2d, or is this just how it is??

thanks!
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
The poor quality is due to the RFI filters installed on Geforce video cards. Some cards are worse than others.

There is a modification that can be done to improve the quality. It involves cutting out 3, 6, or 9 capacitors (depends on your video card) and bypassing (or cutting and retracing) 3 or 6 inductors.

You can also bypass the whole filter circuit by solding 3 wires. There is one filter for each of the RGB signals.

The first method is easier because it can be done without soldering. You pop off the capacitors and use conductive paint to bypass the inductors (actually you just carefull paint the top of the inductors). If you want, you can also cut out the inductors but this leaves the circuit open and you have to close the connection (You can still use conductive paint). You can even skip the painting step since cutting the capactors provides about 80% of the improvement and doing just this much makes for a zero cost modification.

The second method involes soldering but it is reversible and probably less risky depending on your soldering skills. My skills aren't very good so I just clipped off the capacitors and painted the inductors on a Geforce256, a Geforce2 GTS-V, and also an old ATI All-in-Wonder.

Be warned. There is a risk. Removing the capacitors could damage the underlying traces. If this were to happen the only fix would moving on to the second method and soldering in a little bypass.

Here is a link to the modification procedure.

<A HREF="http://www.geocities.com/porotuner/imagequality.html#26sep2000" target="_new">http://www.geocities.com/porotuner/imagequality.html#26sep2000</A>

Really, it's not that bad. When I did the mod, two capacitors just popped right off. The rest just disintegrated. Knowing this, in the future, I would just use a Dremel tool and a cutting disk and cut right through the capacitors. This should eliminate the risk to anything else on the video card.




<b>We are all beta testers!</b>
 

Stiffler

Distinguished
Nov 3, 2001
262
0
18,780
The re soldering is not entirely neccessary though :/ although I do not have any experiance with it, there is a piece of software that is supposed to be able to fool the PC into thinking your card is a Quadro !
<A HREF="http://www.nvworld.ru/downloads/SoftQuadro.ZIP" target="_new">http://www.nvworld.ru/downloads/SoftQuadro.ZIP</A>

For more info this is a great link :-
<A HREF="http://www.geforcefaq.com/faq.cgi" target="_new">http://www.geforcefaq.com/faq.cgi</A>

Tim

I am Homer of Borg ! Prepare to be... MMMhhhhh Doughnuts
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
For a second I thought I provided the wrong link.

Quadro mods are a completely different thing. Those mods, software and hardware, unlock features that are built into the nVidia GPUs. These are features that are useful to graphic designers and the like but not neccesarily to gamers.

The mod of which I was speaking is one that fixes blurry video output. (Click the link I provided in my earlier post. You will see a diagram of the problem). The mod sharpens up the output of your graphics card and it does this for all video, 3D and 2D. This cannot be acheived by software.

<b>We are all beta testers!</b><P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 01/15/02 10:09 PM.</EM></FONT></P>
 

reptilej

Distinguished
May 3, 2001
301
0
18,780
ok, i downloaded the software but for some reason it is asking for some vxd fiel that i can't find so i am unable to install it.

second of all, will this software increase my 2d text quality on my asus gts if i do get it to work?

repeat after me, we are all individuals!
 

jchedley

Distinguished
Nov 14, 2001
37
0
18,530
thanks for the info, havnt done any electronics work for some years, and although id like to do this, work might not be so pleased.

is the problem still persistent in modern GF3 cards? eg if i were to buy a Leadtech GF3 Ti200.... ??

cheers,
john.
 

bikeman

Distinguished
Jan 16, 2002
233
0
18,680
Hi!

Hmmm... This seems an interesting thread. And actually, this is something that is 'bothering' me for a while, too. At home we use (okay, I know, it is outdated, but still ...) a Diamond Viper V770 (TNT2 Ultra based). It gives quite sharp images on our Iiyama Vision Master Pro 450 (19") at all resolutions up to 1280x1024. Only in the resolution we use it at, 1600x1200, it gets a little blurry. Before we had the Diamond, we had a no-brand TNT2 ultra card, and there quality was even worse. Although it wasn't bothering, I wondered ...
Nowadays everybody is talking about the tremendous speeds of the most recent videocards and sometimes even results of benchmarks about 'image quality' are shown. And that, I think, is weird. First of all, your monitor is a very crucial thing, and on the second hand, there is the video cards RAMDAC and accompaning circuitry (what you are talking about) that determine a lot. But still, all videocards with the same chipset score the same amount of points. Okay, I am not stupid. I know that those benchmarks point out what features the chipset supports (FSAA, hardware motion compensation, ...), but they do not tell you anything about the crispness of your image on the screen. Now my question is where I can find that kind of reviews where they look at those things too. Because I do. The quality of a screen determines a lot of the joy you have working on a computer. If any of you could give me a url or something alike? Thanx a lot!

Greetz,

Bikeman
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
If you are only experiencing a little blurriness I would suggest cleaning the pins on the VGA cable. A contact enhancer wouldn't hurt. This might be enough to fix your problem.

I know some Iiyama monitors have BNC connectors. This connection would be preferable to the standard VGA cable if you have a choice.


My problem was much worse than that which you describe.
Although I could read text at 1024x768 it was noticably blurry. My older Geforce256 was much better. When I came across the modification I wasn't convinced to try it until I examined the Geforce2 card vs the Geforce256. The filters were similar but I noticed that one set of capacitors were missing from the Geforce256. There were places for them on the circuit but they were purposefully left off the board. This convinced me.

Now text is sharp right up to the limits of my monitor (1600x1024, 60hz). Pictures and games look much better too.

The forums at <A HREF="http://www.hardocp.com" target="_new">www.HardOCP.com</A>, the place where I learned of the mod, had a message thread with several people discussing doing the mod to Geforce3 cards. I was sweating bullets doing it to a $65 Geforce2 card. I was amazed how many people were trying it with, what at the time, was a $500 card. However, it was obvious that there are people who are unsatisfied the Geforce3 visuals.

I'm not aware of any reviews where any emphasis is placed on visual quality.

<b>We are all beta testers!</b><P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 01/17/02 02:28 AM.</EM></FONT></P>
 

bikeman

Distinguished
Jan 16, 2002
233
0
18,680
Hmmm ... I don't think it has got anything to do with dirt inside the connector, although it is not impossible, since our computer is one big pile of dust on the inside. Only recently, after vacuming it entirely from the inside, the coller on our videocard stopped screaming for attention. I know, we are criminals. Anyway...
I still wonder how the mod works. Would manufacturers really put in place an output-circuitry that actually makes image quality worse? I don't even see any commercial use to that. The only thing I can think of is the fact that those things have to be conformal to some regulations concerning electromagnetic radiation or something. But even than, would a sharp image be more polluting? I wonder ... Anyway, thanks for the information, although you won't see me working on my videocard with a screwdriver ...

Greetz,

Bikeman
PS: I just read the thread 'Text quality', where people are talking about the same problem. And indeed, it are the 'FCC RF emission standards' that are the cause of all this. I suppose this ends our nice chat?
<P ID="edit"><FONT SIZE=-1><EM>Edited by bikeman on 01/19/02 04:38 PM.</EM></FONT></P>
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
Cleaning and using contact treatment just makes for a better a electrical connection and it can help with a "fuzzy" display. If you modify a cotton swap by pulling most of fuzz off of it then you can use the swap to clean the pins of the VGA connector. Dip the swap in rubbing alcolhol, first. You will see what I mean. The swap will be nearly black. You should then see a small but noticeble difference in your display because there will be less electrical resistance at the connection point.

As for video filtering, it's not a clean signal that is a worry for RFI emission but rather high frequencies. The installed filters are low band-pass filters which limit the frequencies that can be transmitted. I believe the filters are installed to meet FCC regulations. I'm not sure of the details but a home computing device is supposed to emit only so much noise at such a distance (something like 3 feet) to classify the device as a Class A computing device. For industry use, a computer can produce the same level of noise at something like 25 feet (which means the device is a much stronger noise source). This device would be classified as a Class B computing device. In a home, the latter would interfere with radio reception and television reception (before the days of cable TV).

Oops, just saw your postscript but the above paragraph does provide a little more information even if not 100% factual. I think it is obvious I don't know the details of the FCC regulations involved.

<b>We are all beta testers!</b>
 

flamethrower205

Illustrious
Jun 26, 2001
13,105
0
40,780
Hmm, at 1280x1024 on my Quadro DCC, text is very sharp, and 2d quality is great. Therefore, the problem mentioned w/ GF2's I do not belive exists (or is noticeable) w/ the GF3 /Quadro DcC series.

My rice car will leave your R8500 in the dust!
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
the problem mentioned w/ GF2's I do not belive exists (or is noticeable) w/ the GF3 /Quadro DcC series
Do you believe that there are RFI filters on video cards? Do you believe the quality of the parts used to make those filters can be different for different manufacturers? If you can believe this then you should be able to believe that not all video cards will necessarily have the same level of visual quality.

However, if you're happy that's great. Don't change anything. I can't imagine anyone risking a Quadro or a GF3, anyway.

All I was saying is there were people who were not happy and were willing to hack up a $500 GF3. I wouldn't have done it, myself. If I thought the quality was that bad I would just return it and buy something else.



<b>We are all beta testers!</b>
 

Pizzarro

Distinguished
Dec 7, 2001
19
0
18,510
I'm glad that I got to read this thread. I'm tossing up between the Gainward GF2 Ti VIVO and the Radeon DDR 64 VIVO. I know ATi's 2D quality, as I've been working on their cards for years, but I also know nVidia's 3D speed. I do a lot of graphics work (always on Trinitron monitors) and I'm going to be getting into DVD out (also a Trinitron TV) and video capture, so 2D is very important to me.
With the Radeon only $25 more than the Ti I'm starting to think that I should stick with ATi, even with somewhat lackluster driver support.
JJ