This is nice budget video card!

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I just installed a Radeon 8500LE OEM with 250 mhz GPU/275 mhz memory. (That's not a misprint. 3.3ns Hynix memory in an OEM card). 3DMark2001 scores went up from 4000 (with a Geforce2 GTS) to about 6000. 8500LE was at stock speeds. Overclocked to 275/300 it scores about 7200.

I haven't found my maximum overclock yet but it ran at 300/346 and then had artifacts when I dropped to the desktop. I'm guessing 290/330 might be the usable limit without cooling mods.

Just for kicks, I underclocked my whole system to 600mhz, 100 mhz FSB, 100 mhz memory and 8500LE at stock. It scored 4300ish. That's better than my system at 1500mhz with the Geforce2 GTS essentially at stock.

At $99 at www.newegg.com I recommend this as the only budget card choice for any system, low-end included.

My system consists of a Tbird AYHJAR 1.0 @1.5 ghz, Epox 8KTA3PRO (KT133A), 256 MB OCZ PC150 SDRAM, Maxtor 40 GB D740X (w/o fluid bearing), Cyberdrive 36x12x48 CDRW, Turtle Beach 16-bit sound card, and other misc ancient hardware, Windows 98SE, 9031 W9x Radeon driver.

<b>Update:</b>
It seems I accidentally had Anisotropy forced when I did the above tests. My scores improved when I corrected this.

Tbird @600, 8500LE Stock, 3DMarks = 4579
Tbird @1500, 8500LE 275/300, 3DMarks = 7777

<b>I have so many cookies I now have a FAT problem!</b><P ID="edit"><FONT SIZE=-1><EM>Edited by phsstpok on 06/07/02 12:34 PM.</EM></FONT></P>
 

AMD_Man

Splendid
Jul 3, 2001
7,376
2
25,780
Agh, your scores are WAY WAY TOO LOW! At 275/300 on a 1.5GHz Athlon, you should be getting over 8000 points. I get 7600 on a 1.33GHz Athlon with SDRAM.

:wink: <b><i>"A penny saved is a penny earned!"</i></b> :wink:
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I too was expecting 8000 but was happy to be over 7000.

Gave my settings another look and found in Display Properties that Anisotropy was force for Direct3D. I just turned it off. My impromptu score is now 7777. That's while I was sitting online, running some shareware with Adware-type stuff going, and whatnot. I'll retest later.

One other thing I noticed. The OpenGL driver had "Convert 32 bit textures to 16 bit" as the default. This is a cheat so I turned it off. I don't think it's used with Direct3D apps.

I have to revise my original post with updated numbers.

<b>I have so many cookies I now have a FAT problem!</b>
 

chuck232

Distinguished
Mar 25, 2002
3,430
0
20,780
That's kinda weird. From what I saw, all the OEM R8500LEs are clocked at either 250/230 or 230/230. With either 4ns or 3.6ns RAM. Maybe you got a lucky card. Perhaps someone else who bought the same card can help us out!!

My firewall tastes like burning. :eek:
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I don't think it's a fluke. Here's the <A HREF="http://www.newegg.com/app/viewproduct.asp?description=14-102-226" target="_new">link at Newegg.com</A>. See the customer reviews. 45 people have received the same video card. Most paid the same low $99 price.

Quite a bargain! I've tested mine to 305/325 without visible artifacts. However, as I sort of indicated in one of my other posts (I forget which thread) that I don't get much gain beyond 275/300. Am I limited by my Tbird @1.5ghz with only SDRAM?

<b>I have so many cookies I now have a FAT problem!</b>
 

AMD_Man

Splendid
Jul 3, 2001
7,376
2
25,780
The OpenGL driver had "Convert 32 bit textures to 16 bit" as the default. This is a cheat so I turned it off. I don't think it's used with Direct3D apps.
No, it doesn't apply for DirectX and it's not a cheat and it significantly improves performance. You see, ATI processes the entire frame in 32-bit along with the textures and stores them in 16-bit at the final stage to save texture space. The final output is indistinguishable from pure 32-bit and gives up to a 10% boost in performance in texture intensive games.

It has less effect on the 128MB R8500 however. I highly recommend you turn that option on. I guarantee you won't notice the difference in quality but you'll notice the difference in performance in the latest games.

:wink: <b><i>"A penny saved is a penny earned!"</i></b> :wink:
 

AMD_Man

Splendid
Jul 3, 2001
7,376
2
25,780
Quite a bargain! I've tested mine to 305/325 without visible artifacts. However, as I sort of indicated in one of my other posts (I forget which thread) that I don't get much gain beyond 275/300. Am I limited by my Tbird @1.5ghz with only SDRAM?
Hmm, turn off Texture Compression for DirectX and rerun 3DMark2001. You'll probably notice artifacts. That's because with Texture Compression on 3DMark2001 uses less texture space and therefore it's less likely to reach the unstable segments of RAM.


:wink: <b><i>"A penny saved is a penny earned!"</i></b> :wink:
 

AMD_Man

Splendid
Jul 3, 2001
7,376
2
25,780
I don't think it's a fluke.
It's no fluke, NewEgg sells all their OEMs with 3.3ns DDR RAM and run at 250/275 by default. At the moment, of course. The next batch NewEgg receives may not be the same.

:wink: <b><i>"A penny saved is a penny earned!"</i></b> :wink:
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I agree. I often used 16 bit textures with my Geforce2. It does improve performance but more importantly, it saves video memory which is very important with a 32 MB video card.

I still think that as a default its a cheat. I prefer to make the choice by individual game. Of course, maybe I'll change my tune later.

<b>I have so many cookies I now have a FAT problem!</b>
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
Hmm, turn off Texture Compression for DirectX and rerun 3DMark2001. You'll probably notice artifacts. That's because with Texture Compression on 3DMark2001 uses less texture space and therefore it's less likely to reach the unstable segments of RAM.
I already tried that (by accident). It generated an error. 3DMark2001 reported texture compression is needed for chosen options (the default test) and that my video card doesn't support it.

I'm definitely hitting a brick wall at just over 275/300. Quake III, Demo002 (1024x768x32, high quality) hits 180 fps and won't go higher, 3DMark 2001 stops at 8000 and change. Is there a benchmark built into the Aquanox demo? I'll check Max Payne and Serious Sam next.

<b>I have so many cookies I now have a FAT problem!</b>
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
Yes, but 250/275 is the spec for an OEM 8500. We customers at Newegg ordered an OEM 8500LE. Supposedly the retail 8500LE's are spec'd at 250/250 while the OEM 8500LE's are usually at 230/230.

If we all keep quiet, maybe Newegg will keep selling these jewels. Shhhh. Pass the word. <b>Too late! Everybody is sreaming about this deal!</b>




<b>I have so many cookies I now have a FAT problem!</b>
 

AMD_Man

Splendid
Jul 3, 2001
7,376
2
25,780
I agree. I often used 16 bit textures with my Geforce2. It does improve performance but more importantly, it saves video memory which is very important with a 32 MB video card.
ATI's method is different though. All the multitexturing is done in 32-bit and only the final texture is stored in 16-bit. It's similar to what the Voodoo3 used to do, which is why it had the best "16-bit" colour quality.

:wink: <b><i>"A penny saved is a penny earned!"</i></b> :wink:
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
I saw that too.. that ATI card looks tasty. But I have to wait for the Parhelia because I like anti-aliasing, and I like matrox. Seriously think about how good 16X fragment anti-aliasing will be when it comes out! If it works, which we'll see in like a month, then it will be the biggest graphical breakthrough since the Voodoo2 SLI.

Censorship makes us so much more creative.
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I saw that too.. that ATI card looks tasty. But I have to wait for the Parhelia because I like anti-aliasing, and I like matrox. Seriously think about how good 16X fragment anti-aliasing will be when it comes out! If it works, which we'll see in like a month, then it will be the biggest graphical breakthrough since the Voodoo2 SLI.
I must be the only person that isn't sensitive to "jaggies". I never use anti-aliasing if higher resolution is an option. Still, if AA wasn't very penalizing I might change my opinion.

"The biggest breakthrough since the Voodoo2 SLI"? What about hardware T&L and DDR SDRAM?

<b>I have so many cookies I now have a FAT problem!</b>
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
Don't worry about it. Turns out you are partly right.

I was re-reading the customer reviews for this card at Newegg. I noticed one person mentioned that his card is a Radeon 8500 not 8500LE so I looked at the back of my card. Sure enough, there is a Radeon 8500 sticker. You were right. The specs are too good to be an 8500LE. It's and OEM 8500.

I'm still puzzled about the 3.3ns memory. I was almost positive that this memory was reservered for the full, retail 8500's.

Whatever, I'm happy.

<b>I have so many cookies I now have a FAT problem!</b>
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I forgot to mention that this particular card from Newegg includes a DVI port and an S-Video Out connector which supports up to 1024x768 resolution. (Do any TV's support 1024x768? )

<b>I have so many cookies I now have a FAT problem!</b>
 

cakecake

Distinguished
Apr 29, 2002
741
0
18,980
On regular TV's I don't believe going past 1024x768 makes any difference. That's why the max for the cards I've seen has always been 1024x768. Resolutions make a much bigger difference on TV's than on monitors though, so basically 1024x768 looks a lot better than 640x480. I remember playing starcraft on my TV and it didn't look that great.

Censorship makes us so much more creative.
 

phsstpok

Splendid
Dec 31, 2007
5,600
1
25,780
I thought NTSC wouldn't support anything beyond 800x600, maybe even less. Hard to tell with my old television and the convoluted way I get graphics to it. The Radeon 8500 only has S-Video Out. My TV only has RF In. Right now, I run the S-Video Out to a second computer which has the old All-in-Wonder. This, in turn, sends composite video to my old Dolby Digital receiver. The signal is further passed to an RF modulator and finally on to the TV. It works but 640x480 doesn't look any different than 1024x768. One thing that works well is that between the two video cards I can size the image to properly fit the TV screen.

I just need to get an S-Video to Composite Video adapter. I think it's possible to make a cheap breakout circuit to get Composite video so maybe I will try to make my own adapter.

<b>I have so many cookies I now have a FAT problem!</b>
 

chuck232

Distinguished
Mar 25, 2002
3,430
0
20,780
I wonder if newegg knows what they're selling. Also I wonder if it's just a mistake by newegg selling it wrong or Ati's wrongdoing by telling them to sell it under the wrong name and price.... Great deal though!!!

My firewall tastes like burning. :eek: