Sign in with
Sign up | Sign in
Your question

Ugh, I am so dumb.. maybe

Last response: in Graphics & Displays
Share
March 25, 2006 7:24:09 AM

When I upgraded my Radeon 9600 XT 128mb AGP card to a Geforce FX 5500 256mb AG card did I actually downgrade?

I was reading the oblivion forums and some dude witha 9600 can play the game fine without tweaking ini files. Is it just a problem with the FX series, or is the 9600 > 5500?

The only reason I bought the FX 5500 was because one day I was using the ATI card, playing world of warcraft and I died. I slammed my desk in frustration and all of the sudden a bunch of redlines went across my screen, as I moved these got worse and worse and eventually it was just a huge scramble of color. When I exited the game the desktop has a few red lines going across it, and WoW just wouldnt stop scrambling. I made sure I had new drivers for it (interesting that when I restarted the computer with no drivers the red lines were gone, but I didnt try to play any games) but it didnt help, when the drivers were installed it got all screwy.

What do you think happened to the card? I doubt it overheated, my house is freezing, our heater has to be manually turned on in the basement and no one bothers, so it is usually very cold (blankets and sweaters > heat). Power surge or something? I banged the desk when it happened, but its not like I took a sledge hammer to the tower or anything.

Basically I just want to know if that card is screwed, what happened to it most likely, and if the FX 5500 was actually a downgrade purchase.

At the time I didnt know anything about video cards, I saw that 256mb was greater than 128mb, and the price was like 60 bucks so I bought it =/

More about : ugh dumb

March 25, 2006 8:11:49 AM

When you slammed your desk, did you check that the card was still properly in the AGP slot, along with the VGA cable properly attached to the socket on the back of the card?
March 25, 2006 8:33:53 AM

Sorry, but you wasted completely $60 unfortunately. The 9600 XT is ahead of even the 5700 ultra (if I remember correctly), and as for your 5500...yeah, huge downgrade.

Your other card is probably fried - it happens. But make sure to check to see if the cooler on it is still attached as it should be, just in case it's jarred loose and now your old card is overheating.

If you're on a budget, I suggest a 6600 GT - it outperforms a 9800 XT and can be had for around ~$120 if not less.
Related resources
March 25, 2006 8:45:26 AM

Yes, I am afraid you downgraded.

Bigtime.

Perhaps your card was overclocked?
I always get those same lines on my 9600XT if I go past 500/300.
Overclocks can messup even if u live in the arctic.

I would ditch the card and get a geforce 6. Besides now and then its ok to blow 60$ :lol: 
March 25, 2006 9:24:52 AM

When I had my Asus 9600XT in, its standard clocks where 500/600 and they OC very nicely. Something like 550/700 when pushed, consistent 530/650, with standard cooling all day.

Trenchtoaster did you try reseating the card again? Or maybe you fried it in the jolt?

Snorkius, yeah they were about even, the 9600XT & 5700 Ultra.
March 25, 2006 10:29:48 AM

The entire GeForce 5000 series (5200 to 5950) should be avoided.
They only used a 128 bit wide interface to video memory on their high end cards, when they should have used a 256 bit wide path. :cry: 

The Radeon 9800 Pro/XT had a 256 bit wide path to video memory, and kicked the entire GeForce 5000 series ass before they even started manufacturing the 5000 seres cards.

The GeForce 5800 model is the most hated video card ever made. (It is like 60 dB when idle). People don't like their PCs as loud as their motorbikes. :p 

The GeForce 6800 GT/GS made up for their past mistakes though, and provided good competition with the Radeon X800 XL/XT series.

The last 3 digits (eg: _600, _800) are indicative of card performance.
You went from a upper end _600 to a _500 vanilla.

However, Going from 6800 GS/GT to 7800 GS/GT is a fair jump. As is going from 9800 Pro/XT to X800 XL/XT.
March 25, 2006 2:45:41 PM

haha, crap that sort of sucks then =/ Just to be clear, the 6800GS is better than the 9600XT right? I would hate to downgrade again, when the care might be repaired.

Hmm, yeah I took the card out and put it back in- made sure it was in right and all that. There were still red lines. It wouldnt be a driver problem would it? When I didnt have the driver installed the desktop looked fine, it was only after my computer restarted with the new catalyst installed that the red lnes appeared again.


"When you slammed your desk, did you check that the card was still properly in the AGP slot, along with the VGA cable properly attached to the socket on the back of the card?"
-Actually I know I didnt plug any wire into it besides my monitor cable when I tried the card again. Is there supposed to be? The only other wire on the card is a small one from the fan to the card. haha, I guess it would be pretty sweet if it still worked, I never understood what happened to it, or why.

And my 6800GS should be in later today hopefully.
March 25, 2006 2:54:30 PM

6800gs is an upgrade that you'll notice. Enjoy that one.
March 25, 2006 3:27:39 PM

The 6800GS should be a large improvement over the 9600XT.
March 25, 2006 3:37:01 PM

I just wondered if maybe the VGA cable connection might've been loose/miss alligned, causing a red line - but you would've seen just red completely - plus the chances of that happening is very rare.

Next time, do like I do - kick the cat, or insult the neighbour, or take your trusted hammer and kill a router or modem (my personal favourite - went through 3 modems in one day - luckily just dial up at $15 a pop) :lol: 
March 25, 2006 3:41:54 PM

Haha nice ^^

Well I guess now I know to research my 'upgrades' before I order them =/

The worst part is back then I had lots of extra money to spend, I was buying 3 movies from hollywood video previously viewed bin every other day. Hundreds of dolllars that I could of spent upgrading my computer so I would have to play with 13fps in world of warcraft with all the visual qualities turned off -_-
March 25, 2006 5:12:32 PM

nvidia 5500 <--- 15 fps on pong

:?
March 25, 2006 6:00:00 PM

:lol:  I also have lots o money - it's just after I've spent it, I realize that I have obligations that somehow went unnoticed.....
March 25, 2006 6:48:34 PM

If it make you feel any better, I'm still running a geforce4 MX440. Who needs more than 256 colors anyway?
March 25, 2006 7:05:14 PM

I have 2 GeForce MX4000s, 2 GeForce MX400s, 1 GeForce 5500 (5200 OC) and 1 GeForce 5700LE... can't wait to upgrade.
March 25, 2006 7:39:36 PM

heh lol, wel there is currently a 256meg agp x700 pro and a 32meg pci voodoo 5 in mi computer :D 
March 25, 2006 9:05:00 PM

Quote:
I have 2 GeForce MX4000s, 2 GeForce MX400s, 1 GeForce 5500 (5200 OC) and 1 GeForce 5700LE... can't wait to upgrade.


You could like build a little house out of all that crap.
March 25, 2006 9:54:24 PM

lol

That was not even a complete list I have more!


If anyone is looking for ancient + obsolete hardware let me know.
March 25, 2006 9:55:11 PM

I would'nt go for a high end AGP card. the price is to high when you'll probably want to move to pcie in the next six months or so. The x1600's are good at low rez and do all the eye candy and at a good price right now.
March 25, 2006 9:59:06 PM

x1600's are easilly the best cheap option, they are about £70 and thats with 256megs of 1.6ghz memory, enough for doom 3 at ultra high on the highest res your monitor will do (about 1200x1600 probably)
March 25, 2006 10:32:19 PM

No thanks. Just last week I sold my Pentium II 400mhz,32mb gf2,64mb ram,2gb hdd,15" monitor for like 80$. I'm thinking about selling this one too.
March 25, 2006 10:38:26 PM

Ehhm 1.6GHz memory???

I think u missed somthing and is thinking about the x1800 or x1900

the x1600 might be ok for 1024* but 1280 is a little high...
...and 1600*1200? yeeh if u got the time to wait for the frames to drop in from time to time in a 3d chess game parhaps...

the x1600 is comparable with the 6600-6600GT in performance
http://www.anandtech.com/video/showdoc.aspx?i=2552&p=9
March 26, 2006 1:35:47 AM

For pure fps the x1600 gets beat by the x700pro[all so a good card] in some games. When it comes to features the x1600 is on top in it's price range. Newegg does list it with gddr2 at 800mhz[ 1.6gh effective ]. If your looking for something fast it's not the best, like I said good for lowwer rez.
March 26, 2006 1:39:46 AM

here are some screenshots of what my 9600XT card looks like, hopefully you guys can tell what is wrong.

This is in guild wars
http://i75.photobucket.com/albums/i311/Trenchtoaster/Gw...

Oblivion (same in wow)
http://i75.photobucket.com/albums/i311/Trenchtoaster/Ob...

Windows media player video of oblivion
http://i75.photobucket.com/albums/i311/Trenchtoaster/wm...

Is the speed right? It seems weird, I am used to core being lower, memory being higher, did it get screwed up somehow? Can I fix it by over clocking or something.
500mhz core
300mhz memory

Catalyst 6.3 Windows XP - Driver Download
March 26, 2006 2:09:45 AM

DO NOT BUY X1600pro !!! I have one and I am very disapointed with its performance, it does well in some games and in others it totally sucks. BTW the mem speed on this card is 400mhz DDR making it 800mhz effective speed and NOT 1.6ghz. you guyz are giving wrong information.

If you're on a budget get a 6600gt they are cheap nowadays and its performance is more stable than x1600pro
March 26, 2006 2:18:34 AM

Memory looks to be corrupted, see if it's still under warranty otherwise take a hammer to it :) 
March 26, 2006 2:30:08 AM

haha, hammer sounds more fun I guess. I got this video card summer 2004, I have no idea if it is still under warranty, doubt it. I dont have the box or anything anyway =/
March 26, 2006 5:37:40 AM

The GeForce 6800 GS kicks the Radeon X1600 sqaure in the balls. While being sold at 'similar enough' prices.

He is far better off with the GeForce 6800 GS IMHO.

As for the GDDR3 800 MHz comment = 1600 MHz :p  (The 800 MHz was 'post DDR', the X1600 is a poor effort from ATI, even compared to their last generations X700 Pro it isn't a great card).
March 26, 2006 6:19:24 AM

Your 9600XT is either dead, or ovwerhearting bad.

1. Check if the fan isn't working

2. If it's working, check that it's seated properly

3. If it's seated properly, it's probably toast... but try underclocking it 100 Mhz on core and mem

4. If it still doesn't work, it's toast for sure.
March 26, 2006 7:35:57 AM

The x700 is a bad card, comparatively, and the X1600 does not exist in the high-clock XT version for AGP. The Pro version has it's memory clocks cut in two and is not worth getting over the 6600GT.
a b U Graphics card
March 26, 2006 1:54:31 PM

Quote:

the x1600 is comparable with the 6600-6600GT in performance
http://www.anandtech.com/video/showdoc.aspx?i=2552&p=9


Not for shader intensive games like Oblivion, whereas it would be more comparable to the AGP GS. Just like the XT is comparable to the PCIe GS in F.E.A.R.;
http://www.xbitlabs.com/articles/video/display/powercolor-x800gto16_9.html

For other games it's quite different, the GS is well outfront, but even then there's very few where the X1600Pro loses top playable to the GF6600GT thanks to the GT's own low ROP and vertex engine count, it's the plain X1600 that tangles more erratically with the GF6600GT.
March 26, 2006 2:03:40 PM

Doesn't the X1600 throttle frequencies aswell? Maybe the 400(800) is just the thorttle equivalent - and in game it chnages to 800(1600)? Although, the amount it throttles (if at all) is quite huge...
March 26, 2006 3:13:33 PM

Try underclocking the GPU to around 400 MHz, and VRAM aswell. (Most technical applications will show VRAM 'pre-DDR' speed, the 'post-DDR' speed might be what you are thinking about).

Underclock VRAM aswell, say to 80% of what it should be.
March 26, 2006 3:48:26 PM

The redlines, and patterns shown in your screen shot are very typical of core, and possibly a memory chip failure. I doubt underclocking would correct or lessen what is being seen, even if it does... continuing degredation will be seen.

Check the brand of card, and check their website. Some cards from that generation were guarenteed 3 years to a lifetime. Even ATI's cards carried 3 years for that generation. What I am saying is that cards of your 9800's generation carried longer warranties then they do today. If you registered your card, original paperwork should not be needed.
March 26, 2006 4:04:52 PM

Im not to good with ATI stuff so I had to do some homework

The 1900XT has 1450MHz.

The X1600 XT was announced at launch to have 590MHz core and 1.38GHz memory clock speeds and 256MB of RAM.
And this is the XT

The x1600 Pro seems to have 780 or 800 MHz as you stated BUT its 800MHz effective 200*4=800 and nothing else and its also 128bit bus

So camparible with the mem speed of the plain 6600 (not 6600GT)

Seems that x1600XT has performance thats is around the slighly better then 6600GT and on pair with the plain 6800 with the 6800GS beting it down.
a b U Graphics card
March 27, 2006 4:47:04 PM

Quote:
Seems that x1600XT has performance thats is around the slighly better then 6600GT and on pair with the plain 6800 with the 6800GS beting it down.


Except like I said shader heavy games (like F.E.A.R. and Oblivion) where that is not the case, see the link up above. If he's looking specifically for that game, the GF6600GT will not play well at all, the GF6800 will be below the X1600Pro, and the crippled AGP GF6800GS will pull up right along side the X1600Pro and be under the XT.

These are the only two games I know of that play to the unbalance pixel/texture formula, but if that's what you're looking at then the X1600Pro isn't such a bad deal as it is in most other games.
March 27, 2006 4:59:41 PM

Too late now I guess =P

Already ordered a 6800GS, it should be here soon hopefully, I skipped class today justto make sure I didn't miss UPS.

It arrived to my city friday night, but they don't deliver on the weekends so I had to wait until today -_-
March 27, 2006 5:06:28 PM

By the way, I noticed Cleeve's sig, and it says
Radeon X1800 XL (o/c 725 core/635 mem)


Isn't it normally a lower core speed, and higher memory speed? Or is that just for Nvidia cards? I noticed the same thing with my 9600XT, the core speed was much higher than the memory, when I thought it was supposed to be the other way around.

What exactly do the numbers mean anyway? The 6800GS I ordered was only 350core, 1000memory. The core seemed pretty low, but the memory seemed like a decent amount, but I have no idea what each number actually means, other than higher is better.
a b U Graphics card
March 27, 2006 6:23:52 PM

Quote:
Too late now I guess =P

Already ordered a 6800GS, it should be here soon hopefully, I skipped class today justto make sure I didn't miss UPS.


Well it'll be fine. But from what I've seen Oblivion is one of the exception to the normal rules/roles of dominance. There are many examples of that for both sides. It should perform OK, but expect to still have to deal with 800x600-1024x768, as this is a very graphically and CPU intense game.

Quote:
It arrived to my city friday night, but they don't deliver on the weekends so I had to wait until today -_-


Well hopefully it's in your hands soon enough and you can enjoy a pretty good game.

Quote:
Isn't it normally a lower core speed, and higher memory speed? Or is that just for Nvidia cards? I noticed the same thing with my 9600XT, the core speed was much higher than the memory, when I thought it was supposed to be the other way around.


Well it all depends on how far you can push it, that core on Cleeve's card is insanely high (compared to stock), but that happens, just like my R9600Pro. Usually for both nV and ATi the clocks are close to syncronous, but being able to push that much further beyond stock gives you serious power, especially in older games that rely more on the ROPs and now Cleeve's core's pushing well beyond into the 100MHZ above the X1800XT/X1900XT speeds (75mhz above XTX). The problem with the memory is nowadays some of the cards try to put memory on closer to the spec than the next level card. Likely Cleeve's memory is 1.4ns and thus hitting a wall earier than the 1.26ns of the XT or X1900XT. But still it's 135mhz (actual not efffectiove [which is 270mhz]) higher than stock. So that's a pretty dang nice boost 225+ core 135+ memory.

Quote:
What exactly do the numbers mean anyway? The 6800GS I ordered was only 350core, 1000memory. The core seemed pretty low, but the memory seemed like a decent amount, but I have no idea what each number actually means, other than higher is better.


Well in this case nV or whomever is marketing the cards is giving you the effective memory speed to make it sound bigger, cleav is giving you actual speed. You card is actually 350/500, or like we like to write it 350/500(1000), whereas his is 725/635(1270).

The most important thing to remember is that these are factor parts, and so in your case it's 350mhz x 12 pipelines x 1 shader unit = 4.2 giga-'shader ops' per second for pushing power which is called fill rate and measure in MPixels/sec (soon to be Gigapixels I'm sure) , for Cleeve it's 725 x 16 x 1 = 11,520 MP/s. Now the TMUs also play a part an tell you the texture fill rate. Noww the X1600 and X1900 have unbalanced processors, and can do 3 times as many pixels as textures per clock meaning they have insane processing power but can't use it all if it's simple because the ROPs (the things that send the pixels to the memory buffer/display) can only draw so many pixels. However in intense new games the requirement of long shaders, loops, etc means that the ROPs are left starving without this kind of power so having this massive amount is a good thing for future games (which was why I recommended the X1600 only for these type of games, because otherwise it sucks due to only having 1/3 the TMU and ROPs as the GS even though it has a small speed advanatage).

For memory it's speed x bit-width to give you overall bandwidth/throughput of the memory. Both the GS and XL have 256bit and therefore scale evenly so faster = faster. However the X1600Pro has 128bit external memory interface and as such if you have the same speed memory it's only 1/2 the bandwidth or, if the memory is say 50% faster on the 128bit card, it's still only got 75% of the overall bandwidth.
This matters for very memory intensive situations like high resolution and high AA levels. However considering the low resolutions you need to play Oblivion in I doubt this would've posed much of an issue for either of the GF6800GS or X1600Pro, however for the GF7900, X1800/1900 series cards this would matter alot since they usually will be starved for memory. Even if it's not using it 100% of the time, it's always nice to have more bandwidth than you need because even cards like the R9600Pro can be meory starved, and it's one of the biggest problems for cards running at mid to high reoslutions, and using AA.

Here's an OK quick chart (only drawback they've miss-number the pixel pipes on the GS) should help you 'see' the differences easier;
http://techreport.com/reviews/2006q1/geforce-7600-7900/index.x?pg=7
March 27, 2006 6:54:02 PM

m8 the video memory on that card has had its day....... 8O. unless u could try clockin down the mem and the core.
March 27, 2006 6:58:52 PM

Ahh alright, well if I can get it to 1024x768 then I would be more than happy. Oblivion is definately one of the more storyline/immersive games that I play, so graphics come second anyway, as long as it is playable with decent quality.

And thanks for the explanation about the clock speeds. So it is sort of like AMD saying their processor is 4200+ when it is really only 2.xx? I know that is because AMD processors do more work per cycle, so although their number is actually lower, they make it higher to correspond with Intel processors?

Or is it just sort of a made up number, just to look good? I notice the two examples you gave were half of what the advertised memory speed said they were.

My FX5500 was 270mhz core, 400mhz memory, and that doesnt seem too much lower than a 350/500 6800GS. Will my new card be better just due to the fact that it is a 6800, or are the clockspeeds the most important aspect? Also, that link you gave me showed charts of the 6800GS being pretty low compared to most other cards. I hope I made the right choice for an upgrade (cost/performance value).

Anyways, thanks for the information, sorry if I interpreted it wrong, I am pretty new to this stuff, but I am trying to learn as much as possible.
March 27, 2006 7:11:41 PM

yer never judge a card by its numbers (in some cases lol) a good example is the nvidia 5950, it had 1000mhz DDR mem and 450 core, it fell flat on its face, onli scores 900 odd in 3dmark 05, well the wole FX series was a load of...... :roll: .
a b U Graphics card
March 27, 2006 8:16:12 PM

Quote:

Or is it just sort of a made up number, just to look good?


Well/while their number is 'made up' it is meant to correspond with the competitions 'pure ghz' mehtod. Even now intel has moved away from mhz because they ran into a speed barrier so they now promote them as numbers too, but they aren't really related to anything the way the AMDss are supposed to be. It's still subjective since they are two different archieture, but they are close (overestimating video processing and such, and underestimating gaming performance IMO).

Quote:
I notice the two examples you gave were half of what the advertised memory speed said they were.


Yes because if ou think about it the memory is DDR2, therefore it's really running at one frequency, but since it's double pumped it's as if it were running twice as fast, and if you're a PR guy which one are you gonna put in the fine print? It's like selling 4 cyclinder engines by pimping the 16valve aspect.

Quote:
My FX5500 was 270mhz core, 400mhz memory, and that doesnt seem too much lower than a 350/500 6800GS. Will my new card be better just due to the fact that it is a 6800, or are the clockspeeds the most important aspect?


Like I said it's about the shader ops. however they are layed out within the pipeline. The GF6800GS has 12 FULL pipelines that do 1 shader op per cycle, the FX series is messed up and has more TMUs than pixel pushing power, the FX is said to be 4 pipelines, but it's really 2X1+1. So for complex pixel shader opps it can only really do 2 per cycle, wheras for simple pixels ops it can take advanatage of it's parralel ALUs to do 4 pixels per cycle (I think they also need to be FP16 not 32 IIRC [it's been a long time since I've thought of the FX]). this was a big issue for the FX series, in that it was always sold as sull pipelines whereas it was really a 1+0 pipeline that could do 2 at once but only very basic opps. With Oblivion these are complex shaders so the card performs terribly. Even if it were full 4 pipeline you'd still have that imbalance of 4pipes on the FX5500 and 12 on the GF6800GS so even if you could get the FX5500 to twice the speed of the GS, once again you'd wind up with only 66% of the power. And to add insult to injury those 4 'piplines' aren't true 4 full fledged pipes, so really the GF5500 is at a terrible dissadvantage.

Quote:
Also, that link you gave me showed charts of the 6800GS being pretty low compared to most other cards. I hope I made the right choice for an upgrade (cost/performance value).


Remember that chart was wrong with it's pixel pipe number, like I mentioned. It's really 12, so the fill rate is really 50% more than the 8 pipe amount they show. In relationship to the other cards posted there it is kind middle of the road to low, you have no choice you're AGP. Rest assured that while your available choices weren't as varied or favourable as PCIe offers, you made a solid choice the GF6800GS is a good card bang/buck. It won't perform as well as a GF7800GT-7900GTX, and the GF7600Gt is a great option to, but they are only PCIe, and yours'll be rock solid, and take you into the next generation games.

Quote:
Anyways, thanks for the information, sorry if I interpreted it wrong, I am pretty new to this stuff, but I am trying to learn as much as possible.


Noi worries, no apologies, your intepretation is part of learning, you weren't challenging our information, you were inquirig about it. It's all good man, and that's what's the nice part of the forum, sharing information, not just the IS card/chip A >=< B?

If you want to learn more search out our threads where we talk about future cards or disect the performance of a card/series based on it's design and the game, those threads should give you alot of info.

Also check GeneticWeapon's Pixel sticky at the top of the page, should be somewhat more enlightening.
March 27, 2006 9:08:02 PM

Ahh, much clearer now, thanks. Yeah, I wish I knew all of this stuff when I first bought the FX5500, I guess I can't rely on newegg reviews to get all the technical info, haha. Well at least it wasn't a complete waste, now my brother is using it and it is a big upgrade from his VIA integrated 64mb card, at least he can play WoW now without the ground looking like crap and his mouse flickering.

I just got the 6800GS and installed it, using Rivatuner I was able to unlock the extra pipelines and vertex shader. I guess I might try overclocking it a bit if I can find a guide for the PNY Verto version of the card, I really don't want to mess anything up on my first day =x

Yeah, I realized that the card wouldnt be as nice as the 7 series, but I really couldnt afford to upgrade my entire system, and the cheapest AGP 7 series I could find was at least 300 dollars. I couldn't really justify spending an extra hundred dollars on a card that I am planning to upgrade again (as opposed to keeping it until it breaks/until games can't be played using it).

Good to know that it will at least allow me to play some of the next gen games that come out, I definately plan on upgrading to PCIe cards sometime, but I am hoping not until DX10 cards come out and are bug-free.
a b U Graphics card
March 27, 2006 9:44:22 PM

Quote:
Ahh, much clearer now, thanks. Yeah, I wish I knew all of this stuff when I first bought the FX5500, I guess I can't rely on newegg reviews to get all the technical info, haha.


Yeah too many people there are like "It's great cause I bought it, so it must be great. Yeah me, and my card, which if I didn't mention it's great!" Ask them what their settings are half the time and they'll say, what are you talkig about? In the future always come to us, we are your new family now, BART! :twisted:

Quote:
I just got the 6800GS and installed it, using Rivatuner I was able to unlock the extra pipelines and vertex shader.


That's GREAT news. That essentially makes it almost a GF6800GT, thus making it a darn good card again, it will be equal to or above the PCIe GF6800GS now in most things (some things still prefer the higher clock speeds just due to the nature of looping and target dependancy, but overall unlocking the card (which I never mention in a recommendation cause it's a gamble and should be though of as a free bonus) has made it a very solid card. Now youhave the 16 pipelines so basically 33% performance boost in that one act. Be sure to watch out for artifacts, but likely you'll be fine. Nice to have the extra power for Oblvion.

Quote:
I guess I might try overclocking it a bit if I can find a guide for the PNY Verto version of the card, I really don't want to mess anything up on my first day =x


Understood. Don't worry about cranking it's nuts off, overclock it a little bit (keep it reasonable, keep it cool), the unlocking has done a heck of alot already. Test your overclocks with things like 3Dmark. And you should really run a loop of 3Dmark at stock speed for a few hours to warm up the card and make sure everything is properly burned-in, so your overclocking doesn't spike the card from the start. Most OCs are much better after a good 4-24 hours burn-in period.

Quote:
Yeah, I realized that the card wouldnt be as nice as the 7 series...


Yeah, but you know, you did good, unlockable GS cards are getting rarer (most are laser cut now), but now your card is great until the next generation IMO.

Quote:
Good to know that it will at least allow me to play some of the next gen games that come out, I definately plan on upgrading to PCIe cards sometime, but I am hoping not until DX10 cards come out and are bug-free.


By the time you NEED to upgrade they should be, Vista should be fine too.
March 28, 2006 1:47:26 AM

Decided to hold off on overclocking for a bit, but here are some oblivion screenshots comparing the two cards. Both of these are on the same AMD 2000+, 1280 computer.

My FX5500, all low settings, 640x480, with every ini file tweak I could find to get the game to play:
http://i75.photobucket.com/albums/i311/Tre...18-53-12-9...
http://i75.photobucket.com/albums/i311/Tre...18-12-53-9...
http://i75.photobucket.com/albums/i311/Tre...18-46-10-5...


Here is my 6800GS, 1024x768, all max settings besides grass shadows, and self shadows, HDR, no ini files tweaked (I am thinking about upping the resolution to 1280x1024)
Avg FPS: 24.625 outdoors in dense forrest, much higher indoors and in sparse areas, definately sweet as far as I am concerned =P

Indoors:
http://i75.photobucket.com/albums/i311/Tre...18-26-35-7...
http://i75.photobucket.com/albums/i311/Tre...18-27-53-3...
http://i75.photobucket.com/albums/i311/Tre...18-43-54-6...
http://i75.photobucket.com/albums/i311/Tre...18-12-29-3...

Outdoors:
http://i75.photobucket.com/albums/i311/Tre...20-35-26-6...
http://i75.photobucket.com/albums/i311/Tre...20-37-46-3...
http://i75.photobucket.com/albums/i311/Tre...20-39-04-0...
http://i75.photobucket.com/albums/i311/Tre...19-57-49-4...
http://i75.photobucket.com/albums/i311/Tre...20-42-13-5...

Very happy, I am glad I spent under 200 on the card rather than 500+ on an HDTV and 400+ on an xbox 360.
a b U Graphics card
March 28, 2006 5:53:31 PM

Very sweet. 8)

Pic links don't work anymore, nice choice of soundtrack for the video 'Sober' is right! :mrgreen:

But I bet the difference is amazing, eh!?!

Enjoy!
April 3, 2006 6:02:32 AM

http://forumz.tomshardware.com/hardware/modules.php?nam...



Quote:
here are some screenshots of what my 9600XT card looks like, hopefully you guys can tell what is wrong.

This is in guild wars
http://i75.photobucket.com/albums/i311/Trenchtoaster/Gw...

Oblivion (same in wow)
http://i75.photobucket.com/albums/i311/Trenchtoaster/Ob...

Windows media player video of oblivion
http://i75.photobucket.com/albums/i311/Trenchtoaster/wm...

Is the speed right? It seems weird, I am used to core being lower, memory being higher, did it get screwed up somehow? Can I fix it by over clocking or something.
500mhz core
300mhz memory

Catalyst 6.3 Windows XP - Driver Download
!