Sign in with
Sign up | Sign in
Your question

FPS on my EVGA 7900GT KO (512mb) seem sort of dismal...

Tags:
  • Graphics Cards
  • Games
  • Graphics
Last response: in Graphics & Displays
Share
August 25, 2006 5:58:11 PM

Here's the situation. Typically I don't play very graphically intensive games -- I like World of Warcraft, Warcraft 3, Baldur's Gate, the Diablo Games...older stuff sorta.

I do however enjoy Oblivion, the graphics hog.

Recently I replaced my 6600GT with an EVGA 7900GT KO, 512mb version which I purchased at newegg.com for a great deal in my opinion.

I am running in single card mode, not in SLI.

My system components are as follows (it was a premade I upgraded):

3.0ghz Pentium 4 Processor
2 gig PC3200 Ram (Kingston)
Enermax Noisetaker495 (485w, combined 36v on separate 12v rails)


I was told I should be able to run Oblivion darn near max settings with great fps and no stuttering.

However, this isn't true. I fired up oblivion and the game detected "Medium graphics" for me by default. I adjusted things, and outdoors at medium settings and no lighting effects (not even bloom), I am lucky to stay above 35fps.

Is the card not good enough? Is my processor bottlenecking me? Or is the game really just that demanding? I'm trying to figure out the best way to get good fps and still retain good visual quality.

Would switching to an x1900xt maybe make that huge a difference, or is it not worth all the hassle (not to mention restocking fees)?

Anyone else play with a setup similar to mine?

More about : fps evga 7900gt 512mb sort dismal

August 25, 2006 6:10:27 PM

The X1900XT performs much better than the 7900GT in Oblivion; if you're really that into the game I suggest getting at least an X1900XT.
August 25, 2006 6:30:32 PM

You didnt mention what resolution you were playing on. But Oblivion is a VERY demanding game no matter what hardware your have, itz not a FPS so frames shouldnt matter too much anyways rite? Unless you have an SLI X-Fire setup, dont expect more than 40 fps outdoor w/ max settings with resolutions greater than 1280x1024.
August 25, 2006 6:32:13 PM

It's your P4 in my opinion not the 7900. It's gotta be what 2 or 3 yr old tech. I owned a P4 3.0 800 and was never that impressed with it. the dam thing wouldn't even record video without locking up. Hopefully you have a decent motherboard like Asus or MSI or you're not going to get the most out of the P4.
August 25, 2006 6:35:12 PM

i have a 7800gtx, (see the rest of my rig in my signature).

I also had to knock some of the settings down, its the the most power hungry game i've ever played and i have played hundreds of games in my 31years.

Dont worry about all settings on high, read the tweak guide that is available :-
http://www.tweakguides.com/Oblivion_1.html

i found that by reducing a few options i was able to get a nice looking game with great playable frame rate... the gameplay itself makes up for any shortfalls in eye candy.

I've even seen on some forums that with an SLI setup you still cant max the graphics out!

You have a great graphics card, it should play everything else really well, so dont wory, and by the way the only other game i've had a few problems with is FEAR, its a power hungry pig too.
August 25, 2006 6:41:01 PM

that looks fine to me. what res...
a c 120 U Graphics card
August 25, 2006 6:44:38 PM

I'd also say it's the processor, slow fsb, etc.

Also, a lot of people buy these cards and think they can max things out. That's not always the case.

I know F.E.A.R. has it, but don't know if your game does........ check for soft shadows and turn them off if there.

Forgot to mention.... try and use a slightly older driver. The newer ones have enhancements for things top end cards will use. 77.77 and up. Stop at 90's
a b U Graphics card
August 25, 2006 7:28:03 PM

Quote:
Here's the situation. Typically I don't play very graphically intensive games -- I like World of Warcraft, Warcraft 3, Baldur's Gate, the Diablo Games...older stuff sorta.

I do however enjoy Oblivion, the graphics hog.

Recently I replaced my 6600GT with an EVGA 7900GT KO, 512mb version which I purchased at newegg.com for a great deal in my opinion.

I am running in single card mode, not in SLI.

My system components are as follows (it was a premade I upgraded):

3.0ghz Pentium 4 Processor
2 gig PC3200 Ram (Kingston)
Enermax Noisetaker495 (485w, combined 36v on separate 12v rails)


I was told I should be able to run Oblivion darn near max settings with great fps and no stuttering.

However, this isn't true. I fired up oblivion and the game detected "Medium graphics" for me by default. I adjusted things, and outdoors at medium settings and no lighting effects (not even bloom), I am lucky to stay above 35fps.

Is the card not good enough? Is my processor bottlenecking me? Or is the game really just that demanding? I'm trying to figure out the best way to get good fps and still retain good visual quality.

Would switching to an x1900xt maybe make that huge a difference, or is it not worth all the hassle (not to mention restocking fees)?

Anyone else play with a setup similar to mine?


I play oblivion with a X1900XT and I can say it's absolutely flawless even in the outdoors. I think I've got Bloom and 6x AA turned on. It's good enough for me, I'll try HDR one of these days.

I don't know the actual framerates but it's probably above 30 or 40 most of the time.
August 25, 2006 8:00:47 PM

Quote:
Here's the situation. Typically I don't play very graphically intensive games -- I like World of Warcraft, Warcraft 3, Baldur's Gate, the Diablo Games...older stuff sorta.

I do however enjoy Oblivion, the graphics hog.

Recently I replaced my 6600GT with an EVGA 7900GT KO, 512mb version which I purchased at newegg.com for a great deal in my opinion.

I am running in single card mode, not in SLI.

My system components are as follows (it was a premade I upgraded):

3.0ghz Pentium 4 Processor
2 gig PC3200 Ram (Kingston)
Enermax Noisetaker495 (485w, combined 36v on separate 12v rails)


I was told I should be able to run Oblivion darn near max settings with great fps and no stuttering.

However, this isn't true. I fired up oblivion and the game detected "Medium graphics" for me by default. I adjusted things, and outdoors at medium settings and no lighting effects (not even bloom), I am lucky to stay above 35fps.

Is the card not good enough? Is my processor bottlenecking me? Or is the game really just that demanding? I'm trying to figure out the best way to get good fps and still retain good visual quality.

Would switching to an x1900xt maybe make that huge a difference, or is it not worth all the hassle (not to mention restocking fees)?

Anyone else play with a setup similar to mine?


I play oblivion with a X1900XT and I can say it's absolutely flawless even in the outdoors. I think I've got Bloom and 6x AA turned on. It's good enough for me, I'll try HDR one of these days.

I don't know the actual framerates but it's probably above 30 or 40 most of the time.

yeah the xt really shows how many balls it has in oblivion.
August 25, 2006 8:22:31 PM

What res are you people running at??? 35 FPS is actually pretty good for Oblivion. You should check out some of the THG benchmarks... Oblivion doesn't run very well on any single-GPU setup: http://www.tomshardware.com/2006/07/17/summer_2006_gefo....

I think it must have been optimized for XB360, because it is way too much of a resource hog for how the graphics look (IMO). Anyway, I think 35FPS is good for medium settings.
August 25, 2006 10:39:40 PM

Actually I experimented with many resolutions..up to and including the smallest size (what is that, like 640x480 or something?) and it didn't seem to make much difference. It's funny though my friends X800 got detected as HIGH settings by Oblivion and my 7900GT gets detected as MEDIUM? What gives?

Also, I ran 3dmark05 today and ended up with a 6879 score without overclocking. However, it is the free trial version. What does this score mean, and is it good?

Like I said I"m a 3d-noob and trying to learn more about how to get the most out of my games.
August 25, 2006 11:16:39 PM

Quote:


I play oblivion with a X1900XT and I can say it's absolutely flawless even in the outdoors. I think I've got Bloom and 6x AA turned on. It's good enough for me, I'll try HDR one of these days.


Damn, son, you've got an x1900xt and don't even play with HDR on? Get with the program!

"Bloom" lighting goes down in the history books with Windows ME and Atari's "E.T." as one of the worst tech inventions of all time.
August 25, 2006 11:26:40 PM

I really think the bottleneck is in your CPU, if you really want to make sure you could always do it the hard way and get a friend with a newer CPU to install your 7900GT on it. Just a thought.
August 25, 2006 11:34:17 PM

My x800 Pro got detected as high too :D  but it has an arctic cooler on it and is clocked from stock 475/450 to 550/550 (that cooler was the best 20 bucks anyone with an x800 or x850 can buy, got an extra 7-10 fps across the board in all games, makes the stuttery run buttery!).

I turned off the grass and I have a good looking game that runs well and is very playable.
August 26, 2006 1:36:57 AM

Quote:
I really think the bottleneck is in your CPU, if you really want to make sure you could always do it the hard way and get a friend with a newer CPU to install your 7900GT on it. Just a thought.



I'm really thinking it's the CPU as well. Not 100% sure but I'd put my money on it. oh well at least when I upgrade computers I can strip the good parts out of it and put them into that. Maybe run 7900GTs in SLI later or something.
a b U Graphics card
August 26, 2006 4:10:29 AM

Quote:
I was told I should be able to run Oblivion darn near max settings with great fps and no stuttering.


Sorry to say, there lies your whole problem. You were told wrong. The 7900GT can't come close to max details, even at 1024x768 resolution running NV's default driver optimizations. I ditched my 7800GT because I founds it's oblivion performance to be very dissapointing. X1900XT or on a budget X1800XT is the way to go for Oblivion.

Don't take my word for it, check out firingsquads review: 1024x768 the 7900GT averaged 23.9 fps in the outdoor demanding areas. 1280x1024 it averaged 17.2 fps.

http://www.firingsquad.com/hardware/oblivion_high-end_p...

Anyway, that should end all arguements and close the case, the 7900GT is just no match for Oblivion. It's a good card, you just expected too much based on someones careless exagerations. To be honest, no single card has conquered Oblivion, but like farcry, it will happen someday.
a b U Graphics card
August 26, 2006 4:13:55 AM

Quote:
I'm really thinking it's the CPU as well. Not 100% sure but I'd put my money on it.

In Oblivion, more than anything it's the 7900GT. The P4 will struggle in gate/town locations with lots of NPC's. Looking at the review I linked above, they had an FX-57 and you can see how poorly the 7900GT did outdoors.
August 26, 2006 4:28:45 AM

Quote:
Quote:

Anyway, that should end all arguements and close the case, the 7900GT is just no match for Oblivion. It's a good card, you just expected too much based on someones careless exagerations. To be honest, no single card has conquered Oblivion, but like farcry, it will happen someday.


Thank you for saying it :)  I was thinking the exact same thing as I was reading through the thread. I remember when you had to have a sweet PC to run StarCraft. Oblivion is simply ahead of the software/hardware curve. Overall, the 7900GT KO is a badass card and you should be satisfied with what it gives you!
August 26, 2006 7:03:13 AM

You never had to have a sweet PC to run starcraft, lol. It ran fine on my old HP 133MHz pentium. Flawlessly on my 1.5 Ghz P4 w/ onboard video. If your system can run the windows pinball game you can run starcraft.
August 26, 2006 9:04:28 AM

....... wow about 6800 on a 7900 GT KO 512 ?

im running a P4 2.8 w/ 6800 GS AGP BFG at 412/1100 stock cooler

i get like 5500-5600 on 3DMark O5 .. thats a pretty crappy score for 3dmark o5 for a 7900GT KO, go read some benchmarks, lotta people are getting about 9k score with 3dmark 05, that of course is with an dual core =x so just maybe upgrade your cpu will do ! amd cpus are at an all time low =D
August 26, 2006 11:04:40 AM

It seems that, according to that link you posted, no single card really runs Oblivion at higher settings on its own that greatly. Even the X1900XT's results are a touch unimpressive (in my opinion). Guess Oblivion is just one of those games where SLI is the way to go.

DAMN YOU ONLY HAVING ONE PCI-E SLOT!

:) 
a b U Graphics card
August 26, 2006 1:29:36 PM

Quote:
It seems that, according to that link you posted, no single card really runs Oblivion at higher settings on its own that greatly. Even the X1900XT's results are a touch unimpressive (in my opinion). Guess Oblivion is just one of those games where SLI is the way to go.


Actually, Crossfire X1900XT is the way to go for Oblivion; alot better than SLI in that game. But those were not high settings, they were MAX settings, which you are right at higher res even a X1900XT struggles at. http://www.firingsquad.com/hardware/oblivion_high-end_p...

Even Anand used crossfire X1900XT's for their Oblivion CPU tests calling them the Clear winner in their Oblivion GPU test. But if you go with high end crossifre, then you need a massive CPU to keep up with it in Oblivion; this anand CPU review shows that quite clearly. http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=274...

But all things considered, a single X1900XT does very well in Oblivion, epeccially compared to it's NV equivelent. FS in that top link, showed a single X1900XT is as good as SLI 7900GT's in Oblivion. Anand's review shows the single X1900xT offering higher average, but also WAY higher low fps compared to SLI 7900GT. http://www.anandtech.com/video/showdoc.aspx?i=2746&p=4

In general, it's the lows where NV's cards hurt the most in Oblivion, which affects the best playable settings more than the average framerates. That's also why you are going to see your 7900GT struggle in that game. But look at the X1900XT and it's lows aren't much lower than it's average, providing for better playable settings to be used.
August 26, 2006 2:49:15 PM

I have seen many people state that for Oblivion, ATI is the way to go. ARe there any games in which the 7900GT excels?
a b U Graphics card
August 26, 2006 3:02:48 PM

Yes, in general OpenGL games like Quake 4, Chronicles of Riddick, and Doom 3 favor the 7900GT's. But really it does very well in most games, Oblivion just hammers all GPU's, moreso the GF7's than the Radeons. Your 7900GT is a very good card, enjoy it.
August 26, 2006 3:05:30 PM

Great, I was starting to wonder if I should've gone with the x1900 instead. ><

I was looking at a 15% restocking fee through newegg for RMA. 8O
August 26, 2006 3:27:17 PM

Oh, one more thing...originally I ordered the 256mb 7900GT KO, and a few hours later was notified that the 512mb 7900GT KO was in stock. So now I have two of them. I actually have two PCI-E slots in my computer, but one of them has an ethernet card in it.

Would it be possible to take that ethernet card out and SLI the two cards I have together?

The computer is a Gateway 500GR, if that helps.
a b U Graphics card
August 26, 2006 3:56:29 PM

Well if priced the same, yes, you should have bought the X1900XT as it's about equal with the 7900GTX. The 7900GT is about equal to the X1800XT, which can be found for as low as $200. But still, it's a good card and I would just enjoy it rather than pay a restocking fee. How much was the 512MB GT? Is it clocked 560 core /1500mem?
August 26, 2006 4:02:46 PM

Quote:
Anyway, that should end all arguements and close the case, the 7900GT is just no match for Oblivion. It's a good card, you just expected too much based on someones careless exagerations. To be honest, no single card has conquered Oblivion, but like farcry, it will happen someday.
I think that day will come with the G80 and R600.
a b U Graphics card
August 26, 2006 4:03:52 PM

You bought both versions! 8O You could have cancelled your Newegg order as long as it didn't ship yet. And For the price of those two you could have had a 7950GX2 or a X1900XT and a wad of cash.

Just for starters, you need 2 of the large 16x sized PCI-e slots for SLI. Even then there are other hurdles to overcome. (SLI compatible mobo, SLI bridge connector, 512MB vs 256MB card, etc.)
a b U Graphics card
August 26, 2006 4:14:13 PM

Quote:
I think that day will come with the G80 and R600.

I hope you are right. I can still see myself playing Oblivion in post R600 days. Maybe even when prices drop into my cheapskate league. :roll:

Playing Farcry on a 9800 pro, 6800U, and then a X1800XT just freaked me out at the X1800's power. The Farcry demo was the first game I played that truely humbled my 9800 pro. And retail farcry made the 6800U struggle, forcing me to switch between no aa, 2xaa, and 4xaa at 12x10, when I wanted 4xaa all the time. HDR was a basic no go for the 6800U back then. Then the X1800XT was having no problem with 16x12 6x/16x, which just blew me away. To see 16x12 with HDR and even 2xaa + HDR at 12x10 very playable, was icing on the cake. Man to see a single card max out Oblivion and stay playable, would be just incredible. :D 


Oh, and speaking of G80, the OP may even be able to do the evga step up and grab a G80. 90 days right?
August 26, 2006 4:30:53 PM

I didn't mean to buy both versions, I purchased the 256 and my free rush processing on my newegg preferred line of credit made the order ship before I was notified the 512mb was in stock.

I have not opened the 256mb and have been told I could RMA for full value, waiving the 15% restocking fee.
August 26, 2006 4:36:00 PM

I also hope to see FP16 with antialiasing on the G80 or I'll make sure ATI gets my money.
a b U Graphics card
August 26, 2006 4:42:03 PM

I'd do that and return the 256MB if there is no restocking fee. Either 7900 is a good card and shoot if G80 comes out within 90 days, that could prove to be a nice deal for you if you wanted to step up. Latest rumors place the G80 end of 2006, which would be beyond your step up 90 day period. I wonder what the 7900GTX and 7950GX2 will cost within those 90 days. Anyway, it's nice to have the option with eVGA.
a b U Graphics card
August 26, 2006 5:44:40 PM

Quote:
I also hope to see FP16 with antialiasing on the G80 or I'll make sure ATI gets my money.

Yup, assuming the number of games with support (and esp. official support) for that will grow, I totally agree. If G80 doesn't support it, chances are many TWIMTBP games won't officially support it. :evil: 

To be honest, IMO without that support, G80 would be a letdown even with single core 7950GX2 or above levels of performance.
August 26, 2006 6:25:19 PM

Does the EVGA step up program work for the 7900XT and the 7950XT or whatever as well, or just in stepping up to a new generation?
a b U Graphics card
August 26, 2006 6:41:54 PM

Quote:
the cards x1900xtx and gt ko are pretty close
http://www.pureoverclock.com/review.php?id=635&page=1
check that review. not so big of a difference
i also agree its definetly the p4


First of all, I don't think he has the superclock. Second of all, he is complaining about Oblivion, which your own link shows the X1900XT destroys the superclock in. Take a look:

Quote from review: "EVGA's 7900 GT cannot compete with the X1900 XT in Oblivion."

The X1900XT had higher resolution and HQAF, and still managed more than triple the low framerates. A pitiful 6 fps for the 7900 GT superclock and 19 fps for the X1900XT.

Again, it's your own review: http://www.pureoverclock.com/review.php?id=635&page=4

Anyway, in Oblivion, it is the 7900GT NOT the P4 as even top notch cpu's show the 7900GT hurting in that game. No offense, but I feel like I'm am talking to walls here. I give link after link of proof and yet we still keep hearing it's the CPU. :roll: He isn't running SLI or crossfire, it's "a single 7900GT in Oblivion" ...enough said.
August 26, 2006 7:19:57 PM

I tweaked with the frames of Oblivion and got rid of grass entirely, and although it looks a touch more barren I can average 35ish fps outdoors and 60+ indoors.

In other good news, I downloaded the single player demo of FEAR and was averaging 70-80fps on High settings while not fighting, and even during gunfights against multiple targets the lowest I hit was 39fps.

The 7900GT just screamed through that game, and even though I take it with a grain of salt since it was the demo and not the full version, I have no doubts it will perform well on the full version too.
a b U Graphics card
August 26, 2006 8:24:34 PM

Yup, for Oblivion, Tweakguides is your friend. Grass height and shadows are the biggest framerate killers to tweak and gain performance.

Good deal on the Fear gameplay. You should do very well in Fear with your card. Fear is tough, but Oblivion and COD2 are more demanding. Try Farcry, HL2, and BF2 and you'll be thrilled.
August 27, 2006 1:05:24 PM

wasnt the ko, the superclocked version ?
anyway its close.. ? .. .
The performance differences between the EVGA e-GeForce 7900 GT KO Superclock and ATI Radeon X1900XT are reasonably small, with a slight advantage to the ATI part. However, the purchasing decision ultimately comes down to what's on offer.


fps 7900 gt ko 39.1 1280 *1024
fps x1900xtx 42.1 1280 *1024

:S thats not a big difference...
There are parts of FEAR where ATI cards simply crash and burn, and the built in time demo that many other sites use will not show this.

The EVGA KO Superclocked pulls a huge lead over the X1900 XT almost to an embarrassingly high level. In fact, even the stock 7900 GT is close on the heels of the X1900 XT. HQAF has little use in FEAR and it's box-like scenary so i see no advantage held by the X1900 XT at all.

but standart 7900 gt cant compete. the superclocked very well. depends wich one you have..
August 27, 2006 1:35:16 PM

Pretty sure the KO is just the reloaded version unless it says Superclocked on it. Mine is factory clocked 500/1500.
August 27, 2006 2:55:15 PM

Quote:
It's your P4 in my opinion not the 7900. It's gotta be what 2 or 3 yr old tech. I owned a P4 3.0 800 and was never that impressed with it. the dam thing wouldn't even record video without locking up. Hopefully you have a decent motherboard like Asus or MSI or you're not going to get the most out of the P4.


you`re so much wrong ..my friend ..is not the cpu ..is the gpu....maybe you should google ..same more after benches in oblivion...that game kills the 7900gt....34 frames ..ha ...it`s perfect for that 7900gt

you said you were not impressed with your old p4...but you forgot to mention what video card you were using back then ..a gf4 maybe
August 27, 2006 3:08:49 PM

ohh.wait or you can dump the old p4 .... and use a top of the line conroe or similar stuff..for that you`l have to sell everytnig you have in the house and the bike too......and the result....

lets see from you old set up p4+7900gt ....34 frames you`ll go to conroe + 7900 gt ...and congratulation you have just achieved 35 frames.....

have many trillion times do i have to tell you ... oblivion kills the gpu not the cpu....

if you want few frames more ..change that no good 7900gt with a 1900xtx or better and perhaps achieve 50 fps...but it all cames at a price ...x1900xtx is pretty damn expensive

or make another experiment :
try to put the old 6600gt ...in your sistem and see what happens ..you`ll probably get same 2 or 3 fps in obly..then put the 7900gt ..and see that your fps has increased from almost zero to same 30 .... and finally put a mighty 1900 and see that your frames are increasing more and more .....
so don`t blame a perfectly working cpu...because of a no good video sistem

August 27, 2006 3:35:10 PM

Quote:
wasnt the ko, the superclocked version ?
anyway its close.. ? .. .
The performance differences between the EVGA e-GeForce 7900 GT KO Superclock and ATI Radeon X1900XT are reasonably small, with a slight advantage to the ATI part. However, the purchasing decision ultimately comes down to what's on offer.


fps 7900 gt ko 39.1 1280 *1024
fps x1900xtx 42.1 1280 *1024

:S thats not a big difference...
There are parts of FEAR where ATI cards simply crash and burn, and the built in time demo that many other sites use will not show this.

The EVGA KO Superclocked pulls a huge lead over the X1900 XT almost to an embarrassingly high level. In fact, even the stock 7900 GT is close on the heels of the X1900 XT. HQAF has little use in FEAR and it's box-like scenary so i see no advantage held by the X1900 XT at all.

but standart 7900 gt cant compete. the superclocked very well. depends wich one you have..
The 7900GT's performance cannot compete with the X1900XT on any title; the 7900GTX can compete because it's extremely high clockspeed, but even then it's not able to use a lot of the X1900XT's nice features such as OpenEXR FP16 HDR with antialiasing, or High quality anisotropic filtering.
August 27, 2006 3:38:42 PM

Quote:
wasnt the ko, the superclocked version ?
anyway its close.. ? .. .
The performance differences between the EVGA e-GeForce 7900 GT KO Superclock and ATI Radeon X1900XT are reasonably small, with a slight advantage to the ATI part. However, the purchasing decision ultimately comes down to what's on offer.


fps 7900 gt ko 39.1 1280 *1024
fps x1900xtx 42.1 1280 *1024

:S thats not a big difference...
There are parts of FEAR where ATI cards simply crash and burn, and the built in time demo that many other sites use will not show this.

The EVGA KO Superclocked pulls a huge lead over the X1900 XT almost to an embarrassingly high level. In fact, even the stock 7900 GT is close on the heels of the X1900 XT. HQAF has little use in FEAR and it's box-like scenary so i see no advantage held by the X1900 XT at all.

but standart 7900 gt cant compete. the superclocked very well. depends wich one you have..


man ....did you take any medicamentation today.........get real man you must be smoking dust..... 8O
a b U Graphics card
August 27, 2006 4:03:08 PM

Quote:
The performance differences between the EVGA e-GeForce 7900 GT KO Superclock and ATI Radeon X1900XT are reasonably small...

fps 7900 gt ko 39.1 1280 *1024
fps x1900xtx 42.1 1280 *1024

:S thats not a big difference...

How can you even say that based on their results? You are looking at 12x10 average framerates in making that statement, which mean didly in Oblivion if one's low framerates are 1/3 of the others. It's a fact the GF7's have a much higher range of framerates in Oblivion that the Radeons, which hurts overall gameplay and playable settings. IMO and your reviewer's opinion, it's about best playable settings.

Again, with the X1900XT they, 1) increased the resolution, 2) enabled HQAF which does make a visual defference and also hurt performance in large outdoor areas, and 3) still managed to have 3 times the low framerates. You really call 6 fps lows at lower resolution and with lower eye candy a small difference to lows of 19 fps with higher res and higher image quality? That means way more than average framerates at 12x10. :roll:


Quote:
The EVGA KO Superclocked pulls a huge lead over the X1900 XT almost to an embarrassingly high level.
Yet lows of 6 fps with lower framerates and lower IQ are not embarrassing, but in your words "reasonably small and not a big difference" :lol:  :lol:  :lol: 


Shoot, the X1900XT also then embarrasses the 7900GT supeclock in COD2, but reguardless this is all besides the point as we aren't talking Fear or COD2, nor even the superclockGT. Plus, I am not saying the 7900GT reference or superclock is a bad card. The topic at hand is the OP complained about dissappointing Oblivion performance and you (as well as others) say it's his Pentium 4, and not the 7900GT which in your own review dropped into single digit lows. I think I have more than proved that "it's the pentium 4" is wrong with both your link and other sites results also. Again, it's a 7900GT and Oblivion, and someone set the OP up into falsely believing this card would max out Oblivion, which is not even close to being the case. FS's review showed it can't max out Oblivion at 10x7, even with IQ compromising driver optimatzations on, (edit:)  AND paired with an FX-57. It's not the CPU!
August 27, 2006 4:07:24 PM

Quote:
The performance differences between the EVGA e-GeForce 7900 GT KO Superclock and ATI Radeon X1900XT are reasonably small...

fps 7900 gt ko 39.1 1280 *1024
fps x1900xtx 42.1 1280 *1024

:S thats not a big difference...

How can you even say that based on their results? You are looking at 12x10 average framerates in making that statement, which mean didly in Oblivion if one's low framerates are 1/3 of the others. It's a fact the GF7's have a much higher range of framerates in Oblivion that the Radeons, which hurts overall gameplay and playable settings. IMO and your reviewer's opinion, it's about best playable settings.

Again, with the X1900XT they, 1) increased the resolution, 2) enabled HQAF which does make a visual defference and also hurt performance in large outdoor areas, and 3) still managed to have 3 times the low framerates. You really call 6 fps lows at lower resolution and with lower eye candy a small difference to lows of 19 fps with higher res and higher image quality? That means way more than average framerates at 12x10. :roll:


Quote:
The EVGA KO Superclocked pulls a huge lead over the X1900 XT almost to an embarrassingly high level.
Yet lows of 6 fps with lower framerates and lower IQ are not embarrassing, but in your words "reasonably small and not a big difference" :lol:  :lol:  :lol: 


Shoot, the X1900XT also then embarrasses the 7900GT supeclock in COD2, but reguardless this is all besides the point as we aren't talking Fear or COD2, nor even the superclockGT. Plus, I am not saying the 7900GT reference or superclock is a bad card. The topic at hand is the OP complained about dissappointing Oblivion performance and you (as well as others) say it's his Pentium 4, and not the 7900GT which in your own review dropped into single digit lows. I think I have more than proved that "it's the pentium 4" is wrong with both your link and other sites results also. Again, it's a 7900GT and Oblivion, and someone set the OP up into falsely believing this card would max out Oblivion, which is not even close to being the case. FS's review showed it can't max out Oblivion at 10x7, even with IQ compromising driver optimatzations on.


good settled then...end of story
a b U Graphics card
August 27, 2006 4:18:21 PM

Yeah, should have been end of story long ago. :evil: 

But it didn't end, and I left out one important part in closing about the FS review - It was paired with an FX-57 and still couldn't manage 10x7 max. :tongue:
August 27, 2006 4:52:06 PM

I just wanted to know what the deal was, not have two pages of "OMFG NO WAI" screaming back and forth guys. ><
August 27, 2006 4:57:32 PM

As an added note I am beginning to feel like I made the wrong choice going with the 7900GT and should get myself an X1900XT. :( 
      • 1 / 2
      • 2
      • Newest
!