Sign in with
Sign up | Sign in
Your question

X1650 gives low performance

Last response: in Graphics & Displays
Share
March 21, 2010 2:20:12 PM

Hello,i have ati x1650 pro ddr2 with 256mb of ram for AGP and it gives me lag and low performance in games....I have ati 8.33 drivers ,cant install new cause GFX wont accept it.
every game lags (old postal 2 for example ),not to talk about never games like infernal ,wolfenstein 2009,call of juarez (first part),jade empire and many popular games that demand SM
3.0 and single core cpu...Even on everything on low .resolution lowest possible everything lags
Rest of specs
mobo:p 4pe2-x
ram:1,5gb ddr1 400mhz
proc:intel celeron 2,4ghz
hdd:wd 1600aajb 160gb ata
psu:chieftec 550W ABA
Card is plug to the additional 4pin connector and inserted properly to mobo
Fan is 'dust free' so its now overheating (although i cannot check temperatures cause card hasn't sensor for it ....

Can you help me somehow to resolve this problem
tnx

More about : x1650 low performance

March 21, 2010 2:26:17 PM

The reason for the lag is that

1. You have a single core CPU.

2. That GPU wasn't the fastest when it was made, nevemind years later.
March 21, 2010 2:42:30 PM

Your entire PC is old/very outdated.

You need to build yourself a new PC if you want to play remotely modern games.
Related resources
March 21, 2010 5:37:33 PM

it is true its very old but i watched videos on you tube and talked to folks who made them(they have the same card and single core even less ram then me ) and they can play these games better then I with normal fps around 20 and lag free
so I am thinking its some issue with card or some settings in CCC
Also some older games i used to play with ati 9250 have lower fps with x1650 pro (cs 1.6,Postal 2,WoW)
And how can i check my GFX for some damage or malfunction ????

March 21, 2010 5:39:44 PM

Downlaod GPU-Z, and see the clock speeds. It may be downloacking it'self to idle speeds.
March 22, 2010 5:03:49 PM

here's the gpu-z readings

is it normal ???




" alt="" class="imgLz frmImg " />
March 22, 2010 5:31:49 PM

They're playing with 20 fps? [:isamuelson]

Uh, I hate to break it to you, but 20 fps is just not playable under any circumstances these days except maybe if you're playing a slide-show game.

For RTS games, you want around 30-40 fps. For FPS games, anything below 45 fps gets to become unplayable, especially if you're playing online games with multiple players.

Just face the facts, your machine is old and is outdated and cannot handle modern day games. It's time for an upgrade no matter WHAT people say on YouTube (and I'd take what they say with a grain of salt).

Also, it's not just your AGP card or your single core CPU. You're also running old, slow memory. Now true, the game requires a Pentium 4 1.8 ghz, but that's the minimum. The Celerons don't run as good as a true Pentium 4 due to the smaller cache size and that's what limits them. They were budget CPUs meant more for business PCs, not gaming. Overall, P4s outpaced Celerons in intensive programs and games. So, even though your Celeron has a higher ghz rating than what the game requires, it's not the same as a P4 rated at that speed.

I liken this to when I tried to run Doom (the first one) on a 286. :lol:  Now THAT was an experience to behold!
March 22, 2010 5:47:28 PM

isamuelson said:
They're playing with 20 fps? [:isamuelson]

Uh, I hate to break it to you, but 20 fps is just not playable under any circumstances these days except maybe if you're playing a slide-show game.

For RTS games, you want around 30-40 fps. For FPS games, anything below 45 fps gets to become unplayable, especially if you're playing online games with multiple players.


I'm doing just fine with Crysis at 20fps.

And memory speed (like 1333vs 1600) isn't a major factor when gaming, GPU/CPU is more likely the bottleneck.
March 22, 2010 5:52:39 PM

builderbobftw said:
I'm doing just fine with Crysis at 20fps.

And memory speed (like 1333vs 1600) isn't a major factor when gaming, GPU/CPU is more likely the bottleneck.


Okay. I know 20 fps will get you killed during an online game. FPS games really need 45+ fps to be completely playable.

Back in the Doom days, 20 fps might have been acceptable, but by today's standards, even 35 fps on a FPS game is considered below acceptable standards.

And memory can still be a factor. Maybe not as much as GPU/CPU, but he's running DDR-400 memory. That is MUCH slower than what you stated.
March 22, 2010 5:54:45 PM

isamuelson said:
Okay. I know 20 fps will get you killed during an online game. FPS games really need 45+ fps to be completely playable.

Back in the Doom days, 20 fps might have been acceptable, but by today's standards, even 35 fps on a FPS game is considered below acceptable standards.

And memory can still be a factor. Maybe not as much as GPU/CPU, but he's running DDR-400 memory. That is MUCH slower than what you stated.


The human eye stops addressing frames past 24.

(The small benefit is reducing distance by frame, but hat's very minor)

All Movies, Tv, everything runs at 24fps.
March 22, 2010 5:58:52 PM

builderbobftw said:
The human eye stops addressing frames past 24.

(The small benefit is reducing distance by frame, but hat's very minor)

All Movies, Tv, everything runs at 24fps.


I'm not talking about the human. I'm talking about the choppiness that ensues due to the low framerates. And, I'm talking specifically about FPS games. Ask anyone, the minimal accepted frame rate is 40-45. Anything less can deteriorate the game play when playing online.

Maybe playing single player is fine, but I can tell the difference and it drives me crazy when it gets below 30.

I guess maybe I'm spoiled? ;) 



March 22, 2010 6:00:58 PM

isamuelson said:
I'm not talking about the human. I'm talking about the choppiness that ensues due to the low framerates. And, I'm talking specifically about FPS games. Ask anyone, the minimal accepted frame rate is 40-45. Anything less can deteriorate the game play when playing online.

Maybe playing single player is fine, but I can tell the difference and it drives me crazy when it gets below 30.

I guess maybe I'm spoiled? ;) 


I don't knwo, I own pretty hard at Crysis at 18-25fps lol.
March 22, 2010 6:05:51 PM

builderbobftw said:
The human eye stops addressing frames past 24.

(The small benefit is reducing distance by frame, but hat's very minor)

All Movies, Tv, everything runs at 24fps.


Tv does run a 24FPS, but watching TV and playing a game are totally different. I don't consider 24FPS playable, I want smooth gameplay and if im playing an online FPS I want a consistant 60FPS. You got to remember theres going to be dips in frame rates, so if the minimum frate rate was 24FPS it wouldn't be so bad, but IMO still not acceptable.

Everyone has different opinions on what they consider playable, but there is certainly a noticable difference from say 30 FPS to 50FPS.
March 22, 2010 6:14:20 PM

You might want (but you don't need) 60 frames, but there is no way in hell you can see the difference between 50 and 60 frames.

No matter what you say.
March 22, 2010 6:28:57 PM

builderbobftw said:
You might want (but you don't need) 60 frames, but there is no way in hell you can see the difference between 50 and 60 frames.

No matter what you say.

Oh I didn't realise you have the exact same eyes as me :lol: 
March 22, 2010 6:35:38 PM

No way ANYONE can tell the diffrence bewteen 50 frames and 60 frames.

Maybe something mental, who knows. You think what you see is better, so you convince yourself you can see a difference.

And also, We did "Eye testing" at my High School, and I scored in the top "1%"
March 22, 2010 6:58:26 PM

builderbobftw said:
No way ANYONE can tell the diffrence bewteen 50 frames and 60 frames.

Maybe something mental, who knows. You think what you see is better, so you convince yourself you can see a difference.

And also, We did "Eye testing" at my High School, and I scored in the top "1%"


No need to be so aggressive :pfff:  All I was saying was I think its iimportant to have 60FPS when playing an online FPS. Thats my opinion, I'm not mental :pt1cable: 

Congratulations on your eye test at school :lol: 
March 22, 2010 7:37:56 PM

builderbobftw said:
No way ANYONE can tell the diffrence bewteen 50 frames and 60 frames.

Maybe something mental, who knows. You think what you see is better, so you convince yourself you can see a difference.

And also, We did "Eye testing" at my High School, and I scored in the top "1%"


The human eye is not a camera.

We stop perceiving something as flickering anywhere from 40-60hz (depends on person to person, and the application). Though flicker is a different beast from the frame rate. The jerkiness we see and complain about is due to the difference between frames. Motion will often look jerky if the frame rate is not high enough to minimize the difference between frames. What you want is the target to blur for it to look normal and not jump, some people are much more sensitive to this than others, and will even get sick watching a low frame rate movie/game.

Generally frame rate stops looking jerky at about 30fps for anything but fast motion, at which point you get obvious strobing (the same effect that makes tires look like they spin backwards on TV) which is a real pain in the ass in an FPS. When the flickering goes away we just perceive it as a dimming of intensity instead of as bright and low points. Mind you, LCDs can't flicker as the light is always on in one way or another, this is an issue for CRT and theatre displays.

With an action packed scene in an FPS it is perfectly reasonable for a human to notice the strobing at up to 60fps. Flickering is caused by our persistence of vision, which is also what allows us to notice motion looking a tad 'silly' in a frame by frame representation of the motion.
March 22, 2010 7:59:49 PM

Rustyy117 said:

Congratulations on your eye test at school :lol: 


Thnaks, I was pretty pumped when I heard that!
!