My heart is broken in piece

coylter

Distinguished
Sep 12, 2003
1,322
0
19,280
:(

I hate the feeling it does when your TopEnd Card (9800pro) then hit the dirt vs the new card .... i am gonna cry


LOL GeForce 6800 is kicking ass... wow

Athlon 2700xp+ (oc: 3200xp+ with 200fsb) , Radeon 9800pro (oc: 410/370) , 512mb pc3200 (3-3-3-2), Asus A7N8X-X<P ID="edit"><FONT SIZE=-1><EM>Edited by coylter on 04/15/04 11:31 AM.</EM></FONT></P>
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
My <b>hearth</b> is broken in piece
I fail to see how a new GPU's release has broken your fireplace. :wink:
<pre>sorry.... couldn't resist that one..</pre><p>
---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

coylter

Distinguished
Sep 12, 2003
1,322
0
19,280
did i make an error ..... Sorry i am french and there is some english word i dont know correctly

Athlon 2700xp+ (oc: 3200xp+ with 200fsb) , Radeon 9800pro (oc: 410/370) , 512mb pc3200 (3-3-3-2), Asus A7N8X-X
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
It's an acceptable typo... I assume you meant <b>Heart</b> (which is the red squishy thing that pumps blood around the body).. rather than <b>Hearth</b> which is actually the floor in/around a fireplace.

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
No offence meant of course - My french is a helluva lot worse than your english :smile: ...

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
I hate the feeling it does when your TopEnd Card (9800pro) then hit the dirt vs the new card .... i am gonna cry
Friend, there is no reason at all to feel bad. The truth of the matter is simple: While the nV 6800 is MUCH faster than ATI 9800, the 9800 is still VERY fast.

Think about it: at 1280x1024 + 4xAA and 8xAF it still gets you 40+ fps in every game you throw at it. Moreover, 6800 AA and AF look worse than 9800 in most titles (at least with current Detonators). Don’t believe me? Read THG review carefully.

What does this tell you? Your ATI has <font color=red>at least 12</font color=red> month of very productive life left in it. And remember the truest of truths about frame rates: MOTION PICTURES (as in *FILMS*) are shot at 30 (<font color=red>ONLY</font color=red> thirty) frames per second. Seen Terminator 2? Looks VERY good. Right? 30 frames per second!

Hey, 99% of the most demanding games of today and the immediate future are VERY playable even at 20-25 fps. As long as you are not dealing with a 10-15 fps slide show YOUR VDIEO CARD RULES. Forget about 250 fps benchmarks. Forever. End of story.

Last but not least, if 12 months from now you feel that your ATI 9800 is suddenly a bit sluggish, reduce your AF from 8x to 4x. and you are back in business. Believe me, unless you examine your monitor with a magnifying glass you will NOT see any noticeable deterioration of image quality. Just play your games and have fun instead of fixating on all this bull$hit hype.

Yes, I am an nVIDIA fan, but it does not at all mean your card is bad.

I have an nVIDIA FX5700 Ultra with GDDR2 overclocked to 550/1.03. Your card is 50-60% faster than mine, but my 5700 gives me around 30 FPS in all games I have (including some of the latest ones) with most options enabled and 2xAA / 4xAF. And you know? Everything looks pretty damn good. I see no pressing need to upgrade my card. Do you really feel bad about yours?


<font color=green>Stingy people end up paying double. One kick-ass rig that will go strong for three years or one half-decent one every year?</font color=green> :cool:
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
Unless you're very masochistic, FPS (as in, first-person-shooter) games are horrible to play at 20-25FPS.

30FPS is fine when watching a film, but when you're actually controlling the camera with a mouse you <i>do</i> notice the sluggish response.

That said though, I won't be upgrading my 9800Pro until I buy a game that doesn't run as well as I want it to. That's why I upgraded from my Ti4600 - not to enlarge my e-penis with a big 3dMark score.

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
I won't be upgrading my 9800Pro until I buy a game that doesn't run as well as I want it to. That's why I upgraded from my Ti4600 - not to enlarge my e-penis with a big 3dMark score

Exactly my point!

FPS (as in, first-person-shooter) games are horrible to play at 20-25FPS.


You are exaggerating. 25 fps is fine even for a shooter game. I agree that anything less than 25 fps is not perfect, especially if you need to do a quick 180 once you hear someone sneaking up on you from behind, but such perfect game/fps response would only really matter in LAN multiplayer death matches and the truth about this is that for most of the gamers out there who play on the internet the advantage of high fps is severely limited by internet lag (even with broadband) and server synchronization lag (when it has to harmonize what multiple players connected at different speeds see on their screens).

The only gaming genre that does require perfectly fluid fps rates is Flight Simulation. But if you are willing to spend $1500 to upgrade to a 250 fps system just for that you gotta be either filthy rich or horribly stupid. :lol: Just fly your F-16 at 1024x768 instead of 1600x1200 and you get your fluid fps rate and save your extra $1100 for something more useful...

<font color=green>Stingy people end up paying double. One kick-ass rig that will go strong for three years or one half-decent one every year?</font color=green> :cool:
 

ChipDeath

Splendid
May 16, 2002
4,307
0
22,790
I'm not exagerrating, I personally <i>hate</i> playing at lower FPS. especially badly-programmed crap like HALO, which introduces some horrible mouse-lag if you're playing at less than ~60FPS.

---
Epox 8RDA+ rev1.1 w/ Custom NB HS
XP1700+ @205x11 (~2.26Ghz), 1.575Vcore
2x256Mb Corsair PC3200LL 2-2-2-4
Sapphire 9800Pro 420/744
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
Like I said in the above post: most of the fps *problems* can be very easily solved by changing some of the driver settings such as go one notch down on AA or FF or resolution and voila!

No, of course, I am not saying that you should enjoy playing at 800x600 with no AA , no AF and dynamic lights and shadows – all turned off. There are limits to fps IQ compromise but any experienced gamer knows damn well that minor driver adjustments often yield big FPS improvements thus seriously extending the useful life of your card without much IQ sacrifice. Anyhow. ‘nuff said.


<font color=green>Stingy people end up paying double. One kick-ass rig that will go strong for three years or one half-decent one every year?</font color=green> :cool:
 

jiffy

Distinguished
Oct 2, 2001
1,951
0
19,780
I would disagree and say 25 fps is unacceptable, at least my experience with Farcry at that frame rate. Sure it plays, but the enjoyment is hindered by the low fps. Sure could kill some eye candy, again dabbing into the enjoyment in the first place. I play UT’s and run most of those games in high fps and would hate to imaging low 25 fps. In fact when high settings are on I do get some stutter from time to time and I bet that would be even higher then 25 fps, but to play that at 25 fps always, no way. I for one want the game to be smooth at ALL times, I also want ALL the eye candy. The 9800 P is already being pushed by some games where to get both smooth play and eye candy you have to make some sacrifice somewhere. Off the top of my mind UT 4 gives the 9800 P a run for it’s money, but Far cry spanks the h*ll out of the 9800 P. So I disagree there too that the 9800 P is on it’s way out. I suppose it depends on the size of your pocket book, because I for one will jump at the chance to play the latest games, instead of playing with the settings hour on end. I hate when the game isn’t always smooth or I can’t have all the eye candy. I think ATI did pretty good when the 9600 came out and I hope the newer cards will last just as long. So I’m going to go buy a Nvidia to finish playing my Farcry. The frame rate looks playable, just wondering if it frame rate will stay in the 70’s or what ever it is, because late 30’s sucks. Naa, I’ll see what ATI does first before deciding. Just hope they don’t have plans for a new card six months later that beat the ones that ant out yet, because that will suck and I don’t want to upgrade that much, lol.
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
Naa, I’ll see what ATI does first before deciding. Just hope they don’t have plans for a new card six months later that beat the ones that ant out yet, because that will suck and I don’t want to upgrade that much, lol.
So you DO see my point after all! Hehe. That's exactly the thing. You will never ever get all games to play at insane fps with all eye candy on for longer than 2-3 months before you do have to play with driver settings. Hey, once you get your 100 fps at 1600x1200 at 8xAA and 16xAF you will complain that you have to turn AF from 16x to 8x since the new game you just bought is not perfectly smooth and does not allow you to turn on all the eye candy. Hence why obsess about the absolute latest and greatest?


<font color=green>Stingy people end up paying double. One kick-ass rig that will go strong for three years or one half-decent one every year?</font color=green> :cool:
 

CaptainNemo

Distinguished
Jun 19, 2002
245
0
18,680
Heh - all good points...

There is no need to be upset by NV40; the whole gfx card thang has been stagnant for months, but now tongues are wagging again.

Axis of Stupid = coop, Kanavit, FUGGER, and SoDNighthawk
 

Spitfire_x86

Splendid
Jun 26, 2002
7,248
0
25,780
No reason to be heartbroken. I have a Radeon 9000 (non-Pro), which is nowhere near the top place in the benchmark charts, but plays most of my games good enough

------------
<A HREF="http://geocities.com/spitfire_x86" target="_new">My Website</A>

<A HREF="http://geocities.com/spitfire_x86/myrig.html" target="_new">My Rig</A> & <A HREF="http://geocities.com/spitfire_x86/benchmark.html" target="_new">3DMark score</A>
 

jiffy

Distinguished
Oct 2, 2001
1,951
0
19,780
“2-3 months before you have to start making changes to the settings.” You could be right, guess it depends on the feature games. Would it be save to say that the ATI 9600 hold it’s own for a little longer then two months, like a couple years, where you didn’t have to sacrifice too many settings. What ever the time frame was it sure beat what Nv was doing with the GF, GF Ultra, GF 3 and then GF 4 and yeah I hate to see that repeat it’s self. You make a point and you might have to sacrifice, and using Farcry as an example the sacrifice would be severe and even Nv looks to barley pull it off. Maybe the next card isn’t the answer, but the second or third one to where it might have some staying power.
 

Slava

Distinguished
Mar 6, 2002
914
0
18,980
I don't have time to go into details now, but the video card is FAR from the only component that determines your machine's longevity. If the system is built and configured right you can get BETTER stats with a WEAKER video card than the next guy with a top of the line card and a messed up, poorly designed system.

When components are chosen care must be taken that they work in perfect harmony with each other, maxilizing each other's capabilities.

That said, tell you what, I only felt the need to build a new monster rig about 2 months ago. Why? Because Kinghts of the Old Republic (and it IS very demanding) refused to run smoothly (faster than 25 fps that is) at 1280x960 with 4xAA and 4xAF with all graphics options enabled. And guess what? The system I am talking about is a 2.5 years old P4/1.8 GHz with ASUS GeForce3 Ti-500 Deluxe 64Mb card!

Later.

<font color=green>Stingy people end up paying double. One kick-ass rig that will go strong for three years or one half-decent one every year?</font color=green> :cool:
 

jiffy

Distinguished
Oct 2, 2001
1,951
0
19,780
I didn't just walk in off the street. At the moment I have a P4 2.8 clocked at 3.3,could get more if I wanted to spend the time. Also, have 1 gig of Corsair 512 of 2700 DDR and 3200 DDR, thinking of upgrading by the way. A W/D 120 gig HD on a Asus P4800 mobo. Before that I had an Abit 266 mobo with a XP1800 with the DDR 2700, same HD and a GF 4 4400. Before that an Abit 133 mobo with a 1200 TB with an IBM HD and Corsair ram and a GF 2 ultra card. A couple more systems down and that would sport the first GF card I bought. Every item I ever bought was top shelf stuff and scored high and I use to tweak my systems so much I learn to format the OS like it was second natural to me. The only tweaking I didn’t do was with video drivers. I did at first and gain performance, but lost eye candy in the process, but besides that I believe I done about as much as one could do. I also believe I was current most of the time, updating my XP 1800 when they just started coming out with fatser chips, but I choice the P4 2.8, because it seem like the right thing to do when it was priced just under $300. My Ram is closs to running at 200 MHz and is about the only thing I might be able to benefit from a little. Maybe a HD. I run my service as lean as I can with out having to turn stuff back on when I’m threw playing games. In fact I even thought of getting another HD and maybe just loading Win98 on it, just t try to get that little extra in games. Also, tweak my video card as high as I can get it, and to tell the truth I yet to see any difference, because the game that gave me the problem, still did after over clocking. I guess the difference between you and me is I’m a serious gamer or hard core, that must have the latest to be happy. Your stilling using a out dated GF 3 and I’m ready to trade in my 9800 P what does that tell you. You settle for less. I have a GF 4 and to be honest that would be torching to even think of using that card again. Sure it plays games, but so does an MX card.

Oops my bad, you did upgrade your GF 3 two months ago. Didn’t me to sound harsh if you were still using the GF 3.

<P ID="edit"><FONT SIZE=-1><EM>Edited by jiffy on 04/15/04 06:38 PM.</EM></FONT></P>