Sign in with
Sign up | Sign in
Your question

THE 8600gt IS FINE FOR GAMING DONT BELIVE ANYHTING ELSE!!

Last response: in Graphics & Displays
Share
December 17, 2007 1:45:34 AM

Sorry to all of you this has to be done.

First off i can play crysis thats right crysis. At all high except the shader quality no aa and get a steady 30 fps. My machine is a $500usd gateway with a xfx 8600gt xxx editon (oced to 750mhz core, 951 memory {1902 effective}). Everyone one says that you can only game with a 8600gts and up but it is not true. I can play any game i please without 30+fps lag and a nice res 1152x864 wich again is not bad. people say waste of time under 1280x1024 however thats another rant. Given not all 8600gt will oc to what i have. but with all stock compontent's with a $10 antec 88mm fan at 1/2 speed blowing on my card to get cool air into the "stock hsf". so say waht you will about me postiong this but as i have sated before it had to be done.

Again im sorry


December 17, 2007 2:12:22 AM

I play med crysis, Source engine at full all at 1440 by 900 with my 7600gt. People underestimate these "Low power cards"
December 17, 2007 2:16:53 AM

Thank you! they realy do. love the use of quotes
December 17, 2007 2:20:27 AM

30fps is MINIMUM for fps.. you need 60 fps for real playing man. :p 
December 17, 2007 2:23:02 AM

Still i beilve its highly playable at 30 fps
December 17, 2007 2:27:36 AM

I agree, you can play a lot of games with playable settings, you just have to work with it.
December 17, 2007 2:33:33 AM

Right. 8600gt is fine for gaming. But for the price? Heck no. Thankfully it's dropping...
December 17, 2007 2:37:24 AM

Bear with me...
1152x864 ~=995K pilels
1280x1024 ~= 1310K pixels
~131/76% difference.

What does that mean u ask?
If u were to play at 1280x1024 (the minimum that most people play at), your 30fps becomes ~23fps, which IS unplayable to me.

Im not calling BS, but without seeing ur settings id find it hard to believe u can run 30fps MIN. Maybe 30fps max which would be ~20 during heavy action.

Bottom line: if the card works for you thats fantastic. However IMHO, i always aim for ~60fps so when the heat comes and things start to crunch it doesnt get below 30.
If someone running 12x10 thinks they can happily run crysis with a 8600GT, id suggest they would be dissapointed.
Im not that happy with my 320GTS OC.
a b U Graphics card
December 17, 2007 2:49:01 AM

I can play Crysis on my laptop w/ATI x1400 256mb, but just at stock settings. Not sure what the FPS is, but it looks pretty good. The beauty is in the eye of the beholder, so if it works for you, go with it!
December 17, 2007 2:51:42 AM

your last part confuzeld me... also i was jsut saying you can game have you looked at screenies from what i play at to waht you say most play at witht eh same settings... you wont ifnd that much of a differnce and you cant say that you can see a difference between even 45 up to 60 fps beacuse i know your brain cant prosess the flashs that fast so ud be ok with 45 fps and my 30 fps with my setting suites me jsut fine plus your straying from my original post the fact that im saying you can game with an 8600gt im not sayng you can run it on a 62 inch plasma with a refresh of 150hz ok...?

P.S. i just brought crysis up as an example i can play at 30fps. however my main posint was again its possible to game.
a b U Graphics card
December 17, 2007 2:53:04 AM

I can only get 30fps on low-medium with my x1950 pro, methinks you measured the framerate with your brain.
December 17, 2007 2:53:23 AM

thank you lunyone, I also want to thank all of you for not making this a heated fight but rather a well worded debate.

December 17, 2007 2:53:46 AM

lol i used fraps...
December 17, 2007 2:54:36 AM

btw i overcloced my shader to 1700mhz...
a b U Graphics card
December 17, 2007 2:55:13 AM

Well then that is excellent. I always knew my x1950 pro was a piece of poop.
December 17, 2007 2:59:36 AM

lol well if it works ofr you im glad = ) lovin the prof. picture
December 17, 2007 3:01:59 AM

by thw way mrmez i belive 23 fps is still playable...jsut me though
December 17, 2007 3:09:17 AM

i agree with mrmez, partially.
my oc'd x1650xt agp can play 1280x1024 on medium/low settings with an average framerate of 24 fps.
if my weaker card can push 24 frames, i'm sure his 8600 can push 30.
but, considering mine is an average, and we don't know what his is..i'm not so sure. is 30 the max? i'll bet on it dipping in gameplay too.
a b U Graphics card
December 17, 2007 3:21:10 AM

But he said 30 on high except shader quality. Considering shaders are the biggest impactor, this makes sense. I missed that part before and now it seems much more likely that he can get 30fps. I am able to as well, as long as shaders aren't on high.
December 17, 2007 3:30:21 AM

Playable is one thing ... enjoyable is another ...

It's nice if it works fine for you. I personally prefer to keep might FPS higher than 50. But hey we're talking about Crysis so nice job anyway ;) 
December 17, 2007 3:45:46 AM

randomizer said:
methinks you measured the framerate with your brain.


ROFL! :lol: 
Quote of the month!

Anyway, as i said, if it works for u thats awesome. If everyone was happy with the same things we wouldnt have so many acceptable game platforms out at any given time.

Most modern games are also very 'scalable' (if u like), meaning they can punish a quad core GXT sli system, or fly on a single core with an 8600.
December 17, 2007 3:54:40 AM

x_2fast4u_x said:
Sorry to all of you this has to be done.

First off i can play crysis thats right crysis. At all high except the shader quality no aa and get a steady 30 fps. My machine is a $500usd gateway with a xfx 8600gt xxx editon (oced to 750mhz core, 951 memory {1902 effective}). Everyone one says that you can only game with a 8600gts and up but it is not true. I can play any game i please without 30+fps lag and a nice res 1152x864 wich again is not bad. people say waste of time under 1280x1024 however thats another rant. Given not all 8600gt will oc to what i have. but with all stock compontent's with a $10 antec 88mm fan at 1/2 speed blowing on my card to get cool air into the "stock hsf". so say waht you will about me postiong this but as i have sated before it had to be done.

Again im sorry


You do realize that your 8600GT is clocked higher than a GTS yes? except for memory of course.

8600GT = 540 core, 1400 memory
8600GTS = 675 core, 2000 memory
you = 750 core, 1900 memory

so really, you do need a GTS.
December 17, 2007 4:21:13 AM

"you wont find that much of a differnce and you cant say that you can see a difference between even 45 up to 60 fps beacuse i know your brain cant prosess the flashs that fast so ud be ok with 45 fps"

WHAT!?!?.. anyone who plays first person shooters knows there there is a diff. I can easily tell when my frames drop below 50fps in many games. I must admit you dont notice it until you play the game at or above 60fps. EX: My comp couldn't play Oblivion very well but i liked how it looked and the frames were between 20-45. I upgraded and now it run 60+ and looks some much better because now it runs smooth. Bioshock , Hellgate:London and FEAR are just a few games where sacrificing some settings to run 60+ will make it so much better to play.
a b U Graphics card
December 17, 2007 4:22:43 AM

Then again, if you're like me, a 8600GT is great for gaming considering all I play is Risk, emulators, and Bejeweled 2 now and then. As some have said, what's playable is up to the user, but is there really a HUGE difference between 100FPS and 200FPS? As randomizer said, playable is measured by the brain.
a b U Graphics card
December 17, 2007 4:35:06 AM

blotch said:
WHAT!?!?.. anyone who plays first person shooters knows there there is a diff. I can easily tell when my frames drop below 50fps in many games. I must admit you dont notice it until you play the game at or above 60fps. EX: My comp couldn't play Oblivion very well but i liked how it looked and the frames were between 20-45. I upgraded and now it run 60+ and looks some much better because now it runs smooth. Bioshock , Hellgate:London and FEAR are just a few games where sacrificing some settings to run 60+ will make it so much better to play.

Bah your brain is n00b! :kaola: 

I can tell when my FPS drops below 200 :D 
a b U Graphics card
December 17, 2007 4:38:07 AM

:kaola:  Randomizer, you silly cat.
a b U Graphics card
December 17, 2007 4:38:54 AM

I wasn't joking. 100, 200, 1000 all have a different feel to them. Same as 20, 30 and 60. <15 doesn't have a feel, only a look.
a b U Graphics card
December 17, 2007 4:44:58 AM

randomizer said:
I wasn't joking. 100, 200, 1000 all have a different feel to them. Same as 20, 30 and 60. <15 doesn't have a feel, only a look.

Would that be a slideshow? Lol!!!! Nice!
December 17, 2007 4:49:00 AM

randomizer i got an x1950 pro and a x2 4000, i play crysis on mostly high settings, and get a 42 avg 56 high and 26 low at 1024- 768 why is urs so low?.... running stock by the way
a c 174 U Graphics card
December 17, 2007 4:58:51 AM

First, as was mentioned you really have a GTS, not a GT. Clock it down to a normal GT and tell us how it is.

The other problem is the res you are gaming at. 1152x864 is not very large. Take a look at neweggs list of monitors. LCDs normally are bigger then what you are playing at. Unlike CRTs, you don't want to play at anything other then the native res. Let me list the resolution of all the 19" and smaller monitors on the first newegg page. 1440 x 900, and 1440 x 900, one is 17" and the other is a 19". Both of these are larger then the 1152x864 which you play at. (the monitors larger then 19" start at 1680:1050 which is even larger.)

The 8600GT isn't quite good enough for gaming. You'd have to disable to much to make it work, unless you game around 1024x768. If you have a modern monitor, meaning you are gaming around 1440 x 900 or larger, you really need at least an 8600GTS, the 3850 is even better.
a b U Graphics card
December 17, 2007 5:39:02 AM

vegie said:
randomizer i got an x1950 pro and a x2 4000, i play crysis on mostly high settings, and get a 42 avg 56 high and 26 low at 1024- 768 why is urs so low?.... running stock by the way

Ok, for one, it's vista, so I lose 10FPS there :lol:  I will need to retest it on XP when my PC gets built (running it in my dads at the moment as my PC is dead, using a t-bird system :lol: ).
December 17, 2007 5:49:51 AM

While the lower end cards are often underestimated, do not underestimate the power of the dar, er I mean higher end cards!

An 8800GT is far beyond the capabilities of a 8600 anything!

I feel like the point of this thread is too tell people that it's ok to buy these low end cards...... For some reason that makes my stomach hurt =\

Gah.... Where's my 9800GTS!!!!
December 17, 2007 6:01:24 AM

funny how the whole "you cant tell *blah resolution from blah resolution* gets tossed in there......."

if you can run at average 60fps or average 30fps, then yes you will notice a few things....

what you will notice is the minimum FPS (which is your card dropping below a smooth frame rate)... its not a difference of your brain being able to distinguish between 30 and 60 (which is something i refuse to argue about)

anyhow to the OP I game with my 8600gt (normal clocked) just fine and there hasn't been 1 game I've played that didn't play well, I almost considered getting an ATI3850 for a little more punch but my wallet wouldn't let me. Besides I am not unhappy with this card. Hell I got it several months ago (now that i think about it, it was nearly a year ago) when everyone was yappin about how expensive they were, with a little shopping around I found mine on sale for $100. At the time it was $5 more than a 7600gt so I went for it. Still chuggin along with my 3800+ X2 quite happily at 1024x768.
December 17, 2007 6:05:13 AM

256MB 8800GT > HD3850 > X1950pro > 8600gts = 7900gs > 8600GT = 7600GT for around the same price range

/thread

This is why you hear people bashing the 8600 series , they do give reaonable performance but price / performance wise it is lagging behind several much better deals and there is no real gain over the last generation.
a c 174 U Graphics card
December 17, 2007 6:45:18 AM

Again IR_Efrem, thats because you are gaming at 1024x768. That is such a small res that many cards look good. If you tried to bump that up to 1440x900, you would see your FPS start to drop. If you had an LCD at 1680x1050, it would be almost unplayable without dropping all of the detail levels down. Any card will work if you don't ask to much of it. Show me a graph where it shows a stock clocked 8600GT play crysis at 1680x1050 with the minimum FPS being 60FPS, and I'll show you a graph that has been faked.
December 17, 2007 8:54:02 AM

turboflame said:
256MB 8800GT > HD3850 > X1950pro > 8600gts = 7900gs > 8600GT = 7600GT for around the same price range

/thread

This is why you hear people bashing the 8600 series , they do give reaonable performance but price / performance wise it is lagging behind several much better deals and there is no real gain over the last generation.


That would depend on the game. Older games 8600gt isn't much faster than 7600gt but in modern games it is faster than 7950gt. People bash because they are ignorant and clueless.

http://images.anandtech.com/graphs/unreal%20tournament%...

http://www.gamespot.com/features/6183967/p-4.html

http://www.gamespot.com/features/6182806/p-5.html
December 17, 2007 8:56:23 AM

4745454b said:
Again IR_Efrem, thats because you are gaming at 1024x768. That is such a small res that many cards look good. If you tried to bump that up to 1440x900, you would see your FPS start to drop. If you had an LCD at 1680x1050, it would be almost unplayable without dropping all of the detail levels down. Any card will work if you don't ask to much of it. Show me a graph where it shows a stock clocked 8600GT play crysis at 1680x1050 with the minimum FPS being 60FPS, and I'll show you a graph that has been faked.



I don't think any card can play crysis with minimum of 60fps @ 1680x1050 unless you lower settings.
December 17, 2007 9:01:18 AM

I can play Crysis everything on medium 1440x900 without any problems too with my system running vistax86. No major slowdowns until you I fight the last battle on the flight deck. Everything else I can play with full detail.
December 17, 2007 9:04:16 AM

If you are a person that goes unhappy when you play at medium settings, buying lower cards is even more of a waste. They are already "old". I mean man, what can you do with a 8600GT a year later? It's trash. Yeah, buy a 8600gt and play crisis on 1152x864 resolution at 30 fps but accept the fact that that's going to be the most beautiful thing you're ever gonna see on your computer until you upgrade again.

Personally my strategy of upgrading would be to buy one or two levels before the "HOLY S%& THATS EXPENSIVE!!" level so that it lasts at least 2 years and hopefully more.

Though I wouldn't want to demoralise lower card buyers. With a cheap price, you can easily play any game at mediumish settings as mentioned above. It's just a matter of choice...and wallet.
December 17, 2007 9:36:34 AM

I believe fully in what the OP is saying. My old 7600GT could do fine in any game except the most recents ones like Crysis etc. I played through Crysis first on my 7600GT at 1024x768 with med-low settings and got around 25fps in most areas, and then would lower the res to 800x600 in large open areas. Still the best looking game I had ever played.

I agree the 8600s are not suitible for new games with max settings and high res, but you will still get the experiance, just on a budget.

Anyway if you are building a new PC for gaming, the minimum card you should get is a 3850, which will play all games at 1280x1024 at max very well compared to a 8600 which will struggle with some.
December 17, 2007 9:38:53 AM

Hehe
Well its also true that if you ask to much from any card it will also be unplayable even for ultras.

When i got my GTS 320 i wasn't even planning on going higher then 1280x1024(my native screen res.), everyone said get the 640+ for higher resolutions and i agreed with that.
But every thread i read, people kept saying the 320 wasn't good enough for playing in 16x10 or 19x12 resolutions...
Somehow people kept leaving the resolution part out altogether so it sounded like the 320 couldn't handle games period.
Benchmarks told me it held its own even at higher resolutions but the "truth" was it wasn't good enough.
Soon people who played at 1280x1024 was recommended GTS 640's and better cards.

YES, the 8600 can play games but not as well as a GTX and certainly not with GTX settings...
People keep forgetting the "eye of the beholder part" and just keep pushing for whatever they think is good enough.
I think the very vast majority can live with JUST playing on medium settings with decent frames, very high with insane frames isn't required by all.

Me and my friend own each a x1600xt but i downgraded from my former card and he hasn't, lets just say our views on CoD4 differ.
Now im gonna wait for santa, since i sold my GTS 320 for bigger better things to come.

/sorry for rambling
a b U Graphics card
December 17, 2007 11:42:46 AM

Playable and 30 average I can believe, but 30 minimum is hard to swallow. Crysis is a beast. Is that just in the beginning of the game where it isn't as stressful? There are cutscenes and parts later in the game where fps dip real low.

AT all medium settings, in Win XP, Legion averages 34 fps at 1280x800. That's pretty close to the same res mentioned in the OP, yet he mentioned higher details - all high except shaders (were shaders med or low? )
http://www.legionhardware.com/document.php?id=698&p=2

In their HD3850 vs 8600GTS comparison, it's easy to see there is no contest. Again it's win xp, and although they don't go as low in res, the 8600GTS tanks at their lowest 14x9. Averages 12 at high details and 21 at medium details. http://www.legionhardware.com/document.php?id=704&p=2

Anand's charts are not working, but they show this same spanking.
http://www.anandtech.com/video/showdoc.aspx?i=3151&p=8

Firingsquad show high details under Vista 64 bit where the 8600GTS averages 11.5 fps at 12x10. The X1950 pro did worse in this one. Sorry Randomizer, Vista isn't your friend. (but it did better in legions Win XP. medium).http://www.firingsquad.com/hardware/amd_rv670_performan...


Anyway, I wasn't happy with my crysis performance at native 16x10 with an 8800GTS 320MB. But Glad you are happy with it, that's what matters.


December 17, 2007 11:55:45 AM

For me averaging 12 FPS or 15 FPS is awesome... for slideshows. If I was demoing a game at 15 FPS for people to buy the card or game, I would be shown the door.
December 17, 2007 2:29:09 PM

pauldh said:
Playable and 30 average I can believe, but 30 minimum is hard to swallow. Crysis is a beast. Is that just in the beginning of the game where it isn't as stressful? There are cutscenes and parts later in the game where fps dip real low.

AT all medium settings, in Win XP, Legion averages 34 fps at 1280x800. That's pretty close to the same res mentioned in the OP, yet he mentioned higher details - all high except shaders (were shaders med or low? )
http://www.legionhardware.com/document.php?id=698&p=2

In their HD3850 vs 8600GTS comparison, it's easy to see there is no contest. Again it's win xp, and although they don't go as low in res, the 8600GTS tanks at their lowest 14x9. Averages 12 at high details and 21 at medium details. http://www.legionhardware.com/document.php?id=704&p=2

Anand's charts are not working, but they show this same spanking.
http://www.anandtech.com/video/showdoc.aspx?i=3151&p=8

Firingsquad show high details under Vista 64 bit where the 8600GTS averages 11.5 fps at 12x10. The X1950 pro did worse in this one. Sorry Randomizer, Vista isn't your friend. (but it did better in legions Win XP. medium).http://www.firingsquad.com/hardware/amd_rv670_performan...


Anyway, I wasn't happy with my crysis performance at native 16x10 with an 8800GTS 320MB. But Glad you are happy with it, that's what matters.


Those benchmarks in legionhardware is so messed up. I don't know where to begin. Their whole FEAR benchmarks. Not to mention Crysis.

FEAR 45fps @ 1440x900? This is awfully low. In major hardware websites 8600gts average around 80fps. I get 87fps running on vistax86 at the same resolution.

As for Crysis I get 30fps @ 1440x900 in the included benchmark with medium settings. How did legionhardware get 24fps and 21.8fps with the other using same hardware and operating system? I have to question some of these benchmarks by certain websites as these people who are testing are human who make mistakes or get free incentives.
a b U Graphics card
December 17, 2007 2:36:12 PM

I'm thinking it's the later of the two. Hard to find reputable and neutral benchmarking sites these days :( 
December 17, 2007 8:00:19 PM

wow i didnt want this to become a who has more ball's in crysis (pardon my use of language) i just wanted to say that cs:s and all of the otehr big game swith massive fan bases such as quake wars:enemy teritory can all be plaed on my card sorry is im 15 with a tiny buget but you can game with the card i have even at stock speeds.
a b U Graphics card
December 17, 2007 8:15:02 PM

marvelous211 said:
Those benchmarks in legionhardware is so messed up. I don't know where to begin. Their whole FEAR benchmarks. Not to mention Crysis.

FEAR 45fps @ 1440x900? This is awfully low. In major hardware websites 8600gts average around 80fps. I get 87fps running on vistax86 at the same resolution.

As for Crysis I get 30fps @ 1440x900 in the included benchmark with medium settings. How did legionhardware get 24fps and 21.8fps with the other using same hardware and operating system? I have to question some of these benchmarks by certain websites as these people who are testing are human who make mistakes or get free incentives.


Can you show me where a stock clocked 8600GTS averages 80 fps in fear at MAX Quality like Legion used? When you get 87 fps ave are you at max details? Are soft shadows on? Have you tried reference clocks?

These fall in line with Legions results:
http://www.hothardware.com/printarticle.aspx?articleid=...
http://www.pcstats.com/articleview.cfm?articleid=2107&p...
http://www.firingsquad.com/hardware/amd_rv670_performan...

As far as Crysis, yours is an overclocked card, theirs is not. Have your tried reference clock speeds and their drivers to see if you beat 24 fps? Are they even running the built in timedemo?

Look at [H] running an 8600GTS superclock at 10x7 med/low as max playable. And they don't use timedemos just actual gameplay most likely in demanding areas. [H] quote- "The Radeon HD 3850 simpy blows the GeForce 8600 GTS away playing Crysis. With the 8600 GTS Superclock we found that we had to lower the resolution to 1024x768. At 1280x1024 performance was so low that we would have had to set many in-game options, including shader quality, to “Low” which just looks downright ugly in Crysis."

I question their opinion of max playable settings often, But If they drop to 16 fps as a low at 10x7 med/low, how am I to believe someone claiming their card stays above 30 at much higher details and higher res? http://hardocp.com/article.html?art=MTQxOSw0LCxoZW50aHV...

SHoot, look at Digit-lifes (ixbt) results:
http://www.ixbt.com/video3/rv670-part3-d.shtml#p216
http://www.digit-life.com/articles3/video/rv670-part3-p...

I don't think these sites are all botching their reviews or fudging numbers for the fun of it. But I'd love to see some links that show an 8600GTS or OC'ed GT maintaining 30 fps in Crysis at almost all high details like the OP specified. I haven't seen any review even close to that.
December 17, 2007 8:23:02 PM

well im not here to prove my results as true im here to say that a system with a gpu like mine can game!!! my whole point was again not to challenge everyone to a who had more balls in cysis death match!
December 17, 2007 8:28:45 PM

How is 30FPS min for crysis?

Dont some of you guys actually play the game or are you just too busy looking at detail and figures?


20FPS is easily enough for crysis to be pretty smooth. Its gameplay is slow there is no need for high FPS. Unreal 3 however has fast gameplay so high FPS is a must.


But seriously, dont let numbers decide if its "playable" try playing the damn game!
a b U Graphics card
December 17, 2007 8:38:09 PM

Yeah, I'd agree that Crysis is very smooth at lower than normal fps. 20 fps doesn't really feel smooth to me though, but unlike most games, 25 fps isn't bad at all. It's the drops in the teens during cutscenes and the most demanding spots that drive me to adjust settings for higher fps and not be too happy with the 8800GTS 320MB for this game (at 16x10).
!