X800GT Benchmarked

HKepc once again breaking new ground;

<A HREF="http://www.hkepc.com/hwdb/x800gt-1.htm" target="_new">http://www.hkepc.com/hwdb/x800gt-1.htm</A>

They were unsuccesful at unlocking pipelines.

Interesting difference in results between 128mb and 256mb.


EDIT: Article been put on hold, go to <A HREF="http://www.hkepc.com/" target="_new">HKEPC</A> and check in on occasion if you care, for time being says (Coming Soon).

- You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
<P ID="edit"><FONT SIZE=-1><EM>Edited by TheGreatGrapeApe on 08/02/05 05:48 PM.</EM></FONT></P>
65 answers Last reply
More about x800gt benchmarked
  1. Yea, I guess being as a "enthusiast" gamer, I'm not too interested, but I guess it is good for the mainstream users out there. Where nVidia usually won with the mainstream users, due to ATI's lack, maybe this card could help solve that problem.

    I must admit, though, I'm all confused with the entire x850 line-up ATI has compiled. If I spent enough time, I'd probably understand it better, but it certainly doesn't compare to the 6200, 6600, 6800...lol. I can understand those...

    And I find it a little quirky that this low end card just happens to use the suffix, GT. My, oh my, it looks quite familiar... ;)

    All-in-all, this x800 GT looks to be a strong competitor with the compared cards. I guess the next thing, that will play a major role, is price. If this card ended up cosing less than let's say a 6600, then I would have to view it as quite interesting, aside from the benchmark results.

    Although, it'll also be important as to when it'll come out. You know, by the time it did, in fact, come out, nVidia could have already released the mainstream and low end cards for the G 7000 series.

    <P ID="edit"><FONT SIZE=-1><EM>Edited by Gamer_369 on 07/30/05 00:03 AM.</EM></FONT></P>
  2. I don't think ATI needs to worry much, Nvidia didn't dominate anything, in the last fiscal yeara, Nvidia lost to ATI, in the desktop sector, integrated sector, AND the mobile sector.
  3. Nice card but pointless as market is already being flooded with so many modals which all do the same performance or better for the money, a lot of people dont bother with 6800le`s/6800 vanillas and x800pro/x800gt moddles now because for a few £/$ more we have the x800xt/x800xl/6800gt which are all far better and with very competative pricing of the 6800gt at £200 or below in the uk there is not much of a market for a £180 x800gt when the XL is already that price and the 6600gt is at £100.

    IMO X800gt is a waste of time unless its crossfire ready.

    <font color=purple> Dont expect Miracles humans made them. </font color=purple>
  4. Revenue.
  5. Why am I not surprised you'd pipe in with some nV sponsored blathering?

    I see your ignorance hasn't improved either.

    Quote:
    Where nVidia usually won with the mainstream users, due to ATI's lack, maybe this card could help solve that problem.

    Like scottchen mentions <A HREF="http://www.xbitlabs.com/news/video/display/20050728052358.html" target="_new">Mercury says otherwise</A>, guess you need another A$$umption to work with. So if you're saying this will simply improve ATi's position I guess even more of the mainstream will be theirs.

    Quote:
    I must admit, though, I'm all confused

    That's not surprising really.

    Quote:
    the 6200, 6600, 6800...lol. I can understand those...

    Well of course, your nV Fanboi guidebook has the secret decoder ring for how the GF6800LE stacks up with the GF6600GT, and why an LE isn't always and LE.

    Quote:
    And I find it a little quirky that this low end card just happens to use the suffix, GT. My, oh my, it looks quite familiar...

    Yeah similar to nV's FX line using XT; Turnabout is fair play I guess. Speaking of convoluted, those FX lines had more endings than a choose-your-own-adventure.

    Quote:
    You know, by the time it did, in fact, come out, nVidia could have already released the mainstream and low end cards for the G 7000 series.

    Sure sure, of course, and then the next generation SIS cards could blow them both out of the water.


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  6. Quote:
    Nice card but pointless as market is already being flooded with so many modals which all do the same performance or better for the money

    Companies releasing cards with revisions, like the R9800SE, or FX5900XT are there because they can be made at a low cost, even if they compete against equalling performing parts, if they are cast-offs then getting +$1 for them means that they are a good play for the company. Even if the X800GT takes away from X700XT or even plain X800 sales, if it costs less to produce (use chips that couldn't perform as X800Pros or vanilla X800s that would've been tossed otherwise) then the benifit is there, and in all likelyhood the prices will come down just like all previous cards, but even if they don't (like many GF6800Us and X850XTs) they still will likely find a niche in the big box market, and may take volume pressure of the e-tailers, which isn't good for the Mfr, but ends up being good for us.

    I'd prefer the vanilla X800 or GF6800 over the GT, but pricing will likely change the attractiveness within months or even weeks of release. As for UK prices, when is that ever a good guage, you guys get $crewed over worse than Canadians for prices. :tongue:


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  7. Chips. To many stinken chips laying around with nothing better to do. Can't eat them. So make another vid card to get rid of them.
  8. That card looks very good. Anyone know the price??
  9. Quote:
    Why am I not surprised you'd pipe in with some nV sponsored blathering?

    Because, honestly, what's so exciting about this x800GT card? I'd choose a SM 3 over SM2 anyday. Especially as more games are starting to utelize this technology.

    Quote:
    Like scottchen mentions Mercury says otherwise, guess you need another A$$umption to work with. So if you're saying this will simply improve ATi's position I guess even more of the mainstream will be theirs.

    Sorry, but my assumptions were based on what I see around the boards of the GPU market, or other hardware sources.

    I guess I should have looked at the finacial stats, but instead I went on what I was seeing a vast amount of people claiming, and therefore my assumptions are based solely on that.

    Quote:
    Well of course, your nV Fanboi guidebook has the secret decoder ring for how the GF6800LE stacks up with the GF6600GT, and why an LE isn't always and LE.

    What's so fanboish about liking a product, and speaking fondly of it? I do the same with toilet paper, so does that mean I'm a Charmon Ultra fanboy? 0_o

    <P ID="edit"><FONT SIZE=-1><EM>Edited by Gamer_369 on 07/30/05 06:41 PM.</EM></FONT></P>
  10. we get screwed hardcore its because of are taxes, o well what can i say, we get paid more to compensate. lol

    <font color=purple> Dont expect Miracles humans made them. </font color=purple>
  11. Seems I may have <A HREF="http://forumz.tomshardware.com/hardware/modules.php?name=Forums&file=viewtopic&p=440041#440041" target="_new">called it right</A> after his third post. *pats own shoulder*

    The GT name doesn't suprise me either, XT/GT same shot.

    I'd like to see an English review, but looks to be a good $150 card. I wonder if that is the 128MB price? The 256MB does offer a big boost. It Seems too big a cross-the-board performance difference to just be from the amount of mem (128MB/256MB).


    <A HREF="http://service.futuremark.com/compare?3dm05=658042" target="_new">3DMark05</A> <A HREF="http://service.futuremark.com/compare?2k3=3781954" target="_new">3DMark03</A>
  12. The 6800 series naming scheme is much clearer for sure.

    6800UE, 6800U 0C, 6800U, 6800GT 0C, 6800GT, 6800GT 128MB, 6800 OC, 6800GE, 6800 Vanilla, 6800 128MB, 6800LE, 6800LMNOP etc. Plain and simple unlike the confusing XTpe, XT, XL, pro, vanilla, GT, and SE. :tongue:

    I'd say the only valid point you have is; some people may be lost with the X850 series introduction: X850 vs X800 and where does X850 XT fit in with X800XTpe, or X850 pro vs X800XL vs x800 pro. That gets a tad bit more difficult, but not much more so that NV's 6600GT and faster lineup.



    <A HREF="http://service.futuremark.com/compare?3dm05=658042" target="_new">3DMark05</A> <A HREF="http://service.futuremark.com/compare?2k3=3781954" target="_new">3DMark03</A>
  13. You forgot the X850XTPE.
  14. I have to aggree. The x800gt is a waste of time unless ATI is now also marketing to the suckers.

    <i><font color=red>Only an overclocker can make a computer into a convectional oven.</i></font color=red>
  15. I didn't forget it, I just left it out since we all know that is ATI's current champ, but some people may mistakingly put the X850XT above the X800XTpe. I really didn't want to list the full ATI line as it would ruin my fun with Mr NV. :tongue:

    <A HREF="http://service.futuremark.com/compare?3dm05=658042" target="_new">3DMark05</A> <A HREF="http://service.futuremark.com/compare?2k3=3781954" target="_new">3DMark03</A>
  16. Quote:
    I'd choose a SM 3 over SM2 anyday. Especially as more games are starting to utelize this technology.

    Right, sounds like the same PR stace of people choosing an FX5200 over a GF4ti or R8500 because it 'supports DX9'. Just goes to show people will get conned in by checkbox features.
    Any card in the X800GT range cannot play well with any serious SM3.0 features enabled, as they already cripple the GF6800GT and Ultra, how it an LE or GF6600GT supposed to cope with half the resources? Of yes you'll have pretty demos to watch, definitely a selling point. I'd prefer better speed in games than being able to say I can enable HDR in FartCry, so it's only 640x480, but it's enabled, right?

    I admit we are getting to the stage where SM3.0 is becoming important as the games that will benifit most from them (and not just using SM3.0 as an tack on feature) are around the corner; but even then it's only for the high end cards that it makes the difference, the low end can't use the features effectively and the mid-range cards should be replaced by their owners with the midrange G7x and RV5xx cards by the time those games ship. Should someone buy a GF6600 over an X800XL simply because of SM3.0? I don't think anyone here would support that kind of thinking.

    Quote:
    I guess I should have looked at the finacial stats, but instead I went on what I was seeing a vast amount of people claiming, and therefore my assumptions are based solely on that.

    And when you hang out in places where other Fanbois go unopposed (Anandtech) of course you'll make that assumption, however when you look at the data instead of 'group-think' mentality then you can be more objective. I'll take the data, even over what people here buy, because most of the people here are in the top third of the buying market so they wouldn't be represntative of the overall market. Most people here buy nV product of late, and rightly so as they've done a damn fine job with this line, but those choices don't match the overall market since they're the ignorant and cheep sheep that addiarmadar is speaking of who buy X300SEs and GF6200TCs or X600SEs and FX/PCX5500s, etc.

    Quote:
    What's so fanboish about liking a product, and speaking fondly of it? I do the same with toilet paper, so does that mean I'm a Charmon Ultra fanboy? 0_o

    There's nothing wrong with speaking fondly of a product/mfr/line, but there is something wrong with ignoring all other logical thought or valid benifits of other products/mfrs. I like Matrox best, but even on the rare occasion where I can recommend one I often temper any recommendation with the reminder that they cost a premium, because while I like them, I'm not about to throw away someone else's money who may not see the benifit of the 200+% premium.
    Also thread-crapping other people's threads because you're a fanboi isn't cool either. This thread wasn't to promote the card, it was an FYI, especially for people like WUSY who like to 'unlock' potential out of things. However you come in an bitch and moan about teh nomenclature, and then write falsehoods for the umpteenth time to try and promote your favourite team. And when confronted you either edit your statements to pretend like you didn't say it, or you try and distort it and say you meant something.

    Calling yourself an " <i>"enthusiast" gamer</i> " really is just laughable and does a diservice to all the true enthusiasts out there.

    Be a fan of Charmin (proper spelling, which you'd know if you were a fan :tongue: ) if you want, but it shouldn't involve misleading people about Northern or Cottonelle to validate your points.


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  17. Oh, and PS, you forgot the two versions of the GF6800LE (16pipe 8 pipe) that got me into trouble early in the year. :lol:

    You could've also mentioned the Gigabyte X800 (faster memory clock than most X800s) and the GF6800 with 128mb of DDR or 128mb of GDDR3, etc.

    Now add in all the Gateway, Dell, etc. special versions of both of these companies cards, and it becomes really messed up.

    Of course the Matrox cards are nice and easily numbered... but damn... the OEM Parhelia is slower than the retail! :wink:

    Forget XGI I can't figure out what the card is doing from one week to the next because of those wonky drivers. It's a V8 one week and a 2-stroke 80cc the next. :evil:


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  18. I totally understand where you're coming from, and agree that the any biased thinking should be excluded when recomending or preferring a card.

    After reading my first reply, I don't think I was too bad in bitching about the x800GT. I think I was fair in saying it was a strong competitor with the 6600GT; something I originally posted and did not edit in, as I don't do that.

    Quote:
    There's nothing wrong with speaking fondly of a product/mfr/line, but there is something wrong with ignoring all other logical thought or valid benifits of other products/mfrs.

    Right, and I try to abstain from that as much as possible. But really, the only "other thought" towards this x800GT was the benchmarks shown, and it does look to be like a strong competitor to nVidia.

    Overall, though, price will play a big role.

    Quote:
    Also thread-crapping other people's threads because you're a fanboi isn't cool either. This thread wasn't to promote the card, it was an FYI, especially for people like WUSY who like to 'unlock' potential out of things. However you come in an bitch and moan about teh nomenclature, and then write falsehoods for the umpteenth time to try and promote your favourite team.

    I'm sorry, but I don't think I've done such :(


    The truth is, I'm being critizied for speaking against the majority. Just like those many threads claiming Crossfire>SLI. I simply stated that's not neccisarily the case, and ultimately time would tell, but for now, SLI is still rocks. :p

    I'm sorry if I disagree with others, but I'm trying to give another point of view for people being bombarded with the same opinions.

    But overall, I think I'm being fair in my comments. I'm really not a fanboy, I assure you. I, for one, am anticipating the release of the r520, and have been for a while now. I also currently have a 9800 PRO, and am very happy with it. Lol, if anything, you can call me an ATI fanboy (but let's not get into that.) :D<P ID="edit"><FONT SIZE=-1><EM>Edited by Gamer_369 on 08/01/05 09:32 PM.</EM></FONT></P>
  19. I didn't realize there was a 16-pipe 6800LE, besides the small number of succeful 8 pipe unlocks people claimed at one time. I figured stock 8 pipe 6800LE < 6600GT < 6800.

    Yeah, Dell had that 6800GE like Asus, that many folks mistook for 6800GT's. There are probably countlessd other examples through the years of Gateway/Dell exclusives.


    <A HREF="http://service.futuremark.com/compare?3dm05=658042" target="_new">3DMark05</A> <A HREF="http://service.futuremark.com/compare?2k3=3781954" target="_new">3DMark03</A>
  20. All the suffixed make it confusing, imo... lol ^_^

    It'd me nice if their was just the no suffix, GT, and Ultra. Otherwise, things are more confusing than they should be.
  21. Link is not working anymore???

    -
    GA-K8NF-9 / <b><font color=green>Athlon 64 3200+</font color=green> @ 3800+</b>
    Infineon DDR400 (CL2.5) 2x512Megs
    <font color=green>GeForce 6600GT 128Megs</font color=green>
    <A HREF="http://www.getfirefox.com" target="_new">Get Firefox!</A>
  22. I Do not agree with you on the sm 3.0 - sm 2.0 issue, i have a 6800 128 16 pipes unlocked and my card is not crushed when sm3.0 is enabled on games there is a performance dip but i avarage 30fps on chronicles of riddick, and the graphics look a bit more crisper/smoother better looking on the 6800 than the X800XT.

    Sometimes it is just a massive PR stunt (which both companys sh1t out), but then again there is a difference in what the outcome is (Graphic visuals), and the performance decrease is not what you say it is if you want a sm3.0 card buy a nvidia simple and NOT a ATI card.

    This is all based on my opnions which i have seen, i trust reviews as much as i trust my nieghbour hes ok but i would not leave him my car keys.


    <font color=purple> Dont expect Miracles humans made them. </font color=purple>
  23. Charmin Ultra Megaroll baby!

    :tongue:

    <font color=red> If you design software that is fool-proof, only a fool will want to use it. </font color=red>
  24. Not to mention the new stuff that's coming from ATI. Whatever day that is. :frown:
  25. Yep Links been snuffed.

    Now says <A HREF="http://www.hkepc.com/" target="_new"><font color=red>(Coming Soon)</font color=red></A>

    LOL!

    Just make the link into page = 2 and you're good to go for the rest of the review they only locked the front door;

    <A HREF="http://www.hkepc.com/hwdb/x800gt-2.htm" target="_new">http://www.hkepc.com/hwdb/x800gt-2.htm</A>

    :evil: :cool: :evil:


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
    <P ID="edit"><FONT SIZE=-1><EM>Edited by TheGreatGrapeApe on 08/02/05 05:50 PM.</EM></FONT></P>
  26. I saw it when you first linked it, but I still can not read it any better.

    <pre><font color=red>°¤o,¸¸¸,o¤°`°¤o \\// o¤°`°¤o,¸¸¸,o¤°
    And the sign says "You got to have a membership card to get inside" Huh
    So I got me a pen and paper And I made up my own little sign</pre><p></font color=red>
  27. Quote:
    i have a 6800 128 16 pipes unlocked and my card is not crushed when sm3.0 is enabled

    First of all unlocked card is not a mid-range anymore, it's something around a de-tuned GT.

    Quote:
    on games there is a performance dip but i avarage 30fps on chronicles of riddick,

    First, Riddick is OGL, but what resolution do you have to drop down to/from in order to get the same playable frames. Look at the GF7800GTX's performance and you don't see nearly the same drops. And be sure you're using AGL2.0++, that's where the performance hit is felt, not in the normal OGL2.0 setting;

    <A HREF="http://www.firingsquad.com/hardware/chronicles_of_riddick_performance/page6.asp" target="_new">http://www.firingsquad.com/hardware/chronicles_of_riddick_performance/page6.asp</A>

    To me that's just eneormous, far worse than SM3.0 in FartCry and SplinterCell.

    Quote:
    and the graphics look a bit more crisper/smoother better looking on the 6800 than the X800XT.

    A bit doesn't make it a killer app, just in the same way that DX8.1 didn't make the R8500 a better choice than the GF4ti.

    Quote:
    and the performance decrease is not what you say it is

    The drop in FartCry is so much that doing HDR on the X850XT using 3 passes would bring it to about 80-90% of the GF6800U's performance, 51fps to 19 @ 16x12, to me that's a crippling impact, same for other resolutions.

    <A HREF="http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page13.asp" target="_new">http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page13.asp</A>
    <A HREF="http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page14.asp" target="_new">http://www.firingsquad.com/hardware/nvidia_geforce_7800_gtx/page14.asp</A>

    Quote:
    This is all based on my opnions which i have seen, i trust reviews as much as i trust my nieghbour hes ok but i would not leave him my car keys.

    And my position is I trust reviewers with reputations to lose over other people's perceptions of what may or may not be. At least they use FRAPS, and not just perception of what may or may not be playable, because everyone's acceptable levels are different.

    If you can show me a 'Killer App' then I'd promote it, but right now, it's a look-ahead feature, and the non-moded GF6800s can't handle it, let alone the GF6600GTs. Of course that's just my opinion, so you can disagree with it as you see fit.


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  28. LOL! :cool:

    Use a translator. :tongue:


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  29. Well your version of crippling is far different to my version, a normal pal tv runs at 25 FPS. i get 30-35min FPS in COR ( at AGL2.0++ settings)i play a lot of games i tinker with the settings to always get the best visuals everything to max basically, if it cant run on max then time to upgrade.

    as for fartcry, it runs in sm3.0 (fully patched)perfectly fine with playable Frame rate of 25+, i here a few people with no experience with a sm3.0 card talk about this performance drop to much, when i send my 6800 128mb to you then you will see.

    I have a hyundai L90D+ so my res is always at 1280 x 1024 with AS & AF at 4x 4x. i run COR at 0x0x

    The only area`s where this card sucks is that it only has 128mb so i cannot run doom3 on ultra settings for 512 cards also you need rivatuner to unlock all pipes, but all other settings are good.

    Oh i never siad SM3.0 cards are killer i siad there better than sm2.0 they have that extra box with a tick which say enable lol!

    No offence but if you GGA trust every review you read you will never buy anything new, are you saying you trust www.firingsquad.com by using every upto date driver e,t,c to get the best results e,t,c and not all hardware is the same.

    Just seen it this review was done in febuary, new drivers work magic you would be surprised what a increase in 5+fps does to a game.


    its kind of funny my vanilla 6800 decimates a 6800ultra according to the results of this Review, i have a 6800 but my card magically does better than this review says lol.

    All of the above are from my actually experiences with my own setup, (this is not an attack at you GGA i value you opnions, and enjoy the discussions you bring up)

    But I do agree with you the mid range 6600gts and standard 6800 cannot handle sm3.0 games at all, unless they atleast 16 piplines open with a 256bit memory interface.

    Also specs on the machine need to be considered, my rig is not magic but it performs well in games.

    <font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
  30. As a 6800U owner, who tends think [H] paints a good playable settings picture, you have me curious. If anything, I say other reviews giving ave fps are almost misleading as to actual best playable settings. Yet you are getting better playable results than what FS painted?

    Quote:
    its kind of funny my vanilla 6800 decimates a 6800ultra according to the results of this Review

    How is that? please explain. At 1280x1024 4X/16X they averaged 70 fps in training. What do you get at 4X/4X?

    It drops to under 33 fps average with HDR on. Are you saying you run farcry each day at 1280x1024 with HDR on and stay above 25 fps all the time?

    I have not tried HDR in any game after my latest driver update. I tend to demo HDR for fun because I can, but a 6800U isn't up to the task of using it IMO. I'd rather keep the reolution up and framerates up, and AA enabled.

    Have you tried this <A HREF="http://downloads.guru3d.com/download.php?det=830" target="_new">Farcry Benchmark Utility</A>? It's a fun little tool for comparisons, although in the end it will take fraps benchmarks of actual gameplay to get the real picture. 1280x1024 4X/16X would be my preffered gaming settings, but I feel that in some of the most demanding levels I need to reduce it to 2X AA in farcry or the framerates take away from the gameplay. Even with 2X AA there are areas where it can't stay above 30fps. (I tend to always leave fraps running while gaming.) But in most maps I can get away with 4X AA.


    <A HREF="http://service.futuremark.com/compare?3dm05=658042" target="_new">3DMark05</A> <A HREF="http://service.futuremark.com/compare?2k3=3781954" target="_new">3DMark03</A>
  31. The "25FPS must be fine because that's what TV is" is commonly trotted out, but IMO it doesn't apply to games. In most games I can 'feel' Lag if they're running at sub-60FPS. It's obviously down to the way they have the game set up to take your inputs or something, as DOOMIII stays responsive even if it dips below 30FPS (although you can see it doesn't <i>look</i> smooth, it <i>feels</i> it). In most games the responsiveness of the controls seem to be tied into the FPS you're getting, and 30 simply isn't enough. HALO was absolutely terrible in this regard. But it was crap anyway, so who cares...

    Obviously this is just my own opinion, but it does lend support to GGA's argument that we all have different limits on what is acceptable.

    ---
    <font color=red>"Life is <i>not</i> like a box of chocolates. It's more like a jar of jalapeńos - what you do today might burn your a<b></b>ss tommorrow."
  32. How can i explain the results of my system apart from they rock.

    <font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
  33. Halo was pretty good on my fx5600, and that avaraged 30fps, i still play halo online today under the name 7 sins with the 6800 spec machine, i still think halo is a excelent game never experienced lag it could be your pro which lets you down, i have noticed anything under 20fps gets to laggy, 25fps is minimum to stop lag in games, thats just my opnion , and this is what i have noticed when playing every game on the pc under the sun.

    mouse smoothing is a b1tch so always switch that off that reduces lag a lot, and start windows with bare minimum processes running, this increases FPS a LOT.

    <font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
  34. X800GT = 6600GT

    But GeForce still have the SM3.0 advantage...

    -
    GA-K8NF-9 / <b><font color=green>Athlon 64 3200+</font color=green> @ 3800+</b>
    Infineon DDR400 (CL2.5) 2x512Megs
    <font color=green>GeForce 6600GT 128Megs</font color=green>
    <A HREF="http://www.getfirefox.com" target="_new">Get Firefox!</A>
  35. I had a Ti4600 at the time, and I found it frankly ridiculous how poorly the game performed. Though it was over 30FPS most of the time, it just felt horribly unresponsive. I tried loads of stuff, tweaks and drivers, before I suddenly realised how vastly over-rated the game is, and went and played something more worthwhile.

    If you like bland graphics, boring levels, stupid AI, unsatisfying weapons and overall woeful performance, then I can see how HALO would appeal :tongue:

    If they'd not completely fubar'd the conversion, maybe I would've got more into it, but it simply didn't seem worth the effort to me.

    ---
    <font color=red>"Life is <i>not</i> like a box of chocolates. It's more like a jar of jalapeńos - what you do today might burn your a<b></b>ss tommorrow."
  36. Well I disagree, it's not just FS that shows those drops in performance. I would neve go on just one reviewer that makes less sense (almost like people who post Anandtech stuff here).

    As for what's fluid and acceptable we've discussed this a few times, and my position is that it depends on the game. Creepers like SplinterCell and some parts of FartCry don't need high sustain framerates, but games UT2K4 or HL2/CS:S need good elevated framerates to ensure fluidity in things like jump-straifing, 180degree turns, etc.

    When playing single player I also think it matters less because you won't know what you're missing compared to the guy who's fraggin' yer a$$ in multiplayer.

    I don't disagree it's a nice feature, but if enabling it on even the GF6800U using FP16 is almost the same as running it on the X850XT with 3 passes using FP24, then really, the feature and it's support on this generation is not that 'impressive' and keeps it from being anything but a tie-breaker IMO.

    Now as games/engines start appearing that were designed around SM3.0 features then we may see a true advanatage, until then the scenes where it's used are limited, and the impact isn't worth it IMO.

    Only where it becomes a feature where you'd be willing to take a performance hit like an GF6600GT over an X800XL or an X850XT over a GF7800GTX does something become a must have feature when talking about things like that. Just like FP32 couldn't save the FX line when the performance just wasn't there.


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  37. I disagree with you totally, it does not lag its just your machine setup or somthing you have done wrong somwhere lol i bet you have mouse smoothing switched on, also halo heavily realys on cpu and to be honest the ge4600ti was not the cutting edge at the time of this games release its was the 5900ultra and the super cool 9800XT (which i used to dream about).

    Halo is a resource hog and the single missions are ok but mutliplay is excelent, make sure you have a low ping and good broadband supplyer.

    <font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
  38. What can i say i agree with you on all points siad there, but if you look at the 7800gtx benchmarks from that site i forgot the name, But the results just does not look consistent enough and i could not see a reasen why the results had not been consistent, i would say there is a driver problem, over sites which have reviewed had some better results, but 2 systems never act the same.

    <font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
  39. Quote:
    X800GT = 6600GT

    That's being generous to the GF6600 IMO.

    the X800GT-256mb>X800GT128mb-256bit>GF6600GT.

    Only in D3 was there any change of positioning, and that's not enough to make a tie.

    If you can explain the GF6600GT 'advanatage' you speak of then fine, otherwise it's the typical checkbox feature because when enabled the GF6600GT would have trouble competing with an X600XT or plain X700, and the number of games that support it are minimal,and even then they are tacked on for show. Things may change with the advent of those games that are optimized from the ground up, but otherwise, there's no 'advantage' yet that falls outside to the 'FX advanatage' I mention above.

    To me from the initial review a better positioning of the X800GT would be (all unmoded/un-oc'd);

    GF6800LE < X700PRO < GF6600GT < X800GT < GF6800 < X800 < X800PRO < X800XL/GF6800GT, etc.

    From that +/- your preferences for features.

    Just my two frames' worth as always.


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  40. The GF7800GTX isn't the issue, but I'd agree immature drivers for that, I was really focuing on the hit to the GF6800 series whose drivers should be mature by now even for relatively new games like Riddick.

    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  41. i agree the x800gt will easily out pace the 6600gt, and if ati is marketing this kit for the same price as a 6600gt nvidia will lose the mainstream crown.

    <font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
  42. when i looked at the date for the 6800 review it siad feb 2005 at the top can you double check for me please.

    <font color=purple> Dont expect Miracles humans made them. The blind are easily led by the blind </font color=purple>
  43. I based my "X800GT = 6600GT" on the fact this in the HKEPC review, most benchmarks numbers shows very close race between both GPU.

    But there is only 2 realworld bench in this review : D3 and HL2. I agree that in High-Res/FSAA the X800GT seems to have a small lead. But I doubt we will really notice the difference while playing. I can't wait to read the HardOPC review of this new X800 flavor.

    And since this X800 is more like a stripped-down X850, will it be matched with an X800 or X850 CrossFire edition? Another confusing choice for buyers in perspective...

    -
    GA-K8NF-9 / <b><font color=green>Athlon 64 3200+</font color=green> @ 3800+</b>
    Infineon DDR400 (CL2.5) 2x512Megs
    <font color=green>GeForce 6600GT 128Megs</font color=green>
    <A HREF="http://www.getfirefox.com" target="_new">Get Firefox!</A>
  44. Yeah the Riddick review (one of the only ones with OGL 2.0 vs 2.0++) is that old, but I don't know of many other reviews that show the difference. Most review the OGL2.0 path alone because they want to compare nV to ATi not nV to nV.


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  45. LOL!

    They've updated the Benchmarks again with the 128mb and 256mb on the same list (and page 4 leads to page 4 again)

    To get to the benchies now you must go directly;

    <A HREF="http://www.hkepc.com/hwdb/x800gt-5.htm" target="_new">http://www.hkepc.com/hwdb/x800gt-5.htm</A>

    And while you say they are close, in HL2 the X800GT has a 15-20+% lead, while the D3 scores only truely favour the GF6600GT by less than 10% and only in the 1 test is it above the margin of error. Max playable is the same for both in D3, unless people think avg ~40fps is playable. And max playable for HL2 goes to the X800GT. If experience is anything D3 will remain the exception to the rule of the X800GT > GF6600GT. But as you say the margin may be very little depending on the settings and application. For me the true test will be minimum FPS since that's usually what affected most by both fast memory bandwidth and added memory. That may be noticeable, but we'll have to wait and see what reviews like [H]'s reveal with their hystograms.

    As for crossfire, probably it'll be possible, heck it's possible with about evey card now, but whether there would be any benifit would be more of a question. 2 X800GTs will likely cost noticeably more than a single X800XL, and adding an X800GT to an XL would have no net effect. Even for plain X800 buyers I doubt there'd be the motivation. As for confusion, like was mentioned before, no more thanalready exists, and those interested in SLi and Crossfire should do their research first otherwise they'll get burnt just like all those people who bought FX5200XTs and R9200SEs just because they had a whole kick-ass 256MB of memory.


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  46. I'm not even going to bother to mention that whenever "ATI's" so called crown-stealer mainstream card is out, we might already be looking at the mainstream and budget-end cards for the 7000 series. GGA, you mentioned something about SIS cards, previously?? If it was to make a point, it really didn't, because the implication that newer things always come out doesn't affect anyone unless it's to come out relatively soon. And to me, it would be stupid of nVidia to have their 7800 GTX card (and the GT) out competiting with the entire r520 series.

    By that, I am implying that when ATI *does* release their r520, it will be with the other cards of the series, with the exception of perhaps the high-end card. Why I think this? It just seems stupid that they'd *only* release the r520 card...I mean, nVidia could pull it off since it was before any competition, but now that the competition is there, I think ATI should go full force.

    But of course that's just my speculation. But despite everything I've seen, I personally wouldn't invest in either an x800XT, nor a 6600GT. Although, investing in a 6600GT would be more feasible for me, just as it contains SM3, which is the same feature as the "next gen. cards." Even if performance isn't as good, if I was looking for a card to last me longer, that's the one I would personally choose. And I want it to be clear that I'm not putting down the x800XT. I'm just simply stating that with SM2, it's not something I'm particularly interested in investing now.

    You might want to, though, as well as many others. But I personally do not. So please don't tell me how my personal preferences are flawed.
  47. ATi's mentioned that the companion parts (RV530, and others) will appear close to the R520's launch. Expect the R520 to come in whatever flavours it comes in and ATi will likely launch whatever of what it can to fill the gaps (although don't expect an R530-like card immediately even if it were ready to ship).

    As for SM3.0 I go back to the FX statement. Features mean little if the accompanying power doesn't make it worthwhile either. Sofar no one can show a single thing with SM3.0 that can't be replicated with ATi's feature set in any real-world app.

    Once again it's great in theory, just not as impressive until the future titles ship, and buying a card for that far into the future is a risky proposition.


    - You need a licence to buy a gun, but they'll sell anyone a stamp <i>(or internet account)</i> ! - <A HREF="http://www.redgreen.com/" target="_new"><font color=green>RED </font color=green> <font color=red> GREEN</font color=red></A> GA to SK :evil:
  48. Quote:
    ATi's mentioned that the companion parts (RV530, and others) will appear close to the R520's launch. Expect the R520 to come in whatever flavours it comes in and ATi will likely launch whatever of what it can to fill the gaps (although don't expect an R530-like card immediately even if it were ready to ship).

    Yay, we finally agree! So once this takes place, I think we will be seeing more budget-oriented cards from nVidia. Such as a 7800, 7600, 7200, (I don't think that's 100% confirmed, yet, though, so don't quote me on it).

    In other words, I would probably wait to buy a mainstream card after they come out with the 7000 series or r520. If not to see if a better card of that series for around the same price as this gen., then to atleast see this gen. decrease in price.

    Quote:
    As for SM3.0 I go back to the FX statement. Features mean little if the accompanying power doesn't make it worthwhile either. Sofar no one can show a single thing with SM3.0 that can't be replicated with ATi's feature set in any real-world app.

    Doom 3 certainly seems to show the advantage...

    Quote:
    Once again it's great in theory, just not as impressive until the future titles ship, and buying a card for that far into the future is a risky proposition.

    What? The x800XT, the 6600GT, or both? Because yea, I agree with 'ya.

    But when even the next gen. cards are only going to be SM3, it somewhat shows that it will do us good for a while. Not neccissarily years and years from now, as SM4 is to come out with LDDM for Vista late next year, but atleast until then.

    Overall, price is another huge impact, or atleast it would be on me. In fact, price is probably the most importain factor for the budget and mainstream cards...if it wasn't, then you'd easily find youself dishing out hundreds of dollars for the high-end. Now don't confuse my statement; I'm not saying performance isn't important. Just saying that price is heavely considered.
  49. Kinney, have you been going to anger management classes? No joke, you have come a long way in a short period of time. Keep up the good work!

    ASUS P5WD2 Premium
    Intel 3.73 EE @ 5.6Ghz
    XMS2 DDR2 @ 1180Mhz

    <A HREF="http://valid.x86-secret.com/records.php?PHPSESSID=792e8f49d5d9b8a4d1ad6f40ca029756" target="_new">#2 CPUZ</A>
    SuperPI 25secs
Ask a new question

Read More

Graphics Cards Font Green Graphics