Sign in with
Sign up | Sign in
Your question

Give some comments bout these GCs please

Last response: in Graphics & Displays
Share

What should I get?

Total: 14 votes

  • PowerColor Radeon X800Pro [AGP]
  • 79 %
  • ASUS N6600GT/TOP/TD [AGP]
  • 22 %
January 3, 2006 8:08:46 AM

PowerColor Radeon X800Pro and ASUS N6600GT/TOP/TD. Both are AGP.

I'm want to upgrade my FX5200 magic to one of these. Help me choose one too please. I'm a gamer btw.

My current AI:

P4 2.4GHZ
1.5GB RAM
550W P.S.
Geforce FX5200 Magic[OC]

More about : give comments bout gcs

a b U Graphics card
January 3, 2006 8:13:53 AM

I would say get neither, go with an X800GTO2 or GF6800GS because they'll beat those other two options you have.

However from those you've picked the X800Pro will be top card in just about ever game out there. If they're the same price it'd be the one to go with, but check the price of those other two cards as well just to be sure. You might also be able to mod the X800Pro if it's a ViVo card.
January 3, 2006 8:48:34 AM

Quote:
I would say get neither, go with an X800GTO2 or GF6800GS because they'll beat those other two options you have.


He said AGP. The GTO2 is PCI Express only and the 6800GS AGP is pretty rare.
Related resources
a b U Graphics card
January 3, 2006 9:52:38 AM

Quote:
The GTO2 is PCI Express only


SapphireGTO or GTO2, same difference. Nit-picking really, In any case SSDD.

As for the GF6800GSs they aren't that rare anymore;

http://www.us.ncix.com/products/index.php?sku=17411


Either way both would still be better, even as a plain GTO, which are still more modable than the X800PROs. Even if you have ot wait, probably worth it, just ask the bunch of people here who've already moded their GTOs.
January 3, 2006 10:14:07 AM

Quote:
The GTO2 is PCI Express only


SapphireGTO or GTO2, same difference. Nit-picking really, In any case SSDD.

As for the GF6800GSs they aren't that rare anymore;

http://www.us.ncix.com/products/index.php?sku=17411


Either way both would still be better, even as a plain GTO, which are still more modable than the X800PROs. Even if you have ot wait, probably worth it, just ask the bunch of people here who've already moded their GTOs.

No. There is a BIG difference between a GTO and a GTO2 since the GTO2 has the R480 core with 16 pipelines. So by flashing the BIOS you get a X850XT PE for the price of a GTO2. 100% success rate.
a b U Graphics card
January 3, 2006 10:51:28 AM

Quote:

No. There is a BIG difference between a GTO and a GTO2 since the GTO2 has the R480 core with 16 pipelines.



And the the SapphireGTO Has an R420 with 16pipes. The difference is minimal, and pointless since like I said BOTH moded or unmoded would be better than a plain X800Pro. So regardless of success rate a plain GTO is still a better choice.


Quote:
So by flashing the BIOS you get a X850XT PE for the price of a GTO2. 100% success rate.


Actually NOT 100% success rate, since you can unlock them, but many people still experience artifacting. I love the 100% number because when people end up with a bum mod they then complain as if it were guaranteed by the MFR. Anywhoo, yes it's highly successful, but even without it, still better than the other two cards.
January 3, 2006 11:01:33 AM

Quote:

Actually NOT 100% success rate, since you can unlock them, but many people still experience artifacting. I love the 100% number because when people end up with a bum mod they then complain as if it were guaranteed by the MFR. Anywhoo, yes it's highly successful, but even without it, still better than the other two cards.


Well, show med someone who has failed to unlock a pipe on the GTO2.

The artifacts are because people OC the cards to high. All cards can be unlocked but not all cards can handle the OC because of several reasons. The most common reason is people having poor case cooling.
a b U Graphics card
January 3, 2006 11:56:24 AM

Quote:

Well, show med someone who has failed to unlock a pipe on the GTO2.

The artifacts are because people OC the cards to high.


And that's the point.

You said "So by flashing the BIOS you get a X850XT PE for the price of a GTO2. 100% success rate."

So unless you overclock it to the X850XTPE level then it's not 100% success rate, a$$clown! Or am I not allow to nit-pick like you? :roll:


EDIT: Instead of bringing this to the top of the pile again, not to draw attention to your nickname, as you told the teacher, I'll simply edit this post and re-state the fact that I said you CAN unlock them but people are experiencing artifacts. Therefore not 100% success to X850XTPE like you imply.
January 3, 2006 1:56:37 PM

Quote:

Well, show med someone who has failed to unlock a pipe on the GTO2.

The artifacts are because people OC the cards to high.


And that's the point.

You said "So by flashing the BIOS you get a X850XT PE for the price of a GTO2. 100% success rate."

So unless you overclock it to the X850XTPE level then it's not 100% success rate, a$$clown! Or am I not allow to nit-pick like you? :roll:
You were referring to the unlocking procedure:

Quote:
Actually NOT 100% success rate, since you can unlock them
January 5, 2006 3:10:43 PM

Well, in just about any benchmark, synthetic or real-world, that you can think of, the X800pro would hand the 6600GT its rear end.

However, that's clearly not the whole story of which card is a better thing to get. First off, the big "if" is how much video RAM each card has. Given that you're comparing two rather different cards, I would presume that you might've found the more expensive 256MB version of the 6600GT, as all X800pros (correct me if I missed one) have 256MB of video RAM. While the advantage of taking a 512MB video card might be debatable today, the advantages of a 256MB card are not. Most next-gen games we'll be seeing, as well as ones already released, will require 256MB of video RAM to properly run with maxed texture and detail settings. Otherwise, not enough texture data can be loaded onto the card itself, resulting in the card having to shuffle textures back and forth from the main memory, which can be especially painful for an AGP card.

Then again, there's also the fact that the 6600GT comes with support for shader model 3.0, while it's only shader model 2.0 extended for the X800pro. Normally, the differences are fairly small in real-world terms, (as far as I can tell, it's only that the X800pro can't use texture filtering and floating-point color at the same time, enough reason not to use floating-point color) if no one bothers to program for it, it's no better than an older technology; SM 2.0 extended has been ignored, so it's effectively as good as SM 2.0. Point for the GeForce.

Lastly, of course, there's the issue of price. Comparing a standard (128MB) GeForce 6600GT to a Radeon X800pro would be an insane idea, given that you'll see a massive gap in pricing, which may approach as much as 100%. The difference is smaller with the 256MB version, depending on the availibability of each card at each particular store and time. But usually, the GeForce will be the cheaper card. So even if it's weaker, it's a matter of paying for what you get here.
a b U Graphics card
January 5, 2006 6:18:19 PM

Quote:
(as far as I can tell, it's only that the X800pro can't use texture filtering and floating-point color at the same time, enough reason not to use floating-point color)


The thing about "FP-Blending" is that it can be reproduced using other techniques, for the ATis it would require 3 passes to achieve the same effect, and I'm pretty certain the X800Pro could do 3 passes in nearly the time it take the GF6600GT torender a scene in FartCry with HDR enabled. Of course you still need to code for that too.

Quote:
if no one bothers to program for it, it's no better than an older technology; SM 2.0 extended has been ignored, so it's effectively as good as SM 2.0. Point for the GeForce.


FartCry does use SM2.0 extended, that's where you get geometric instancing support, and alot of games are adding that since the benifits are to all R3xx series cards and above and all GF6 series cards and above. The edge to the Geforce in that case is negligible at best.

Quote:
Lastly, of course, there's the issue of price. Comparing a standard (128MB) GeForce 6600GT to a Radeon X800pro would be an insane idea, given that you'll see a massive gap in pricing,


Except when the Sapphire X800Pros were selling for the same price as the GF6600s in the middle of last year. Now they're rare as sin, but second hand, oem or refurb this may create the same scenario, so 'insane' is a little strong.

Quote:
which may approach as much as 100%.


Now that's insane talk! :p 

Quote:
The difference is smaller with the 256MB version, depending on the availibability of each card at each particular store and time. But usually, the GeForce will be the cheaper card. So even if it's weaker, it's a matter of paying for what you get here.


Yeah, but price performance will likely still favour the X800Pro, and also give you the benifit of having playable framerates longer IMO.

Of course like I originally said, the X800GTO or GF6800GS would be the better choices for price/performance.
January 5, 2006 7:37:06 PM

The X800 PRO will absolutely RUIN the 6600 GT.
It would still ruin it if the 6600 GT had 1024 megs of ram and the X800 PRO had 128 megs of ram.

The interface (AGP or PCI-E) makes no difference, more RAM or no.

SM 3.0 is maybe worth less than a half-point to the 6600 GT, especially since my 6800 Ultra is not fast enough to run the SM 3.0 HDR in far Cry. The 6600 GT doesn't have the horsepower to do squat with SM 3.0.
It's a checkbox feature, nothing more.

The X800 PRO has 12 pipelines and a 256-bit memory interface, the 6600 GT has 8 pipes and a 128-bit interface. No contest. Even if you were a total Nvidia fanboy, you'd be supremely retarded to choose a 6600 GT over an X800 PRO. The X800 GTO would also kill a 6600 GT.

I'd even be hard pressed to choose the 6600 GT over an X800 GT, but at least that's a much closer race. The X800 GT has the 256-bit memory interface (good for AA & other eye candy), but the 6600 GT has very efficient architecture and, in such a close race, SM 3.0 might come into play as a factor.
a b U Graphics card
January 5, 2006 8:54:25 PM

Quote:
but the 6600 GT has very efficient architecture and, in such a close race, SM 3.0 might come into play as a factor.


Yeah the only thing I can think of where the SM3.0 might play a factor is in per-pixel specular lighting like in SplinterCell and FartCry where cose is gruesome, but then again we're talking about maybe single digit percentile benifit in an area where the X800GT and GF6600GT would still be chugging. So maybe 11fps versus 10fps for that benifit IMO.
January 6, 2006 10:50:22 AM

ok so in real world gaming the ati is the better choice
January 6, 2006 11:14:17 AM

the best AGP option is the Geforce 6800 GT, it's the counterpart of X800 pro which I think is much better.
a b U Graphics card
January 6, 2006 11:59:21 AM

Quote:
the best AGP option is the Geforce 6800 GT, it's the counterpart of X800 pro which I think is much better.


How did you come up with that? The X850xtpe, x800xtpe, x850xt, x800xt and the 6800U are all better than the 6800GT performance wise. Maybe prices are different where you live, but currently in North America anyway, better, cheaper, and more available than a 6800GT is an AIW X800XT. I ordered one yesterday from Buy.com for $259.99 less a $30 rebate. $230 shipped free no tax. Close to the 6800GS price and much better performance. If priced the same as the GS or pro, than sure the 6800GT is a nice card. But the only 6800GT AGP on pricewatch now is $508. 8O
a b U Graphics card
January 6, 2006 12:06:30 PM

Yes, in real world gaming the X800 pro is far superior to a 6600GT.
January 13, 2006 3:56:46 AM

Sorry to resurrect this topic, but I neglected to post to some things directed at myself. I’m not sure what the general atmosphere is for resurrecting topics on THG’s Forumz is, but usually, one week isn’t all that bad if it’s brought up for good reason. Again, I apologize in advance.

Well, prices definitely have changed, at least in the USA. Unfortunately, this is AGP in question, so the only improvement of note would be the new GeForce 6800GS, which isn’t all that badly priced, at about $225US. (link)

And on another note, how is it that some are taking me to be an nVidia fanboy? I do know that my post regarding the two cards did perhaps give more credit to the 6600GT than it was due, but that would primarily be because the performance advantage of the X800pro over the 6600GT needed very little to be said on it. For any interested parties, I use a Radeon X800XT anyway.
Quote:
The thing about "FP-Blending" is that it can be reproduced using other techniques, for the ATis it would require 3 passes to achieve the same effect, and I'm pretty certain the X800Pro could do 3 passes in nearly the time it take the GF6600GT torender a scene in FartCry with HDR enabled. Of course you still need to code for that too.

FartCry...” I’ll have to remember that one. I was fairly certain that the X series cards could handle filtering and blending in floating-point mode, as I’ve seen what is clearly examples in their own tech demos. (being the video card junkie I am, I have all of ATi’s tech demos for the Radeon 7200 through the X850, the highest that will run on my X800XT) However, as I said, if no one bothers to program using a feature, it’s worthless, and thus far, I’ve seen no one use it.

However, recent information regarding the upcoming Elder Scrolls IV: Oblivion found at Beyond3D suggests that the game will be an exception. Apparently, it will not only use floating-point color for its HDR method, it will possibly even work with MSAA even on any SM 2.0-complaint card, though what was said makes it inconclusive. It was noted that, for the abovementioned reasons, that “blending” was impossible to do without a significant performance hit, so it’s likely what we’ll see is simply a failure of some blending effects, such as alpha blending... And honestly, when’s the last time anyone really cared that any game got alpha blending correct? If memory serves me correctly, it was this acceptance of this problem that lead to the development of special AA techniques like nVidia’s “transparency super-sampling” and ATi’s “selective AA.”

Quote:
FartCry does use SM2.0 extended, that's where you get geometric instancing support, and alot of games are adding that since the benifits are to all R3xx series cards and above and all GF6 series cards and above. The edge to the Geforce in that case is negligible at best.

It does? I honestly wasn’t aware of that; the game looked the same on a Radeon 9600XT as it does on an X8x0 card. Then again, I’ve heard some suggestions that the “XT” 9 series cards were SM 2.0 extended as well... And I have no other truly SM 2.0 cards; I think the “PCI” part of my old GeForce FX 5200 kinda nullifies things, as it won’t even run Far Cry, Halo, or any other game with a SM 2.0 mode.

Quote:
Except when the Sapphire X800Pros were selling for the same price as the GF6600s in the middle of last year. Now they're rare as sin, but second hand, oem or refurb this may create the same scenario, so 'insane' is a little strong.

And lo and behold, we now have X800GTOs (128MB version) from Sapphire selling for close to the same price as 6600GTs right now! :D  I find that to be a pleasant surprise, now if only they became available for AGP...

Quote:
Now that's insane talk! :p 

Well, it is what I’ve seen at points. It’s much better now, but primarily because the 6600GT seems to be drying up and raising in price. But at some points, I noted that the X800pro would cost in the $240US neighborhood while the 6600GT was around $120US.

Quote:
Yeah, but price performance will likely still favour the X800Pro, and also give you the benifit of having playable framerates longer IMO.

Of course like I originally said, the X800GTO or GF6800GS would be the better choices for price/performance.

Indeed; my very first line was the comment that as far as performance goes, the X800pro can typically hand a 6600GT its rear end. I guess I didn’t mean to give the illusion of emphasis to the other qualities of the 6600GT so much, but in retrospect, it seems that I did considering how much space I devoted to them.

Quote:
The X800 PRO will absolutely RUIN the 6600 GT.
It would still ruin it if the 6600 GT had 1024 megs of ram and the X800 PRO had 128 megs of ram.

The interface (AGP or PCI-E) makes no difference, more RAM or no.

SM 3.0 is maybe worth less than a half-point to the 6600 GT, especially since my 6800 Ultra is not fast enough to run the SM 3.0 HDR in far Cry. The 6600 GT doesn't have the horsepower to do squat with SM 3.0.
It's a checkbox feature, nothing more.

The X800 PRO has 12 pipelines and a 256-bit memory interface, the 6600 GT has 8 pipes and a 128-bit interface. No contest. Even if you were a total Nvidia fanboy, you'd be supremely retarded to choose a 6600 GT over an X800 PRO. The X800 GTO would also kill a 6600 GT.

I'd even be hard pressed to choose the 6600 GT over an X800 GT, but at least that's a much closer race. The X800 GT has the 256-bit memory interface (good for AA & other eye candy), but the 6600 GT has very efficient architecture and, in such a close race, SM 3.0 might come into play as a factor.

Well, as I commented, I do not disagree with the fact that the X800pro would slaughter the 6600GT in a fair performance fight. As for the interface comment, it can make a difference where texture reading is concerned; a PCI-e interface not only has about double the overall bandwidth, its multi-lane serial nature makes it far more flexible concerning re-proportioning bandwidth on the fly, to accommodate reading textures from the main RAM, as would be the case if the game has to buffer more in textures than the video card can hold.

The SM 3.0 is a wild card, heavily dependant on the game. As far as I’ve actually been able to find out, it really offers nothing truly improved over SM 2.0 extended, and it just merely makes me bitter that so many game developers have slighted those with SM 2.0 extended cards, making a SM 3.0 path instead. (and it’s not just that I use an X800XT) That path often only offers things that could’ve been accomplished simply with SM 2.0; from all information I’ve found, it seems that the abovementioned Oblivion game is a prime example; it will apparently incorporate all the soft-shadow and HDR technologies seen before and more, and it will all run in SM 2.0. (it will run much faster with SM 2.0b or SM 3.0 support, as SM 2.0 will require many of the things to be done in many passes instead of one) And again, as for what amount of power is really necessary to do something with SM 3.0, just as with how much power is necessary to use so much video RAM, is dependant on the game, not the card.
January 13, 2006 11:16:59 AM

if u can find a x800pro, getting rarer all the time...
1.6800gs
2.x800gto
3.x800gt
4.6600gt in that order
a b U Graphics card
January 13, 2006 11:19:23 AM

Quote:
Well, prices definitely have changed, at least in the USA. Unfortunately, this is AGP in question, so the only improvement of note would be the new GeForce 6800GS, which isn’t all that badly priced, at about $225US.

The 6800GS in AGP is going to be clocked lower than the PCI-e version. We already have benchmarks for the 350/1000 12-pipe 6800 in the Ausus v999 gamer edition. In general, the X800 pro still beat out that Asus, meaning a X800 pro should still be able to defeat a 6800GS in more games than it loses. If priced the same, the X800 pro would provide more bang for the buck. Neither competes with the similar priced X800XT or X850Xt when on sale. Your X800Xt will be above any of these other cards. If the 6800GS feature set pushes you over the edge compared to a X800 pro or X800GTO, knowbody would blame you; they should be very close. It's good to see a 6800GS come out in AGP, but the clock speeds were a huge dissapointment which makes me believe that NV still has very little to offer the AGP upgrader unless they MUST have NV. Besides the 6600GT when found for $120-140, I'm thinking ATI still owns the AGP upgrade market.
a b U Graphics card
January 14, 2006 1:30:46 AM

Quote:
if u can find a x800pro, getting rarer all the time...
1.6800gs
2.x800gto
3.x800gt
4.6600gt in that order


Before moding the list is different for AGP X800GTO>GT6800GS because the GS in underclocked in AGP form. There is the advanatage of unlocking pipes in the AGP form, but that's hit/miss depending on whether they are locked/cut or not. Also AGP doesn't OC as well as PCIe because it's 130nm vs 110nm.
a b U Graphics card
January 14, 2006 2:44:29 AM

Quote:

FartCry...” I’ll have to remember that one.


From an early nV application detection cheat, that was discovered by renaming it FartCry.

Quote:
I was fairly certain that the X series cards could handle filtering and blending in floating-point mode, as I’ve seen what is clearly examples in their own tech demos.


The X8 series cards have FX16 alpha blending and FP16 texture and only FX16 in the TMU (I'm not sure but I think that's the case with the TMUs even on the R520, but not enough info on it). It's not a major limitation, just technical difference. The alphablenching is the killer when trying to use OpenEXR based HDR.

Quote:
(being the video card junkie I am, I have all of ATi’s tech demos for the Radeon 7200 through the X850, the highest that will run on my X800XT) However, as I said, if no one bothers to program using a feature, it’s worthless, and thus far, I’ve seen no one use it.


ATI's techdemo is similar to Masa's Rthdribl. Valve used it for the first attempt at the lost coast demo which allowed HDR on SM2.0 cards. The progression from there is detailed in this ARStechnica article;
http://arstechnica.com/articles/culture/lostcoast.ars/3

To the point where it's a combination of both FP16 and FX16.

Quote:
However, recent information regarding the upcoming Elder Scrolls IV: Oblivion found at Beyond3D suggests that the game will be an exception. Apparently, it will not only use floating-point color for its HDR method, it will possibly even work with MSAA even on any SM 2.0-complaint card, though what was said makes it inconclusive.


Likely it's similar to Valve's eventual compromise. I should check out that thread at B3D, likely very enlightening, and being a Morrowind / Oblivion fan I've ben keeping track of their developments (likely more raw new applicatoins of technology than any other single game since FartCry).

Quote:
And lo and behold, we now have X800GTOs (128MB version) from Sapphire selling for close to the same price as 6600GTs right now! :D  I find that to be a pleasant surprise, now if only they became available for AGP...

They are available in AGP now, have been for a while in my neck of the woods, but of course no GTO2s in AGP yet. And hey how about those supposed GF7800GSs coming from eVGA and XFX? :lol: 

Oblivion game is a prime example; it will apparently incorporate all the soft-shadow and HDR technologies seen before and more,


Oh yeah, specular lighting, parallax mapping, HDR and more. It should be as fun as Morrowind in that respect.

Quote:
and it will all run in SM 2.0. (it will run much faster with SM 2.0b or SM 3.0 support, as SM 2.0 will require many of the things to be done in many passes instead of one) And again, as for what amount of power is really necessary to do something with SM 3.0, just as with how much power is necessary to use so much video RAM, is dependant on the game, not the card.


Well that's true to some extent except that some of the ways the GF6 series does do it's target dependant calculations forces excessive loops, whereas the GF7 is far more efficient. Just look at the enormous speed boost of the GF7 with HDR (beyond it's boost in non-HDR), and in it's OGL 2.0++ performance in Riddick. There's alot more going on there than just the ability to use the video RAM texture buffer (a GF7800GT is still more efficient even at low res, and even at lower texturs detail).

I would say it's a combination of both, but it all leads down to the 'enough power to turn on the checkbox'. Many features are low impact and many features even improve performance, but to this date the only SM3.0 feature I think the GF6 can take full advantage of compared to the competition is specular lighting (used in FartCry and SplinterCell Chaos Theory), and the performance benifit from it is small. Also the GF6 can run the specular lighting at default whereas the X8 series cards do need to be properly coded for in the game, or else slight differences are noticeable.

It all comes down to what the developers are will to code for, just like R2VB on the R520; it could be far faster but without support it will be wasted.
a b U Graphics card
January 14, 2006 3:50:49 AM

Quote:

Huh? Where? Where? WHERE???
I've been waiting for that new ages!


None for you while you're stuck in NZ with your new girlfriend

BTW, isn't GW's already moded and you still don't have one? :twisted:
January 14, 2006 10:14:56 AM

Quote:
Guru my arse. You're lucky you got pwned by GGA and not by me when it comes to pipeline unlocking. :twisted:

I'm the only real Guru here BTW. :tongue:


I didn't get "pwned" by GGA. The GTO2 has a 100% success rate in unlocking the pipes. If you were a guru you would know that.
January 14, 2006 11:43:15 PM

Quote:
ATI's techdemo is similar to Masa's Rthdribl. Valve used it for the first attempt at the lost coast demo which allowed HDR on SM2.0 cards. The progression from there is detailed in this ARStechnica article;
http://arstechnica.com/articles/culture/lostcoast.ars/3

To the point where it's a combination of both FP16 and FX16.

Well, I did note how very close to identical the variety of elements used in both Masa's RTHDRIBL as well as ATi's Devenec HDR demo for the 9700.

Quote:
Likely it's similar to Valve's eventual compromise. I should check out that thread at B3D, likely very enlightening, and being a Morrowind / Oblivion fan I've ben keeping track of their developments (likely more raw new applicatoins of technology than any other single game since FartCry).

Indeed; the article primarily covers shadows though, but it's big nonetheless; the most important thing to note is that the game casts of stencil shadowing entirely, and uses shadow-mapping instead. Hence, with any card that has support for MRTs, each shadow can be drawn in a single pass. Not to mention the fact that it's far more accurate to life, as well as providing a better distributed workload.

Quote:
They are available in AGP now, have been for a while in my neck of the woods, but of course no GTO2s in AGP yet. And hey how about those supposed GF7800GSs coming from eVGA and XFX? :lol: 

I should've been clearer; I meant the 128MB versions; I haven't seen any for AGP, only 256MB versions. The price difference I saw was significant. Since the GTO²s seem to all use the R480 core, I have my doubts that they might be made availible for AGP. Then again, I have no idea how much it costs to add the PCI-e to AGP bridge chip, so it may actually be feasible. And the 7800GS cards are indeed strange, though I'm not holding my breath for an AGP version; even if they do come, they will likely still cost a ton, and not be able to compete performance-wise with ATi's X850XT PE, which already comes close enough to the 7800GT in a fair fight.

Quote:
Oh yeah, specular lighting, parallax mapping, HDR and more. It should be as fun as Morrowind in that respect.

Ah, yes, the days when shaded water was all the rage, with all those cross Xbox/PC titles like Halo and Unreal Championship...

...Good thing that the game will certainly be more fun and less stiff than Morrowind was, and have some more of the gameplay goodness that we got in Daggerfall. I must say that it's my awaited game of 2006, and was my awaited game for 2005, though last year it shared the spot with Age of Empires III and Perfect Dark Zero. (both from Microsoft, coincidentially...)

Quote:
Well that's true to some extent except that some of the ways the GF6 series does do it's target dependant calculations forces excessive loops, whereas the GF7 is far more efficient. Just look at the enormous speed boost of the GF7 with HDR (beyond it's boost in non-HDR), and in it's OGL 2.0++ performance in Riddick. There's alot more going on there than just the ability to use the video RAM texture buffer (a GF7800GT is still more efficient even at low res, and even at lower texturs detail).

I would say it's a combination of both, but it all leads down to the 'enough power to turn on the checkbox'. Many features are low impact and many features even improve performance, but to this date the only SM3.0 feature I think the GF6 can take full advantage of compared to the competition is specular lighting (used in FartCry and SplinterCell Chaos Theory), and the performance benifit from it is small. Also the GF6 can run the specular lighting at default whereas the X8 series cards do need to be properly coded for in the game, or else slight differences are noticeable.

It all comes down to what the developers are will to code for, just like R2VB on the R520; it could be far faster but without support it will be wasted.

Ah, yes, I'm aware that nVidia has definitely been making major changes to the whole GPU with each new generation; the G70 looks little like the NV45. I've been impressed with how much nVidia's been able to pull up their weak points; SM 3.0 modes in games did little good for the GeForce 6 cards given the massive performance loss that would come from it. I've seen this gap shrink, and it seems to be rather small in the X1k series cards, ad hopeful will be as well with the GeForce 8 cards as well; it might even be viable to use such features on a low-level card, preusming you're also reducing the number of pixels to render those shaders across, of course. :p 
January 15, 2006 2:08:41 AM

Quote:
The fact is it doesn't as GGA has already mentioned that some GTO2 comes with artifacts like mine when unlocked.

Indeed; I'm susprised at how effective such a rumor managed to spread around, in spite of the fact that as far as I know, Sapphire said nothing on the issue, they merely made up that model. These sorts of rumors do annoy me, as they completely obscure the truth. At least they're usually close to part-true, in that the X800GTO² can be pushed higher in both pipelines and clock speeds with a higher degree of success than other cut-down X800 cards, but if it was 100%, and ATi needed to offload those R480s, they would've just cut the price on the X850XT PE cards, to better fight nVidia at the higher end.
January 15, 2006 1:09:12 PM

Quote:
Guru my arse. You're lucky you got pwned by GGA and not by me when it comes to pipeline unlocking. :twisted:

I'm the only real Guru here BTW. :tongue:


I didn't get "pwned" by GGA. The GTO2 has a 100% success rate in unlocking the pipes. If you were a guru you would know that.
The fact is it doesn't as GGA has already mentioned that some GTO2 comes with artifacts like mine when unlocked.
Well, I've only heard of people getting artifacts when they OC the GTO2 after unlocking or people having poor case cooling. You are the first one that I've heard of that are getting artifacts from just unlocking the card. Can you please give me a screenshot of the settings view in ATI Tool and a screenshot of the artifacts. Or report it directly to TechPowerUp.com it's very important...
a b U Graphics card
January 16, 2006 2:25:17 AM

Did you vote for the 6600GT? :o 
January 16, 2006 3:30:42 AM

Quote:
I've over 10 years of professional experience building custom-made computers, mostly gaming systems.


What a freakin' joke :roll:
a b U Graphics card
January 16, 2006 12:37:35 PM

Someone did, maybe the Guru? :p 
January 16, 2006 6:51:10 PM

Quote:
I've the artifacts somewhere but they weren't like mine with the checkerboard effect on my 9800SE->Pro. The few ones that did have artifacts were snow so they aren't all 100% unlockable.


Well, can you give me some screenshots? Or send them directly to TechPowerUp.com? It would be nice since they are still claiming there is a 100% success rate.
January 16, 2006 6:59:44 PM

Quote:
I've over 10 years of professional experience building custom-made computers, mostly gaming systems.


What a freakin' joke :roll:
Please explain yourself.

What really is a "freakin' joke" is that you base your statement on the little information you know about me.
January 16, 2006 10:05:12 PM

Quote:
Please explain yourself.


Ok....I'll explain. Your screen name is "The Guru"....which is pretty much the same as claiming that you know it all. Then, by reading your post's, you really dont seem to know what you are talking about, when it comes to graphics cards.

You should find another forum to try and convince that you know something....okay?
January 16, 2006 11:32:11 PM

Quote:
Please explain yourself.


Ok....I'll explain. Your screen name is "The Guru"....which is pretty much the same as claiming that you know it all. Then, by reading your post's, you really dont seem to know what you are talking about, when it comes to graphics cards.

You should find another forum to try and convince that you know something....okay?

I've been called "the guru" for many years. I've got the name since I'm a good system developer.

You shouldn't judge people by their name, race or color, that's childish behavior. If I were called Osama Bin Laden, would you report me to the CIA? If I were called Omar Abd el Hamid would you refuse to hire me because I had a muslim name?

What posts are you referring to? Specify where I'm wrong.

About the X800GTO2 I've sold several cards and not heard of anyone getting artifacts after unlocking. And according to TechPowerUp there should be a 100% successrate since people report to them it their mod guides doesn't work. Many people have reported for the X800GTO but none has reported for the X800GTO2 so that must make artifacts very rare.

And I'm still waiting for the people here to report the artifacts to TechPowerUp or to send me the screenshots so I can forward them.

I had an argument with GGA in another thread if you are referring to that but then I admitted I exaggerated and GGA didn't admit he was wrong.

Only because I've missed the latest rumors about a graphics card. Does it make me stupid? Get real...

How can you judge another person on this little information? Are you that narrowminded?

I didn't think "the Law of Jante" applied in the US. But I guess I was wrong...
a b U Graphics card
January 17, 2006 1:37:49 AM

Ok, sure, it's what they call you; That makes sense. But look at it this way: A noob signs up with the Name "The_Guru", and put's " I've over 10 years of professional experience building custom-made computers, mostly gaming systems" in his sig? Don't you see where you might be over-scutinized by drawing such attention to yourself? Were you expecting ooohs and aaahs? You'd have to be either foolish, desperately seeking attention, and/or extremely cocky to do so. I'm not jumping on you, just explaining the obvious reasons why you may have set yourself up for failure? No matter how confident you were coming in here, wouldn't it make sense to come in humble and earn the Guru title, rather than marching in and self proclaiming it? There are lot's of knowledgable people here, most of whom I'd say would never refer to themselves as A Guru, never mind THE Guru. There are definately others with similar experience and close-circle-nicknames who wouldn't dare(or don't feel the need) to advertise it with every post. :roll:
January 17, 2006 3:09:58 AM

Quote:
PowerColor Radeon X800Pro and ASUS N6600GT/TOP/TD. Both are AGP.

I'm want to upgrade my FX5200 magic to one of these. Help me choose one too please. I'm a gamer btw.

My current AI:

P4 2.4GHZ
1.5GB RAM
550W P.S.
Geforce FX5200 Magic[OC]


How do you do the POLL? :twisted:
January 17, 2006 3:15:20 AM

Quote:
PowerColor Radeon X800Pro and ASUS N6600GT/TOP/TD. Both are AGP.

I'm want to upgrade my FX5200 magic to one of these. Help me choose one too please. I'm a gamer btw.

My current AI:

P4 2.4GHZ
1.5GB RAM
550W P.S.
Geforce FX5200 Magic[OC]


never mind I got it.
January 17, 2006 4:13:41 AM

X800PRO is the better of the two.
January 17, 2006 4:45:46 AM

Quote:
X800PRO is the better of the two.

7800gt :twisted:
January 17, 2006 8:46:49 AM

Quote:
Ok, sure, it's what they call you; That makes sense. But look at it this way: A noob signs up with the Name "The_Guru", and put's " I've over 10 years of professional experience building custom-made computers, mostly gaming systems" in his sig? Don't you see where you might be over-scutinized by drawing such attention to yourself? Were you expecting ooohs and aaahs? You'd have to be either foolish, desperately seeking attention, and/or extremely cocky to do so. I'm not jumping on you, just explaining the obvious reasons why you may have set yourself up for failure? No matter how confident you were coming in here, wouldn't it make sense to come in humble and earn the Guru title, rather than marching in and self proclaiming it? There are lot's of knowledgable people here, most of whom I'd say would never refer to themselves as A Guru, never mind THE Guru. There are definately others with similar experience and close-circle-nicknames who wouldn't dare(or don't feel the need) to advertise it with every post. :roll:


But I have 10 years of experience with building computer systems, it's nothing but the truth. I've got several PMs from people asking for help since I got here, and I've helped them. Why should I have to change nickname just for joining a new forum? I've become member of many forums (but not US forums) and I've had similiar signatures and the same nickname. But I've never got such bad respons as I've got here. Before I joined this forum I though the US was the country where "the law of Jante" least applied, but now it seems it is the country where it applies most.

You need to think of the positive side of things. When you see my signature you shouldn't think "this guy thinks he is someone" you should think "if I've got some questions or trouble he might be the one to ask".
a b U Graphics card
January 17, 2006 11:39:18 AM

I'm not doubting it is the truth. But You're not getting it.

Anyway, If it makes noobs PM you for help, I'm all for you keeping the name and sig. :)  But remember, the slack others are given when they make a mistake, might earn you a little ribbing (teasing), again based on your claims.
January 17, 2006 8:07:07 PM

:twisted: FIGHT FIGHT, A USA DUDE AND A FOREIGN DUDE. WHO WILL WIN LADIES AND GENTLEMEN??? PLACE YOUR BETS. HIT ME UP. I TAKE PAYPAL. :twisted:
January 17, 2006 8:33:34 PM

What's with the Guru thingy?
I think this topic is pretty screwed and way off topic lol
January 18, 2006 6:27:26 AM

Quote:
What's with the Guru thingy?
I think this topic is pretty screwed and way off topic lol


Hey vile who knows what's going on. Yea what is wrong with uguru. :twisted:
January 18, 2006 5:05:20 PM

Quote:
But I have 10 years of experience with building computer systems, it's nothing but the truth. I've got several PMs from people asking for help since I got here, and I've helped them. Why should I have to change nickname just for joining a new forum? I've become member of many forums (but not US forums) and I've had similiar signatures and the same nickname. But I've never got such bad respons as I've got here. Before I joined this forum I though the US was the country where "the law of Jante" least applied, but now it seems it is the country where it applies most.

You need to think of the positive side of things. When you see my signature you shouldn't think "this guy thinks he is someone" you should think "if I've got some questions or trouble he might be the one to ask".

This is where you enter the fallacy of "Nottheking's law of Comparative politics." (It's not technically known as a law, particularly given that I've merely compelted undergraduate studies in political science, but at least it is recognized by those around me)

The law describes that the main thing that Europeans fail to recognize about the United State of America is its scale. This entails many things, chiefly of which the fact that you cannot apply ANY form of generalization to it. I'm not grilling you over it, but it's just something I make sure to keep in mind when talking to Europeans.

At any rate, how much the "Jante Law" applies depends upon the circle you enter into. If you opt to enter into a specialized area, the rule very certainly applies if the discussion is centered on the topic of specialization. You come to a hardware forum, and the discussion is on hardware, it applies EVEN STRONGER in the US than in anywhere else. However, if the topic is something not related well to the specialization it applies somewhat less, but still strongly; one must still demonstrate before making claims. Know that it is in the US that the term "noob" (or newb, n00b, etc.) originally came from; does that not sound like something that would conform to the Jante Law?

This sort of atmoshpere is one of the few things that actually is fairly consistent through the US. Austerity in behavior is the best way to go, if you wish to gain respect.
January 18, 2006 7:40:19 PM

Who gives a flying f*uck. Can we all just get along and go back to the nerd sh*t. Though I'm not a nerd my self. Besides what you guys are talking about is waaaaaaaaaaay off subject. :twisted:
!