Sign in with
Sign up | Sign in
Your question

ATI vs. Nvidia editorial

Last response: in Graphics & Displays
Share
Anonymous
a b U Graphics card
August 23, 2004 7:30:08 PM

I found this <A HREF="http://www.theinquirer.net/?article=18029" target="_new"> editorial </A> pretty interesting. The article is about the future of ATI and Nvidia. What stood out to me was the fact that ATI doesn't have any plans about SLI.

<font color=blue> When in doubt, rock the fu*k out </font color=blue>

More about : ati nvidia editorial

August 23, 2004 8:34:40 PM

Quote:
The future belongs to Nvidia right now, and the only hope ATI has is in the R500, but that won't be here for a long time.

Why? Because nVidia is faster in a game that people are already trading in at their local game shop? Sure, they will be licensing the engine for some titles, but there are a lot of games coming out that <i>aren't</i> using the Doom engine.
Quote:
HL2, for all its gameplay is a last generation engine.

Oh, and Doom 3 is bleeding-edge? Most of the features in Doom have been available for years- they could have released that game 12 months ago, with better gameplay, if they didn't have their heads up their a$$es.
Quote:
It was a showcase for geometry instancing and PS3.0 effect. The new crop of games coming out in a few weeks will tell the story loud and clear.

ATi can do geometry instancing with SM2.0b. In fact, R300 can do it also, something that NV30 cannot.
Quote:
SLI alone will be enough to make Nvidia own every benchmark under the sun by simply abusive margins.

SLI is neat, but a lot of enthusiasts would rather just wait and buy a single powerful card. Imagine upgrading from a 9800 Pro to a 6600GT- some games might run a little bit faster, maybe some slower, but then you have to wait another 6 months before you can afford the second card, when the <i>real</i> upgrade actually happens? That's more power draw, more clutter inside the computer, more noise, more heat, and not necessarily as good performance as one flagship card. Oh and BTW, where is my PCI-Express motherboard? In fact, where is my dual PCI-Express board?

This is just a paranoid rant. These cards have only been out for a few months, and there is little software to expose the advantages of either architecture.
August 23, 2004 8:46:11 PM

Agreed - SLI might be a nice idea, but 1 6800 Ultra is too expensive for most people, even most enthusiasts! So while you might see 1/3 better performance with it, dont see many people usig it, even if, as you say, there even were dual pci e boards!



XP2000, 512 ddr 2700ram, GF4 MX440, XP Pro
Related resources
August 23, 2004 8:50:47 PM

I'm not talking down the 6600GT, we haven't even seen what it's capable of, and SLI is a cool auxiliary benefit, but I don't think that SLI will be a big force in most people's buying decisions, especially since it implies spending <i>more</i> money in the future.

<P ID="edit"><FONT SIZE=-1><EM>Edited by sweatlaserxp on 08/23/04 04:52 PM.</EM></FONT></P>
August 23, 2004 8:51:13 PM

yeah that's a pretty poorly laid out argument. His argument sounds almost fanboyish, which doesn't say much for the inquirer.

AHHH <b>MOTHERLAND!!!!!</b>
August 23, 2004 10:18:59 PM

I concur all your points.

slightly off topic: (this is for everyone)
if you were the R&D guy at ATI, what would you suggest the company does to counter NVs SLI ??

Just curious.

<i> :evil:  <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :evil:  </i>
<b>Hard work has a future payoff. Laziness pays off now.</b><P ID="edit"><FONT SIZE=-1><EM>Edited by priyajeet on 08/23/04 05:19 PM.</EM></FONT></P>
August 23, 2004 10:31:34 PM

Wow you should try being neutral like me...damn its so funny to watch the fanboy cry after seing thing that contradict them....hahahaha!

Athlon 2700xp+ (oc: 3200xp+ with 200fsb)
Radeon 9800pro (oc: 410/360)
1024mb pc3200 (5-3-3-2)
Asus A7N8X-X
August 23, 2004 10:42:02 PM

I was hoping for a good article... that article was not it. The reality is the features that cards introduce today, well, are not (usually) really mainstream for 1-2 years in games. And my experience has been that a lot of times the introduction of a technology is underpowered for its use.

The only reason I am considering a new card (6800GT) is because it can play the new games with all features on, maxed out (1600x1200) at 60fps. This means it has a lot of horse power, meaning in a year or two it should be able to play top end games with most features on at reasonable resolutions and framerates. Heck, the 2 year old Radeon 9700Pro is playing HL2 great, but not maxed out... but I am sure most enthusiest who bought one have long moved on, which kinda shows how the market is.

Besides a lot of presumptions and really narrow view of the industry, I was pretty shocked the Longhorn and the next generation consoles were not mentioned more.

Longhorn (with DX10?) and R500 and NV50 with pixel/vertex shader 4.0 will be a big platform for a long time (in a way a deternt for me for getting something now, especially since PCIe/DDR2 seem to be in their infancy, and in 12-18 months I think we can judge these platforms better... if not delayed).

Also, w/ the Xbox2 and GameCube2 both using ATi chips (Xbox2 is using R500 tech, I think the Cube is using Flipper2 tech from the ArtX guys ATi bought aquired) there is a lot of money and recognition.

I just think he ignored a lot of variables--not to mention I REALLY doubt OEMs are asking for SLI at this point. I mean, come on! Not only do you need a PCIe board, but you need 2 PCIe slots. There is no demand for SLI at this point, and you have to wonder about this scenario:

• $300 for an extra board to make SLI and increase speed 30-100% (I am sure each program varies) with DX9 PS/VS 3.0; or

• $300-$400 for a new midrange flagship card with equivalent performance with PS/VS 4.0 and DX10 compatibility.

I had the VoodooII with SLI support, and by the time the 2nd card was cheap enough, there was the TNT with awesome performance and 32bit color depth and just plain kicked butt.

If I were ATi, I would suggest looking at dual GPUs onboard. The PCIe spec is not being pushed yet, so I think a lot of enthusiests would love a dual GPU system. But you have a lot of heat and size to cram in there... so it would probably have to be a midrange part. The advantage is the GPUs may be able to share memory in an effecient design.
August 23, 2004 10:55:32 PM

hmmm...dual core technology in GPUs....will PCIe be able to manage it ?

<i> :evil:  <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :evil:  </i>
<b>Hard work has a future payoff. Laziness pays off now.</b>
August 24, 2004 12:54:41 AM

Quote:
Sure, they will be licensing the engine for some titles, but there are a lot of games coming out that aren't using the Doom engine.


Maybe now, but look at the amount of blockbuster games that came out using the Quake 3 engine.

Personally I believe (based on the past with iD and other game companies) the DOOM engine will likely be used much more in future games than HL2's, IMO. So don't try to give a false disadvantage towards an engine, especially when it just came out. This would be the same scenario as if you said that back when Q3 came out. Boy, 3-4 years later would your face look red, wouldn't it now?

Quote:
Oh, and Doom 3 is bleeding-edge? Most of the features in Doom have been available for years-

For all it's worth, HL2 isn't the most impressive looking game I've seen yet. I really find DOOM III more polished. Most of the HL2 textures are flat from what I've seen. It WILL employ many DX9 features, however it's not a major step up.

Lemme ask you a question though: Are DOOM III's graphics similar to last generations?
Quote:
with better gameplay, if they didn't have their heads up their a$$es.

Label your opinions as ones and not as facts. As far as I'm concerned there are a lot who are also enjoying it as hell, so don't put your opinions as facts.

Quote:
SLI is neat, but a lot of enthusiasts would rather just wait and buy a single powerful card. Imagine upgrading from a 9800 Pro to a 6600GT- some games might run a little bit faster, maybe some slower, but then you have to wait another 6 months before you can afford the second card, when the real upgrade actually happens?

Consider THIS scenario though:
You buy yourself a 6800GT. It's currently 400$. Now unless the next generation is launched in 6 months and the performance jump is as astronomical as the near 2x jump we had with the current new generation of high end cards over the NV38 (5950?) and R380 (XT?), then in 6 months you can pay 200$ and you will have a performance that is assumably on average 60-100% better (I don't buy that 33% minimum claim, mind you), whilst the next card may actually have 40% better performance AT BEST, AND cost 500$. (hey, the 512MB cards premium is likely to come and stay for a while, just like the still-overpriced XT)
Now the extra 100$ investment within the next 6-11 months will give you a new lease on a system that was almost 6-12 months old or more, and you will have a system overall nearly twice faster.

I fail to see how SLI or the GFX Array (assuming one day it becomes mainstream for everyone, and ATi too can use it for its advantage at little extra cost) are not potential technologies. The time where new generation cards meant major boosts is really no longer such a recurrent one, like the old days of the Voodoos.

Just trying to play the devil's advocate (DH isn't in this thread so far, so bleh! :tongue: ) to level your rather pessimistic look on DOOM III's engine and nVidia. (hey, if you guys want another Kinney, albeit far less of him and more of "Eden's love", you got it. :wink: )

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red><P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 08/23/04 09:12 PM.</EM></FONT></P>
August 24, 2004 1:04:25 AM

Quote:
not to mention I REALLY doubt OEMs are asking for SLI at this point.

That WOULD be ridiculous indeed, unless they're big gaming enthusiast OEMs like Alienware.

I mean, if nVidia slapped SLI capabilities on all their cards out, then I can just imagine Dell SLIing their GF4 MX4000 systems and calling them GF8 MX8000 powerhouses! :lol: 

Quote:
• $300 for an extra board to make SLI and increase speed 30-100% (I am sure each program varies) with DX9 PS/VS 3.0; or

• $300-$400 for a new midrange flagship card with equivalent performance with PS/VS 4.0 and DX10 compatibility.

See that's the thing, SLI is more made for long-term upgradability options, AND taking into account that new updated cores don't improve THAT much as we'd expect them to. Did the 9800PRO make the 9700PRO look foolish? Hell no, I thought it was the worst update at its time. Then I was convinced ATi didn't know jack when they tried the same and worse, with the XT. It's still a great product today, especially for HL2 and DOOM III engine games who will make use of greater memory available, but at its time, the extra performance advantage, for the extra 100-200$ was worthless.
Compare that to an extra card which 6 months after you bought it, costs half what it did, which will potentially give you UP to twice more performance. (AND you get to double your current video memory for much less!)

Like I said, SLI may work well because we're no longer in an age where every new card out, is a quantum leap over the last. It happened with the R420 and NV40, but I highly doubt we will see it again for a LONG time. Even the 9700PRO regarded as a major improvement over the last generation, including the GF4 Ti4600 and the Radeon 8500, was not very impressive at normal IQ gaming, but rather kicked ass in AA and AF. But it took almost 2 years before we got another leap that was even more dramatic (except for AA and AF, I don't think the new cards beat the older ones by 2.5-3x). Now if you can almost double your performance over the course of 2 years with the same card but then will cost half what it did when you got it, under such scenario, you are a winner, big time. Although a bit exagerated, maybe nVidia is underestimating how tricky the consumers might be with using SLI to their advantage.

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red><P ID="edit"><FONT SIZE=-1><EM>Edited by Eden on 08/23/04 09:06 PM.</EM></FONT></P>
Anonymous
a b U Graphics card
August 24, 2004 3:07:51 AM

Yeah, at least The Inquirer labels it as an editorial unlike some sites that would list it as a real, information packed article.

<font color=blue> When in doubt, rock the fu*k out </font color=blue>
Anonymous
a b U Graphics card
August 24, 2004 3:19:11 AM

I agree. ATI should at least *consider* SLI technology, unless they they have another method of getting a quick 30% performance boost.

<font color=blue> When in doubt, rock the fu*k out </font color=blue>
August 24, 2004 3:36:32 AM

well, ATI might be pushing OEMs to use Alienware kinda ALX arrays wherin you can use any 2 cards...not specially Nvidia. Also I read in the future it will even have 4 cards.

Or they can go the CPU guys way...creating dual core or dual GPU boards.

Or Buy Nvidia
(or get bought over by Nvidia)

Has SLI been patened or something ? If not, there is no shame if ATI goes into that. Then it will be a healthy competition as to who has better parallel processing algorithms.

Or they can do a diff kind of SLI, 1 card doing something like rendering and has tons of RAM on it. The other only dealing with IQ and GPU (CPU related) intensive tasks. Basiacally like a mother + daughter board...
and then they give an option for us to add more RAM by plugging add-on daughter sisters into other PCIe slots...even the x1 ones.

<i> :evil:  <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :evil:  </i>
<b>Hard work has a future payoff. Laziness pays off now.</b>
August 24, 2004 3:59:56 AM

this <A HREF="http://forums.xbox-scene.com/index.php?showtopic=231928" target="_new">link</A> and this <A HREF="http://www.xbitlabs.com/news/mmedia/display/20040627030..." target="_new">link</A> show

Xbox2 will be a BAD ASS console.
IBM powerpcs triple-core each at 3.5G, and each having <b><font color=red>vector processing capabilities</b></font color=red>.

<i>"The Xenon GPU is a custom 500+ MHz graphics processor from ATI. The shader core has 48 Arithmetic Logic Units (ALUs) that can execute 64 simultaneous threads on groups of 64 vertices or pixels. ALUs are automatically and dynamically assigned to either pixel or vertex processing depending on load. The ALUs can each perform one vector and one scalar operation per clock cycle, for a total of 96 shader operations per clock cycle. Texture loads can be done in parallel to ALU operations. At peak performance, the GPU can issue 48 billion shader operations per second.

The GPU has a peak pixel fill rate of 4+ gigapixels/sec (16 gigasamples/sec with 4× antialiasing). The peak vertex rate is 500+ million vertices/sec. The peak triangle rate is 500+ million triangles/sec. The interesting point about all of these values is that they’re not just theoretical—they are attainable with nontrivial shaders."

"Xenon has 256+ MB of unified memory, equally accessible to both the GPU and CPU. The main memory controller resides on the GPU (the same as in the Xbox architecture). It has 22.4+ GB/sec aggregate bandwidth to RAM, distributed between reads and writes. Aggregate means that the bandwidth may be used for all reading or all writing or any combination of the two. Translated into game performance, the <b>GPU can consume a 512×512×32-bpp texture in only 47 microseconds</b>."</i>

<i> :evil:  <font color=blue>Futile is resistance,</font color=blue><font color=red> assimilate you we will.</font color=red> :evil:  </i>
<b>Hard work has a future payoff. Laziness pays off now.</b>
August 24, 2004 4:09:51 AM

Hey Eden,

I understand where you are coming from. I do not think SLI is BAD, just that it is not the same as in 98/99 when Voodoo2's ran on PCI slots. We barely have PCIe slots, let alone Dual 8x/16x slots. It will take a while before the tech even gets a foot hold.

e.g. I am considering a video card right now, and I am leaning toward a 6800GT. If I do it will have to be AGP, which means I wont be able to go SLI for a long time. The segment who will use SLI *at this point* is very very small, and that wont change for a bit. I DO like the 6600GT product specs, and THAT could be interesting for SLI.

Anyhow, my point was the article was stupid. e.g. his statements about ATi's defense... we know NOTHING about the next gen of stuff, less they will most likely be PS/VS 4.0 and DX10/Next (whatever) and the ATi tech will be in the Xbox next (which means they are working close with MS).

I remember before the paper releases there was a lot of unknowns, and then it was leaked ATi had a 12 pipe part, and then it was leaked nVidia had a 16 pipe part, then ATi had one also... and we are just FINALLY getting some descent feedback on how they work in games and it will be a couple more months before there is good product availability and driver updates/tweaks.

One thing on the D3 engine: It does not do out doors well right now (at least that is what I have read) because it is very intensive, whereas HL2 does outdoors well now. Now, I know D3 will be tweaked, as were Q/Q2/Q3--which have done some nice outdoor games. But HL2 does them pretty well right out of the box it appears.

Another thing is there is a lot more to a game than JUST the graphics. I am sure a lot of developers will look at HL2 and the extensive use of physics and the excellent AI and decide it may be best for their game. Developers mod the engines to meet there needs, and graphics are to a point where style is just as important as technology. I think the D3 graphic technology looks maybe a little better than some of Hl2 (though they have some awesome stuff too) BUT HL2's style I think is a lot better. I think it is more impressive, personally, to have games depicting the real world than ficticious. D3 has the best Mars base I have ever seen... but I have never seen one. Also, D3 does have some issues as id even noted (like the plastic look to people).

Anyhow, I am not pro nVidia or ATi--I am pro consumer. I just think the article was misleading.

D3 is great, so is HL2. nVidia and ATi have been giving us some GREAT products over the last couple years (some better than others, but all very good overall). SLI looks cool, so does the Alienware tech. The worse thing that could happen now is for one of them to go down... especially based off reactions to hysterical rants.
August 24, 2004 4:16:37 AM

Heya Priyajeet :) 

Looks nice. I do hope they up the RAM though--if the CPU and the GPU are using the Memory, 256MB wont be "enough". Both the Cube2 and Xbox2 will work on Monitors and HDTV, so I am guessing 720p will be popular. Washed out/bland textures will be very noticable at this resolution.

I also hope the Bandwidth on the RAM is enough for 5 years, because the current high end cards are about ~30GB/s, I am guessing in 12-24 months it will be a bit higher than that.

Oh well, I personally think the next consoles will all look great... will be more a matter of quality games and compelling content. But it wont hurt ATi to be pushing its tech into a lot of homes.
August 24, 2004 4:31:08 PM

I wasn't criticizing the Doom engine, nor was I attacking nVidia.
Quote:
Label your opinions as ones and not as facts.

So.... if I state my opinion, and I don't declare that's an opinion, then that makes it a fact?

I still think that Doom is by far the most impressive-looking game to date, and yes, it does look better than HL2 (though the difference in environments makes it difficult to compare the two).

I think the main advantage of SLI will be long-term upgradeability, the option of buying a second card when they've become cheap and getting a nice boost.
August 24, 2004 6:34:49 PM

Quote:
Has SLI been patened or something ?

Hmm, while I would say SLI isn't patented, since nVidia is using what 3DFX used, I am then reminded, nVidia BOUGHT 3DFX AND uses their technologies, like in the FX!

So maybe they can't, alas. :eek: 
Not like it's that hard though, the SLI version is using its own slot type, ATi can just as well do like dual processor systems and create their own ways to make 2 cards work too. I don't think you can patent that.

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red>
August 24, 2004 9:21:15 PM

ATi has already dabbled with SLI (well technically NOT SLI, but it's almost the same thing) with the Rage Fury MAXX. That card had 2 Rage 128 GPUs... ala Voodoo 5. They could do the same thing again, but there are limitations... like whether or not the bus can handle the bandwidth and the fact that each GPU needs it's own RAM... which further increases the cost of the card.

It's doable... but I think ATi is leary after the MAXX didn't work out as well as they would have liked.

<font color=red> If you design software that is fool-proof, only a fool will want to use it. </font color=red>
August 24, 2004 11:18:27 PM

Dual-GPU is quite different from SLI because it isn't <i>modular</i>, and therefore it doesn't have the price appeal. It would be a crappy response to SLI, when ATi could just develop their own multi-card standard. The difference between the SLI Voodoos and SLI nVidias is that the Voodoos rendered in alternating lines whereas the nVidias render the top and bottom halves of the screen.
August 25, 2004 3:13:10 AM

Quote:
We barely have PCIe slots, let alone Dual 8x/16x slots. It will take a while before the tech even gets a foot hold.

I know, hence why I set an example scenario, but not necessarily the one we'll see. Besides, chipset manufacturers cater to gamers a LOT. There are motherboards dedicated solely to LAN partying, and it would not surprise me if Dual PCIe is conceived very soon by VIA, SiS and others and released ASAP. nVidia of course will make sure that happens ASAP. And even if one PCIe slot will be x8, it will likely run as fast as x16, given the amount of bandwidth still unused in AGP 8X.
I'm just saying that it may not take as long as you might think, because companies have been very serious about catering to enthusiasts and gamers everywhere, and it would not surprise me if a Dual PCIe solution is nearby.

Quote:
Anyhow, my point was the article was stupid. e.g. his statements about ATi's defense... we know NOTHING about the next gen of stuff, less they will most likely be PS/VS 4.0 and DX10/Next (whatever) and the ATi tech will be in the Xbox next (which means they are working close with MS).

Frankly I barely read anything about it. Too long for my taste. I was just arguing what some seemed to downplay a lot.
And I'm quite happy to see ATi get so much offers by console companies.

Quote:
One thing on the D3 engine: It does not do out doors well right now (at least that is what I have read) because it is very intensive, whereas HL2 does outdoors well now.

I don't think it's that true on the D3 outdoor thing. I do think HL2's engine currently works very well for outdoor scenes, but from the moments you get in airlock cycles in DOOM III, to explore outside, it seemed to be fairly complex and had good performance overall. But yeah, it will need better hardware to go far in outdoor gaming. But hey, that's why I was saying that this game's engine is solid in its programming and advancements.

Quote:
But HL2 does them pretty well right out of the box it appears.

Indeed, but that kind of argument is like saying DOOM III does interiors and their lighting (visually and performance wise, a combination of both in other words) pretty well right out of the box. Both got their PROs, ya know?

Quote:
Another thing is there is a lot more to a game than JUST the graphics. I am sure a lot of developers will look at HL2 and the extensive use of physics and the excellent AI and decide it may be best for their game. Developers mod the engines to meet there needs, and graphics are to a point where style is just as important as technology.

I don't disagree to that. But what I once said, was even repeated by IGN in their D3 review: for the first time, a game's graphics DICTATE the gameplay. In other words, unless you're some whino, DOOM III's graphics make the gameplay and it almost won't matter if the physics are weak, it still holds some serious immersion to keep the player on his toes and ready to use what he has against a certain enemy.
Of course a developper looking for outdoor scene games with a lot of physics, kinda like CS: Source now, will easily find solace in HL2's engine. Maybe because of this, we might see a Medal of Honor using that engine. But who knows, maybe Carmack's got it planned well for these kinds of games and already has deals with major companies. This guy's a genius no matter what anyone thinks, he's a legend in computer graphics and programming, and if he can't overcome something, it may not be that doable now for others either. I'm not really kissing his arse, but rather stating what the industry itself acknowledges. I mean, he made the Quake 3 engine and its flexibility, and look how many excellent games came out of it. JK II, MoHAA, RTCW and many many others.
Newell or Sweeney come in second I'd say.

BTW I don't think AI has anything to do with an engine. You can't really compare MoHAA's AI to Quake 3's, can you?

Quote:
Also, D3 does have some issues as id even noted (like the plastic look to people).

And HL2's engine won't? I think it's more bold on Carmack's part to actually admit his own creation has its flaws and how he plans to improve it. I look forward to seeing Newell talk about his engine's limitations. After all, HL2's graphics are not a big step forward, and they also possess many limitations. I've noticed D3's limitations during the game and agree with Carmack on them, so I know it's a flawed engine as well. But that's the beauty of it: there's more to come, for each engine.
So I don't think it would be fair to point the finger at iD's engine without acknowledging Valve's has its own problems, but we'll just have to wait and see what Newell will say about it, IF he will.

Quote:
Anyhow, I am not pro nVidia or ATi--I am pro consumer.

I honestly didn't doubt that. :wink:

Quote:
D3 is great, so is HL2.

Wish I'd hear more level headed comments like these from others too. :eek: 

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red>
August 25, 2004 3:23:50 AM

Quote:
I wasn't criticizing the Doom engine, nor was I attacking nVidia.

I tend to get defensive and play a devil's advocate when I spot too much one-sided comments. In your case, yes, I found it too much attacking on iD's hard work and little open mindedness towards SLI. At least you cleared it up though.

Quote:
So.... if I state my opinion, and I don't declare that's an opinion, then that makes it a fact?

It's as easy as "IMO", "To me", "I think", "For me", etc. You get what I mean. Yes, it can be easy to be taken as fact sometimes. I felt like you were trying to speak for everyone, thus my response.

Quote:
(though the difference in environments makes it difficult to compare the two).

I agree on that. So you have to look at game-specific effects used. HL2 has that neat eyeball technology, and excellent physics. DOOM III pushes bump mapping to the maximum and creates a real tridimensional feeling to otherwise flat textures. It also does lighting like it's a simple 2+2 calculation.

But that's my point though, you were giving too much credit to one side and making DOOM III look like it's outdated or has nothing on HL2. Bashing the game itself didn't help either.

Quote:
I think the main advantage of SLI will be long-term upgradeability, the option of buying a second card when they've become cheap and getting a nice boost.

And like I said, the only reason I see it as a powerful tool for consumers, is because 2x performance upgrades do NOT happen frequently. Between a geForce 3 Ti500 and a geForce 4 Ti4600, there wasn't a huge jump like the 9800XT and the X800XT. But who knows. I've been known to be wrong quite a bit when it comes to predicting. :eek: 

--
<A HREF="http://www.lochel.com/THGC/album.php" target="_new"><font color=red><b>The THGC Photo Album revision Eden, faster updated than ever before!</A></b></font color=red>
Anonymous
a b U Graphics card
August 25, 2004 3:44:24 AM

I forgot about the Rage Fury Maxx! Thanks for putting me on a treasure hunt for <A HREF="http://www.firingsquad.com/hardware/atimaxxreview/" target="_new"> information </A>on it.

<font color=blue> When in doubt, rock the fu*k out </font color=blue>
!