Sign in with
Sign up | Sign in
Your question

5% speed inscrease with latest R600 drivers

Last response: in Graphics & Displays
Share
May 6, 2007 3:29:55 PM

It's possible.

We'll see when it's tested.
May 6, 2007 3:32:35 PM

Some speculation, If this does improve the R600 in the two stated games could there be a chance of victory over the GTX in few other top games? It sure would be good.
Related resources
a b U Graphics card
May 6, 2007 4:13:33 PM

Even if true, so what? After 5 months late, they should not need to dig and scrape for a few percentage points.
May 6, 2007 4:35:59 PM

this doesn't surprise me. I figured the drivers that they were using with the 2900 tests would have been test drivers and not final drivers, I'm just disappointed that the percentage isn't higher. Still they have a few more days to maximize there drivers before the public can get a hold of the cards. This has happened before with hardware release and there drivers.
Anonymous
May 7, 2007 1:10:12 AM

big deal if it beats 8800GTX in 2 games. it's 7 months late.
May 7, 2007 1:43:24 AM

Damn - 7 months late on performance for games that aren't even out yet.


What were they fucking thinking?
May 7, 2007 1:59:00 AM

Quote:
Damn - 7 months late on performance for games that aren't even out yet.


What were they ****** thinking?


So you'd rather have a 7900/1950 vs a 8800....

Just because they support dx10 doesnt make it the only reason to buy one.
May 7, 2007 2:02:45 AM

Quote:
Damn - 7 months late on performance for games that aren't even out yet.


What were they ****** thinking?


So you'd rather have a 7900/1950 vs a 8800....

Just because they support dx10 doesnt make it the only reason to buy one.

Actually yes, I would.

If we're talking about someone handing out 8800 GTXs on the end of the street, then hell yes, I'd bag one.

But if we're talking about bang for your buck? I'd rather keep my X1900 XTX, at least until games are out that actually run below my monitors refresh rate. Now, in a year or so, I will actually care who has the best card, be it Nvidia or ATi - point being, frames don't matter if you can't see the difference.
May 7, 2007 2:21:26 AM

Quote:
Damn - 7 months late on performance for games that aren't even out yet.


What were they ****** thinking?


So you'd rather have a 7900/1950 vs a 8800....

Just because they support dx10 doesnt make it the only reason to buy one.

Actually yes, I would.

If we're talking about someone handing out 8800 GTXs on the end of the street, then hell yes, I'd bag one.

But if we're talking about bang for your buck? I'd rather keep my X1900 XTX, at least until games are out that actually run below my monitors refresh rate. Now, in a year or so, I will actually care who has the best card, be it Nvidia or ATi - point being, frames don't matter if you can't see the difference.

agreed
May 7, 2007 3:37:56 AM

Agreed here too.....The refresh rate of my LCD nullifies anything over 60 or 80 hz (can't remember). Why buy now when DX10 still hasn't shown us much yet. I will judge the available cards when we have predominantly DX10 games. It is entirely possible the cards we see now will not be worth a poop when DX10 takes hold and the new cards from ATI and NVIDIA are launched (there will be another launch before we have more than 1 or 2 DX10 games).
May 7, 2007 4:16:57 AM

Quote:
Damn - 7 months late on performance for games that aren't even out yet.


What were they ****** thinking?


So you'd rather have a 7900/1950 vs a 8800....

Just because they support dx10 doesnt make it the only reason to buy one.

Actually yes, I would.

If we're talking about someone handing out 8800 GTXs on the end of the street, then hell yes, I'd bag one.

But if we're talking about bang for your buck? I'd rather keep my X1900 XTX, at least until games are out that actually run below my monitors refresh rate. Now, in a year or so, I will actually care who has the best card, be it Nvidia or ATi - point being, frames don't matter if you can't see the difference.

Amen.
May 7, 2007 9:00:09 AM

Seriously though, what is wrong with that dude at Fudzuki? has he a major gripe with ATI, seems like a right whiny toad :) 
May 7, 2007 9:21:45 AM

He just wants hits. With a card that runs at 100C and uses 750W of power, I guess a 5% increase isn't too much to hope for.....*irony*
May 7, 2007 9:41:00 AM

to above poster - wtf?
also, 5% isn't that bad of a gain, it would be assumedly be that much across the board...then you would also have that 5%+ on top of your overclocking...oh, also, what do you mean? the x18+ series ran at 90oC so whats the big problem with another 10oC? plus, i think that 100oC is with a huge vgpucore, because of the mentioned overclockability i would assume this..
May 7, 2007 10:46:16 AM

Quote:
If we're talking about someone handing out 8800 GTXs on the end of the street, then hell yes, I'd bag one.

But if we're talking about bang for your buck? I'd rather keep my X1900 XTX, at least until games are out that actually run below my monitors refresh rate. Now, in a year or so, I will actually care who has the best card, be it Nvidia or ATi - point being, frames don't matter if you can't see the difference.


I totally agree with that statement.
May 7, 2007 11:23:56 AM

It's a joke. Fudzilla has been posting lots of ridiculous info lately. Apparently the 100C was with no case flow, with good airflow it only gets to 82C, or so he now claims. He also claimed it needed a 750W PSU, but it turned out later that you'd be fine with a 500W. His cred is pretty much shot. I'm gonna sit and twiddle my thumbs until I see some official benches.
May 7, 2007 12:27:38 PM

Quote:
Damn - 7 months late on performance for games that aren't even out yet.


What were they ****** thinking?


So you'd rather have a 7900/1950 vs a 8800....

Just because they support dx10 doesnt make it the only reason to buy one.

Actually yes, I would.

If we're talking about someone handing out 8800 GTXs on the end of the street, then hell yes, I'd bag one.

But if we're talking about bang for your buck? I'd rather keep my X1900 XTX, at least until games are out that actually run below my monitors refresh rate. Now, in a year or so, I will actually care who has the best card, be it Nvidia or ATi - point being, frames don't matter if you can't see the difference.

finally someone who talks sense. Iv been pretty much saying the same thing....only you said it better :lol:  :roll:

obviously if you have money to spend, then by all means, stay at the top of the chain.

Also just cause a card supports dx10, doesnt mean it does it very well.

Sure its a high end card, but who knows how it will run on dx10.


sooo for atleast a year, id still go for the bang for the buck dx9 cards.
May 7, 2007 5:47:21 PM

His credibility has been shot for a long time.
May 7, 2007 6:13:00 PM

i believe it all depends on where you are in your upgrade cycle.

for people with high-end last gen dx9 cards, it makes little sense to upgrade to unproven dx10 cards, therefore its not bad ati is behind.

for others who are running older cards (cough - x850xt - cough) who have been upgrading, they would be foolish to not go for the 8800's, therefore ati screwed up be being so late to the game.
a b U Graphics card
May 7, 2007 6:38:15 PM

I'd agree except at this very point in time, there's little point in rushing to buy a GF8800 when the HD2900s are about to be launched, which will give people more options and most likely lower prices in the GF8800 range.

Best time to simply buy the GF8800 was Nov-March, more recently it makes sense to wait another week.
May 8, 2007 4:01:17 AM

Yeah this crap actually makes me happy that Halo 2 was delayed until the 22nd... gives me time to wait for the X2900XT Vs 8 series comparison so I can evaluate the results myself and stop relying on all this rumor bs. Whether it is bs or not remains to be seen, but that is how I am taking it until I see more legitimate sources.

Hmmmmm DX10... this should be fun. No bitching about Vista... I can get it for $5 and then run it on an entirely separate drive.
a b U Graphics card
May 8, 2007 5:42:20 AM

Yeah, and really I just want them to get the mobile solutions out, cause dang it I want a new laptop.

Gimme this laptop with a nice new GFGO8600/HD2600 in an HP or Gateway form factor and I'll be happy;

http://www.notebookreview.com/default.asp?newsID=3427

And hey if they wanna add LED backlight mentioned below it in HP and the Samsung SSD, hey sure why not. 8)

So I'm kinda happy that I'm not missing out on Crysis, but I'm still ticked waiting for these new laptops, some of which seem to be getting held up by the graphics and chipset sides of the equation, Santa Rosa should be out this week, now all we need is the mobile graphics.
May 8, 2007 6:37:36 AM

every bit of free performance is a good gain for consumers. go AMD, don't stop here, gain more performance for us. lol
May 8, 2007 6:40:52 AM

Quote:
Damn - 7 months late on performance for games that aren't even out yet.


What were they ****** thinking?


So you'd rather have a 7900/1950 vs a 8800....

Just because they support dx10 doesnt make it the only reason to buy one.

Actually yes, I would.

If we're talking about someone handing out 8800 GTXs on the end of the street, then hell yes, I'd bag one.

But if we're talking about bang for your buck? I'd rather keep my X1900 XTX, at least until games are out that actually run below my monitors refresh rate. Now, in a year or so, I will actually care who has the best card, be it Nvidia or ATi - point being, frames don't matter if you can't see the difference.
You can see the difference at 1920x1200.
May 8, 2007 7:45:55 AM

what a bunch of jerks. 8O
May 8, 2007 8:10:57 AM

Quote:
Damn - 7 months late on performance for games that aren't even out yet.


What were they ****** thinking?


So you'd rather have a 7900/1950 vs a 8800....

Just because they support dx10 doesnt make it the only reason to buy one.

Actually yes, I would.

If we're talking about someone handing out 8800 GTXs on the end of the street, then hell yes, I'd bag one.

But if we're talking about bang for your buck? I'd rather keep my X1900 XTX, at least until games are out that actually run below my monitors refresh rate. Now, in a year or so, I will actually care who has the best card, be it Nvidia or ATi - point being, frames don't matter if you can't see the difference.


I can name more than a hand full of games that will run below your monitors refresh rate using a X1900XT :roll:

Seriously, just STFU already FANBOY. :roll:

Yes. Im the fanboy. You got me there.

I buy cards when they need replacing for performance, and I buy them depending on what is the best card within my price-range at the time. The X1900XTX I bought last May, it was the best available for the £300 price range.

My card still runs the games *I* want to play at 50+ FPS. The only games that fall short of that are Supreme Commander, and Rainbow Six: Vegas.

Both run at 35+ FPS, which is plenty.

The first game that I may have to make sacrifices on is Crysis, and even then that's 6+ months away. I wasn't talking about DX10 performance, I was talking about performance in general. What is the need in having frames 20 more than you can see? None.

As for everyone spouting off the old "Yeah, it's a great card, until you run at 29838284090x19389231321, then you can tell the difference", that's true, but if it's not applicable to the people with that card, what difference does it make?

Like I said before, wait until you actually NEED the peformance before buying the card, that applies to both nVidia AND ATi offerings.
a b U Graphics card
May 8, 2007 8:54:30 AM

lol and how many % performance did nvidia gain with its "ultra" just to diss ati?
May 8, 2007 9:55:39 AM

I just got a new laptop back in early march with a 7600go... it performs... adequately since I have a main gaming rig. a 7600go is a little weak for driving a 16x1080 screen. It looks great and is quite zippy, all around not too bad for $1500. Mobile DX10 would have been great to have with it, but I needed it for grad school which started in March... so damn timing.

*Starts twiddling thumbs until May 14th*

*looks at the clock*

May, 8th 2007

Damn it... where is my De Loren, I need to go to the future.
May 8, 2007 4:00:05 PM

Quote:
Damn - 7 months late on performance for games that aren't even out yet.


What were they ****** thinking?


So you'd rather have a 7900/1950 vs a 8800....

Just because they support dx10 doesnt make it the only reason to buy one.

Actually yes, I would.

If we're talking about someone handing out 8800 GTXs on the end of the street, then hell yes, I'd bag one.

But if we're talking about bang for your buck? I'd rather keep my X1900 XTX, at least until games are out that actually run below my monitors refresh rate. Now, in a year or so, I will actually care who has the best card, be it Nvidia or ATi - point being, frames don't matter if you can't see the difference.


I can name more than a hand full of games that will run below your monitors refresh rate using a X1900XT :roll:

Seriously, just STFU already FANBOY. :roll:

Your talking about the x1900xt and he's talking about x1900xtx, and yes there is a diffrence. But really If you owned an x1900xtx you would have to have alot of money to justify the small jump in power by buying a 8800 GTX
May 8, 2007 4:35:48 PM

Personally, I don't think buying an 8800 was a waste of money like many have implied. I had a 1600xt and moved up to a 1600x1200 monitor 4 months ago. The 8800gts 320 was way cheaper than a 1950xtx and has some (speculative) future proofing. There are some of us that this was the best option for. I would not have bought a 1950xtx for $100 more and been stuck with dx9 at the time nor would I now. Just my opinion though.
May 8, 2007 4:45:42 PM

I completely agree. If you can afford a better card and want one then it's great to go get one. There are ALWAYS games that cannot be run on max at moderately high res with even the best cards without occasional slowdowns, so you can always improve your entertainment value with a new card. Most are just jealous that they don't have those new cards because, yes, sometimes the small increase is not justification for the $$ you have to spend. But don't beat a guy down that has the money to do it and is what he really wants. Who doesn't have a newer vehicle when an older one (with some rust on it) is perfectly good for what you need to do. Everyone wants nice, new stuff. Don't hate!
May 8, 2007 4:52:45 PM

60-62fps is what they theorize the human eye operates at, so anything above that becomes indistinguishable, untill you start applying Aliasing effects. The man is right though, there is no need to buy the top of the line GPU until you NEED that much power. Then some people have superiority complexes that make them run out and buy 2 8800's to SLi, for absolutely no reason.

When or how you upgrade your GPU is entirely up to yourself, I happen to agree with DarkKnight. Everyone has there own opinion as to how or when to upgrade, as long as you can justify to yourself that its worth the money then so be it. But theres no need to come here and call someone a fanboy, f**ktard, because they bought a 8800 or are still using a x1900 because the games they play don't justify a DX10 card or the price to own one.
May 8, 2007 5:13:10 PM

Now I'm jealous. I all most wet myself everytime I think of all that tasty graphics power. Am I a geek? Why yes, yes I am.
May 8, 2007 5:28:12 PM

To be honest if my budget were such I would do exactly the same thing. However it is not. I am glad that you got what you wanted as I do not disrespect those that cannot or do not wish to do the same.
May 8, 2007 5:51:01 PM

Quote:

And then some people are just jealous bitches who have nothing better to do than to bash those that can afford such luxuries. :roll:


I really do not owe you an explanation as to why I went SLI but I pretty much like that fact that games such as Oblivion, R6 and Call of juarez went from averaging 35-40fps with all maxed out settings plus the Addition of 16XAA and 16XAF to never dropping below 60fps.


I like my games to be perfectly smooth and do not like to sacrifice any eye candy. If I can afford to make that happen then its my own business.


When Crysis comes out if I cannot get the same type of performance do not even think for a minute that I will not toss these 2 cards out and go for the next best thing.


i'm not bashing you for having SLi or for even having 8800, selective quoting is lame and if you'd read the bottom part of my original post I said we all have our own reasons for the purchases we make so long as we can justify the reasoning to ourselves. You justified having 8800's in SLi... good for you... 90% of the market can't justify or afford it, so don't look down on those who are happy with 30fps on weaker cards and i'm not asking you to justify your purchases to me either.

But SLi is unnecessary unless you run 1920x1200 and greater and use all the different aliasing effects, which it seems you obviously do. at 1600x1200 one top of the line card will suffice.
May 8, 2007 6:01:30 PM

Quote:


Oblivion hits the X1900XT even harder when you crank up all the in game settings to 100% max and use AA+AF with HDR.

For me Oblivion alone was worth upgrading to the 8800GTX as it runs that game about 3 times faster than a X1900XT and thats with all in game options on at 100% max plus using 16XAA and 16XAF at 1680X1050.



I play Oblivion at 1680x1050 with everything more than at 100% (custom INI), with upgraded textures a little AA + HDR an plenty of AF without any problem. I'd played at 1920x1200 but I had some problems with fps while in combats. With a single core and a x1900XTX. It's the only game I don't play (I could 90% of the time) at less than 1920x1200 at max.

If I use standard Oblivion, frame rates almost double (no AA + HDR).

Of course, as you are who knows everything I wonder... is there any trick to play AA +HDR in a nVidia card or you're talking BS, comparing AA+Bloom with AA+HDR?

Anyway, If I can play perfectly one of the most stressful games with my current card, I'd be a moron or a loaded man if I bought a better one... Are you a loaded man?

And yes, in many games, 20-60-1234000 fps is the same. I'd say in most games (no FPS).

You're either a fanboy troll, a moron or a very rich man who doesn't know that most ppl don't spend 600$+ in a card that offers little less than an e-penis upgrade.

The fact is: if you have a good card (7900-X1900), the best option is TO WAIT until you find a game you can't play (and you want). And when you reach that point, you should buy the best option perf/money wise. And if the better one is 1 month old and the worst is 10 years old, it's the same. You're not an stockholder and you give a crap about when a card was launched. You want the best for your money when you buy. And that's it.
May 8, 2007 8:23:55 PM

Quote:
Of course, as you are who knows everything I wonder... is there any trick to play AA +HDR in a nVidia card or you're talking BS, comparing AA+Bloom with AA+HDR?


8 series removed that limitation. Prior to the 8 series, you are right, nVidia couldn't do AA + HDR.

Quote:
The fact is: if you have a good card (7900-X1900), the best option is TO WAIT until you find a game you can't play (and you want). And when you reach that point, you should buy the best option perf/money wise. And if the better one is 1 month old and the worst is 10 years old, it's the same. You're not an stockholder and you give a crap about when a card was launched. You want the best for your money when you buy. And that's it.


Well that is the sort of the prototypical situation but not how everyone rolls. That is how the mainstream people generally work, but not how early adopters/enthusiasts work. The high end cards are targeted early adopters who are willing to spend that much money and want the latest and greatest cards, in general.

Also, BF2142 is pretty taxing, not in general but when the damn walker comes after you creating dust/debris and then someone tries to blow you up with a RDX pack... my 7900GTX starts to choke. 16x12 w/o AA. :? FPS generally need more FPS to maintain playability so if you are FPS gamer you will likely either have to 1) upgrade more often or 2) bring down the settings.
May 8, 2007 8:58:45 PM

Quote:
Of course, as you are who knows everything I wonder... is there any trick to play AA +HDR in a nVidia card or you're talking BS, comparing AA+Bloom with AA+HDR?


8 series removed that limitation. Prior to the 8 series, you are right, nVidia couldn't do AA + HDR.

I know that 8 series removed the hardware limitation. My question is: Are there drivers that activate that option in Oblivion? Oblivion doesn't allow AA+HDR, you need some driver trick to activate it and I don't know of any for nVidia cards...
May 8, 2007 11:58:06 PM

Quote:
Of course, as you are who knows everything I wonder... is there any trick to play AA +HDR in a nVidia card or you're talking BS, comparing AA+Bloom with AA+HDR?


8 series removed that limitation. Prior to the 8 series, you are right, nVidia couldn't do AA + HDR.

I know that 8 series removed the hardware limitation. My question is: Are there drivers that activate that option in Oblivion? Oblivion doesn't allow AA+HDR, you need some driver trick to activate it and I don't know of any for nVidia cards...

That I do not know, I have 0 experience with Oblivion... not much of a RPG fan, but thats another topic. 8)
May 9, 2007 12:55:31 AM

I never meant to imply that those of you currently using the Geforce 8 series were overkilling it. Hell, I don't know if you went from a Riva TNT2 to a 8800 GTX. That's a big performance gain, and of course, totally worth it.

My point wasn't that the 8800 wasn't better than the X19##/79## cards, it was originally about the people crying "OH MY GOD, IT'S MONTHS LATE". If the HD 2900 XT ends up being 10% faster, it doesn't make it any less of an upgrade simply because the Geforce 8 series game first. Their simply isn't enough to warrant moving from an X19##/79## yet. If you have the money and want to do it, that's fine, I never meant to suggest otherwise.

Until the REAL next generation games are out, the need that kind of performance, Upgrading from the top tier of old graphics card just doesn't seem necessary at this point.
May 9, 2007 3:19:56 AM

Quote:
Now I'm jealous. I all most wet myself everytime I think of all that tasty graphics power. Am I a geek? Why yes, yes I am.



Then you will love this 8)




Ok, Now I'm Jealous. I want that set up!!! :cry: 

At least I have the case already. 8)
May 9, 2007 3:30:06 AM

Quote:
Even if true, so what? After 5 months late, they should not need to dig and scrape for a few percentage points.


i vouch theis dude^^, hes totaly rite 100%
May 9, 2007 3:40:19 AM

Quote:
Of course, as you are who knows everything I wonder... is there any trick to play AA +HDR in a nVidia card or you're talking BS, comparing AA+Bloom with AA+HDR?


8 series removed that limitation. Prior to the 8 series, you are right, nVidia couldn't do AA + HDR.

I know that 8 series removed the hardware limitation. My question is: Are there drivers that activate that option in Oblivion? Oblivion doesn't allow AA+HDR, you need some driver trick to activate it and I don't know of any for nVidia cards...
You just use nVidia control panel as shown below, no hacks needed:
May 9, 2007 3:55:35 AM

SLINROB:
you're right i don't have much SLi experience, i did have two 6800's in SLi when they came out, didn't see much from it, and haven't tried it since.

p.s. you have a nice rig, what case is that?
May 9, 2007 6:48:16 AM

Quote:



I play Oblivion at 1680x1050 with everything more than at 100% (custom INI), with upgraded textures a little AA + HDR an plenty of AF without any problem.



Then you are either lying about your settings or have a high tolerance for FPS which will often be in the teens with those settings and a X1900XTX :roll:

Maybe I have a hight tolerance to FPS in some games or maybe I've tweaked Oblivion better than you, who knows. What I know is that I play Oblivion at better PQ than original at max. Without spending about an Xbox360, an PS3 and a Wii to do it...

Ah, in Oblivion my tolerance is in the 20s. It's not HL, I'm afraid. But in all settings except woods, my card is in the 60s (I think the game is capped in 60). You spent about 95% of the game in interiors (capped at 60) and you can play ok that 5% of the game at 20s. And my experience with that game is that my bottleneck is the CPU (linear proportion fps-MHz)


Quote:
Even At 1280X1024 running the game 100% maxed out with all sliders and shadow options on full plus AA and AF with HDR it would often get very very sluggish in the forest dropping into the mid to high 20's :roll:

You claiming that you are running these settings with the addition of ini tweaks to make it look even better plus at a higher res is complete and utter BS unless like I said you have a high tolerance for a mess of a slide show. :lol: 


And this is all I wanted to know. You can play Oblivion, at max, and the minimun fps you see is 25-30. WOW! Did you know that all theater movies are at 24 fps? IS THAT a mess of a slide show?????? I'm afraid you're a clear case of e-penis complex :-D. I don't say that mid 30s is OK for all games, but for most, yes it is.

You are a rich boy that like to come to forums to say it has the best. And It's OK. But please, as I earn my money working, don't expect me to waste it. And yes, I could buy a magnificent 8800GTX if I wanted... But, right now, there is no point to do it.


Quote:


If I use standard Oblivion, frame rates almost double (no AA + HDR).


Horse sh1t :roll: going from 4XAA to no AA nets around 10fps more at the most. It doesnt double. :roll:

Sorry, I meant no HDR+AA. And that was at 1920x1200, so the fps were about mid high tens. I don't know about HDR+no AA, I didn't test it.

Quote:


Of course, as you are who knows everything I wonder... is there any trick to play AA +HDR in a nVidia card or you're talking BS, comparing AA+Bloom with AA+HDR?


:lol:  :lol:  :lol:  at the noob.

OK, now I'm sure. You're a rich kid... Should I look for a trick like that for nVidia when I'm using an ATI (first ATI in my life, btw)? If there's one, you should say it. If not... :-D

Bah, forget it. Maybe I could stand fps in the teens, but I can't stand rich kids in the teens...
May 9, 2007 6:55:01 AM

Thanks!!! I didn't know. :-)

Of course, the best thing right now is C2D + GT8800. But if you have A64+X1900 (or 7900), and you don't like wasting money, IMHO, you should wait a little more.

It's funny, I was sort of an nVidia fanboy , now I'm an atheist. :-D


Quote:
Of course, as you are who knows everything I wonder... is there any trick to play AA +HDR in a nVidia card or you're talking BS, comparing AA+Bloom with AA+HDR?


8 series removed that limitation. Prior to the 8 series, you are right, nVidia couldn't do AA + HDR.

I know that 8 series removed the hardware limitation. My question is: Are there drivers that activate that option in Oblivion? Oblivion doesn't allow AA+HDR, you need some driver trick to activate it and I don't know of any for nVidia cards...
You just use nVidia control panel as shown below, no hacks needed:
May 9, 2007 8:27:22 AM

Come on, I'm sure an 3117 kid like yourself can think of something better...

Anyway, I'd rather have you trying to offend me that spreading elitist BS and making novices spend more money than they need...


Quote:
LMAO at rechicero :lol:  The most interesting internet clown I have yet to meet. :lol:  :lol:  :lol: 
May 9, 2007 3:29:57 PM

Quote:
Now I'm jealous. I all most wet myself everytime I think of all that tasty graphics power. Am I a geek? Why yes, yes I am.



Then you will love this 8)




Ok, Now I'm Jealous. I want that set up!!! :cry: 

At least I have the case already. 8)

Its a great case isnt it? 8)

Yes it is, strong and has alot of mod possibilities.
!