Sign in with
Sign up | Sign in
Your question

7900 GTX or X1900XTX?

Tags:
  • Graphics Cards
  • Gtx
  • Games
  • Graphics
Last response: in Graphics & Displays
Share
July 11, 2006 6:48:18 PM

It would appear that the X1900XTX has 30,000 Shader Operations/s because it has 48 pipes, while the 7900 GTX has 17,000. But that dose't matter because its overkill, meaning that no game uses the full 30,000 or even 20,000. The other thing is that the 7900 GTX has 17,000M Texels/s, Texture Fill Rate while the X1900XTX has only 10,000M Texels/s.
So, from this i get that having 1.7x the Texture Fill Rate, makes the 7900 GTX better in most games by a small margain.
Since, they both cost the same, i dont know wich one to get. Ive listed the only differences beetween them.. they both have 512Mb or memory, same bandwidth etc.
So, what do u think?

More about : 7900 gtx x1900xtx

July 11, 2006 7:05:47 PM

The X1900 series is superior to nVidia in graphic quality within gaming...
Performance is a mixed bag, with some going to one, others to the other...
Quality tho is much more clear...
July 11, 2006 7:59:09 PM

Quote:
The X1900 series is superior to nVidia in graphic quality within gaming...
Performance is a mixed bag, with some going to one, others to the other...
Quality tho is much more clear...


Could u please explain what u mean "quality", because i see no difference in what eavh company's chips can render in games.
Related resources
July 11, 2006 8:25:19 PM

When HDR and AA are on, the graphics end up looking nicer.

:roll:
July 11, 2006 8:49:34 PM

Instead of comparing specs, how about you tell us the games you plan on playing.

Doom3 engine based games (Doom3, Quake 4, Prey, Enemy Territory Quake Wars) usually perform better on the 7900GTX

F.E.A.R. is usually pretty mixed; both ATI and Nvidia cards perform pretty much the same

Oblivion performance wise is pretty similar on the 7900GTX vs. the X1900XT but the X1900 has a serious advantage with OpenEXR HDR and antialiasing support.

Far Cry's performance is also similar on both cards, but ATI still has an advantage with HDR and antialiasing support.

Battlefield2 is pretty mixed; both cards will perform fine

Source engine games perform about the same with either card.
July 11, 2006 9:11:41 PM

Quote:

Could u please explain what u mean "quality", because i see no difference in what eavh company's chips can render in games.


In addition to the HDR + AA issue, Ati's AF algorithm is less angle dependant than Nvidia's, and their AA is considered slightly better.

Also, Nvidia and Ati cards will render colors differently in many titles, it's a point of subjection but many people feel that ati's methods are superior.

At the end of the day though, I think they both do good and passable job. Nvidia has come a long way from the Geforce FX image butchering fiascos of the past.
July 11, 2006 9:44:06 PM

it pretty much if u prefer nvidia or ati, and the x1900 will have better image quality
a b U Graphics card
July 11, 2006 11:21:20 PM

Quote:
Instead of comparing specs, how about you tell us the games you plan on playing.


Games shmames, I like BIG numbers! :twisted:

I'd agree with most of your assessment, except the following;

Quote:
Oblivion performance wise is pretty similar on the 7900GTX vs. the X1900XT but the X1900 has a serious advantage with OpenEXR HDR and antialiasing support.


While their averages are close to similar, the the GTX bottoms out way lower with min fps, so as most reviews found the Ati series is more playable that it's matched competitor in Oblivion because of this.
http://www.xbitlabs.com/articles/video/display/asus-eax...

Anywhoo, for anyone who cares to take the time, there's alot of game benchmarks in that there review to give a great overall picture of the situation.

However I say go with the one that has the coolest BOX! :twisted:
July 12, 2006 12:25:39 AM

I am imagining a few of those ATI girls have pretty hot boxes! :wink:
July 12, 2006 3:21:33 AM

well i was in the same spot you were in a little while ago and i was upgrading from 2 7800gt's in sli. so i asked around, and looked for reviews online. i decided to go with the x1900 because i heard it had much better image quality and preformance was similar. i saw MUCH better quality in my games! colors in my opinion look amazingly well sturated, bright, and just purely wonderful! AA+HDR looks awsome! i did notice the the AA is also smoother and nicer especialy in HL2. you can also enable high quality AF which looks very nice but takes a prity big preformance hit so i would only use it in CF mode. also if you look at the scores in games, though the 7900 usualy preforms better in older game titles, the x1900 begins to take over in newer games because of the demand of pixle shaders. it was a very smart move by ati to favor more shader units than texture units, though at least 20 pipelines would have been nice not to mension probly MUCH better benchmarks. i am almost sure that the x1900 will dominate in upcoming titles like UT 2007 and crysis! not to mension that the next gen of gpu's will prob come out till early 2007 or late 2006. games WILL get more demanding and i know the x1900 will be king until the next gpus.
July 12, 2006 5:24:06 AM

Quote:
I am imagining a few of those ATI girls have pretty hot boxes! :wink:


i didnt read the whole thread, but someone should suggest to ATI to start putting the middle chick on their boxes rather than a computer generated Ruby. Their sales would sky rocket :lol:  :lol: 
July 12, 2006 6:52:50 AM

Ok, i can see that many ppl think that ATI's picture quality is better then Nvidia's and i think i might see a small difference, when comparing some pics online, but to me it dosent matter because im upgrading from an FX 5200 so i wont notice any AA or AF or HDR changes, ive never even seen HDR in action.
I plan to play every game possible, mostly FPSs and RTSs. And every game that ive missed since oh.. 2004 cause of my crappy card.

I know that the X1900XTX has twice as many Shaders then the 7900GTX, but thats useless, cause all u need is the ammount that the 7900GTX has, so basiclly ATI has no advantage in current games, and the GTX has almost twice as many Texels and Texture fill rate, wich is a big advantage.
So i just dont know! GTX performs better in games.. but the XTX has some interesting advantages.

According to my calculations, Windows Vista wont be here till here Feburary 2007. Now i dont know when the DX10 card are going to show up, but it could be in 2007 or in the Fall of 2006.. depends how much the delay affects them. I think it will affect them but i know many ppl think it wont.
I want to buy a card now so that i can play in the summer, since im going in to my 12th year of High School and wont have much time for games, and also ive been waiting long enough. The only problem is, that this card is going to have to last me for at least a couple of years and with the DX10 cards, not waiting seems dumb.
July 12, 2006 7:11:23 AM

Another thing to consider is the heat factor. If you can't really decide between the two then consider how you cool your system. It is well known that the ATI card generates about as much heat as the sun (approximately 130 watts). The Nvidia card runs considerably cooler (around 90). There is a substantial difference between the two in this aspect.
July 12, 2006 7:51:25 AM

Quote:
Another thing to consider is the heat factor. If you can't really decide between the two then consider how you cool your system. It is well known that the ATI card generates about as much heat as the sun (approximately 130 watts). The Nvidia card runs considerably cooler (around 90). There is a substantial difference between the two in this aspect.


I have a big fan.. but why does cooling ur system matter, when ur card has its own fan wich generates noise and not heat?
July 12, 2006 8:53:45 AM

I know that the stock cooler for the X1900XTX blows the air out but I am not knowlesgeable as to how effective it is at removing the heat generated by the GPU and the videocards ram. Alot of enthusiasts don't use the stock cooler anyway and opt for something better like with Zalman or Artic Cooling.

130 watts of heat - that's alot for the stock cooler to dissipate and if you overclock or game alot...

That's why I inquired about your systems cooling method. I am not trying to talk you out of either card. Truthfully, you can't go wrong just going by their strengths - both are excellent. I meant to give you another perspective to go by if you couldn't decide.

Actually, if you could hold off - the newer GPU's from ATI and Nvidia are going to be released before long. If you HAD to get a card now, why not opt for somehting with good quality but less expensive - say the 7900 GT and wait - Nvidia's newest GPU should come out in Sept and ATI's before Xmas.
July 12, 2006 9:23:22 AM

I dont live in a place that allows me to buy 2 cards in the same year..

Your saying that the Fans that come WITH the cards arent good enough and i should opt for something better?

And truthfully i dont know whats so great abt the DX10 cards anyway.. I mean i can have the DX10 software on my X1900XTX right? so whats the big deal?
July 12, 2006 9:46:59 AM

The stock coolers are ok as far as heatsink/fan combos are concerned and you'll be ok with it but alot of people usually opt for better, especially after making such an expensive investment in a single component. You will only be able to use DX10 with Vista. The next gen cards are going to be optimized for DX10 but either card will be fine.

If you aren't a hardball enthusiast - like the kind who have to have the newest tech regardless of the investment or whether they need it or not - you will be fine with either card. When the next gen cards come out later this year I am quite sure their prices will drop.
July 12, 2006 10:04:30 AM

Quote:
The stock coolers are ok as far as heatsink/fan combos are concerned and you'll be ok with it but alot of people usually opt for better, especially after making such an expensive investment in a single component. You will only be able to use DX10 with Vista. The next gen cards are going to be optimized for DX10 but either card will be fine.

If you aren't a hardball enthusiast - like the kind who have to have the newest tech regardless of the investment or whether they need it or not - you will be fine with either card. When the next gen cards come out later this year I am quite sure their prices will drop.


I really cant wait until October to buy a card, they should have thought abt the summer before choosing a date.
But, will there be a difference beetween DX10 cards with DX10 software and DX9 cards with DX10 software? Like, if i buy a similar spec card from the 8 series or X2k series and use DX10 with them, will it be better then using a card with the same specs from the DX9 series, using DX10 with it?
July 12, 2006 10:11:03 AM

It's all about your needs and desires. When you are talking about performance issues and DX10 or DX9 and the upcoming GPUs only the people care about things like FPS (feet per second) and thigns like that with games are really going to notice.

Like I said, you'll do fine with either card - even when Vista and DX10 come out. I am going to be using an XFX 7900 GTX card until I make my next purchase, probably in the first quarter next year. I'm not worried about being able to play games as I know my card will do just fine - and gaming isnt the only thing I'll be doing anyway
July 12, 2006 5:02:37 PM

You came here asking a question but it seems pretty obvious you've already made up your mind before you asked it.

So buy the Geforce... and don't waste our time asking for advice when you choose to ignore it. :wink:
July 12, 2006 5:45:43 PM

Quote:
You came here asking a question but it seems pretty obvious you've already made up your mind before you asked it.

So buy the Geforce... and don't waste our time asking for advice when you choose to ignore it. :wink:


No, actually im probably going to get the X1900. :roll:

I just want to know if Texture Fill really matters all that much.
July 12, 2006 7:23:14 PM

Obviously you prefer Product X over the Product Y, but if you buy Product X you’ll always wonder if that the product Y would have been a better purchase.

Sometimes we doubt our self if we make a right decision. But we can only hope if we did. You have given some information about the two products already, which are both competitive.

So when comes the big decision for you on where to spend that hard earned cash, I can only give you incite since most guys pretty much said everything and I don’t own NVIDIA card but my old TNT 2, the rest were Radeon X-series.

It’s all up to you and what you feel like it? Good Luck!
a b U Graphics card
July 12, 2006 7:59:28 PM

Quote:

I just want to know if Texture Fill really matters all that much.


It matters for some games, but less and less for newer games. The texture barrier isn't as quickly reached as the shader barrier. Although if you play @ 1024x768 then you won't reach either anyways.

I have to agree with Cleeve though, all your statements have seemed to try and discount what people are telling you.

You say that the X1900 having twice as many shaders is nice but all you need is that of the GF7900, obviously you don't know what you're talking about as you're coming from the FX5200, but seriously why make a statement like that if you have NO point of reference. Heck if the X1900 or GF7900 had 2-4 times as many shader units you could find a game situation where they could be used. Would having more ROPs have more impact, sure mainly because most older games aren't holding the systems back so more ROPs would have a greater impact in older titles, however newer titles like F.E.A.R., Oblivion, Prey, etc. are more shader heavy and take advanatage of ATi and nV's un-even distribution of strength, so that you don't have the ROPs sitting empy in the future.

If you want to play the future games then take the statements to heart as those are the areas that benifit the most, and then even in older games, that where the extra features like HQAF can be useful.

Really they are both nice cards, and both will do regardless of your choice, however there is one that performs better on average and that's usually what people here recommend, the best solution to the problem, and in this case it's the one card they are mentioning the most. Move down a price range or two and the situation changes.

I see the resistance that Cleeve spoke to, and if you've overcome it yourself all the better, if not, it's not like you're buying another damn FX so it's not really that big a deal, and I doubt anyone is going to really ry and convince you either way because it's not worth the hassle.
July 12, 2006 9:51:16 PM

Quote:

I just want to know if Texture Fill really matters all that much.


Sure it does, and so does shader power. Image quality matters too.

Neither card trumps the other so much that you'd regret it later. But there are alot of IQ fans here, so we'd recommend the Ati in this case. Plus the X1900's shaders will likely come into play more in future titles that aren't so texture limited.

Quote:
No, actually im probably going to get the X1900.


That's odd, every statement you have made discounts the X1900's shaders in favor of the 7900's Texel pushing power. :roll:

But whatever; like I said, either choice is fine, there are just alot of IQ fans here and that's what many of us have been recommending based on. It's not like you'd buy either card and in a year be saying "I should have gotten the other one
July 12, 2006 10:37:47 PM

If you are going to stay with 1 graphics card then X1900XTX has a slight edge, but If you have an SLI mobo and or want dual graphics cards in the future, SLI owns Xfire right now, So a 7900GTX might be worth it.

At any rate, I woul get a $150 7600GT to tide you over until DX10 cards come out.
a b U Graphics card
July 12, 2006 10:53:41 PM

Quote:
SLI owns Xfire right now,


As mentioned many times in many threads before. Neither OWNZ, but many would give the nod to SLi, however there are still many cases where Xfire has a large advantage too, so overall they are pretty close with a tip going to SLi at maybe a 55/45 split IMO, or less.
July 12, 2006 11:58:55 PM

Hi!
If you dont mind, im going to tell you mY point of you.-
I live in a country where Most people that knows how to speak english, try to take advantage from the one who doesnt :x , That leads the PC Shops, and auction sites, to sell ie. 7800gtx 256 mb as high as 1100 U$D 8O , thats why I began to buy in usa and canada.-
why did i told you this, here 1 U$D are three Pesos, that cost the same to get it as 1 U$D, so as I like very much the new technologies, and having here A LOT of problems getting stock something, and even when they have it in stock, they will sell you a 300 U$D card for 800 U$D, you have to THINK and Look what youre gonna buy, so my recommendations today are 2 7900 GT or two 7800 GTX , you do SLI, and you will have the best of both worlds, While polishing the floor on all the games, you will have nearly the same performance on oblivion, the only thing you will miss is the HDR and AA together, but belive me, if you are upgrading from an aughful FX, even 1 6600 GT will do wonders for you!
sorry if I bore you.... let the flames come!!! :lol: 
Max.-
July 13, 2006 12:30:40 AM

Quote:
It would appear that the X1900XTX has 30,000 Shader Operations/s because it has 48 pipes, while the 7900 GTX has 17,000. But that dose't matter because its overkill, meaning that no game uses the full 30,000 or even 20,000. The other thing is that the 7900 GTX has 17,000M Texels/s, Texture Fill Rate while the X1900XTX has only 10,000M Texels/s.
So, from this i get that having 1.7x the Texture Fill Rate, makes the 7900 GTX better in most games by a small margain.
Since, they both cost the same, i dont know wich one to get. Ive listed the only differences beetween them.. they both have 512Mb or memory, same bandwidth etc.
So, what do u think?


Some people will say that this or that is better all based on "benchmarks".

The truth is that BOTH cards will preform so well that the human eye can not tell the differance...and following that concept going SLI is not a gain as well.

Both cards will provide all that the eye can really see and do so for years.

I myself use the 7900GT (any brand as they all tend to OC to about the same speed) in the systems I build becuse of the lower power draw. This makes less heat in the case and all the parts last longer.

Z
July 13, 2006 12:31:01 AM

Or you could just get a better performing 7950GX2 for less.
July 13, 2006 1:41:07 AM

Quote:
Or you could just get a better performing 7950GX2 for less.


For less?
Less what?
Price...no.

Remember that a 7950 is basicly two under clocked 7900GT's linked together to fit into one PCI-E slot.

When it comes down to what the human eye can detect then it is like Crossfire or SLI gfx,alot of money for nothing.

One top end gfx card (ATI or nVidia) is all a person can see and all those benchmarks/reviews are just to keep the writers and the companys paying them bringing in money for themselfs.

Z
July 13, 2006 4:55:10 AM

Quote:

One top end gfx card (ATI or nVidia) is all a person can see and all those benchmarks/reviews are just to keep the writers and the companys paying them bringing in money for themselfs.

Z


I see your point, but you're missing the longevity angle though.

It certainly doesn't matter if Card 'A' runs at 140 fps and Card 'B' runs at 90 fps.

But at high detail settings in advanced games, if card 'A' is running at 28 fps and card 'B' is running at 18 fps, you can be damn sure there's a helladifference.

It makes sense to buy the best, given a choice... especially when it's a significant investment, like a video card is for a lot of people.
July 13, 2006 5:38:52 AM

Quote:

One top end gfx card (ATI or nVidia) is all a person can see and all those benchmarks/reviews are just to keep the writers and the companys paying them bringing in money for themselfs.

Z


I see your point, but you're missing the longevity angle though.

It certainly doesn't matter if Card 'A' runs at 140 fps and Card 'B' runs at 90 fps.

But at high detail settings in advanced games, if card 'A' is running at 28 fps and card 'B' is running at 18 fps, you can be damn sure there's a helladifference.

It makes sense to buy the best, given a choice... especially when it's a significant investment, like a video card is for a lot of people.

longevity angle?

Neither of those cards are going to run any..ANY..game that is out at 18-24 fps.

My stock 7900GT runs FEAR @ an average 127fps and tops out @ 308 on either an Optron 146 or 148.

So as it stands right now today....they both provide all the the human eye can really see.

Take the Crossfire/SLI concept useing high end cards.
Now if you took 1,000 people and told them (well lied to them!) and said BOX "A" has 1 gfx card and BOX "B" had 2 gfx card (of the same type) then a few people would say BOX "B" looked better.

But only becuse they THOUGHT it was a better system.

It would all be in thier minds though becuase the human eye CAN'T see any differance.

Both systems are just that good right now so it really just comes down to which one you personally would like to buy.

I build with the 7900GT's for lower pwr draw...no other reason.

Z

PS:
When it comes right down to it my 5950 E with my 3200+ Barton STILL plays all the new games just fine. So much for "longevity" that everyone talks about.
I have to admit that the new systems I build with OC Optrons and OC ram @ DDR1-520 are so sweet!
July 13, 2006 6:01:33 AM

Quote:
longevity angle?

Neither of those cards are going to run any..ANY..game that is out at 18-24 fps.


Two things:

1. longevity indicates FUTURE. Today is not the future...

2. You haven't played Oblivion at high detail settings/resolution with HDR & AA enabled, have you? :wink:
July 13, 2006 6:08:33 AM

Quote:
longevity angle?

Neither of those cards are going to run any..ANY..game that is out at 18-24 fps.


Two things:

1. longevity indicates FUTURE. Today is not the future...

2. You haven't played Oblivion at high detail settings/resolution with HDR & AA enabled, have you? :wink:

Quote:
You haven't played Oblivion at high detail settings/resolution with HDR & AA enabled, have you?


Well to be 100% honest no I haven't.
In fact I passed up on that torrent file all together. :wink:

Am I really missing anything?

Are you really saying that I and others would not fully enjoy it with the systems I now build or even the older systems I built 3 years ago?
Are you trying to say that your A64 @ 2.6 Ghz cant do the job?
Can my Optron systems @ 3 and 3.2 Ghz not do the job?

Should I go out and buy a CRAY computer to run that game?!

All kidding aside buddy,if the systems we have can't handle it then we both don't need to own computers and I sure as hell don't need to be building them...or more to the point SOFTWARE DEVELOPERS should not be makeing products that currant hardware can't run,which is not the case here.

Quote:
longevity indicates FUTURE. Today is not the future...


How long do you expect this ofton stated "longevity" stuff should be?
I read alot of people say todays top end parts will be "outdated" in six months...don't upgrade your computer go buy the XBOX!

Every now and then I really wonder how people that say such things can even manage to get on the internet.
a b U Graphics card
July 13, 2006 6:54:41 AM

Quote:

All kidding aside buddy,if the systems we have can't handle it then we both don't need to own computers and I sure as hell don't need to be building them...or more to the point SOFTWARE DEVELOPERS should not be makeing products that currant hardware can't run....

Which is not the case here.


There's a difference between can't run at all, and can't run at high settings.

FartCry couldn't run at high settings when it launched, same with Chronicles of Riddick, F.E.A.R., Oblivion, all required future hardware to start fully pushing the sliders. If early indications of Quad SLi mean anything, maxing out Oblivion won't happen until DX10 cards arrive, if then. But that's because you can turn things up and add to the realism, but at the same time it's possible to play it on low settings on older hardware. So you get the best of both worlds, a wide enough install base, and a game that still has benifits to offer a new more powerful setup in the future.

Quote:
I myself use the 7900GT (any brand as they all tend to OC to about the same speed) in the systems I build becuse of the lower power draw. This makes less heat in the case and all the parts last longer.


Except for the fact that the GF7900GT expells most of it's heat back into the case whereas the GTX expells 50% in 50% out, and the X1900XT/XTX expells nearly 100% of the waste heat ouside of the rig, so if you were looking to increase the longevity of other components due to heat then, you'd want the one that helps remove the warm air from inside your case and help promote increased coool airflow.

Best thing would be to add Arctic cooling NV Silencer5 (or something equivalent) to that GT to make it as effective as the others.
July 13, 2006 7:11:59 AM

Quote:

All kidding aside buddy,if the systems we have can't handle it then we both don't need to own computers and I sure as hell don't need to be building them...or more to the point SOFTWARE DEVELOPERS should not be makeing products that currant hardware can't run....

Which is not the case here.


There's a difference between can't run at all, and can't run at high settings.

FartCry couldn't run at high settings when it launched, same with Chronicles of Riddick, F.E.A.R., Oblivion, all required future hardware to start fully pushing the sliders. If early indications of Quad SLi mean anything, maxing out Oblivion won't happen until DX10 cards arrive, if then. But that's because you can turn things up and add to the realism, but at the same time it's possible to play it on low settings on older hardware. So you get the best of both worlds, a wide enough install base, and a game that still has benifits to offer a new more powerful setup in the future.

I don't have Oblivian but the other programs ran on high for me.

Did they not run for you?
Do you know anyone it would not run for....yourself?

Quote:
maxing out Oblivion won't happen until DX10 cards arrive


Well as DX10 will only come out in Vista which is not an OS anyone can buy now. Are you trying to say that the software developers built this game to run on an OS and HARDWARE that NOBODY has or even makes let alone able to buy?

Myself I think alot of people with high post counts don't know what they are talking about.
They just repeat "non-facts" as though it was the way things really are in real life....becuase some other person incorrectly did.

Your going off topic about heat and cooling units for gfx cards does not help you make ANY point in this topic at all.
Just to point out for your personal information...the 7900's in my builds never get over 42C in a 27C room with stock coolers and the Optron 146's @ 2.95Ghz (still at 1.4 volts by the way) never gets over 40C under a full hour of 100% load durring benchmarks useing a Thermalright SI-120 cooler.
Perhaps you can't build a proper air cooled computer any better than you can try to report facts?
a b U Graphics card
July 13, 2006 7:30:08 AM

Quote:

I don't have Oblivian but the other program ran on high for me.

Did they not run for you?
Do you know anyone it would not run for....yourself?


When FartCry launched, no card had enough juice to run it maxed, period.
As for F.E.A.R, it wasn't until the X1900 series in Xfire that you could run 16x12 with 4XAA at above 30FPS avg, and that's not even MAX, and even then it was slow on the X1900s.

So either your confused about the settings or the launch of the titles.

Quote:
Well as DX10 will only come out in Vista which is not an OS anyone can buy now. Are you trying to say that the software developers built this game to run on an OS and HARDWARE that NOBODY has or even makes let alone buy?


Did I mention the OS? No! I said DX10 CARDS. And yes developers often add features to games that will not be exposed until future hardware can properly support them, in fact Id was famous for doing that with the Doom/Quake series. And Matrox's SurrondView was supported in games long before the hardware shipped.

Quote:
Myself I think alot of people with high post counts don't know what they are talking about.


What like your statements about the GF7900GT-512 not existing?
http://forumz.tomshardware.com/hardware/modules.php?nam...

Quote:
They just repeat "non-facts" as though it was the way things really are in real life....becuase some other person incorrectly did.


So is that how you operate? It wold explain why you can't directly respond to Cleeve's statements and instead talk about gaming with CRAYs.

And consideirng you haven't used the most demanding game out there I don't see how you can comment on whether or not current titles can overpower current PCs. Of course if you only game at 1024x768 then I guess you never would overstress your PCs. :roll:
July 13, 2006 7:38:51 AM

Just face it...
There is NO DX10 or hardware to run it.
There is NO OS that supports DX10 at this point as well.

When FEAR came out I and alot of others COULD run it on high and we were all very happy even if you yourself felt let down.

Just becuase you have some high post count and ramble on stateing often off topic concepts does not,and will not,ever make you or any other person doing the same correct.

Ever.

I rest my case.

Z
July 13, 2006 7:51:33 AM

Quote:
What like your statements about the GF7900GT-512 not existing?
http://forumz.tomshardware.com/hardware/modules.php?nam...


You know what...it seems that I was WRONG.

Some company does make a 512MB 7900GT!!!

I have no idea why they would call it a GT when every other 512 MB 7900 card is called a GTX...but looks like I just got sent to school this time.

Run me over and call me roadkill! :oops: 

PS: I think they should get the CIA/RCMP to look into that company, they must be up to something! :lol: 
a b U Graphics card
July 13, 2006 8:19:33 AM

Quote:
Just face it...
There is NO DX10 or hardware to run it.


Actually both exist, just not in retail.

Wanna see a DX10 card, here ya go;
http://www.hexus.net/content/item.php?item=6155

Quote:
There is NO OS that supports DX10 at this point as well.


Sure there is;
http://msdn.microsoft.com/windowsvista/

And not only developers get to monkey with it, but the public gets to play with Beta2, so the OS isn't a concern either as it's there already for developers.

Not that that has much relevance in building in features that will best be exposed by future cards that have better branch handeling, more shader power, and better ability to assign that power. You can emulate alot of hardware features using software and CPU/VPU resources to emulate what will be supported in hardware by those future cards. ATi's already done that with an X1900 to demonstrate some of the efficiency of a unified design and better batch handling.

Remember it wasn't about the current existance of those cards, but the fact that future hardware would better be able to handle some games, which has been demonstrated in the past as well.

Quote:
When FEAR came out I and alot of others COULD run it on high and we were all very happy even if you yourself felt let down.


Yeah, sure, it just makes me question your definition of high settings, or whether you were just high and it seemed like it ran well at max.

Quote:
Just becuase you have some high post count and ramble on stateing often off topic concepts does not,and will not,ever make you or any other person doing the same correct.


No, us posting the facts and being able to back it up will alwyas make it correct, while your statements about what you say you can do, and what you think exists or doesn't won't change that regardles of how many posts you get nor how much you want to moan about the number of posts other people have. Sofar you're hitting 0/4 in the facts department, and your subjective opinion of max settings and what developers can and can't do doesn't improve that standing.

Quote:
I rest my case.

Z


Just rest, it sound like you need it, cause it's getting Old Dude. :roll:
July 13, 2006 9:39:02 AM

My base argument for getting the 7900GTX is, and no im not sure yet, because ive been looking at Nvidia cards only for the past month, and just now got to seeing that the X1900XTX prices are abt the same -
7900GTX : 17,000M Texels
17,000 Shader Ops
X1900XTX : 10,000M Texels
30,000 Shader Ops
While, the 7900GTX performs better in almost every game. This makes me think that, while the 7900GTX has 1.7x the Texels and the X1900XTX has 2x the Shaders, most games require abt 20K Shaders and 15kM Texels.. So that the X1900XTX is overkill on the Shaders and underkill on the Texels while the 7900GTX is just right for both, except it could use a little more on shader power.
But now, if u say that texture fill is becoming less used (however that may be :? ) and Shader Operations are becoming more used in more shader-haevy games then i should get the X1900XTX, since im toped at 1280x1024 anyway.

Also, i get that the DX10 cards will be better because they are more new, but what other advantages will they have in Vista? Cause i can install the DX10 software on my DX9 card, so whats the difference beetween a DX9 card and DX10 software and a DX10 card and DX10 software?

Does anyone think it a whise idea to buy SLI/Xfire when i have a P4 3.0Ghz Prescott 800Mhz FSB w/HT CPU. Its top of it's line, but its not Dual Core with ive heard is better with SLI (2 cores for 2 cards), or would a DX10 card be OK with my CPU?

Thnx
a b U Graphics card
July 13, 2006 1:44:56 PM

Quote:
Just rest, it sound like you need it, cause it's getting Old Dude

LOL, Amen; that was getting painful to read.

You know, peoples idea of high settings very. Oblivion playable on a 7800GS at 16x12 high with HDR ring a bell? But, when someone makes a claim about max detials, their room for weasiling out of the situation dissapears. I remember a while ago someone claiming 12x10 MAX w/ HDR in Oblivion was smooth on a 7800GT, until I pointed out mine would die at those settings and FS's 13fps average supports that conclusion.

Anyway, I'll give that link again as someone here needs an understanding that the mighty 7900GT's have their limits:

Firingsquad Oblivion Foliage 10x7 max with HDR averages 23.9 fps: http://www.firingsquad.com/hardware/oblivion_high-end_p...
Anandtech Oblivion Gate 12x7 High with HDR averages 21 fps: http://www.anandtech.com/video/showdoc.aspx?i=2746&p=4
July 13, 2006 2:22:37 PM

Quote:

All kidding aside buddy,if the systems we have can't handle it then we both don't need to own computers and I sure as hell don't need to be building them...or more to the point SOFTWARE DEVELOPERS should not be makeing products that currant hardware can't run,which is not the case here.


And there's where we disagree.

I'm personally for innovation, even if it hurts the pocketbook a bit.

If all devs made games only for current hardware, the industry would stagnate. We'd all be playing pong.

I'd prefer they push it so in ten to 20 years we have photorealism... sure it costs me a few hundred bucks every year or two, but if that's the goal then more power to 'em.
July 13, 2006 2:25:20 PM

Read the THG article about the Nvidia 7950 GX2 , get the card ...end of history 8)
a b U Graphics card
July 13, 2006 6:38:52 PM

Quote:

While, the 7900GTX performs better in almost every game.


According to what?

IT sounds like you're reading some very select reviews. Overall if anything there is a slight advantage to the XTX when things get tight, but they're very close, no one winning 'in almost every game';
http://www.xbitlabs.com/articles/video/display/gigabyte...

And really that's to be expected because when it's easier thent there's no advantage to having more pixel shaders, just like there's no advantage to having more TMUs if both aren't being stressed, and they have to wait for the ROPs to 'empty'.

Quote:
Also, i get that the DX10 cards will be better because they are more new, but what other advantages will they have in Vista?


Both ATi and nV's cards will have more flexability in design (R600 copletely unified, G80 Unified Geomettry/Vertex package) and will have more proccessing units overall (but the same number or TMUs and ROPs supposedly). They will also be slightly faster in both core and memory speeds, and their ability to fully integrate into Vistas better branching and batch handling may be primarily a next gen feature, but the X1900 might be able to handle at least the branching aspect (good branching skills now).

Quote:
Cause i can install the DX10 software on my DX9 card, so whats the difference beetween a DX9 card and DX10 software and a DX10 card and DX10 software?


You DX9 card will not take advantage of the full compliment of DX10 features beacause it doesn't have the hardware requirements. In fact based on what M$ has stated, your DX9 card will run the intermediate step of (DX9L-V/WGF1) which will support some additional features of DX10, but will be used to support legacy parts and therefore will not receive full integration at the kernel level, no special advanatage for the X1900&G7900.

http://www.elitebastards.com/cms/index.php?option=com_c...

Quote:
Does anyone think it a whise idea to buy SLI/Xfire when i have a P4 3.0Ghz Prescott 800Mhz FSB w/HT CPU. Its top of it's line, but its not Dual Core with ive heard is better with SLI (2 cores for 2 cards), or would a DX10 card be OK with my CPU?


I'd say we can't know for sure, but based on all of the speculated specs of the R600/G80 as well as the demonstrated benifits by ATi in their presentation, you'll likely get better performance out of an R600 or G80 than either a GTX in SLi or an XTX in Xfire. But once again it will favour newer games more than older games, because where you may have relatively very simplictic games (UT2K4, COD2, D3, etc) they will require RAW grunt where more ROPS will help because there's not a huge wait for multiple passes and longer shader ops. In fact D3 should be interesting because some of it's perforamance may change based on the new handling especially of Z-cull, supposedly there's new tweaks that may make for interesting performance boosts.

Heck alot of this is speculation, but the trend towards more pixel shader ops is a reality. I think ATi may have jumped the gun in the Ratio (3:1) but it's definitely the direction the games are heading in.
July 13, 2006 7:32:08 PM

Well, according to every benchmark ive ever seen, excpet those based on very shader heavy games, the GTX performs way better, sometimes even over 20 FPSs better.

So, would buying a DX10 card and using DX10 on it be worth it if the card isnt as good as a DX9 card? I dont know what this integrated stuff is all abt, but im not too afraid of it, i mean ATI and Nvidia wouldnt be selling cards they know wont work well in the future, right? My main concern is that the DX10 cards will have far newer and very cool technology, i.e. PS 4.0 that will make me wish id waited a little longer.

I really hate DX10, and Nvidia and ATI for releasing it after the summer, i wont forget this!

Im going to buy a "crippled" Motherboard that has 478 and PCI-E so that i can stick with my current CPU and run whatever PCI-E card i choose to buy. Ive heard, since Intel Core 2 is coming out on the 27th, that many other Intel CPUs will be getting cheaper.. wont mean i have the money to get them but still maybe i should wait and get a new 775 motherboard.

Does anyone think i should wait for DX10, even though i dont think i have the money to buy Vista, nor do i think that its possible to have Vista and F.E.A.R or any new game working at the same time.. ? Its either miss the summer and get a new system, or play in the summer wich is really important to me and get an "old" card.
a b U Graphics card
July 13, 2006 8:03:53 PM

Quote:
Well, according to every benchmark ive ever seen, excpet those based on very shader heavy games, the GTX performs way better, sometimes even over 20 FPSs better.


Believe me and feel what most of us are saying, those benchies you're talking about are the minority. These cards are very close, and only a slim edge would I give to ATi in FPS, and then another wedge for IQ.

Quote:
So, would buying a DX10 card and using DX10 on it be worth it if the card isnt as good as a DX9 card? I dont know what this integrated stuff is all abt, but im not too afraid of it, i mean ATI and Nvidia wouldnt be selling cards they know wont work well in the future, right? My main concern is that the DX10 cards will have far newer and very cool technology, i.e. PS 4.0 that will make me wish id waited a little longer.


Well it's not just DX10, we say DX10 cards, but it's about the G80 and R600 really when we say better than the current cards. The entry and mid-level DX10 cards likely won't perform any beter, and likely will be worse. But based on the design of the R600/G80 they should outperform the current cards in a DX9 and DX10 environment, but they should get an additional boost when moving from DX9 to DX10.

Quote:
I really hate DX10, and Nvidia and ATI for releasing it after the summer, i wont forget this!


M$ releases DX, ATi and nV just try and anticipate when it's released so they can have products ready.

Quote:
Does anyone think i should wait for DX10, even though i dont think i have the money to buy Vista, nor do i think that its possible to have Vista and F.E.A.R or any new game working at the same time.. ? Its either miss the summer and get a new system, or play in the summer wich is really important to me and get an "old" card.


I don't think most people should wait when buying (edit: unless it like a single month or a few weeks away). Remember it's at least 3-4 months wait for the first indication of hardware if they're on schedule.

What I would suggest is decide on the card you want now, buy it, and then when the new gear comes out sell your current card and upgrade, trying to get as much value for it as possible.

If you're talking about an SLi rig but have issues with prices, you may want to refocus your funds because SLi would be a waste if you need to put the money elsewhere.
      • 1 / 2
      • 2
      • Newest
!