Sign in with
Sign up | Sign in
Your question

Physx card or crossfire?

Last response: in Graphics & Displays
Share
February 10, 2007 5:30:57 PM

This may seem like a dumb question but what would be the smarter option?
Going the Aegis Physx card route or with a second video card? Here is my setup:

*Intel Core 2 Duo E6600 Conroe 2.4GHz 4M shared L2 Cache LGA 775
*ASUS P5B Deluxe/WiFi-AP LGA 775 Intel P965 Express
*ARCTIC COOLING Freezer 7 Pro (cpu fan)
*Kingston HyperX 2GB (2 x 1GB) 240-Pin DDR2 SDRAM DDR2 800 (PC2 6400) -timing: 4-4-4-12 -voltage: 2.0v -cas latency: 4
*COOLMAX CW-650T EPS12V 650W Aluminum ATX v2.01
*Sapphire ATI x1900xt 512mb video card
*Cooler Master Centurion 5

Thanks in advance!

More about : physx card crossfire

a b U Graphics card
February 10, 2007 5:47:00 PM

Second video card.

While SLi/Xfire have limited benefit, it's far more than that of the PPU.

And considering current performance, the X1900 gives about 4-10 times the processing power of the PhysX card.

If you feel you must, then get a second card.

But IMO, it'd be even more advantageous to simply sell your old card and buy a GF8800 or new R600, and then maybe add an X1300 is you want physics later. That option IMO is better use of your money and likely a better overall gaming experience.
February 10, 2007 6:25:48 PM

I agree with The great grape Ape, a Physx card wont do much for you at this point, and i dont think it works with all games anyway.
Again as TGGA said, getting a newer video card might be better.
Related resources
February 10, 2007 8:08:40 PM

basically its all about how they are used, if enough gmaes take advantage of it eventually then its worth a buy, if ut2007 makes good use of it, im definatly getting one
February 10, 2007 8:19:55 PM

Quote:
basically its all about how they are used, if enough gmaes take advantage of it eventually then its worth a buy, if ut2007 makes good use of it, im definatly getting one
Yeah, with UT3* supporting the PhysX card, it may have some benefit.
February 10, 2007 8:24:25 PM

The second card only works with games that have Havok FX (not just regular Havok), though. And I believe it's less accurate than PhysX. But not very many games support PhysX yet, and the ones that do don't really utilize it very well (particularly GRAW). If you go with the PhysX route, you might actually LOSE performance (depending on how the game is using it... if it just does physics that can be done fine in software, you'll gain performance; if it does all sorts of cloth, fluid, breaking, etc. simulations, you'll lose a few FPS), but performance isn't the main point of it, the point is highly realistic physics (and not just basic ragdoll and rigid body physics).
February 10, 2007 8:29:09 PM

Well now that I have a new mobo that supports xfire it is tempting. I had only really been tempted to go the physx route because I didn't have xfire. What second card would you guys recommend to go with my exsisting Sapphire ATI x1900xt 512mb? I have a budget of say $300.
a b U Graphics card
February 10, 2007 8:34:01 PM

Second Vedio Card. There are not many games using the Physix card.
February 10, 2007 9:06:31 PM

Second vid card, especially if your monitor is running higher resolutions.
February 10, 2007 9:16:42 PM

Yup got a dell 30" running at 2560x1600....not that i'm rubbing it in :D 
Any suggestions for adding a second video card? Kinda new to how the whole xfire thing works. Can I just simply add say a x1300?
February 10, 2007 9:28:58 PM

Master card? Does that mean a specific modeled card i.e. x1900 crossfire. Is that a master card?
February 10, 2007 9:42:25 PM

Ok, so does the model type matter, x1300, x1400 etc? I'd go for a x1900 plus regardless but just curious.
February 10, 2007 9:47:06 PM

My advice is take nothing for now. Heres the deal:

The Ageia card is much more capable for raw physics calculations then any VGA. Ageia has very few games supporting it but Havok FX has NOTHING.

Let me tell you this. As we all know the UT Engine 3 is one of the most technically advanced graphics engine but it's also very POPULAR. Several game developers have already purchased licenses for utilizing the UT 3 engine. This means a rain of future game releases with support for the Ageia's PPU.
Besides that, other popular game engines support it too. Developers nowadays try to achieve game coding which is compatible in all platforms. Know that both PS3 and XBOX 360 support Ageia's software API which indicates only good predictions about Ageias future games support.

There are great things coming because Ageia isn't the only PPU developer. Theres also a company named AIseek that will soon release a PPU too.

Wait until UT2007. By that time a lot more games will feature support for Physics. Also by that time all the performance issues should be solved and maybe a better revision of the card will be out using PCI-E.

Go with Xfire for Xfire, not for physics support as there is practically no such thing. Just keep your money and wait in the PPU section.
February 10, 2007 9:53:26 PM

If it's not too much to ask, could you just stick it out and wait until ATi's R600 release? (Early March)
February 10, 2007 10:00:45 PM

Probably the best advice so far. Sell you 1900 for 200~250$, keep the 300$ you already have and hit a R600 card in April! :wink:
February 10, 2007 10:00:57 PM

I'm not in a huge hurry after buying a new mobo, ram, ps, cpu and nice cpu fan. I can wait and save up for the beast (r600).
February 10, 2007 10:05:43 PM

Quote:
if you are running that monitor then it isn't even worth asking that question, you need another card. hell, i felt the need for my lowly 24" dell so for you it is a must IMO.

oh and btw you can only add another card which uses the same core if im not mistaken. you'll also need a master card since you have the same kind of card like mine i believe. get the x1900 master card if you can find one.


I built my PC before I got my 24" Dell. I really wish I would've gotten an SLI compatible MOBO :cry: 
February 10, 2007 10:07:28 PM

Quote:
There are great things coming because Ageia isn't the only PPU developer. Theres also a company named AIseek that will soon release a PPU too.
I don't see how that's a great thing without a common physics API. It'll just get really damned messy.
February 10, 2007 11:00:23 PM

Agree on that but let them hit the market and then maybe game developers force Microsoft release a common Physics API for all of them. But still it's great to have more then one player in the PPU market.
a b U Graphics card
February 11, 2007 7:31:08 AM

Quote:

The Ageia card is much more capable for raw physics calculations then any VGA.


Far from it both ATi and nV can get more physics calculations than the PhysX cards. The advantage of the PhysX cards it their level of interaction and game dependant physics. But even this may change with the new VPUs as nV has hinted that the GF8800 has greater ability to communicate with the host system, and even AMD has mentioned that as a benefit to the R600s.

Alot of the published info can be found in this good [H] article;
http://enthusiast.hardocp.com/article.html?art=MTA5Nywx...



Quote:
Ageia has very few games supporting it but Havok FX has NOTHING.


As a published game, UBi will be demoing their application of Havok4 (with HAvokFX effects) in SanFran in March. As for Ageia's published titles, none are much of anything, even GRAW uses HAVOK physics with and AGEIA add-on that essentially does little for game play because the game was built on Havok physics (which has more titles under it's belt and more future titles than Ageia).

As for current Ageia titles, they are tacked on, and pretty poorly sofar;
http://www.tomshardware.com/2006/07/19/is_ageias_physx_...

Only acceptable title is City of Villans, eveyrthing else is essentially a mediocre tech demo.

Quote:
Let me tell you this. As we all know the UT Engine 3 is one of the most technically advanced graphics engine but it's also very POPULAR. Several game developers have already purchased licenses for utilizing the UT 3 engine. This means a rain of future game releases with support for the Ageia's PPU.


That's wishful thinking because most of the people already comitted to using the UE3 engine (not UT3 engine) aren't using PhysX, just like RainbowSix Vegas and Gears of War, neither uses PhysX, and many have said they won't bother. Because you use UE3 doesn't mean you get PhysX free, you still have to pay for it, and Epic has not ruled out VPU physics for it's other titles.

BTW, Crysis will support Crytek's own VPU physics implementation, and Valve is behind VPU accelerated physics for their solution as well. So there's no major gaming engine advantage. It will depend on which titles can ake the most use out of it, and based on past games and demos, Crysis looks to be the killer app, and not so much UT3 itself, but UE3 based games if they're like Gears of War may have some legs, but it's still to be seen.

Quote:
Know that both PS3 and XBOX 360 support Ageia's software API which indicates only good predictions about Ageias future games support.


First of all the Xbox can't support the PPU, only VPU physics, and it's an M$ rig, and M$ is going VPU physics for PC to with Direct Physics. Oh an BTW, Havok also supports both as well;
http://www.havok.com/content/blogcategory/30/68/

The reason that the PS3 and Xbox suport Havok and Ageia, is because they support the games, and it's up to the developers to add the features, but there's no PPU in the X360, only in the PS3, but both can do VPU accelerated phsycis. Don't confuse a game power by either Havok or Ageia physics, as having VPU or PPU physics.

Quote:
There are great things coming because Ageia isn't the only PPU developer. Theres also a company named AIseek that will soon release a PPU too.


Which is about as relevant to the big picture as VIA/S3 and SIS making DX9 and DX10 VPUs.

Quote:
Wait until UT2007. By that time a lot more games will feature support for Physics.


They already support physics, what they need are PPU or VPU accelerated physics, not just CPU, like most of them now.

Quote:
Agree on that but let them hit the market and then maybe game developers force Microsoft release a common Physics API for all of them.


Well M$ is bringing a common API, Direct Physics, and it's going the VPU accelerated route according to M$, not the PPU route because of their involvement with the graphics boys already. Future addition of PPU is likely dependent on whether Ageia can give compelling reasons, and IMO that depends on the success of their involvement with UT3.

Quote:
Go with Xfire for Xfire, not for physics support as there is practically no such thing. Just keep your money and wait in the PPU section.


Just keep your money until the killer game or app regardless of who supplies it, but don't bank on either one, because spending the money on the GF8800GTS or GTX makes more sense now than any of the other wishful options, and maybe even waiting for the R600 or GF8900.
February 11, 2007 8:39:08 AM

I've also got the Motherboard Asus P5B Deluxe Wifi Ap, but it doesn't support full CrossFire since on of the PCI slots is only x4 not x16!

Am I right in saying this? If I'm wrong I'd love to have crossfire!
a b U Graphics card
February 11, 2007 8:44:28 AM

Well it says it supports crossfire, and you can support it at lower rates (like 4X). Larger # of lanes is better, but 16 + 4 or 8+4 will do it also, and likely with little impact, although of course not as nice as 16+8 or 16+16.

You should be fine, but like so many things, Xfire and SLi are tricky with a ton of possible barriers/hiccups/etc.
February 11, 2007 9:38:10 AM

With your current card im not sure why you even need to xfire it.. I have yet to play a game that doesnt let me run my native res of 1920x1200 with all settings at the max the game allows (aside vanguard which sits me at about 20 frames) Unless you own that 30" lcd i dont see either route being needed. Though if you must have more power ignore physics cards right now its a total waste go xfire. Btw match the card you have with the same card otherwise it with throttle so i have been reading. ATI (AMD) will also allow not sure if they do now but to allow a older card you have laying around to act as your physics card like mentioned a x1300 for instance.. I plan on using my old x850xt pe as my physics card if i ever play a game that can use it as such.
February 11, 2007 9:41:09 AM

wait... havok is out?????? when, and where was i? so this means when i do upgrade, if i go with ATI i can use my old x1600 pro? coz that would be sweeeeet!

For the post above this 1, i though you needed shader model 3 for it to work...

bLAKE
February 11, 2007 12:19:46 PM

Well said TGGA.

Just don't be so sure about all your statements.

I didn't say that PS3 and 360 have a PPU, i said they support the API.
Also the Unreal Engine 3 (not "UE3 engine") powered games aren't using it's full potential right now. I believe that more company's will enable Ageia support depending on how successful the implementation in UT2007 will be.
I know that most current games supporting Ageia are crappy, but better then nothing. Ubi's application of Havok4 in SanFran is also a tech demo and not a true game supporting Havok FX. This will come but after some time.
GRAW uses Havok not Havok FX. Any list in future Havok FX games?
I got this for Ageia: http://www.ageia.com/physx/titles.html
Don't get me wrong, i am against PhysX and prefer the VPU acceleration path.
I thought Direct Physics was just speculation. Any official source for that?

Btw,
Quote:
"They already support physics, what they need are PPU or VPU accelerated physics, not just CPU, like most of them now. "
Really?! No $hit!

Just a joke :lol: 

Thanks for your reply and i'm very interested to hear more of your opinions!

Peace. 8)
February 11, 2007 4:44:56 PM

Quote:

Just keep your money until the killer game or app regardless of who supplies it, but don't bank on either one, because spending the money on the GF8800GTS or GTX makes more sense now than any of the other wishful options, and maybe even waiting for the R600 or GF8900.


So are you saying i'd be smart to go with say a x1900xtx now versus buying a r600 in a few months? Of course I suppose it's all really going to come down to price and buying now vs buying later (spending only $300 plus now for a x1900xtx or $600 plus for a r600)?
February 11, 2007 5:28:15 PM

Quote:
That's wishful thinking because most of the people already comitted to using the UE3 engine (not UT3 engine) aren't using PhysX, just like RainbowSix Vegas and Gears of War, neither uses PhysX, and many have said they won't bother. Because you use UE3 doesn't mean you get PhysX free, you still have to pay for it, and Epic has not ruled out VPU physics for it's other titles.
Wrong, they both do. And your second statement is also wrong, Ageia made the PhysX SDK free for anyone (except for Xbox 360 games, though there's possible exceptions to that if they're using an engine that has PhysX integrated into it).
a b U Graphics card
February 11, 2007 6:41:56 PM

Quote:
wait... havok is out??????


Havok has been out for EONs. Havok powered HL2, and Havok3 powers Oblivion and GRAW. However what we are talking about here is HavokFX, a physics add-on to the Havok4 set of physics tools, it's optional, just like Ageia's PhysX portion to their Physics engine, acquired by NoVodex (and upgraded with Meqon acquisition). Two separate portions of the physics equation.

It depends on what people decide to add, look at GRAW as an interesting example, the game-based-dependent (important) physics are by Havok, the shiny physics are Ageia's add-on accelerated portion. So companies can decide to build a game, and then even later buy the upgrade to add-on the hardware accelerated portion. So a game could start out with a quick 'get it to market' edition with system/cpu based physics, and then with the next expansion pack, along with adding maps, add hardware physics support because now it's about the bonuses not just getting the game finished (like most developers seem to experience near the end with pushed back launch dates, etc).
a b U Graphics card
February 11, 2007 7:39:55 PM

Quote:

I didn't say that PS3 and 360 have a PPU, i said they support the API.


But we're talking about the PPU/VPU-accelerated portion. Supporting the overall API is one thing, but supporting the HavokFX and PhysX-hardware portion is another thing, and the X360 doesn't have a PPU, but the PS3 does have the ability to mimic the PPU in it's CELL, however, both have a VPU, so for development purposes, only one solution 'could' support both. But the reality is that neither is going to be a good showcase for either solution.

Quote:
Also the Unreal Engine 3 (not "UE3 engine") powered games aren't using it's full potential right now. I believe that more company's will enable Ageia support depending on how successful the implementation in UT2007 will be.


I agree with that in principle, but remember the developers that chose to go with the UnrealEngine3 engine (it's repetitive but correct the engine is called UE3, like if it were called BOB it would be the BOB engine, regardless of what BOB meant [a little pedantic on my part to be sure but too many people have been caling it the UT2K7 engine, so same reaction from UT3 engine, just knejerk on my part]) are unlikely to devote as much time to the PhysX portion unless Epic can make PhysX the star of the show. So saying that games will be based on the UE3 engine means little, because like games were based on the Source engine, not all used the same level of Havok physics as in HL2. And while game titles will be based on Havok4, likely few will use HavokFX.

Quote:
I know that most current games supporting Ageia are crappy, but better then nothing. Ubi's application of Havok4 in SanFran is also a tech demo and not a true game supporting Havok FX. This will come but after some time.


Exactly, and that's why I mention it, we are far from even the true starting point of this battle let alone declaring anything as being a winner or loser. But in the meantime SLi and Xfire have game benefits for a wider variety of games, but still only about 50% of the title out there (but that's better than the less than 1% of PhysX and 0% of HavokFX), and that's my point, they're both, hype, and while SLi and Xfire are hype they still have more utility right now, although I still think they suck and he'd be better with a GF8800GTS than 2 X1900s IMO.

Quote:
GRAW uses Havok not Havok FX.


I know, I specifically did NOT say HavokFX despite explaining how it's part of Havok4 the sentence before it. I assumed that since I mentioned that they will only NOW be demoing it wth the UBi demo everyone would already know that GRAW was Havok3 which did not include support for HavokFX. So again Havok3 is the game engine physics for GRAW, PhysX is the shiny add-on stuff. And that's the big argument with regards to drawbacks as current support for VPU physics is very limited to visuals more than true interactions. But this is supposed to also change.

Quote:
Any list in future Havok FX games?
I got this for Ageia: http://www.ageia.com/physx/titles.html


That list is obviously to generic and doesn't just focus on the PPU-acclerated physics since Gears of war does not have PPU-accelerated physics on the Xbox as listed. So the question is what is a PPU title, and what just uses CPU/host physics like all those that use the standard Havok engine? City of Villans is the only good game there that DOES for sure use the PPU-accelerated physics. Can we even be sure of the RS-Vegas with it's PS3 and X360?

Sounds like they are using Havok's similar creed that leaves true title lists unknown, and kind of proves my point about UE3 with Ageia saying the following;
"PC titles built on this technology can more easily take advantage of PhysX hardware support - though specifics on this level of support are up to the developers."

Same as all Havok4 titles CAN use FX, but personally I think only the few major marquee titles WILL. WE are at least 2 years away from physics even reaching the 50% mark on new games from the major sutdios.

Quote:
I thought Direct Physics was just speculation. Any official source for that?


Sure official and unofficial. How's this for official, want a job?
http://members.microsoft.com/careers/search/details.asp...

"You will be a member of the core engine team who will be primarily responsible for working closely with our Direct3D team, helping to define, develop and map optimized simulation and collision algorithms onto data structures that are optimized for the GPU. Extensive experience with graphics shading languages such as HLSL is expected as well as a good understanding of modern graphics hardware and associated algorithms."

Which path do you think M$ is focusing on first? Id say VPU because it's easier to implement in the current hardware and drivers.

I'll see if I can find the powerpoint presentation I have/had on the subject.

Xbit also mentioned the same a while back;
http://www.xbitlabs.com/news/multimedia/display/2006062...

And DirectPhysics rumour goes for it to be a DX10.1 add on, which is expected by the summer.

I'm sure eventually they will have some PPU support if it lasts long enough, but I personally think they picked their pony since no it's, AMD/ATi, intel, nVida and VIA with an install base of many tens of millions in the one camp, and tiny Ageia with a few hundred thousand in the other. I don't doubt that Ageia 'could' do well, but like even you mentioned I'd say they are pretty much dependant on the success of UT3 (the game) showcasing the technology, because if the buzz isn't good and thus the sales of the add-in cards great, then IMO it will die. The thing about VPU physics is that it could pretty mch come in at any point and voila millions of potential users almost instantly, and often with no or little added cost.

As always, only time will tell.
a b U Graphics card
February 11, 2007 8:00:09 PM

Quote:
Wrong, they both do.


No, you're wrong, understand my statement, you are getting confused between the Novodex engine called PhysX, and the PhysX PPU (like the difference between Havok4 and HavokFX). For that comment of mine I thought it was clear that what I said is that they use the physics engine but obviously do not use the PPU portion as neither supports it (just read the interviews with Epic about both titles, also use your head about how they are going to use a 'PPU' on the X360 when there is none). Even think about the PS3's implementation as nothing more than using the multi-cores in a non-uniform way more than a real PPU.

Quote:
And your second statement is also wrong, Ageia made the PhysX SDK free for anyone (except for Xbox 360 games, though there's possible exceptions to that if they're using an engine that has PhysX integrated into it).


No you should look into it further, does charge developers to use it. And using the UE3 engine does not guarantee PPU support, just like I said. Read up on it, just like your link above (And also in Mike's post), you need to dig deeper to find the truth.

"The AGEIA PhysX SDK is free for non-commercial use. Standard pricing for commercial use is $50,000/title/platform. Licensed developers who implement PhysX accelerator support in their PC title are not required to pay this fee."

So yeah it's free for all them nice non-commercial games out there or else you pay for it by paying for EPICs PhysX version over their non-PhysX version, just like Havok with Valve and Gambryo (hey notice Gambryo is both Havok and Ageia, does that mean you therefore pay twice according to your logic?).

The point is simply that not everyone is going to pay the premium for PPU support. To Epic and it's clients UE3 is about the ability to easily use and combine tools for game development, not promoting someone else's software/hardware. They will support a bazillion add-ons/plug-ins but not necessarily include any of them. EPIC want to profit from their game engine, they're not trying to re-sell Ageia's software.
a b U Graphics card
February 11, 2007 8:06:22 PM

Quote:

So are you saying i'd be smart to go with say a x1900xtx now versus buying a r600 in a few months? Of course I suppose it's all really going to come down to price and buying now vs buying later (spending only $300 plus now for a x1900xtx or $600 plus for a r600)?


I thought you already had one?

I'm saying either save your money and then sell your current card one the R600 comes out and get it (or the GF8900 if it's better), or even buy a GF8800GTS now, because Xfire's not a great solution with those options now. And the X1900XTX is usually overpriced period.

If you're only buying one card now, likely don't buy an X1900XTX as they are usually overpriced, and something similar like an X1950XT is better priced, or the X1900plain XT is a better value.

BTW, are you buying from NewEgg or where? Pricing and availability may not be as good near you, and your options may be different.
February 11, 2007 8:10:33 PM

Yes I do have a x1900xt 512mb card. I was just entertaining the thought of going the xfire route but then again i've heard of people running sli or xfire and actually getting lower frame rates. Maybe i'm good with just my x1900xt?
I just wanna be ready to go when crysis comes out :D 
a b U Graphics card
February 11, 2007 8:24:24 PM

I'd say stick with your very competant X1900XT for now, and then when Crysis comes out get the best solution, be it from ATi/AMD, nVidea or even maybe S3/VIA. I understand your situation, and really the best thing for you IMO is to wait for the results. A GF8800 is great for current gaming, but if you focus is Crysis then wait for at least early GOLD/RTM test before deciding on anything.

Only if you needed something right now or just like to upgrade often, would I say to buy something now, and then I'd recommend against Xfire and tell you to get a GF8800 instead because more often than not the cost of selling your X1900XT and then Buying a GF8800GTS would be the same as another X1900XT, and I would say the GTS will outperform more often than not, and also have nice future benefits so you could see if you like the DX10 advantages in Demos before comitting to anything further.

However if you're happy right now I'd say save your money, especially since so much is going to comeout and change the entire landscape (R600, GF8800GTS-320, GF8600/X2600). Unless you NEED it now, wait a few weeks to decide IMO. And if you want to play a particular game like Crysis, maybe wait and see which architecture (G80/R600) does better, and whether any physics benefit is even worth considering (BTW like I said Crytek is going with their own custom GPU-physics solution, so likely alot of 'unknowns' at this point).
February 11, 2007 8:26:39 PM

Quote:
No you should look into it further, does charge developers to use it. And using the UE3 engine does not guarantee PPU support, just like I said. Read up on it, just like your link above (And also in Mike's post), you need to dig deeper to find the truth.

"The AGEIA PhysX SDK is free for non-commercial use. Standard pricing for commercial use is $50,000/title/platform. Licensed developers who implement PhysX accelerator support in their PC title are not required to pay this fee."
Read this press release. (And/or this if you're still not convinced.) They forgot to update that part of the website.
a b U Graphics card
February 11, 2007 8:43:07 PM

Well looks like they changed their tune due to their current situation, making it free to increase adoption.
But one thing you forgot to mention is that like I said hardware support is still separate from software and it is specifically mentioned in the EULA, so they are talking about the software portion;

As part of the 50K part of the updated EULA;
"Fee may be waived at our discretion for multi-platform developers providing PC HW support"

Still not clear if PhysX-HW is any different from HavokFX. So no, I'm still not convinced.

But it does look like they are trying harder, either out of desperation or they understand that they need a support base first, then stop 'waiving' fees.
February 11, 2007 9:16:40 PM

Quote:
i just cannot believe that statement about being able to play games using one card. which games do you play because i sure as hell can't.


Whats so hard to believe? what game even requiers two cards to play? sure as hell isnt oblivion.... hel even vanguard doesnt need two cards though its hte only game that has made my frame rates drop like a rock..
February 11, 2007 9:21:19 PM

k my computer lies...


Im just glad it doesnt requier you to agree with it though.
a b U Graphics card
February 11, 2007 9:21:31 PM

Agreed, without having things turned off or down, there's no way.
February 11, 2007 9:26:27 PM

it was set to the max the game allowed believe it or not like it or not doesnt matter to me :) 
February 11, 2007 9:33:07 PM

Yeah you are right. Well i agree that time will tell.
I am currently getting a 8800 gts to keep my new PC for like three months and then sell it for an 8900 or something. Of course i'll wait to see Crisis and UT2007 performance before buying my new gpu (the one after gts) and any PPU unit.
That's me, but i would recomend the same strategy to the OP.

Btw, TGGA thanks for the quality response. I believe that healthy conversations like ours add a lot value in this Forum's community.
February 11, 2007 9:35:59 PM

i play with the fsaa up and hte self shaders up just cuz it makes like one thing look good but everything else look like crap :-/ I have grass on though most of it in the game looks like garbage didnt know HQAF was a game setting or im just not getting what it is...


But like i said let me do it in caps this time and in bold


ALL SETTINGS THE GAME ALLOWS ME TO SET

im not talking about something im forced to set in drivers. then i might have gotten the fsaa to work and the hdr assuming the hdr didnt make it look worse...

I was looking for screen shots either the game saves them where i cant find htem or i wasnt able to save screen shots for some reason. someone ona thread somewhere said i couldnt save screen shots with out hdr running which i think is bs but it would explain my lack of screen shots. But all my sliders are turned up to the max they will go. The fsaa i forgot how high it went in game though. i think it was like 8x or something.
February 11, 2007 9:43:41 PM

K well since your not going to believe me (which doesnt change anything really) this argument is moot... im running the game in hte max settings it allows dont see the problem there... Im sure im running it in a higher fsaa then you and i do have sliders on that you dont and detail settings.. only thing i see you had different is a single driver settings that i must have named different in my drivers since i dont use the crappy ati drivers :o 
February 11, 2007 9:48:12 PM

''Playable'' varies from people to people. I'm the demanding type. I can't accept below 30 FPS and hate to see unstable frame rates. I played FEAR in 1152x864 with all maxed and HQ AF16X+Adaptive 6XAA. My minimum frame rate in the game bench was 38 fps with an average of 55 fps. If i increased the res into 1600x1200, then the minimum dropped 27 with 44 avg. For me that wasn't ''Playable'' cause of my demanding nature. A pal of mine played fine with a min of 19!! 8O Well to him it was good.
I bet that performance in oblivion maxed with one 1900 would made ME die but he can also think it's good for HIM. :wink:
Btw, the FEAR story was a year before in my X1800XT.
Now i type in X850XT. :( 
February 11, 2007 9:53:02 PM

OOOOK i said it before ill say it again your beliefe or otherwise isnt changing anything. there were vary few places the out doors made my frames drop to 30 or below ba few frames...
a b U Graphics card
February 11, 2007 9:59:03 PM

Quote:
Yeah you are right. Well i agree that time will tell.
I am currently getting a 8800 gts to keep my new PC for like three months and then sell it for an 8900 or something. Of course i'll wait to see Crisis and UT2007 performance before buying my new gpu (the one after gts) and any PPU unit.
That's me, but i would recomend the same strategy to the OP.


Yep I think that strategy makes alot of sense, and the one I generally recommend for people the GF8800 is a quality DX9 card, and offers future 'unlockable' benefits, so win/win IMO. MY only concern like I mention is that new hardware is only weeks away so people should wait if they can wait 'til we know more (don't need to buy the R600, but if you know it's got less/more then that makes a more informed decision IMO), also that nice cheapo GF8800GTS might be interesting too, and it's only days away.

Quote:
Btw, TGGA thanks for the quality response. I believe that healthy conversations like ours add a lot value in this Forum's community.


Yeah agreed. I really just wish we had more info, cause right now it's more about the marketing PR than the actual implementations unfortunately due to the embargo on info.

I think anyone buying right now should only consider them like the DX10 portion of the GF8800 series, potential unlockable 'bonuses' not something worth buying right now, unless you have a need for City of Villains or GRAW or a specific use you can currently benefit from them, be it a PPU or Xfire/SLi.

Heck if you read the info on the future of intel and AMD, both solutions are pointless as CPUs move to massively parrallel, massive FPU units combining the best of VPUs and CPUs into monster processors.

As always, only time will tell.
February 11, 2007 9:59:28 PM

I tend to have lots of luck playing:

HL2
DoD Source
Company of Heroes
ETC< ETC<ETC

I play all at my native res of 2560x1600 and usually everything turned up and it's pretty damn smooth depending on the game, it's usually very smooth.
February 11, 2007 10:15:48 PM

You'll def never play on anything less that 30" again....it's like being a kid in an arcade!!
February 11, 2007 10:17:31 PM

Yeah felt that way about my 24" :D 
!