Sign in with
Sign up | Sign in
Your question

BFG Technology AGEIA PHYSX card

Last response: in Overclocking
Share
April 1, 2006 9:55:11 PM

Has anyone seen any initial benchmarks or reviews of this new add-in card?

http://www.bfgtech.com/
April 1, 2006 10:07:29 PM

Haven't seen any benchies yet bu I am eagerly awaiting them. Kudos to those who find em :D 
April 2, 2006 4:29:06 AM

I am wondering if it works with all games, or has to have a profile to work with games or if games have to support the Ageia chip...
Related resources
April 3, 2006 3:29:52 AM

That's what I am wondering as well. It definately sounds very interesting. I hope it offers more real world performance than SLI. I have heard rumors of it costing around $250.00. Which if the performance is right might make it a worth while investment.
April 3, 2006 3:53:24 AM

the games that have full immersion capabilities will be able to use the PPU but those that dont will be left high and dry because the have not been designed to have real world physics capabilities. if you want to see the difference in a game with a regular graphics card and one with the PhysX processor, go to this site, its pretty bad ass.

http://physx.ageia.com/footage.html
April 3, 2006 4:06:56 AM

I would say this new physics card is going to improve the rigs performance. Okay, in a pc with no physics card the physics that is mainly scripted are run by the cpu obiously. But with the addition of the physics card, now the cpu would no longer have the task of running the physics on the game for the physics card takes it over, well most of it. So the performance will definitety increase in a pc sporting a graphics card. Bottom line, 3 chips (cpu,gpu & ppu) working together is better than just the cpu and the graphics card.

As Agiea have mention, the card supports some games that are already out but would not fully implement it. Games like Cellfactor and others alike that are soon due would fully soppurt this physics card.

I mainly play BF2 and FEAR most of the time and I wonder how this new card can improve the game performance. I'm very interested with these physics card, but I not so sure if I want to buy right after it comes out, which is this coming May. Since none of my games that fully supports this card I would say I just gonna wait like the end of this year for games that supports this card and by then the price would be cheaper than $300.
April 3, 2006 5:06:39 AM

if only half-life 2 supported the Physx card



-zombies fly everywhere
April 3, 2006 5:18:42 AM

does anyone see the pattern here come on, it's going back to old days of computers

a card for everything

the more the better the computer performs

i am not complaining, personally i think it's cool
April 3, 2006 6:24:17 AM

i dunno is this physx card be better than sli physics?
nvidia announced that they will release physics driver for sli setup...
ageia physx will cost 299.99.... hm... price of another geforce 7900gt...
if nvida can manage a whole vga card to manage only physics...
it might be better no?
unless...
ageia physx card can fit on other slots than pcie x16...
April 3, 2006 7:00:05 AM

Even though Nvidia is making a physics driver there are a couple things that make me cautious. The first thing is that you have to have dual cards to make use of it. The second thing is that it still is putting tax on the GPUs instead of the CPU. It seems to me that a card dedicated to physics is just the ticket.
April 3, 2006 7:09:21 AM

The graphics card has little to do with the physics. It's the cpu job to do all physics like movement, interactive and effects. The graphics card's job is to render the picture/visuals like the image, shadows, ligthings etc. With the addition of the physics card on and SLI system it would make it perform a lot better and improved the physics in the game. Like I mentioned before the physics card will take over the cpu in managing the physics effect in the game.
April 3, 2006 7:23:21 AM

Quote:
The graphics card has little to do with the physics. It's the cpu job to do all physics like movement, interactive and effects. The graphics card's job is to render the picture/visuals like the image, shadows, ligthings etc. With the addition of the physics card on and SLI system it would make it perform a lot better and improved the physics in the game. Like I mentioned before the physics card will take over the cpu in managing the physics effect in the game.


i understand that gpu is calculating visual;
but nvidia is claiming that one gpu will be doing physics calculation dedicated.
although, that's if they make the driver correctly...
so.. basically, nvidia will make a gpu card to an physics card by a driver update...
i wonder if it is possible though...
maybe it will perform great... maybe not...
we'll just have to wait and see....
btw... BFG physx card will be released in end of may to end-users, according to BFG.
April 3, 2006 8:56:39 AM

Well, Nvidia is going to come out with a driver that improves physics. I believe you, but Nv cards are not made for physics and that they're trying to release a physics driver for their card make me think that Nvidia is getting intimedated by the Agiea Physix card.

I'm looking forward to that Nvidia physics improvement driver and see if it's going to increase my performance. If you think about it, that Nvidia new driver for physics is gonna affect the rendering job of the card as it is designed in the first place. We could only hope that maybe Nvidia will start producing cards as gpu and ppu in one. :D 
April 3, 2006 5:18:48 PM

Be that as it may, if I am looking for longevity of my rig, I'd much rather have a dedicated unit towards physics than tax my GPUs in any way. The less they are taxed the longer I can keep them in the future with games like Cellfactor.
April 3, 2006 7:22:50 PM

Its like saying when the first GPU card is out, the CPU is better at prosessing Graphics than the GPU, LOL.

How many time do I need to post this......
Ok, here goes, 1st the GPU Physics is only for visual, looks only, no interactive content, you cannot interact with it. 2nd GPUs are more of a kind of "Inline" or we call "The Production Line" in car factories, NOTHING goes BACKWARDS, but the CPU does, the PPU does even more.

More overly, games running the Agiea PhyX ppu, ragdoll DO NOT dissapear, other stuff wont too, see the video of CellFactor? Damn alsome, you will need the card to run the game, the video is recorded in real time. The Havocs GPU physics only calculates cloth simulation, some ragdoll effects, and particle effects, not too interactive ehh?

Why would anyone do SLI Physics? I would rather, have SLI and PPU with a Dual/Quad core CPU, GPUs are design for Graphics, CPU for prosessing data and stuff, PPU for physic, face the future guys and girls, we moved to GFX cards, it time to move to a PPU card too, anyway it wont cost much, consider it would actually last for like atleast 2 years.

Physics datas are RANDOM, GPU aren't design to do RANDOM data prossesing, CPUs are good at random codes but aren't optimized to run dedicatedly for Physics right? thats when the PPU comes in.

Conclusion: CPU are for data xfer, GFX for Graphics, Sound card for Sound and PPU for Physics. Just like in a American foot ball team, you can't have team with all the same people, just like in chess. you can have all the chess pieces the same, you would lose very easily.

Anyway, the PPU is the only thing that can do SUPA interactive inviro, imagine, you wana frag a guy hiding in a building, you can just blow up the buildings pillars, or you can blow OFF the ROOF and shoot the guy!!!!! LOL.
April 3, 2006 8:05:15 PM

Just wait for SLI Agiea Phyx!!!!! LOL

It does make you question how many new cards will we be craming into our future gaming rigs? With things like quad SLI graphics and PPUs where does it stop?

If you add in the giant power supply needed and you have a heat factory. I think it shows that water cooling will be mandatory pretty soon.
April 3, 2006 8:27:19 PM

We might just see descrete multi-core GPUs and multi-core PPUs.
April 3, 2006 9:49:31 PM

Quote:
nah i don't buy it. they want you to think that so you'll buy the card, but like that article i posted said, gfx cards have alot of processing power and i don't think all of it is used at once in games.

can you please post an article backing up your point.


I understand your dedication to the free part of the driver Nvidia is doing. But you get what you pay for. GFX cards may have a lot of power but that doesn't mean they are capable of taking on all of that crap. Have you seen some of the benchmarks, some of the frame rates from the latest games with the graphics full enabled. Even the new 7900's in SLI barely run the intensive games with a decent frame rate. So you suggest taxing them with more physics operations as well? Nah, I'll let a dedicated chip handle the physics and let my gfx cards stay cooler while handling what they are meant to handle.
April 3, 2006 11:13:04 PM

Quote:
not wanting to be mean but read my sig dumbass. i use ati and they have the processing power to pull it off maybe. while gfx cards may struggle with games is true, that is not evidence that all the GPU is being utilised at once, that may sound odd but i'm sure ATI yes ATI!!! would not be trying to have their GPU used for physics if it couldn't handle it.


You seem a little biased here, don´t you? It doesn´t matter if you have a nvidia or ati board, you won´t get the same level of physics processing like a dedicated processor. It is hardware optimized for physics like a gpu is for graphics....

And expect to see many, many game bugs on gpu physics processing...if graphics drivers do have a lot of bugs, imagine what we can expect for driver to run physics on gpus....
April 4, 2006 1:23:49 AM

Quote:
heres a question, explain in detail how the chips are different and how it affects the abilities to process physics


Dude, would take to long to compile all the stuff on the ppu, but if you want to know more, you can visit both ATI's and Ageia's homepages and get the white papers on their architecture.
If you want to get into more technical stuff just mp me and we can discuss it. :) 
April 4, 2006 1:43:50 AM

The Ageia and Clearspeed co-processors are designed specifically for mathmatics. For instance the Clearspeed processor computes at 25 GFLOPS versus about 5.7 GFLOPS from an opteron processor.
April 4, 2006 2:29:25 AM

Quote:
does anyone see the pattern here come on, it's going back to old days of computers

a card for everything

the more the better the computer performs

i am not complaining, personally i think it's cool




watch they come out with a card for the monitor

ohh wait ...
8O
April 4, 2006 12:00:19 PM

Quote:
but in that article i posted ATI said their core can do 375gflops so it would have the advantage by that logic


Yeah, it would, but then wouldn´t the P4 with 3GHz have an advanteage over an A64 with lower clock speeds and "lower throughput". But you can´t always look at numbers, there´s a lot more involved, specially when it comes to chip architecture.
There´s a white paper at Ageia´s homepage explaining the difference between the PhysX and a gpu, take a read, then compare to what the ATI´s article had to say, and then decide for yourself what is the best option for you (it always comes to this...what´s best suited for the consumer).
April 4, 2006 12:09:06 PM

Quote:
i won't be PM'ing you as there is nothing to discuss. i went on ageia's website and read 2 of their white papers and tbh if you think they could be used in any constructive argument you are an idiot. they were full of their own buzzwords, and only told how their way was the only way. i know all manufacturers do this, but they are a joke. they provided no actual information and numbers and told me nothing more than the article's i've read. i will wait for more detailed information and see the real story.


Hey, no need to get aggressive, but if you do I can certainly answer accordingly.
April 4, 2006 12:27:21 PM

:lol:  Yeah - Current Game Benchmarks right here:

In the current version of Half-Life 2 I gained 0 fps.

Also in the current version of Battlefield 2 I gained 0 fps

But in Quake 4.... whoa.... I gained a MASSIVE 0 fps.

In the heavier FEAR game, current version, I could only gain 0 fps.
April 4, 2006 1:40:28 PM

StrangeStranger, asking people on the internet to back up their opinions with facts is a lesson in futility. They would much rather "witness" to their opinions until people start believing them as facts.

All we can do is wait and see what real world benchmarks bring.

Personally, just logically speaking, I would rather have a dual GPU setup where if a game supports physics processing, the second GPU could be dedicated to that, and if a game doesn't support physics, it could take advantage of the increased video processing.

But, of course, that's just me.

Dark Spider
April 4, 2006 2:08:44 PM

I would rather have a physics card, rather then drive the physics through a GPU (regardless of manufacturer). A GPU is designed to render graphics for games. It is true that the architecture has become a lot more general and flexible in allowing certain programming to be executed on the GPU, but it was not designed specifically to handle something like physics calculation. If I bought and created an SLI system, I would want a faster frame rate through graphics processing. If one card is used for physics calculation, does this mean that the graphical frame rate drops?

My theory concerning this is that indeed the graphics card would be successful at calculating physics; however, what would happen if you wanted to run the game at a high resolution with all the eye candy turned on? I imagine you would still have all the interactivity and proper physics calculation, but the game would not see much frame rate improvement because the graphics card would become a bottleneck.

I think mainly the benefit would be in terms of the game play, not in terms of frame rate. And with GPU rendered physics, you would have to trade game play for frame rate. With a dedicated card, you can have both, with an SLI setup dedicated to rendering frames, and a physics card contributing to the element of game play.
April 4, 2006 2:41:01 PM

You crazy kids. . . so much animosity.

Here is a link explaining some of the differences and unknowns. Pay close attention to the part about how no one knows whether or not graphics card accelerated physics will have the capability to report rendered objects' locations back to the system RAM. If this capability does not exist, then graphics card accelerated physics will be limited to eye-candy physics with gameplay physics getting the shaft. On the otherhand, a dedicated physics card has already been shown accelerating "gameplay stuff and the eye-candy stuff," all in real time.

It's not an apples to apples comparison. Just as some of the people here are saying, brute force is not always indicative of a more capable solution. A card desinged to tackle a specific problem will, 99% of the time, perform more efficiently than a "patched" solution on hardware intended for something else.

Anyhoo, the article's a good read and will hopefully shut down some of these uninformed posts.
April 4, 2006 2:50:45 PM

Looks like I'm right, Havok FX is only limited to EYE CANDY ONLY, buy Ageia PhyX if you want gameplay, like me.

Draw back is the games you play, since I love and play Unreal Engine titles, I'm in luck.
April 4, 2006 3:27:06 PM

Quote:
Looks like I'm right, Havok FX is only limited to EYE CANDY ONLY, buy Ageia PhyX if you want gameplay, like me.


Actually, if you read a bit closer, no one knows right now.

Quotes from article - "Right now, it's unclear whether the GPU-enabled physics acceleration will be used just for "eye candy" effects, or for "real gameplay physics.""

"We're told there is nothing to prevent the game from reading back the data from the graphics card, but this has an impact on performance."

It sounds like it is theoretically possible, although, if I was a gambling man, I'd place my money on there being no feasible work-around without impacting performance to the point of slowing gameplay down.
April 4, 2006 3:47:24 PM

Quote:
one thing i know everyone will or should unless your really stupid is agree that competition is essential. it would be better in the long term if there was a battle and the better tech won, rather than everyone mindlessly jumping on the ageia bandwagon.

it is amazing that because of all the hype, people are speaking as if it is the best one and in some cases the only one. this is foolish and with that reasoning why not have nvidia and intel as the only gpu and cpu manufacturers. thats what everyone wants right?.


If you read the article you'll see the Ageia card and NVidia's solution are ment to work together and complement each other. Not so much competing as complementary.
April 4, 2006 6:00:33 PM

Quote:
I would rather have a physics card, rather then drive the physics through a GPU (regardless of manufacturer). A GPU is designed to render graphics for games. It is true that the architecture has become a lot more general and flexible in allowing certain programming to be executed on the GPU, but it was not designed specifically to handle something like physics calculation. If I bought and created an SLI system, I would want a faster frame rate through graphics processing. If one card is used for physics calculation, does this mean that the graphical frame rate drops?

My theory concerning this is that indeed the graphics card would be successful at calculating physics; however, what would happen if you wanted to run the game at a high resolution with all the eye candy turned on? I imagine you would still have all the interactivity and proper physics calculation, but the game would not see much frame rate improvement because the graphics card would become a bottleneck.

I think mainly the benefit would be in terms of the game play, not in terms of frame rate. And with GPU rendered physics, you would have to trade game play for frame rate. With a dedicated card, you can have both, with an SLI setup dedicated to rendering frames, and a physics card contributing to the element of game play.



/begins clapping, slowly standing .... claps faster
April 4, 2006 6:15:16 PM

Quote:
I would rather have a physics card, rather then drive the physics through a GPU (regardless of manufacturer). A GPU is designed to render graphics for games. It is true that the architecture has become a lot more general and flexible in allowing certain programming to be executed on the GPU, but it was not designed specifically to handle something like physics calculation. If I bought and created an SLI system, I would want a faster frame rate through graphics processing. If one card is used for physics calculation, does this mean that the graphical frame rate drops?

My theory concerning this is that indeed the graphics card would be successful at calculating physics; however, what would happen if you wanted to run the game at a high resolution with all the eye candy turned on? I imagine you would still have all the interactivity and proper physics calculation, but the game would not see much frame rate improvement because the graphics card would become a bottleneck.

I think mainly the benefit would be in terms of the game play, not in terms of frame rate. And with GPU rendered physics, you would have to trade game play for frame rate. With a dedicated card, you can have both, with an SLI setup dedicated to rendering frames, and a physics card contributing to the element of game play.



/begins clapping, slowly standing .... claps faster


(Bows respectfully)
April 4, 2006 6:28:28 PM

I lifted this from an article on another web site :) 

"Concerning the recent news on NVIDIA’s Physics use on a GPU using Havoc, we spoke with Epic Games’ Mark Rein at the recent GDC conference and he told us “I can use every bit of GPU you can give me” – why use any precious GPU cycles for anything but graphical rendering? Ever thought about what sort of GPU power it’ll take to drive some of the larger LCD’s coming out? "

http://www.gdhardware.com/hardware/ppu/ageia/app/001.ht... <-- the whole article.
I tend to agree with that part there why use the graphics card for physics if I can buy another card to do it... I know it would be cheaper to do it with a video card driver but if im running SLI im not realy worried about whats cheaper LOL
April 4, 2006 6:31:09 PM

My apologies for focusing on Nvidia, my post applies to any graphics company. As far as I can tell, in terms of utilizing the graphics system for physics calculation, ATI seems to have a better architecture then Nvidia; it's more flexible. I think for this entire discussion, the hardware really will not matter so much as the software base. I am concerned about proprietary software that will only work with certain hardware set-ups; There likely will be games with physics tailored specifically for Ageia hardware, tailored for ATI hardware, and tailored for Nvidia hardware. This may be a limiting factor for the end user, because they might not be able to run a certain game with physics calculation embedded in the programming unless they use a certain setup.

Suppose Unreal 2K7 will only support Ageia hardware (an example, I really have no idea if it will be limited); Someone who has an ATI or Nvidia solution will not be able to run the game with physics calculations? In a more extreme case, certain titles may require specific hardware sets entirely, and will not run on a machine without those specific features.

This would mean that a consumer will have to buy hardware based on the game titles available for that hardware. There is a computer that already has done this; it's called a console.

My point is that I hope software companies will have flexible applications, and at least partially support alternative hardware configurations. The PC is a universal entertainment platform. Let's hope it stays that way.
April 4, 2006 6:32:29 PM

Quote:
if you'd followed my posts, which i am not saying you should but you would see that i am the only one reminding everyone about ATI. i don't give a damn about nvidia!!!!! i think competition is good, i also think on paper ATI's idea is the best. allowing developers to make their own solutions based on their architecture means a level playing field and does not require people using ageia's all or nothing approach.


Point taken about ATI, and I did read the article you posted. However the article I posted compares how the addin physics card differs in functionality from a graphics card doing physics. It seems a logical approach to allow the graphics card to handle the physics involved in rendering graphical effects (smoke and fog come to mind) and allowing a dedicated ppu to handle structural effects, like a grenade hitting a building causing an in game effect.

IMHO, the right tool for the right job.
April 4, 2006 6:39:13 PM

I'm sure that there will be flexibility. Aegia's site, even now, has side by side comparisons of what difference a dedicated processor will make. Besides, look at the games now; there is no PPU nor is there physics drivers released. The physics are just hard-coded in the game. It's just like the games that are coded to make use of soundcards, not everyone has them, but you still have sound without them. That's flexibility solution you'll most likely see.
April 4, 2006 7:30:21 PM

Quote:
My apologies for focusing on Nvidia, my post applies to any graphics company. As far as I can tell, in terms of utilizing the graphics system for physics calculation, ATI seems to have a better architecture then Nvidia; it's more flexible. I think for this entire discussion, the hardware really will not matter so much as the software base. I am concerned about proprietary software that will only work with certain hardware set-ups; There likely will be games with physics tailored specifically for Ageia hardware, tailored for ATI hardware, and tailored for Nvidia hardware. This may be a limiting factor for the end user, because they might not be able to run a certain game with physics calculation embedded in the programming unless they use a certain setup.

Suppose Unreal 2K7 will only support Ageia hardware (an example, I really have no idea if it will be limited); Someone who has an ATI or Nvidia solution will not be able to run the game with physics calculations? In a more extreme case, certain titles may require specific hardware sets entirely, and will not run on a machine without those specific features.

This would mean that a consumer will have to buy hardware based on the game titles available for that hardware. There is a computer that already has done this; it's called a console.

My point is that I hope software companies will have flexible applications, and at least partially support alternative hardware configurations. The PC is a universal entertainment platform. Let's hope it stays that way.


Well, PhysX SDK, allows games to run without the PhysX Hardware, although you won´t get the same level of physics you´d get with the card. And it´s a very nice tool (i´ve been reading it´s manuals for start developing on it :)  ).
April 4, 2006 7:48:58 PM

I know that wasn't the best example. I didn't go into the fact that these processors are only good at mathmatics and even though the opteron only has roughly 5.7 GFLOPS it would blow away these co-processor at other tasks.
April 4, 2006 8:35:11 PM

Quote:
if you'd followed my posts, which i am not saying you should but you would see that i am the only one reminding everyone about ATI. i don't give a damn about nvidia!!!!! i think competition is good, i also think on paper ATI's idea is the best. allowing developers to make their own solutions based on their architecture means a level playing field and does not require people using ageia's all or nothing approach.


Point taken about ATI, and I did read the article you posted. However the article I posted compares how the addin physics card differs in functionality from a graphics card doing physics. It seems a logical approach to allow the graphics card to handle the physics involved in rendering graphical effects (smoke and fog come to mind) and allowing a dedicated ppu to handle structural effects, like a grenade hitting a building causing an in game effect.

IMHO, the right tool for the right job.


Sounds about right to me a certain amount of physics done by the GPU like smoke and particles makes sense. The physics done by a PPU should be more colision and gravity based type things.
April 7, 2006 7:09:21 PM

If nVidia manage to do what they claim and enable their crads to deal with physics as well as graphics than thats a good thing but lets be honest. When all there was was CPU and no dedicated graphics chip look at what graphics we could have (look now at 3dmrak and the cpu tests and how slow the cpu only tests run even on an FX57).

Then came along the dedicated graphics chip. Good old 3DFX. But do we remember how we felt about them. A waste of money people said. We dont need great graphics the cpu can do what we need was what was being said at first. But then as games developed we cant live with out graphics cards anymore. If you want to play even the shitest of games you need an ATI or nVidia card. Even better two. (Aply the same to sound cards)

Now what happens when someone comes along and creates a new card dedicated to another aspect of gaming. The Physix Card. The same will happen but at the end of the day if you want to play the best games looking at their best, sounding thier best and now acting, reacting and moving (clothes, trees, building, explosions, wind hair etc) their best then you will need all the dedicated hardware you can get.

In the end games developers will begin to program all their games to use the new physix chip. Games like Unreal 3 and Ghost Recon Advanced Warfighter are already starting. All will follow.

I understand what people are saying about nVidia and ATI having powerful graphics chips that could theoretically handle physics but why spend money on sli if one will do physics and one graphics. In the end games will run slow again. A dedicated chip is the only true way to go when performance is concearned. I dont want to sacrifice graphics performance for the sake or hair moving realistically in the wind. I want a dedicated card for physics so that when hair is moving in the wind while being lit by 100 light sources all different colurs (extreme example but you get the picture) it looks realistic and thanks to a dedicated graphics card itll look as good as it moves and thanks to a dedicated sound card itll all sound great and thanks to the cpu ill be able to look at it at any angle etc

All in all

Dedicated Sound great
Dedicated Graphics amazing
Dedicated Physics PERFECT
April 7, 2006 7:44:33 PM

I'm willing to pay $250 as long as they get a PCIe 1x card out.
April 7, 2006 8:45:01 PM

i think you're looking at this the wrong way, no offense. the PPU will allow developers to push games FURTHER, and allow you to to enable physics beyond what your graphics cards are capable, seeing as how they have to keep up with the ever increasing demand for graphical rendering. can you imaging what crysis will demand of your GPU/GPUs when displaying it at 1920x1200 or 2560x1600? dont you think that allowing something other than the GPU to do the physics would allow you to use their power for say HDR, AA, etc? your assumption doesnt take into account that games will continue to progress, or the fact that games now can push your system to its limits, even with SLI.

i am not questioning or debating the fact that the GPU can be used, or even used well, for physics processing.
April 7, 2006 8:48:40 PM

This PPU hype is nice and if applied correctly, can become the next big thing. However, as with all first-time stuff, there is one possible problem:

bugs

Remember the first GPU's? The ones that were called graphics deccelerators? I hope he same doesnt happen to the PhysX cards, because then we will have to wait a while for proper physics goodness.
April 8, 2006 1:19:38 AM

ok, once again, it appears you are determined not to try and view this any other way. you ignored the fact that youll be able to have better games developed because of the addition of the PPU and that hardware and software capabilities/requirements go back and forth as to which is keeping up with which. food for thought: why is the cell processor supposed to be so great?
on the other hand, nice rig. mine is very similar. just ordered the 2405fpw, getting it tuesday. thinkin about the x-fi, not sure yet. my case, PSU, and Memory are way better tho. jp

CPU: Athlon64 X2 4400+
Motherboard: ASUS A8N-E
Graphics Card: EVGA 7800GT
Memory: 2048 Corsair XMS DDR 500
Hard Drive: WD Raptor 73.6
Cooling: Zalman CNPS9500
Power Supply: PC Power & Cooling 510
Speakers: Logitech Z-2300's
Monitor: soon to be 2405fpw
Case: Lian-Li V2000B
April 8, 2006 7:29:12 AM

Technology is a lot stronger today and more testing is done before products are released. Think about it. There is a lawsuit for every city in the country, so companies are more and more sure about their products as they are released.

As far as practicality? Games are encorporating more complex physics involving situations so more dedication is required: ppu.
April 8, 2006 8:56:37 AM

ther are no app for ageia physics processor ......yet...i heard it`s cooking now
April 8, 2006 10:24:11 AM

We all need to bear some things in mind. The new videos that are being released for games including Ghost recon advanced warfighter and the Unreal tech demo / UT2007 look amazing. But if you look closely the physics also look better than anything weve seen before. Example is in Ghost recon where the building blows up falls down and dust and debrie fly everywhere and vision becomes impaired. This is all thanks to the physix hardware. The new crysis video is also a prime example. This is by far the best looking game ive ever seen. Granted this down to sheer GPU processing power but youll need the pysix chip to play the game the way the videos show it being played, ie moving plants, destructible buildings, falling leaves, trees bodies on impact etc. There is no way, no matter how powerful a gpu becomes will it be able to make crysis look that amazing AND have all the great physix. The GPU will be too stressed computing all graphical effects meaning that there wil be no power left for physix. GPU's do use all their power for rendering so a dedicated physics processor is the only viable way of making a game play amazing. A dediacted GPU is the only way of making and game look amazing, A dedicated sound card is the only way of making a game sound amazing and a CPU is the only way of bringing it all together with great AI
!