The first nails in the PhysX coffin........

G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

The obvious alternate to the PhysX add-in-card physics accelerator:-

http://www.gamesindustry.biz/content_page.php?aid=10177

The PhysX no doubt would provide superior performance, but (a) who
is going to pay $300 and (b) what games developer is going to bother
to support it when the above offering from Havok is a natural
alternative and can be readily integrated into the current
multiplatform development environments. Also, within a year dual-core
PCs will be as standard as DVD-roms in PCs today. Every desktop PC
sold will be dual-core capable by merely changing the CPU, if not
already supplied as a dual-core machine.

Seems as if the PhysX is already dead-in-the-water. 5 years too late.

John Lewis
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

Multi-core CPU's will use twice the power to do less than a quarter the
work. They will also cost more. A dual core CPU right now is going for
almost 600 dollars.

When sound cards first came out, they cost about 200 dollars. Yet today
some kind of dedicated sound processing chip or card is part of just about
every gaming PC.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

But the thing about those $200-$300 sound and video cards is that they
let your computer do something you simply could not do by throwing more
CPU cycles at the problem.
On admittedly casual observation, I don't see the same thing applying
to a physics processor.

Kendt
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
news:42dbc5d9.2806200@news.verizon.net...

> The PhysX no doubt would provide superior performance, but (a) who
> is going to pay $300

For a graphics card, they said, way back when ...

But I've been wondering why nobody has said they're going to do this. Take a
single-player game of BF2 - imagine playing with 128 or 256 NPC's with their
AI all running on one core while the other core takes care of everything
else.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"eventerke@nspm.h0tmail.com" <eventerke@hotmail.com> wrote in message
news:1121731343.607084.145840@g43g2000cwa.googlegroups.com...
> On admittedly casual observation, I don't see the same thing applying
> to a physics processor.

A dual core CPU couldn't do the physics of 30,000 rigid bodies
interacting, either.

And BTW, most CPU's can do the stuff the soundcard does in software
(software algorithms are part of DirectSound 3D). Not all of it but most of
it. Of course it would come at a performance penalty. In fact many less
expensive sound chips have much of the sound engine itself in software.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

In article <1121731343.607084.145840
@g43g2000cwa.googlegroups.com>, eventerke@hotmail.com
says...
> But the thing about those $200-$300 sound and video cards is that they
> let your computer do something you simply could not do by throwing more
> CPU cycles at the problem.
Actually you can, GPUs and SPUs are just like CPUs,
just faster at doing their specialized task then your
average CPU.

> On admittedly casual observation, I don't see the same thing applying
> to a physics processor.
The real issues here:
Will the graphics card companies try to make their own
physics accellerators as add-ons to their GPUs?
Will the cores in CPUs ramp up enough to be able to
perform physics on par with dedicated boards?
Can the PC market support the hardware sales that are
needed to support development? (given the prolonged
slump it is in atm)
Will gamers want the type of physics that the card can
provide?

- Factory
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

> The real issues here:
> Will the graphics card companies try to make their own physics
> accellerators as add-ons to their GPUs?

That's the most logical move to me. A two-in-one, single-slot solution.
Would the PCI-e bus be able to handle the bandwidth of both a graphics
processor and a dedicated physics processor?

The initial game offerings would also have to be backwards compatible with
CPU-only setups.

- f_f
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

factory wrote:
> In article <1121731343.607084.145840

> Actually you can, GPUs and SPUs are just like CPUs,
> just faster at doing their specialized task then your
> average CPU.

I should have clarified what I meant - could not do at the time, with
the hardware available for a PC. I wonder how many games these days
are CPU bound (I realize that's a chicken-egg question - developers are
coding for the hardware that's out there).

> The real issues here:
> Will the graphics card companies try to make their own
> physics accellerators as add-ons to their GPUs?
> Will the cores in CPUs ramp up enough to be able to
> perform physics on par with dedicated boards?
> Can the PC market support the hardware sales that are
> needed to support development? (given the prolonged
> slump it is in atm)
> Will gamers want the type of physics that the card can
> provide?
>
> - Factory

Good points. It's starting to seem like a dedicated physics processor
is more in line with the simulator market (driving, flight) than with
general gaming. Bundling with a graphics card seems like a great idea.
It would probably help push along the installed base faster than a
stand-alone card. I wonder if there are some interesting efficiencies
that could be realized be accelerating movement and rendering
calculations on the same hardware (or least out of shared high-speed
memory).

Kendt
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

The physics cards will use not much more bandwith than an Audigy
soundcard. They will work even with the regular PCI bus. The internal
calculations of the physics will be very bandwith intensive, which is why
the cards will have high speed memory ordinarily used on graphics cards.
However, the actual information it will send back to the PC will be
relatively simple stuff. The CPU doesn't have to know why the rigid body
(for example, the ubiquitous crate) moves, it just needs to know where it
needs to move it.
 

Andrew

Distinguished
Mar 31, 2004
2,439
0
19,780
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

On Tue, 19 Jul 2005 13:26:44 GMT, factory <t@t.com> wrote:

> Will gamers want the type of physics that the card can
>provide?

Personally it is something that would be nice to have, but I am
certainly not going to much pay extra for it. When I play HL2 and BF2,
I just enjoy the games, I have never thought to myself "this is good,
but it needs more physics". Games starting to effectively utilize dual
core CPU's (which I don't have yet) would be a more interesting
prospect.
--
Andrew, contact via interpleb.blogspot.com
Help make Usenet a better place: English is read downwards,
please don't top post. Trim replies to quote only relevant text.
Check groups.google.com before asking an obvious question.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

Most games now days are very much dependent on the graphics card. Doom 3
being the perfect example.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

In article <EScDe.186$xn.73@bignews6.bellsouth.net>,
Magnulus <magnulus@bellsouth.net> wrote:
# The physics cards will use not much more bandwith than an Audigy
#soundcard. They will work even with the regular PCI bus. The internal
#calculations of the physics will be very bandwith intensive, which is why
#the cards will have high speed memory ordinarily used on graphics cards.
#However, the actual information it will send back to the PC will be
#relatively simple stuff. The CPU doesn't have to know why the rigid body
#(for example, the ubiquitous crate) moves, it just needs to know where it
#needs to move it.

That's funny, I thought it was the job of the PhysX processor to
determine where it needs to move it. If you hit an object off-center and
it needs to rotate instead of translate, in your scenario, the main
processor has to figure this out?

Then you take 400 of these objects, some connected to each other and
some bouncing off each other, each collision changing each other's
trajectory and orientation, at a reasonable rate for realism like 15-20
times a second, and you propose to pump this through the PCI bus?

Take a look at the Altivec processing and how it is integrated into the
PowerPC CPU and you'll see how closely integrated it really needs to be,
to work.

Ken.
--
-------------------------------------------------------------------------
Mail: kmarsh at charm dot net | Fire Rumsfeld, secure Iraq's borders.
WWW: http://www.charm.net/~kmarsh | Our border with Mexico too.
-------------------------------------------------------------------------
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

All I know is that Ageia believes it will work over the regular PCI bus.
It would just about have to, as most motherboards only have free PCI slots,
though they are talking also about PCI-Express versions. I don't know how
they are going to do that, as at most, new motherboards only have PCI-e x1
network slots (they do have graphics cards that will work in these slots,
though, just not every PCI-e motherboard has them), in addition to the PCI-e
x16 graphics card slot.

Take a look at how the Audigy soundcard works. It does some very
complicated stuff in terms of working with geometry reflections, but
obviously it works, even with the rather old PCI bus.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

In article <EScDe.187$xn.5@bignews6.bellsouth.net>,
Magnulus <magnulus@bellsouth.net> wrote:
Imagine being able to blow holes in walls with
#rocket launchers anywhere on a map, having real sandstorms, avalanches,
#rockslides, incredibly realistic explosions with realistic flying shrapnel.
#You won't be able to do any of those things simply with dual cores.

And... why not? Simulators are doing all these things with single cores
right now. They COULD do it better with dual CPU's. (Whether they'll
bother depends on the stupidity of the game manufacturers.)

# When people see this stuff in action they will demand it.

Just like they are demanding better AI?

Ken.
--
-------------------------------------------------------------------------
Mail: kmarsh at charm dot net | Fire Rumsfeld, secure Iraq's borders.
WWW: http://www.charm.net/~kmarsh | Our border with Mexico too.
-------------------------------------------------------------------------
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"Ken Marsh" <kmarsh@fellspt.charm.net> wrote in message
news:wReDe.8$fb1.1406@news.abs.net...
> And... why not? Simulators are doing all these things with single cores
> right now.

Most simulations have much simpler physics than you think.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"John Lewis" <john.dsl@verizon.net> wrote in message
news:42ddc0af.1412458@news.verizon.net...
> AI is far more important to the single-player experience than physics
> perfection. And obviously you have not played with the bots in Unreal
> Tournament 2004 after a little tweaking on the custom settings.

But hardly anybody talks about the AI in Unreal Tournament, because it's
an online game.

DICE and good AI don't go together. I daresay the AI in Starwars
Battlefront was better than in any DICE game. And it didn't cause much
slowdown. But again, people don't rave about the AI in what is basicly a
multiplayer game. Multiplayer game= no AI needed.

> The dual-core can equally handle Havok's multi-thread physics
> computations and the AI computations,

In my cynical mind, I see the upcomming games having no better AI than in
the past. Really, how much AI do you need in a singleplayer game that is
heavily based around scripting and trigger points, which seems to be what
most FPS games are doing? OTOH, a single core 2 GHz CPU can only handle a
few hundred rigid body physics calculations. A dual core CPU will at most
handle a thousand or so. It will not be able to handle fluids or thousands
of particles.

>
> PhysX is dead-duck in today's computer-gaming market. And the
> consoles certainly won't be embeeding a PhysX chip any time
> soon. So the multi-platform developers would have to specially
> accommodate the few purchasers of the PhysX card....
> that won't happen..

I for one would like to see fewer multiplatform games. Consoles can do
what they want, I want to see what PC developers can do.

Your arguement is also foolish. Rockstar added EAX support to Grand Theft
Auto, even though they didn't have to, as there was nothing similar on the
PS2 or XBox. Assuming consoles developers won't add similar physics support
to at least some PC ports is absurd, especially considering Ageia has
licensed their middleware engine to several console developers, including
Epic Megagames (Unreal).
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

On 19 Jul 2005 13:42:55 -0700, "eventerke@nspm.h0tmail.com"
<eventerke@hotmail.com> wrote:

>
>
>factory wrote:
>> In article <1121731343.607084.145840
>
>> Actually you can, GPUs and SPUs are just like CPUs,
>> just faster at doing their specialized task then your
>> average CPU.
>
>I should have clarified what I meant - could not do at the time, with
>the hardware available for a PC. I wonder how many games these days
>are CPU bound (I realize that's a chicken-egg question - developers are
>coding for the hardware that's out there).
>
>> The real issues here:
>> Will the graphics card companies try to make their own
>> physics accellerators as add-ons to their GPUs?
>> Will the cores in CPUs ramp up enough to be able to
>> perform physics on par with dedicated boards?
>> Can the PC market support the hardware sales that are
>> needed to support development? (given the prolonged
>> slump it is in atm)
>> Will gamers want the type of physics that the card can
>> provide?
>>
>> - Factory
>
>Good points. It's starting to seem like a dedicated physics processor
>is more in line with the simulator market (driving, flight) than with
>general gaming. Bundling with a graphics card seems like a great idea.
> It would probably help push along the installed base faster than a
>stand-alone card. I wonder if there are some interesting efficiencies
>that could be realized be accelerating movement and rendering
>calculations on the same hardware (or least out of shared high-speed
>memory).

I wonder how well that would work. Presumably a Physics card would not
only need to receive data, but it would also need to push back its
results to the CPU. AGP is great for RECEIVING data (usually textures
directly from memory), but its not really a two-way street; the
bandwidth in the other direction is pretty minimal in comparison. And
the GPU probably hogs that all to itself already, so an AGP
GPU/Physcard could very well have bandwidth issues.

Of course, with the industry slowly moving to PCI-E, this may not be
an issue... but I don't know if PCI-E's bandwidth is asymetrical or
not.


>
>Kendt
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

On Wed, 20 Jul 2005 01:27:45 -0400, "Magnulus"
<magnulus@bellsouth.net> wrote:

>
>"John Lewis" <john.dsl@verizon.net> wrote in message
>news:42ddc0af.1412458@news.verizon.net...
>> AI is far more important to the single-player experience than physics
>> perfection. And obviously you have not played with the bots in Unreal
>> Tournament 2004 after a little tweaking on the custom settings.
>
> But hardly anybody talks about the AI in Unreal Tournament, because it's
>an online game.
>
> DICE and good AI don't go together. I daresay the AI in Starwars
>Battlefront was better than in any DICE game.

Have you played BF2 ? SP mode ? Or are you still stuck on Joint
Operations ?

>And it didn't cause much
>slowdown. But again, people don't rave about the AI in what is basicly a
>multiplayer game. Multiplayer game= no AI needed.
>
>> The dual-core can equally handle Havok's multi-thread physics
>> computations and the AI computations,
>
> In my cynical mind, I see the upcomming games having no better AI than in
>the past. Really, how much AI do you need in a singleplayer game that is
>heavily based around scripting and trigger points, which seems to be what
>most FPS games are doing?

Far Cry. I'm sure that Crytek is drooling right now over the AI
possibilities with a second full CPU at their disposal. The
autonomous-AI implementation in Far Cry was constrained by
processor load. And Far Cry's physics ( and BF2's physics also) are
just fine forme and probably 99% of other action-gamers. Sorry,
I want bots that behave more like real-live opponents BEFORE
I kill them ( or they kill me...) Exactly how they fly apart after I
blast them I care little - rag-doll is just fine.

> OTOH, a single core 2 GHz CPU can only handle a
>few hundred rigid body physics calculations. A dual core CPU will at most
>handle a thousand or so. It will not be able to handle fluids or thousands
>of particles.
>

Who cares........really....unless you are a submarine or aircraft
designer.

>>
>> PhysX is dead-duck in today's computer-gaming market. And the
>> consoles certainly won't be embeeding a PhysX chip any time
>> soon. So the multi-platform developers would have to specially
>> accommodate the few purchasers of the PhysX card....
>> that won't happen..
>
> I for one would like to see fewer multiplatform games. Consoles can do
>what they want, I want to see what PC developers can do.

The exclusive-PC developer is unfortunately a rapidly dying breed.
Sorry...
>
> Your arguement is also foolish. Rockstar added EAX support to Grand Theft
>Auto, even though they didn't have to, as there was nothing similar on the
>PS2 or XBox.

So what --- they added it to their previous game-ports to the PC. The
tooling is simple, and probably 90% of action-gamers have Creative
audio in their machines.

> Assuming consoles developers won't add similar physics support
>to at least some PC ports is absurd, especially considering Ageia has
>licensed their middleware engine to several console developers, including
>Epic Megagames (Unreal).

Use Ageia software and port over to multicore PC, sure - as a
potential alternate to Havok.

Support Ageia hardware ( i.e PhysX ) -- few or no takers.

So, please let us know when:-

(a) you buy your PhysX board

( b) you buy a dual-core CPU

If I was a betting person, I would take odds on (b) happening
long before (a)......... Of course, I am making the rash assumption
that you need to budget your PC-related expenditures; that you are
not rolling in so much dough that you can buy anything to which
you take a fancy, with ot without game-developer support.

John Lewis
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

You don't think fluids or particles are important? People talk about
pushing graphics and photorealism- well, you aren't going to be able to do
those things unless the physics to make them behave realisticly is there.
Paying 250 dollars for a physics card is relatively trivial compared to the
500 dollars for a dual core CPU, and in either case, support will depend on
developers. Getting a dual core CPU won't automaticly mean anything for
games. In general game applications right now a dual core CPU does very
little.
 

Andrew

Distinguished
Mar 31, 2004
2,439
0
19,780
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

On Wed, 20 Jul 2005 03:26:24 -0400, "Magnulus"
<magnulus@bellsouth.net> wrote:

> You don't think fluids or particles are important? People talk about
>pushing graphics and photorealism- well, you aren't going to be able to do
>those things unless the physics to make them behave realisticly is there.
>Paying 250 dollars for a physics card is relatively trivial compared to the
>500 dollars for a dual core CPU, and in either case, support will depend on
>developers. Getting a dual core CPU won't automaticly mean anything for
>games. In general game applications right now a dual core CPU does very
>little.

But in a year or two's time, dual core CPU's will be common place and
a lot cheaper, and there will be about 5 people that will have bought
a PhysX card.
--
Andrew, contact via interpleb.blogspot.com
Help make Usenet a better place: English is read downwards,
please don't top post. Trim replies to quote only relevant text.
Check groups.google.com before asking an obvious question.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

"Andrew" <spamtrap@localhost.> wrote in message
news:l9vrd1tu15emcnicbqki4tgidim4jk1nf3@4ax.com...
> But in a year or two's time, dual core CPU's will be common place and
> a lot cheaper, and there will be about 5 people that will have bought
> a PhysX card.

In a year- no. The Dell's of the world have zero interest in multi-core
computers except maybe for their very high end gaming rigs. The PhysX chip
will be out later this year. That's one year's head start.

And JL is missing a big fact. The PS3 is not a multi-core processor.
It's just a single core with 7 DSP's. Not that different in theory from
the PhysX chip or the Audigy's processor. Which means there will only be
one really multi-core console, and even then most developers will use the
other cores for prefetching of data initially, not for running multiple
threads.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

See my clarification above - I didn't mean impossible, just not
technically feasible at the time.

I still think there is a fundamental difference between what 3d
accelerators and sound cards brought to the gaming experience and what
a physics processor could add, on a $/CPU cycles vs. WOW factor.
After all, we interact with our PCs mainly through what we see and
hear. If I break a window or blow a hole in a wall in a game, I'm not
sure how much more fun it would be knowing the interactions of every
shard of glass and piece of brick were calculated to the nth degree of
precision ;).
On something like a flight or driving simulator you could have the
physics processor handling complex aero calculations or a
high-precision tire and suspension model, freeing up the CPU for much
better AI (which is very noticable in the single-player portions of
these games). That's something I really might consider paying $100+ to
have - several of the best sims are highly CPU-bound. But it's all
about developer support - you know they'll do something to properly
take advantage of dual-core CPU's before they do anything like PhysX
support.

Kendt
 

Andrew

Distinguished
Mar 31, 2004
2,439
0
19,780
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

On Wed, 20 Jul 2005 05:09:57 -0400, "Magnulus"
<magnulus@bellsouth.net> wrote:

> In a year- no. The Dell's of the world have zero interest in multi-core
>computers except maybe for their very high end gaming rigs. The PhysX chip
>will be out later this year. That's one year's head start.

The PhysX card is purely aimed at high end gaming. How many casual
gamers would fork out $250 for a PhysX card? I am at the geeky
relatively high end of PC gaming and I am not at all interested in
buying one at that price. Dual core CPU's are already available, that
is even more of a head start.
--
Andrew, contact via interpleb.blogspot.com
Help make Usenet a better place: English is read downwards,
please don't top post. Trim replies to quote only relevant text.
Check groups.google.com before asking an obvious question.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

Keep in mind the PhysX chip will come down in price over time, just as
sound cards have . Initially the Audigy card was more than it is today, too.
Also, the expectation is that a physics card/chip will have a longer, more
stable life cycle than a graphics card, because the hardware is not going to
advance as quickly. Thus it will mirror sound cards more than graphics
cards.
 
G

Guest

Guest
Archived from groups: comp.sys.ibm.pc.games.action (More info?)

eventerke@nspm.h0tmail.com wrote:
> But the thing about those $200-$300 sound and video cards is that they
> let your computer do something you simply could not do by throwing more
> CPU cycles at the problem.
> On admittedly casual observation, I don't see the same thing applying
> to a physics processor.
>
> Kendt
>

Why not?

Nobody thought a piece of silicon would be doing to pixels what they are
now doing.

More CPU cycles, by definition, equals more processing for the same
instruction set which means more physics calculations.

--
Walter Mitty
-
Useless, waste of money research of the day : http://tinyurl.com/3tdeu
" Format wars could 'confuse users'"
http://www.tinyurl.com