Dedicated Game Physics Card for Gamers???

Archived from groups: comp.sys.ibm.pc.games.action (More info?)

Check it out!
http://www.theinquirer.net/?article=21648

rms
15 answers Last reply
More about dedicated game physics card gamers
  1. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    Andrew wrote:
    > Moore's law and
    > multicore CPU's will make this obsolete IMO.

    If that was true, we'd still be running on VGA cards with the CPU doing
    all the 3D rendering. In reality, simple tasks like rendering 3D
    triangles and performing physics calculations can be done much more
    efficiently in dedicated hardware than a general-purpose CPU... plus
    offloading them to hardware leaves more cycles for AI and other
    CPU-intensive tasks.

    I've been wondering for a while when Microsoft would add
    'DirectPhysics' to all the other DirectX interfaces and open up a whole
    new hardware market... may not be too long now.

    Mark
  2. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    Andrew wrote:
    > 3D rendering is way more computationally intensive than physics. The
    > fact that you can run HL2 on a 1Ghz CPU but it requires a 3D card
    > demonstrates it.

    Physics is way more computationally intensive than 3D graphics if done
    properly. HL2 physics is on the same kind of level as the 3D graphics
    in Quake 1: impressive for its time, but a joke a few years later.

    > Possibly a PPU could be embedded somewhere in a PC, but the idea of
    > gamers having to buy one as a separate card will never happen IMO.

    I have a suspicion it may end up on the graphics card (since it could
    perform the physics calculations and feed vertices directly into the
    graphics chip), but I wouldn't be too surprised if we're buying physics
    cards for PCs as well as graphics cards in the next few years.

    Mark
  3. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    On Tue, 08 Mar 2005 00:56:06 GMT, "rms" <rsquires@flashREMOVE.net>
    wrote:

    >Check it out!
    > http://www.theinquirer.net/?article=21648

    I can't see that catching on myself. When I played through HL2 I
    didn't find myself thinking "this needs more physics". Moore's law and
    multicore CPU's will make this obsolete IMO.
    --
    Andrew, contact via interpleb.blogspot.com
    Help make Usenet a better place: English is read downwards,
    please don't top post. Trim replies to quote only relevant text.
    Check groups.google.com before asking an obvious question.
  4. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    In article <GK6Xd.1394$ZB6.323
    @newssvr19.news.prodigy.com>, rsquires@flashREMOVE.net
    says...
    > Check it out!
    > http://www.theinquirer.net/?article=21648

    The thing is that graphics card manufacturers are
    already making expansion cards with a decent amount of
    memory, with fast, highly vectorised processors, and are
    used to interacting with game developers. This company
    is going to have to be competing with Nvidia and ATI.

    - Factory
  5. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    I could see something like that working, especially if it's PCI
    compatible

    It would be useful not only in first-person shooters and action games, but
    also simultions. Already, games like IL-2 and Microsoft Flight Simulator
    can be bogged down by the physical calculations- which is why they are so
    CPU limited. It might make more sense to upgrade a physics chip rather than
    a CPU.
  6. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    On 8 Mar 2005 04:20:37 -0800, mmaker@my-deja.com wrote:

    >If that was true, we'd still be running on VGA cards with the CPU doing
    >all the 3D rendering. In reality, simple tasks like rendering 3D
    >triangles and performing physics calculations can be done much more
    >efficiently in dedicated hardware than a general-purpose CPU... plus
    >offloading them to hardware leaves more cycles for AI and other
    >CPU-intensive tasks.

    3D rendering is way more computationally intensive than physics. The
    fact that you can run HL2 on a 1Ghz CPU but it requires a 3D card
    demonstrates it.

    Possibly a PPU could be embedded somewhere in a PC, but the idea of
    gamers having to buy one as a separate card will never happen IMO.
    --
    Andrew, contact via interpleb.blogspot.com
    Help make Usenet a better place: English is read downwards,
    please don't top post. Trim replies to quote only relevant text.
    Check groups.google.com before asking an obvious question.
  7. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    On Tue, 08 Mar 2005 12:47:04 +0000, Andrew <spamtrap@localhost.>
    wrote:


    >3D rendering is way more computationally intensive than physics.

    Not in the world of simulations.

    >The
    >fact that you can run HL2 on a 1Ghz CPU but it requires a 3D card
    >demonstrates it.

    HL2 is not a simulation.
  8. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    "rms" <rsquires@flashREMOVE.net> wrote in message
    news:GK6Xd.1394$ZB6.323@newssvr19.news.prodigy.com...
    > Check it out!
    > http://www.theinquirer.net/?article=21648
    >
    > rms
    >

    They best be reasonably priced or they won't catch on...
  9. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    On Tue, 08 Mar 2005 00:56:06 GMT, "rms" <rsquires@flashREMOVE.net>
    wrote:

    >Check it out!
    > http://www.theinquirer.net/?article=21648
    >
    >rms
    >
    >

    Nothing new here. DSP processor cards have been used for years
    to speed up physics and spatial calculations and offload the CPU.
    Take a look at some of the computer-animation systems. A custom-chip
    is senseless..the high-powered DSPs from TI or Analog Devices
    are far cheaper. This is highly-likely to be a company taking some
    naive-in-the-technology venture-capitalists for a ride...or maybe
    deluding themelves on the way...

    Without a common sw-interface standard agreed on by
    all game developers, the general usage of such hardware is moot...
    the hardware developer would have to write custom-interface code
    for each game, if not the physics code itself.

    The Cell processor in the PS3 is likely to be able to easily dole out
    physics operations among its compute units, but remember this is a
    closed system and can permit such flexibility between its software
    and hardware. Not so easy in the mostly-single-CPU-centric PC
    ( regardless of multiple cores and HT ). Remember, the data is still
    bottlenecked at the pins of a single-chip and sharing that data
    in an efficient non-symmetric way with a custom 3rd-party processing
    board as proposed in the article is a hair-raising system and
    interface software task.

    Anyway, the guy who wrote the article can't even spell and can't
    get developers' names right.

    John Lewis
  10. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    <mmaker@my-deja.com> wrote in message
    news:1110284437.787116.113710@o13g2000cwo.googlegroups.com...
    > Andrew wrote:
    >> Moore's law and
    >> multicore CPU's will make this obsolete IMO.
    >
    > If that was true, we'd still be running on VGA cards with the CPU doing
    > all the 3D rendering. In reality, simple tasks like rendering 3D
    > triangles and performing physics calculations can be done much more
    > efficiently in dedicated hardware than a general-purpose CPU... plus
    > offloading them to hardware leaves more cycles for AI and other
    > CPU-intensive tasks.
    >
    > I've been wondering for a while when Microsoft would add
    > 'DirectPhysics' to all the other DirectX interfaces and open up a whole
    > new hardware market... may not be too long now.
    >
    > Mark
    >

    I can see it now. A dedicated CPU could be used for graphics, physics,
    sound, AI, you name it. But the real deal is that graphical texturing and
    polygon pushing require some of the most intense computations. I would think
    that dual CPU setups, or dual core setups as are in the lineup for future
    CPU's, would mitigate the need for something like a dedicated physics card.
    Although most people thought 3D cards were also a gimmick at first. Now even
    the simplest of games use 3D.
  11. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    > Without a common sw-interface standard agreed on by
    > all game developers, the general usage of such hardware is moot...
    > the hardware developer would have to write custom-interface code
    > for each game, if not the physics code itself.
    >

    Why would it be any different than current graphics or sound cards? DirectX
    is a standard API. Just integrate a physics API and voila, you're all set. I
    would think with a physics card you would just have to specify material
    properties like its state (liquid, gas, solid), density,
    hardness/brittleness, even its boiling and melting points. Then the physics
    processor would crunch the numbers and pass it on to the video processor.
  12. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    On Wed, 9 Mar 2005 18:34:44 -0500, "HockeyTownUSA"
    <magma@killspam.comcast.net> wrote:

    >
    >> Without a common sw-interface standard agreed on by
    >> all game developers, the general usage of such hardware is moot...
    >> the hardware developer would have to write custom-interface code
    >> for each game, if not the physics code itself.
    >>
    >
    >Why would it be any different than current graphics or sound cards? DirectX
    >is a standard API. Just integrate a physics API and voila, you're all set. I
    >would think with a physics card you would just have to specify material
    >properties like its state (liquid, gas, solid), density,
    >hardness/brittleness, even its boiling and melting points. Then the physics
    >processor would crunch the numbers and pass it on to the video processor.
    >

    And how long has it taken to get to the current version of DirectX ?
    Apply for a job at AGEIA. If you get hired you shoud ensure very
    long-term employment including countless standards-meetings, at
    least up to the time they run out of money...

    John Lewis

    >
  13. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    On Wed, 9 Mar 2005 18:34:44 -0500, "HockeyTownUSA"
    <magma@killspam.comcast.net> wrote:

    >
    >> Without a common sw-interface standard agreed on by
    >> all game developers, the general usage of such hardware is moot...
    >> the hardware developer would have to write custom-interface code
    >> for each game, if not the physics code itself.
    >>
    >
    >Why would it be any different than current graphics or sound cards? DirectX
    >is a standard API. Just integrate a physics API and voila, you're all set. I
    >would think with a physics card you would just have to specify material
    >properties like its state (liquid, gas, solid), density,
    >hardness/brittleness, even its boiling and melting points. Then the physics
    >processor would crunch the numbers and pass it on to the video processor.
    >
    >

    Has Havok shown any interest in this development ?
    Layering their physics models on this hardware seems a natural.
    No indication in the article as to whether AGEIA sought input
    from leaders in game-physics implementations ( DICE, Havok
    etc) in the design of their chip. If not, it is probably doomed.

    John Lewis
  14. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    "John Lewis" <john.dsl@verizon.net> wrote in message
    news:422ff825.10313319@news.verizon.net...
    > On Wed, 9 Mar 2005 18:34:44 -0500, "HockeyTownUSA"
    > <magma@killspam.comcast.net> wrote:
    >
    >>
    >>> Without a common sw-interface standard agreed on by
    >>> all game developers, the general usage of such hardware is moot...
    >>> the hardware developer would have to write custom-interface code
    >>> for each game, if not the physics code itself.
    >>>
    >>
    >>Why would it be any different than current graphics or sound cards?
    >>DirectX
    >>is a standard API. Just integrate a physics API and voila, you're all set.
    >>I
    >>would think with a physics card you would just have to specify material
    >>properties like its state (liquid, gas, solid), density,
    >>hardness/brittleness, even its boiling and melting points. Then the
    >>physics
    >>processor would crunch the numbers and pass it on to the video processor.
    >>
    >>
    >
    > Has Havok shown any interest in this development ?
    > Layering their physics models on this hardware seems a natural.
    > No indication in the article as to whether AGEIA sought input
    > from leaders in game-physics implementations ( DICE, Havok
    > etc) in the design of their chip. If not, it is probably doomed.
    >
    > John Lewis
    >
    >

    Yeah, aren't the physics in HL2 modeled with just four lines of code?
  15. Archived from groups: comp.sys.ibm.pc.games.action (More info?)

    Sounds great. I'm sure it will become a standard part of gaming
    hardware. We can always use more power and realism. And companies are
    always looking for a way to get an edge in the marketplace.
Ask a new question

Read More

PC gaming Games IBM Video Games