Sign in with
Sign up | Sign in
Your question

Nvidia puts their money on PhysX

Last response: in Graphics & Displays
Share
October 13, 2006 2:14:56 PM

http://www.theinquirer.net/default.aspx?article=35068

Thought this was funny.
Sorry if its a double post.

Inq + Friday 13 = Any conclusion remains possible.
October 13, 2006 2:35:38 PM

Hmm well I'd be tempted to say it's just the Inquirer, but there is photographic evidence.

This is just wild speculation, but perhaps it wouldn't be crazy to conclude Nvidia is planning on buying out Aegia? It would make sense to include the PhysX chip on a graphics card, like the 3D accelerator was integrated onto 2D cards.

Of course it could just be another sensationalist Inquirer story!
October 13, 2006 2:48:50 PM

If I was Nvidia, even before the ATI/AMD merger stuff, I would have jumped on Ageia. It gives them an advantage that ATI's arch would have otherwise had. I can't understand why they've waited so long.
Related resources
October 13, 2006 2:49:01 PM

i doubt it since they've been working with Havoc to make their own physics processor...if anything the Dell came with the Aegia card...
October 13, 2006 2:56:45 PM

I personally think the concept of a dedicated physics card is dead. It makes more sense to offload the physic to a seperate core in a multicore CPU. More people are likely to have a multicore CPU than a dedicated physics card.
October 13, 2006 3:02:38 PM

Agreed! Why not just pay $250 for a dual core that can do more than physic calculations? Let's see, I pay $250 for a PhysX card and how many games support this? Price of a PhsyX card > price of games that support it.

Conclusion: PhysX is a waste of money until more games support it!
October 13, 2006 3:25:13 PM

Quote:
Agreed! Why not just pay $250 for a dual core that can do more than physic calculations? Let's see, I pay $250 for a PhysX card and how many games support this? Price of a PhsyX card > price of games that support it.

Conclusion: PhysX is a waste of money until more games support it!


Well the are numerous differences in the architecture of CPUs and GPUs/PPUs that make it theoretically implausible to run physics calculations on a CPU. Someone with more knowledge on the subject could help you more than I, but i believe it's something to do with the floating point operations or something. Not entirely sure, i just know it's not really plausible on a CPU.
October 13, 2006 3:42:24 PM

Quote:
You obviously never saw the tornado demo of Alan Wake running on a Kentsfield. Basically its been designed to utilize multiple core CPU's to dedicate 1 or more cores (depending on how many you have) to do the Physics Processing. Then again they could have had a PPU Card hiding somewhere inside the machine...

http://www.youtube.com/watch?v=DetnKgOxrSI


The reason a CPU cannot do physics as well as a dedicated PPU is that they cannot do as many floating point calculations per second, which is basically what you need to calculate physics. If someone more knowledgable than i came along they can explain the intracasies better.
October 13, 2006 4:14:44 PM

Quote:
You obviously never saw the tornado demo of Alan Wake running on a Kentsfield. Basically its been designed to utilize multiple core CPU's to dedicate 1 or more cores (depending on how many you have) to do the Physics Processing. Then again they could have had a PPU Card hiding somewhere inside the machine...

http://www.youtube.com/watch?v=DetnKgOxrSI


The reason a CPU cannot do physics as well as a dedicated PPU is that they cannot do as many floating point calculations per second, which is basically what you need to calculate physics. If someone more knowledgable than i came along they can explain the intracasies better.

Right, Folding@Home said that Folding on GPU's was much faster due to the huge gain in FLOPS over CPU's. However, they also said they cannot get that power out of nVidia cards due to the differences (I think they use the shaders for calculation of which the 19xx series has much more of than the 79xx, I may be worng on the exact specifics). This may mean that it would make sense for nVidia to ally with Ageia if they do not want to add the huge number of shaders.

As far as running it on CPU's, if the calculations can be spread among cores it may soon be possible to use the CPU's. While a GPU can do more FLOPS than a single core (I think it is 2-3 times the number at the moment), a quad core should be able to match that performance by spreading those calculations to the other three cores (1 core for main game thread, 3 for physics)
October 13, 2006 4:16:37 PM

Quote:
You obviously never saw the tornado demo of Alan Wake running on a Kentsfield. Basically its been designed to utilize multiple core CPU's to dedicate 1 or more cores (depending on how many you have) to do the Physics Processing. Then again they could have had a PPU Card hiding somewhere inside the machine...

http://www.youtube.com/watch?v=DetnKgOxrSI


The reason a CPU cannot do physics as well as a dedicated PPU is that they cannot do as many floating point calculations per second, which is basically what you need to calculate physics. If someone more knowledgable than i came along they can explain the intracasies better.

Right, Folding@Home said that Folding on GPU's was much faster due to the huge gain in FLOPS over CPU's. However, they also said they cannot get that power out of nVidia cards due to the differences (I think they use the shaders for calculation of which the 19xx series has much more of than the 79xx, I may be worng on the exact specifics). This may mean that it would make sense for nVidia to ally with Ageia if they do not want to add the huge number of shaders.

As far as running it on CPU's, if the calculations can be spread among cores it may soon be possible to use the CPU's. While a GPU can do more FLOPS than a single core (I think it is 2-3 times the number at the moment), a quad core should be able to match that performance by spreading those calculations to the other three cores (1 core for main game thread, 3 for physics)

But isn't that a huge waste of Cores? Surely using a £80 graphics card as a PPU is a much more sensible option, or am i the only one thinking this?
October 13, 2006 5:22:16 PM

Quote:


Right, Folding@Home said that Folding on GPU's was much faster due to the huge gain in FLOPS over CPU's. However, they also said they cannot get that power out of nVidia cards due to the differences...

When and where did folding@home say that? They said they haven't gotten it to work with nvidia GPUs yet. http://www.tgdaily.com/2006/09/29/folding_at_home_to_us...
There have been a number of posts by fanboys saying that, but I don't see anywhere that F@H has said that. So many "enthusiasts" seem to be completely clueless about how projects are run. If you read all the information you can see that they chose the ATI GPU over the nVidia GPU because it was easier to make it work (and I assume more people have them right now, which likely also influenced their decision). They could have likely coded it to run on a physx card with even more ease, and it would have likely kicked the crap out of the ATI card and the nVidia card combined, but almost no one has physx cards, so that would be pointless. There are also a number of scientific co-processing add-on boards that would put an entire "enthusiast" gaming machine to shame as far floating point power goes, but in a project like F@H ease of programming and how many people actually run it is far more important than raw speed. The SETI@home team admitted that the "screensaver" mode of their distributed computing program slowed processing down to as little as 1/4th the speed, but that they continued to support it because it increased the number of people working on the program thusly improving the number of units processed for the project. There are many factors that go into programming projects like this, not just what is fastest for the job, because ATI GPUs are most certainly NOT the fastest thing by a long shot.


As for the OP: There was a physx card in it... so what? Was it demoing any programs that use it? Like another poster said it probably just came with the Dell desktop. INQ makes really bad stories, they got a picture of an nvidia machine that happened to have a physx card in it and seem to have made up the rest. Unless they were demoing a game that can even use a physx card this is a completely made up story, pure rumor, no basis in reality. nVidia has it's own physics processing projects already underway, unless those projects are having a serious lack of brain power there is no reason for them to buy another company's technology. Until there is evidence of one of those two things, you make safely disregard this story.
October 13, 2006 5:38:43 PM

Quote:
Agreed! Why not just pay $250 for a dual core that can do more than physic calculations? Let's see, I pay $250 for a PhysX card and how many games support this? Price of a PhsyX card > price of games that support it.

Conclusion: PhysX is a waste of money until more games support it!


Well the are numerous differences in the architecture of CPUs and GPUs/PPUs that make it theoretically implausible to run physics calculations on a CPU. Someone with more knowledge on the subject could help you more than I, but i believe it's something to do with the floating point operations or something. Not entirely sure, i just know it's not really plausible on a CPU.

Well, can you give me an example on these theoretically implausible physics calculations ? What differences makes it impossible ?

There are problems with using a dedicated physics card, not just number crunching. One is memory. Physics affects the interactively of a game more than graphics. Such as a falling box. The way the box falls has a greater effect than changing the appearance of the box. This means the CPU and the physics will need to communicate a lot more. This could would mean a lot of data needs to be transfered in both direction. There was a major problem with the old AGP port. Upstream to the GPU was very fast. Downstream was very slow.

I am no expert so I do not what the bus transfer rate is like but if the PCIx transfer is optimised for one way, then that is a potential bottleneck.

This will not be the case in a multicore CPU.

Another possible reason is as more cores becomes available, Intel announced the quad-core, then it is possible to dedicate more than one core for the physics.

Unless Ageia is bought out, they will find it hard to compete with Intel in processor development.
October 13, 2006 6:03:07 PM

Quote:

But isn't that a huge waste of Cores? Surely using a £80 graphics card as a PPU is a much more sensible option, or am i the only one thinking this?


Thats no argument. Most Software available doesn´t use a CPU as it is intended. Instead of actually calculating the Processor is is juggling the stack like mad. So right now it may seem as if the GPU may be better, but if Intel intends to go multicore like crazy, i bet $/Core is going way down and in two years four cores will be cheaper than even a entry level GPU. Then again, two years from now, there might be GPUs integrated into the CPU or at least available vor CPU Sockets...
October 13, 2006 6:22:24 PM

All I see is that Nvidia pc is equiped with Ageia card. Whatever impress the masses I guess. So they're using physics card, big deal. I don't think they're fooling the people about that, maybe they ran out of space that's why they place the desktop behind the monitor.
October 13, 2006 6:23:44 PM

Quote:
There are many factors that go into programming projects like this, not just what is fastest for the job, because ATI GPUs are most certainly NOT the fastest thing by a long shot.


However, the x1xxx chips from ATI are programmable and some time ago ATI released a program that will let you encode video via its GPU and it does it much faster than the CPU.

Thanks
October 13, 2006 6:46:14 PM

Quote:
All I see is that Nvidia pc is equiped with Ageia card. Whatever impress the masses I guess. So they're using physics card, big deal. I don't think they're fooling the people about that, maybe they ran out of space that's why they place the desktop behind the monitor.

I see a DeLL PC with nV and Ageia hardware.
It would be realy amusing if Dell give a ATI ageia rig to show of nV.
So funny, but it's a Dell. So what the new's.
October 13, 2006 6:49:18 PM

Quote:

But isn't that a huge waste of Cores? Surely using a £80 graphics card as a PPU is a much more sensible option, or am i the only one thinking this?


Thats no argument. Most Software available doesn´t use a CPU as it is intended. Instead of actually calculating the Processor is is juggling the stack like mad. So right now it may seem as if the GPU may be better, but if Intel intends to go multicore like crazy, i bet $/Core is going way down and in two years four cores will be cheaper than even a entry level GPU. Then again, two years from now, there might be GPUs integrated into the CPU or at least available vor CPU Sockets...

AMD open socket plans: http://www.simmtester.com/page/news/shownews.asp?num=95...

Open socket designs are good for bringing innovation to the market; good for consumers. GPU, PPU, FPGA, dual Proc, Intel or Via CPUs running on an "AMD" socket. These are all possibilites. An FPGA "co processor" is the one that excites me the most. When an application is launched it could program the FPGA and you then have ~500mhz of low-overhead specialized processing power to do exactly what you want to do no matter how much load you put on the rest of your system. I am currently testing ASIC code on FPGAs running at a mere 53mhz and they can respond to and route 2ghz traffic on multiple channels simutaneously within a few nano seconds. IMHO specialized logic chips are the only way computers are going to get significantly faster from here on out. FPGAs capable of over 500mhz are currently available. Specialized logic chips are extremely powerful, as you can see by a ~500mhz GPU having significantly more processing power then a dual-core ~3ghz CPU.

I don't think it's really feasible for high-end gaming GFX though as they depend heavily on having tons of very fast RAM very close to the GPU with dedicated bandwidth and that might not work well for a "socket" GFX solution if it relies on a single Hyper Transport link and shares memory bandwidth with the rest of the system (but we're still talking much faster then current GFX cards that use system ram). For mid range GFX and many other applications it could work extremely well.

As far as it being a "waste" of cores. Dual-cores are dropping in price and rapidly becoming the norm. Using a core that a system already has instead of buying additional hardware is certainly a valid strategy. CPUs are very ineffecient though as they are designed to be able to do anything and require a lot of software overhead to accomplish most tasks, so they may simply not be fast enough, no matter how many extra cores you have, to perform some tasks (real-time GFX being an obvious example of something you wouldn't want to do with an "extra" core).
October 13, 2006 7:37:20 PM

Quote:
This may be one reason why people believe nVidia is too slow http://www.anandtech.com/video/showdoc.aspx?i=2849&p=3


"The situation for NVIDIA users however isn't as rosy, as while the research group would like to expand this to use the latest GeForce cards, their current attempts at implementing GPU-accelerated processing on those cards has shown that NVIDIA's cards are too slow compared to ATI's to be used." The article says that... but they don't quote anyone on the folding@home team as saying that, or any folding@home documents that say that.

The article I linked a few posts up references Stanford Associate Professor Vijay Pande as saying that ~"the group has not been able to get the software to work on Nvidia chips."~. The two articles were posted within 1day of eachother. If you're completely out of touch with reality and ****-retentive I suppose you could say that not working at all is much, much slower, but to any sane person you would quickly realize that the reason ATI is supported now and nVidia isn't is because it was easier to code for ATI. If it had been the other way around I'm sure it would be nVidia cards running folding@home irregardless of the number of shader units. If the folding@home team got the code to work with nVidia cards they would release it, it's the only thing that makes sense, no matter how slow it was. They have pentium2 machines working on the project, they're not picky about where the gflops come from. They haven't released it because they never got it to work. Maybe they had some trouble and "it's slower anyway" was one of the reasons they gave up (if they even did give up, F@H didn't say they had) but if that was the reason I'm sure "and the g80 is coming out soon and should be faster and easier to code for" was the next reason.

More on subject: A CPU can be used for any number-crunching task. But there is a lot of overhead involved as you not only have to give the CPU the information, you have to tell the CPU what to do with it. With a dedicated logic chip you give the chip the information and it does the same thing with it it always does. It has transisters hard-coded to do complex equations without having to be told how. A CPU has to do the first step, write it to chache, read the instructions for the next step, read back the information and do the next step, over and over (a vary vague description, but largely accurate). Some algorythms are more ineffecient in this senario then others. A dedicated logic chip can do any algorythm in one pass (but there is a limit to how many algorythms it knows how to do, whereas a CPU can do anything as the software tells it the algorythm each time). That not only significantly increases the amount of work done per cycle, but also greatly decreases the latency which is why they are often used in real-time applications such as gaming GFX and real-time physics calculations. That's why a 500mhz GPU can kick the crap out of a 3ghz dual-core CPU and a 53mhz FPGA can do things a computer system could never dream of.
a b Î Nvidia
October 13, 2006 8:32:48 PM

IMO the reason nV would look at Ageia is to bolster their Physics knowledge base and R&D versus ATi to apply to their GPU based solution, not to promote, adopt or continue the PPU.

Of course people are making alot of noise out of a PC that had the PPU in it, yet we don't know if it was actually used in any way, may have been there in the test system for other stuff prior to the expo/demo.
a b Î Nvidia
October 13, 2006 8:41:27 PM

Quote:
That's why a 500mhz GPU can kick the crap out of a 3ghz dual-core CPU and a 53mhz FPGA can do things a computer system could never dream of.


Yeah FPGAs are great, but they wouldn't be as fast for the mundane stuff, and they'd be more expensive than ASIC once the requirements were finalised, also they'd likely draw more power.

What would be great IMO is adding a good sized FPGA alongside the ASIC (CPU/VPU) in order to be much more open ended with options/features.
October 14, 2006 1:26:32 PM

Quote:
IMO the reason nV would look at Ageia is to bolster their Physics knowledge base and R&D versus ATi to apply to their GPU based solution, not to promote, adopt or continue the PPU.

Of course people are making alot of noise out of a PC that had the PPU in it, yet we don't know if it was actually used in any way, may have been there in the test system for other stuff prior to the expo/demo.


I see that ATI's physics engine is much more advanced than nvidias at this moment in time, so this is probably why.
October 14, 2006 5:09:02 PM

Quote:
IMO the reason nV would look at Ageia is to bolster their Physics knowledge base and R&D versus ATi to apply to their GPU based solution, not to promote, adopt or continue the PPU.

Of course people are making alot of noise out of a PC that had the PPU in it, yet we don't know if it was actually used in any way, may have been there in the test system for other stuff prior to the expo/demo.


I see that ATI's physics engine is much more advanced than nvidias at this moment in time, so this is probably why.

Uh? Where did you see this?
October 14, 2006 5:47:55 PM

Quote:

But isn't that a huge waste of Cores? Surely using a £80 graphics card as a PPU is a much more sensible option, or am i the only one thinking this?


Thats no argument. Most Software available doesn´t use a CPU as it is intended. Instead of actually calculating the Processor is is juggling the stack like mad. So right now it may seem as if the GPU may be better, but if Intel intends to go multicore like crazy, i bet $/Core is going way down and in two years four cores will be cheaper than even a entry level GPU. Then again, two years from now, there might be GPUs integrated into the CPU or at least available vor CPU Sockets...

AMD open socket plans: http://www.simmtester.com/page/news/shownews.asp?num=95...

Open socket designs are good for bringing innovation to the market; good for consumers. GPU, PPU, FPGA, dual Proc, Intel or Via CPUs running on an "AMD" socket. These are all possibilites. An FPGA "co processor" is the one that excites me the most. When an application is launched it could program the FPGA and you then have ~500mhz of low-overhead specialized processing power to do exactly what you want to do no matter how much load you put on the rest of your system. I am currently testing ASIC code on FPGAs running at a mere 53mhz and they can respond to and route 2ghz traffic on multiple channels simutaneously within a few nano seconds. IMHO specialized logic chips are the only way computers are going to get significantly faster from here on out. FPGAs capable of over 500mhz are currently available. Specialized logic chips are extremely powerful, as you can see by a ~500mhz GPU having significantly more processing power then a dual-core ~3ghz CPU.

I don't think it's really feasible for high-end gaming GFX though as they depend heavily on having tons of very fast RAM very close to the GPU with dedicated bandwidth and that might not work well for a "socket" GFX solution if it relies on a single Hyper Transport link and shares memory bandwidth with the rest of the system (but we're still talking much faster then current GFX cards that use system ram). For mid range GFX and many other applications it could work extremely well.

As far as it being a "waste" of cores. Dual-cores are dropping in price and rapidly becoming the norm. Using a core that a system already has instead of buying additional hardware is certainly a valid strategy. CPUs are very ineffecient though as they are designed to be able to do anything and require a lot of software overhead to accomplish most tasks, so they may simply not be fast enough, no matter how many extra cores you have, to perform some tasks (real-time GFX being an obvious example of something you wouldn't want to do with an "extra" core).

Actually Flasher nVIDIA GPU's ARE too slow for GPGPU calculations.

Mike Houston: Stanford University Folding@Home Project

nVIDIA GPU's have to many limitations. First of all they're 256bit internally (vs 512bit for ATi X18/19K VPU's), which normally would not be a problem but nVIDIA 7x00 GPU's have a 64K limit in shader lengths. Thus many passes need to be made in order to get the work required by Folding@Home done (or physics). The more passes the slower it is.

ATi X18/19K VPU's support unlimited shader lengths. (they can do all the work in a single pass).

Another aspect is the dedicated Branching unit found on ATi X18/19K VPU's. These are used extensively by GPGPU applications. nVIDIA's 7x00 do not support such a feature.. thus more passes required once more for Branching.

And last but not least pure Shader power. ATi X19K series have more shader power then ANY nVIDIA GPU currently (by 2 times or more).

All this makes nVIDIA 7x00 too slow for hardcore GPGPU applications.

Read up on it.

ATi x19K > AGEIA Physx in GPGPU applications if dedicated to such an application as well BTW.
October 14, 2006 5:51:21 PM

Quote:
IMO the reason nV would look at Ageia is to bolster their Physics knowledge base and R&D versus ATi to apply to their GPU based solution, not to promote, adopt or continue the PPU.

Of course people are making alot of noise out of a PC that had the PPU in it, yet we don't know if it was actually used in any way, may have been there in the test system for other stuff prior to the expo/demo.


I see that ATI's physics engine is much more advanced than nvidias at this moment in time, so this is probably why.

Uh? Where did you see this?

It's logical. G80 will, however, be a pwoerhouse for FP calculations. It will place nVIDIA in first place for a few short months on every front. R600 is said to be un-imagineable.. but then again this is an area of expertise over at ATi.
October 14, 2006 7:05:22 PM

Quote:
There are many factors that go into programming projects like this, not just what is fastest for the job, because ATI GPUs are most certainly NOT the fastest thing by a long shot.
The R580 is the fastest consumer GPU on the market.
Quote:
Actually Flasher nVIDIA GPU's ARE too slow for GPGPU calculations.

Mike Houston: Stanford University Folding@Home Project

nVIDIA GPU's have to many limitations. First of all they're 256bit internally (vs 512bit for ATi X18/19K VPU's), which normally would not be a problem but nVIDIA 7x00 GPU's have a 64K limit in shader lengths. Thus many passes need to be made in order to get the work required by Folding@Home done (or physics). The more passes the slower it is.

ATi X18/19K VPU's support unlimited shader lengths. (they can do all the work in a single pass).

Another aspect is the dedicated Branching unit found on ATi X18/19K VPU's. These are used extensively by GPGPU applications. nVIDIA's 7x00 do not support such a feature.. thus more passes required once more for Branching.

And last but not least pure Shader power. ATi X19K series have more shader power then ANY nVIDIA GPU currently (by 2 times or more).

All this makes nVIDIA 7x00 too slow for hardcore GPGPU applications.

Read up on it.

ATi x19K > AGEIA Physx in GPGPU applications if dedicated to such an application as well BTW.
While ATI's current GPUs are faster, by the time the X1900's general purpose power really starts being used, the G80 and R600 will be out.
a b Î Nvidia
October 14, 2006 8:03:04 PM

Quote:

It's logical. G80 will, however, be a pwoerhouse for FP calculations. It will place nVIDIA in first place for a few short months on every front. R600 is said to be un-imagineable.. but then again this is an area of expertise over at ATi.


However alot depends on how much nV has improved their dynamic branching / branch prediction, right now the ATi chips are about 3-5 times as fast when branch prediction is call for... like in FAH.

So it's not just raw FP calculations but also branching capabilities that will need to be tweaked for nV to be competitive as a dedicated brooks/FAH processor.

Considering ASUS' mention of a dedicated nV PPU the Ageia card may be for R&D of something unrelated to the demos being shown. I still don't see nV buying Ageia for anything more than R&D and maybe a patent or two, but buying now would be paying too much for Ageia, they would be far wiser to simply crush Ageia and then buy the bits and pieces left over, buying now forces a price premium and also would make them look predatory thus adding fire to other potential deals. Speaking of which it makes more sense for Intel to buy Ageia than either nV or AMD/ATi.
October 14, 2006 8:13:46 PM

Quote:
IMO the reason nV would look at Ageia is to bolster their Physics knowledge base and R&D versus ATi to apply to their GPU based solution, not to promote, adopt or continue the PPU.

Of course people are making alot of noise out of a PC that had the PPU in it, yet we don't know if it was actually used in any way, may have been there in the test system for other stuff prior to the expo/demo.


I see that ATI's physics engine is much more advanced than nvidias at this moment in time, so this is probably why.

Uh? Where did you see this?

You heard of the ATI display where they used an x1600 to render the physics? I'll try and rummage around for the link.
October 14, 2006 9:46:29 PM

They say they are not doing physics and so are relying on an Ageia card. Also, it’s a Dell which has been offering Ageia boards for quite some time now.

Nothing major there...
October 15, 2006 5:51:57 AM

It’s just that I would not condemn them on using it to "up things" a little bit (and since it isn’t an Nvidia part, their statement would still be true). But you’re right. I did not thought this through. It would be like shooting their own foot.

It’s really just a Dell with Ageia’s card. No need to say more.

But a question remains… Why keep the card in the case and take a chance like this?
October 15, 2006 7:27:24 PM

Quote:

But isn't that a huge waste of Cores? Surely using a £80 graphics card as a PPU is a much more sensible option, or am i the only one thinking this?


Thats no argument. Most Software available doesn´t use a CPU as it is intended. Instead of actually calculating the Processor is is juggling the stack like mad. So right now it may seem as if the GPU may be better, but if Intel intends to go multicore like crazy, i bet $/Core is going way down and in two years four cores will be cheaper than even a entry level GPU. Then again, two years from now, there might be GPUs integrated into the CPU or at least available vor CPU Sockets...

AMD open socket plans: http://www.simmtester.com/page/news/shownews.asp?num=95...

Open socket designs are good for bringing innovation to the market; good for consumers. GPU, PPU, FPGA, dual Proc, Intel or Via CPUs running on an "AMD" socket. These are all possibilites. An FPGA "co processor" is the one that excites me the most. When an application is launched it could program the FPGA and you then have ~500mhz of low-overhead specialized processing power to do exactly what you want to do no matter how much load you put on the rest of your system. I am currently testing ASIC code on FPGAs running at a mere 53mhz and they can respond to and route 2ghz traffic on multiple channels simutaneously within a few nano seconds. IMHO specialized logic chips are the only way computers are going to get significantly faster from here on out. FPGAs capable of over 500mhz are currently available. Specialized logic chips are extremely powerful, as you can see by a ~500mhz GPU having significantly more processing power then a dual-core ~3ghz CPU.

I don't think it's really feasible for high-end gaming GFX though as they depend heavily on having tons of very fast RAM very close to the GPU with dedicated bandwidth and that might not work well for a "socket" GFX solution if it relies on a single Hyper Transport link and shares memory bandwidth with the rest of the system (but we're still talking much faster then current GFX cards that use system ram). For mid range GFX and many other applications it could work extremely well.

As far as it being a "waste" of cores. Dual-cores are dropping in price and rapidly becoming the norm. Using a core that a system already has instead of buying additional hardware is certainly a valid strategy. CPUs are very ineffecient though as they are designed to be able to do anything and require a lot of software overhead to accomplish most tasks, so they may simply not be fast enough, no matter how many extra cores you have, to perform some tasks (real-time GFX being an obvious example of something you wouldn't want to do with an "extra" core).

Actually Flasher nVIDIA GPU's ARE too slow for GPGPU calculations.

Mike Houston: Stanford University Folding@Home Project

nVIDIA GPU's have to many limitations. First of all they're 256bit internally (vs 512bit for ATi X18/19K VPU's), which normally would not be a problem but nVIDIA 7x00 GPU's have a 64K limit in shader lengths. Thus many passes need to be made in order to get the work required by Folding@Home done (or physics). The more passes the slower it is.

ATi X18/19K VPU's support unlimited shader lengths. (they can do all the work in a single pass).

Another aspect is the dedicated Branching unit found on ATi X18/19K VPU's. These are used extensively by GPGPU applications. nVIDIA's 7x00 do not support such a feature.. thus more passes required once more for Branching.

And last but not least pure Shader power. ATi X19K series have more shader power then ANY nVIDIA GPU currently (by 2 times or more).

All this makes nVIDIA 7x00 too slow for hardcore GPGPU applications.

Read up on it.

ATi x19K > AGEIA Physx in GPGPU applications if dedicated to such an application as well BTW.

Elmo: Thank you for finally taking the time to point out what the actual big problem with the nVidia card as a GPGPU is that it sucks for shader lengths bigger than 64k. I don't know why you still insist on rambling on about all that other stuff; the important information could have been summed up in once sentence. All that other stuff seems pretty irrelevant if you're dealing with shader lengths >64k.

And really, a few sentences later he said this:
"While it would be possible to run the code on the current NVIDIA hardware, we would have to make pretty large changes to the code they want to run, and even past that, the performance is not great. We will have to look at their next architecture and re-evaluate."
Isn't that *exactly* what I said was probably their attitude??? :)  And also notice that the phrase "too slow to run f@h" or anything even close to it close to it does not appear in the article anywhere. SO STOP SAYING IT! Just say "nVidia can't do shader lengths over 64k that GPGPUs need to do and ATI can". It's a 100% accurate statement, and if your goal is to trash nVidia it makes their product sound likes it's comletely incapable, which is perhaps a bit misleading, but no more so then "it can't do it because it's too slow" and you won't sound like a whiny fanboy when you say it (although I don't think you can make too much progress on that front as long as your avatar is the ATI logo).

But in all seriousness, good find on the article, thank you. This topic should be very, very dead now.

Hmm, I was thinking that since protein folding is physics that the Physx card would have capabilities usauble for it. But I tried to find some more information on it and it appears to be too highly specialized to gaming and I'm not finding any information to suggest *any* support for other uses. It was really just supposed to be an example of an accelerator card that gamers would be familiar with. I didn't find any sustained floating point benchmark scores, such as LINPACK, for ATI cards either, so a direct coparison between x19k and other co-processors, such as ClearSpeed, does not seem possible at this time. With a theoretical peak of >300gflops it seems x19k cards could do fairly well if they could be made to run that benchmark, and it would be quite embarrasing for ClearSpeed if their $5k accelerator board got beat by a ~$500gfx card that will likely plumit in price, but it can't touch their performance/watt (clearspeed card increases total system power consumption by ~1%, would be nice if we could get that kind of effecieny out of a GFX card) so it wouldn't affect their sales to high-end computing sites linky
October 15, 2006 7:34:48 PM

Quote:
It’s just that I would not condemn them on using it to "up things" a little bit (and since it isn’t an Nvidia part, their statement would still be true). But you’re right. I did not thought this through. It would be like shooting their own foot.

It’s really just a Dell with Ageia’s card. No need to say more.

But a question remains… Why keep the card in the case and take a chance like this?

Quite possibly they weren't allowed to change the system. It's just a generic Dell XPS isn't it? They didn't run any physics demos, they didn't take any chances of people thinking they were using Ageia to do their physics for them. INQ just fabricated this entire rumor as far as I can tell. It's like accusing AMD of relying on Intel if they ran a Demo on a machine with an intel chipset network card in it... that wasn't plugged into any network. The card isn't even doing anything. Who made it is so far removed from relevance that only INQ would act like it's a news story.

And if it weren't for all the other related topics beign discussed here this thread would probably be dead already xD
a b Î Nvidia
October 16, 2006 8:35:31 AM

Quote:

Quite possibly they weren't allowed to change the system. It's just a generic Dell XPS isn't it?


Well looks like it wasn't a DELL PC, and yes they could've yanked it first if they wanted to or thought to, cause they have now;

http://www.theinquirer.net/default.aspx?article=35089

Quote:
INQ just fabricated this entire rumor as far as I can tell... The card isn't even doing anything. Who made it is so far removed from relevance that only INQ would act like it's a news story.


First of all how do you know what it was doing? As for the InQ, they did get it right, so blaming them for reporting it is pretty much like trying to kill the messenger, and saying they fabricated it when they had clear pics kinda speaks to your credability not theirs.
October 16, 2006 5:45:33 PM

Quote:

Quite possibly they weren't allowed to change the system. It's just a generic Dell XPS isn't it?


Well looks like it wasn't a DELL PC, and yes they could've yanked it first if they wanted to or thought to, cause they have now;

http://www.theinquirer.net/default.aspx?article=35089

Quote:
INQ just fabricated this entire rumor as far as I can tell... The card isn't even doing anything. Who made it is so far removed from relevance that only INQ would act like it's a news story.


First of all how do you know what it was doing? As for the InQ, they did get it right, so blaming them for reporting it is pretty much like trying to kill the messenger, and saying they fabricated it when they had clear pics kinda speaks to your credability not theirs.

"Dell XPS system" was the only thing they mentioned being run at the nVidia booth in the first article. In the second article they even admitted to omitting the pertenant information of what system had the Physx card in it. The INQ got it wrong through their omission.

How do I know what it was doing??? Well, for starters: "When asked whether they were showing any physics acceleration demos, the Nvidians indicated that they were not doing any physics at this show." and then the original INQ article, nor the second one, said nothing about spotting possible physics demos being run at the nVidia booth, which any sane reporter would have promptly reported as it would be supporting evidence. Conclusion: nVidia said they weren't running physics demos, and they were telling the truth. Suggestions to the contrary by INQ were baseless rumors.

They had clear pics.... of a random machine, not even showing what booth it was in, with an ageia physx card in it. Of what is that evidence of? That system builders are putting physx cards into makines with nVidia GFX cards? INQ got nothing right. They found a machine with a physx card in it and completely made up the entire rest of the story. This is proved by nVidia taking the card out of the machine... and continuing to run their demo.

They physx card was idle, there was no story. INQ is to blame for making up baseless rumors and making a big fuss about them, much in line with their reputation for low-quality reporting. Their second article only confirms how completely off-base the first one was. Wether it was a Dell XPS or a Biohazard Computers (the INQ still doesn't tell us what model it was, if they did we could probably look it up and go "oh, those come with physx cards") is irrelevant, because the card is irrelevant, because there were no physics demos. Hence this entire story was BS. The only good thing you can say about the INQ at this point is that they did do a follow story, but they didn't blatently admit that Ageia was not the "...'sekret sauce' behind their physics acceleration." and appearantly it needs to be blatantly stated, repeatedly, for some people to get the message: nVidia was NOT using Ageia to speed up their physics demos. There was never any evidence to suggest that they were.

People should simply not read INQ. This reporter appearantly didn't even know what physics acceleration is or what supports it as he seems to have thought that simply slapping an Ageia Physx card in your system speeds up all your graphics... So he made a story about it. Read the last two sentences and the image captions of that first article again. He's either an idiot, or he thinks all of his readers are to post such garbage.
a b Î Nvidia
October 16, 2006 8:10:18 PM

Quote:

"Dell XPS system" was the only thing they mentioned being run at the nVidia booth in the first article. In the second article they even admitted to omitting the pertenant information of what system had the Physx card in it. The INQ got it wrong through their omission.


They weren't wrong by omission, the issue was not whether it was a DELL a Gateway or a Beige Custom Built, it was the Ageia card, and reading it they NEVER SPECIFIED, the mention of DELL was regarding the GeforceGO unit, everyone just read in the DELL as being the unit the talked about, and even now it's not relevant. They also didn't admit omitting pertinent information, because it wasn't pertinent only people like yourself seems to matter the make of the box.

Quote:
How do I know what it was doing??? Well, for starters: "When asked whether they were showing any physics acceleration demos, the Nvidians indicated that they were not doing any physics at this show." and then the original INQ article, nor the second one, said nothing about spotting possible physics demos being run at the nVidia booth, which any sane reporter would have promptly reported as it would be supporting evidence. Conclusion: nVidia said they weren't running physics demos, and they were telling the truth. Suggestions to the contrary by INQ were baseless rumors.


Without proof, you still don't know either way, and therefore it's just as baseless an assumption. That's my point, and accepting nVidia's (or ATi or anyone's) line when confronted with issues like this has never been the wise course of acton, or did the optimization years pass you by?
This is very similar to racing where if you see a nitrous bottle or switch you don't trust he doesn't spray, you have them remove the bottle and bleed the lines. Your assumptions that they don't do anything just brings up the questions that were there at the time, why have it in there if you aren't going to use it, as many of us guessed (including myself) this might be a case of bringing a rig that was designed for a bunch of things to expo and not thinking about the implications of the presence of some parts.

Quote:
They had clear pics.... of a random machine, not even showing what booth it was in, with an ageia physx card in it.


Which could easily be confirmed by anyone at the expo at the time of publishing who saw the same booth, and I'm certain there's additional information should nV challenge it.

Quote:
Of what is that evidence of? That system builders are putting physx cards into makines with nVidia GFX cards? INQ got nothing right.


That's hyperbole on your part at best, 'nothing right'? :roll:

They were right about it being there, like I said that is evidence of myopia to say the least from a company that is supposed to be making competing products. Either way it conveys... for physics nV uses Ageia. Do you think they'd ever dare get caught with pictures of their main Booth PC with ATi cards in them even though we know for sure they have a ton of them back at work to test and mess with? From a PR side alone it's a blunder. At the very least it conveys support for Ageia just by it being there, even if it isn't used.

Quote:
They found a machine with a physx card in it and completely made up the entire rest of the story. This is proved by nVidia taking the card out of the machine... and continuing to run their demo.


If they made up the rest of the story why would it be nvidia taking the card out? That's alot of defending you're doing based on even less info than the InQ had. And nV taking the card out doesn't prove anything unelss you had benchies before and after. The only thing taking it out does is prove that even they felt it was a consideration, leaving it in would've be more of a statement that they didn't care. But coming clean after a slip-up like this is always the best way to go about doing it, now there's no way to test for sure what was the before/after results. The wise thing would be to invite them back to explicitely show there's no influence. Instead of eliminating doubt, they sustained the doubt and eliminated the source of the doubt (and the source of redemption). IT probably is nothing, but just like so many corporat questionable actions the response is usually bunged up by trying to hide it rather than address it directly.

Quote:
They physx card was idle, there was no story.


How do you know it was idle, once again, you're assuming far more than the InQ did, the InQ at least had pictures of it in the rig, you have nothing at all one way or the other.

Quote:
INQ is to blame for making up baseless rumors and making a big fuss about them, much in line with their reputation for low-quality reporting.


ME thinks thou dost protest too much. You can say what you want about them, but they obviously touched a nerve to get this much attention on what appears to be a factual information and picture. The questions that it led to are leading for sure, but the main reportage was factual unless you can prove it never existed.

Quote:
Their second article only confirms how completely off-base the first one was.


No it doesn't, it simply adds to it saying well the card was there, but it wasn't by nV design so much as pre-built design, and that it's probably a tempest in a tea pot, but they weren't anymore in it being there, just in the way they phrased their sarcasm which is the only thing off base. nV still showed up to the expo with essentially the nitrous kit strapped in there, whether they intended to or had the need/opportunity to use it or not.

Quote:
Wether it was a Dell XPS or a Biohazard Computers (the INQ still doesn't tell us what model it was, if they did we could probably look it up and go "oh, those come with physx cards") is irrelevant, because the card is irrelevant, because there were no physics demos. Hence this entire story was BS.


See and the thing that makes me laugh about what you write, is that at NO POINT was all the software/demos detailed, yet you think it's more important to find out which model of the pre-built it was. Wow you really miss the point, either on purpose or simply becuase of your obvious dislike of the InQ. The question shouldn't be who made the PC, it should be what were all the demos being displayed, and how do they work? Prove that it means nothing, don't hide behind the 'we don't know what's in our rig defence, it makes them look dummer than the people who buy DELLs.

Quote:
The only good thing you can say about the INQ at this point is that they did do a follow story, but they didn't blatently admit that Ageia was not the "...'sekret sauce' behind their physics acceleration." and appearantly it needs to be blatantly stated, repeatedly, for some people to get the message: nVidia was NOT using Ageia to speed up their physics demos.


You actually CAN'T say that, and nV didn't even provide the evidence to dispel that misconception. Bad move on their part.

Quote:
There was never any evidence to suggest that they were.


Actually a PhysX card in a rig DOES suggest it's there to do physics, but what there isn't is evidence ONE WAY OR THE OTHER of what exactly occured in the demo. nV reacted in such a way to now make it pretty much impossible to prove. They could demonstrate the demo now, but how would they prove that it wasn't a software work around like the use of thePhysX calculations on the dual core CPUs. They should've addressed the isssue immediately, openly and decisively. not yank the card.

Quote:
People should simply not read INQ. This reporter appearantly didn't even know what physics acceleration is


Where do you get that from, no who's making up porky-pies. :roll:

Quote:
or what supports it as he seems to have thought that simply slapping an Ageia Physx card in your system speeds up all your graphics...


Yeah and we all know the PhysX card slows down the FPS, we've all discussed it many times. :twisted:
Reality is like I said you don't know WTF it was doing there, you only know what nV says it was doing there. At the very least it make nV look very bad at picking parts considering that if it's idle it's still wasting power and generating heat. Silly rabbits!

Quote:
So he made a story about it. Read the last two sentences and the image captions of that first article again. He's either an idiot, or he thinks all of his readers are to post such garbage.


Read the WHOLE story and not just the pictures and select sentences and you actually get the whole picture where the questions are obviously leading and obviously sarcasm, specifically meant to ilicit a reaction from nV, and point out the humour of nV putting a competing product in the rig in their display. To think this stuff means nothing you're fooling yourself, just look at how much Intel's conroe launch changed once ATi was own by AMD, suddenly it would look foolish to have the direct competition being the co-star of your show. Just like a "Stop outsourcing ~ Buy American" buttons made in China. :mrgreen:

However, like I said before "Of course people are making alot of noise out of a PC that had the PPU in it, yet we don't know if it was actually used in any way, may have been there in the test system for other stuff prior to the expo/demo."

But like I also said in this thread, I think you protest far too much, and make as many if not more assumptions about the whole story than the InQ. :roll:
October 17, 2006 1:06:35 AM

OMFG you did not just do a point-by-point reply without making a single valid counter argument... Fine, have it your way.

Quote:

"Dell XPS system" was the only thing they mentioned being run at the nVidia booth in the first article. In the second article they even admitted to omitting the pertenant information of what system had the Physx card in it. The INQ got it wrong through their omission.


They weren't wrong by omission, the issue was not whether it was a DELL a Gateway or a Beige Custom Built, it was the Ageia card, and reading it they NEVER SPECIFIED, the mention of DELL was regarding the GeforceGO unit, everyone just read in the DELL as being the unit the talked about, and even now it's not relevant. They also didn't admit omitting pertinent information, because it wasn't pertinent only people like yourself seems to matter the make of the box.

I said, repeatedly, that it was irrelavant who made the box. The assumption was that the box was made by a system builder and not the nVidia demo team. A safe assumption, and I was correct. The Dell XPS part was a guess, stated as such in the form of a question, and based on the only information available to me at the time. You made a big deal about it the first time and I pointed out that it was irrelevant. Go find your own points, stop trying to steal mine. You're the one that seems to think it makes a big difference.


Quote:
How do I know what it was doing??? Well, for starters: "When asked whether they were showing any physics acceleration demos, the Nvidians indicated that they were not doing any physics at this show." and then the original INQ article, nor the second one, said nothing about spotting possible physics demos being run at the nVidia booth, which any sane reporter would have promptly reported as it would be supporting evidence. Conclusion: nVidia said they weren't running physics demos, and they were telling the truth. Suggestions to the contrary by INQ were baseless rumors.


Without proof, you still don't know either way, and therefore it's just as baseless an assumption. That's my point, and accepting nVidia's (or ATi or anyone's) line when confronted with issues like this has never been the wise course of acton, or did the optimization years pass you by?
This is very similar to racing where if you see a nitrous bottle or switch you don't trust he doesn't spray, you have them remove the bottle and bleed the lines. Your assumptions that they don't do anything just brings up the questions that were there at the time, why have it in there if you aren't going to use it, as many of us guessed (including myself) this might be a case of bringing a rig that was designed for a bunch of things to expo and not thinking about the implications of the presence of some parts.[/quote]
On who's shoulders is the burden of proof? The one making the outrageous claim or the person pointing out that there is no evidence to support the outrageous claim and a number of obvious explanations to explain the cirumstance, all of which, except for the Dell XPS bit that YOU keep bringing up as though it disproves everything I said, turned out to be true. I have a statement from nVidia and no statements to the contrary, even by the person reporting the story, that there were NO physics being run. Without any evidence to the contrary there is NO STORY.

Quote:
They had clear pics.... of a random machine, not even showing what booth it was in, with an ageia physx card in it.


Which could easily be confirmed by anyone at the expo at the time of publishing who saw the same booth, and I'm certain there's additional information should nV challenge it. [/quote]
Which could have been easily confirmed by a wide-angle photograph had they the integrity (or intelligence) to take one. So some assumptions are safe to make but others aren't? I'll agree with you on that, but your metric is way off. My point wasn't that I didn't think the computer was at the nVidia booth, I too assumed that they were at least that accurate. My point was obviously that they hadn't even bothered to show us that it was at the nVidia booth, but expected us to believe it just because they said it, tyhen expected us to believe their next, and completely rediculus, statement also. You tout their "photographic evidence" then admit that it would have to be confirmed by someone else at the event. Why are you even bringing this up again?

Quote:
Of what is that evidence of? That system builders are putting physx cards into makines with nVidia GFX cards? INQ got nothing right.


That's hyperbole on your part at best, 'nothing right'? :roll:

They were right about it being there, like I said that is evidence of myopia to say the least from a company that is supposed to be making competing products. Either way it conveys... for physics nV uses Ageia. Do you think they'd ever dare get caught with pictures of their main Booth PC with ATi cards in them even though we know for sure they have a ton of them back at work to test and mess with? From a PR side alone it's a blunder. At the very least it conveys support for Ageia just by it being there, even if it isn't used.[/quote]
They have pictures of an Ageia card in an nVidia booth. Big... freaking... deal. They have no evidence of any physics acceleration being done. Without that they have nothing. Not to mention, nVidia doesn't make physics acceleration products yet, so your analogy to ATI cards is rediculus. Oh and "got nothing right" was a play on yor statement of "they did get it right" that appearantly went right over your head and your attempt at a personal attack on me over an obviously intentional grammer transgression is pathetic both in execution and also because it's irrelevant to the argument. It doesn't "convey" anything except to someone who puts way too much stock in baseless rumor. Had I seen an ageia card in their demo I would have attempted to learn more about what demos they were running... that would have required real journalist work though, not something to expect from INQ. They just like to jump at the slightest big of information that something is happening and post it so they can feel good about themselves when they manage to get something right. They don't wait for real evidence before they post a story.

Quote:
They found a machine with a physx card in it and completely made up the entire rest of the story. This is proved by nVidia taking the card out of the machine... and continuing to run their demo.


If they made up the rest of the story why would it be nvidia taking the card out? That's alot of defending you're doing based on even less info than the InQ had. And nV taking the card out doesn't prove anything unelss you had benchies before and after. The only thing taking it out does is prove that even they felt it was a consideration, leaving it in would've be more of a statement that they didn't care. But coming clean after a slip-up like this is always the best way to go about doing it, now there's no way to test for sure what was the before/after results. The wise thing would be to invite them back to explicitely show there's no influence. Instead of eliminating doubt, they sustained the doubt and eliminated the source of the doubt (and the source of redemption). IT probably is nothing, but just like so many corporat questionable actions the response is usually bunged up by trying to hide it rather than address it directly.[/quote]
I'm not defending nVidia, I'm attacking a craptastic INQ story. And this time you've choosen to make a rediculus assumption: Don't you also think that if nVidia's demo went to crap after they took out the physx card that it would be easily "confirmed by anyone at the expo at the time"? They only thing they "acknowleged" by taking the card out (which they left in all day since they didn't think it was important enough to stop their demo) is that they recognize that a bunch of morons read INQ that can't seperate evidence from speculation. They took it out and continued to run their demo when there was never any evidence to suggest they needed it in the first place? How much more do you need?

Quote:
They physx card was idle, there was no story.


How do you know it was idle, once again, you're assuming far more than the InQ did, the InQ at least had pictures of it in the rig, you have nothing at all one way or the other.

Quote:
INQ is to blame for making up baseless rumors and making a big fuss about them, much in line with their reputation for low-quality reporting.


ME thinks thou dost protest too much. You can say what you want about them, but they obviously touched a nerve to get this much attention on what appears to be a factual information and picture. The questions that it led to are leading for sure, but the main reportage was factual unless you can prove it never existed.

Quote:
Their second article only confirms how completely off-base the first one was.


No it doesn't, it simply adds to it saying well the card was there, but it wasn't by nV design so much as pre-built design, and that it's probably a tempest in a tea pot, but they weren't anymore in it being there, just in the way they phrased their sarcasm which is the only thing off base. nV still showed up to the expo with essentially the nitrous kit strapped in there, whether they intended to or had the need/opportunity to use it or not.

Quote:
Wether it was a Dell XPS or a Biohazard Computers (the INQ still doesn't tell us what model it was, if they did we could probably look it up and go "oh, those come with physx cards") is irrelevant, because the card is irrelevant, because there were no physics demos. Hence this entire story was BS.


See and the thing that makes me laugh about what you write, is that at NO POINT was all the software/demos detailed, yet you think it's more important to find out which model of the pre-built it was. Wow you really miss the point, either on purpose or simply becuase of your obvious dislike of the InQ. The question shouldn't be who made the PC, it should be what were all the demos being displayed, and how do they work? Prove that it means nothing, don't hide behind the 'we don't know what's in our rig defence, it makes them look dummer than the people who buy DELLs.

Quote:
The only good thing you can say about the INQ at this point is that they did do a follow story, but they didn't blatently admit that Ageia was not the "...'sekret sauce' behind their physics acceleration." and appearantly it needs to be blatantly stated, repeatedly, for some people to get the message: nVidia was NOT using Ageia to speed up their physics demos.


You actually CAN'T say that, and nV didn't even provide the evidence to dispel that misconception. Bad move on their part.

Quote:
There was never any evidence to suggest that they were.


Actually a PhysX card in a rig DOES suggest it's there to do physics, but what there isn't is evidence ONE WAY OR THE OTHER of what exactly occured in the demo. nV reacted in such a way to now make it pretty much impossible to prove. They could demonstrate the demo now, but how would they prove that it wasn't a software work around like the use of thePhysX calculations on the dual core CPUs. They should've addressed the isssue immediately, openly and decisively. not yank the card.

Quote:
People should simply not read INQ. This reporter appearantly didn't even know what physics acceleration is


Where do you get that from, no who's making up porky-pies. :roll:

Quote:
or what supports it as he seems to have thought that simply slapping an Ageia Physx card in your system speeds up all your graphics...


Yeah and we all know the PhysX card slows down the FPS, we've all discussed it many times. :twisted:
Reality is like I said you don't know WTF it was doing there, you only know what nV says it was doing there. At the very least it make nV look very bad at picking parts considering that if it's idle it's still wasting power and generating heat. Silly rabbits!

Quote:
So he made a story about it. Read the last two sentences and the image captions of that first article again. He's either an idiot, or he thinks all of his readers are to post such garbage.


Read the WHOLE story and not just the pictures and select sentences and you actually get the whole picture where the questions are obviously leading and obviously sarcasm, specifically meant to ilicit a reaction from nV, and point out the humour of nV putting a competing product in the rig in their display. To think this stuff means nothing you're fooling yourself, just look at how much Intel's conroe launch changed once ATi was own by AMD, suddenly it would look foolish to have the direct competition being the co-star of your show. Just like a "Stop outsourcing ~ Buy American" buttons made in China. :mrgreen:

However, like I said before "Of course people are making alot of noise out of a PC that had the PPU in it, yet we don't know if it was actually used in any way, may have been there in the test system for other stuff prior to the expo/demo."

But like I also said in this thread, I think you protest far too much, and make as many if not more assumptions about the whole story than the InQ. :roll:[/quote]

Holy crap you go on and on and start talking about something about outsourcing... Both the original and follow up articles from INQ leave any reader NO CHOICE but to make assumptions as they are drasctically lacking in information. I think the assumptions they try to get you to make are the least likely. I made several safe assumptions, which turned out to be true when you posted an article that showed that I was right and tried to present it as evidence that I was wrong. You're in some kind of fantasy land and I really don't know where you're coming from. They used a system with a physx card in it to demo GFX, not physx. There is no evidence they were demoing physx. Almost nothing supports that stupid card anyway, and then they took it out and continued their demo. There were no reports of missing effects or reduced performance. If it's safe to assume that there are enough witnesses to confirm the PC was there in the first place is it not also safe to assume that there are enough witnesses that they would notice a newly-crippled demo?

If there is more evidence of some kind of relationship between nVidia and Ageia I am all ears. Until then there is NO STORY. There is only far-fetched assumption and speculation. Why are you rallying against reasonable assumptions to support rediculus ones? You make no sense. You posted an article that confirmed what I said and then told me that my inductive reasoning was flawed. While it is certainly possible I used the wrong technique to come to the correct answer, you should wait until I come to the wrong answer to try and make such an argument.

And what in the hell did I do to deserve such a long and rambling point-by-point response that so closely teeters on the edge of meaningless flaming? I was right. Had I been wrong I could certianly understand you insulting me and picking apart my statements (not that I would have liked it, but I would have understood it at least).

INQ could have certianly challenged the nVidia demo team to take the Ageia card out righ then and proove they weren't using it. They didn't, they fired up their rumor mill instead. And then nVidia took the card out and prooved them wrong. Until there is more evidence (and better evidence) the rumor ends here. They shouldn't have reported it in the first place, it wasn't news, it was garbage. They could have sat on it and gathered more information and posted an article later when and if they had a case, instead they had to post what was basically a correction as it took everything they said in the first article, and invalidated it.

You accuse me of defending nVidia... Why are you defending INQ?
October 17, 2006 2:39:29 AM

Quote:
OMFG you did not just do a point-by-point reply without making a single valid counter argument... Fine, have it your way.


"Dell XPS system" was the only thing they mentioned being run at the nVidia booth in the first article. In the second article they even admitted to omitting the pertenant information of what system had the Physx card in it. The INQ got it wrong through their omission.


They weren't wrong by omission, the issue was not whether it was a DELL a Gateway or a Beige Custom Built, it was the Ageia card, and reading it they NEVER SPECIFIED, the mention of DELL was regarding the GeforceGO unit, everyone just read in the DELL as being the unit the talked about, and even now it's not relevant. They also didn't admit omitting pertinent information, because it wasn't pertinent only people like yourself seems to matter the make of the box.

I said, repeatedly, that it was irrelavant who made the box. The assumption was that the box was made by a system builder and not the nVidia demo team. A safe assumption, and I was correct. The Dell XPS part was a guess, stated as such in the form of a question, and based on the only information available to me at the time. You made a big deal about it the first time and I pointed out that it was irrelevant. Go find your own points, stop trying to steal mine. You're the one that seems to think it makes a big difference.


Quote:
How do I know what it was doing??? Well, for starters: "When asked whether they were showing any physics acceleration demos, the Nvidians indicated that they were not doing any physics at this show." and then the original INQ article, nor the second one, said nothing about spotting possible physics demos being run at the nVidia booth, which any sane reporter would have promptly reported as it would be supporting evidence. Conclusion: nVidia said they weren't running physics demos, and they were telling the truth. Suggestions to the contrary by INQ were baseless rumors.


Without proof, you still don't know either way, and therefore it's just as baseless an assumption. That's my point, and accepting nVidia's (or ATi or anyone's) line when confronted with issues like this has never been the wise course of acton, or did the optimization years pass you by?
This is very similar to racing where if you see a nitrous bottle or switch you don't trust he doesn't spray, you have them remove the bottle and bleed the lines. Your assumptions that they don't do anything just brings up the questions that were there at the time, why have it in there if you aren't going to use it, as many of us guessed (including myself) this might be a case of bringing a rig that was designed for a bunch of things to expo and not thinking about the implications of the presence of some parts.
On who's shoulders is the burden of proof? The one making the outrageous claim or the person pointing out that there is no evidence to support the outrageous claim and a number of obvious explanations to explain the cirumstance, all of which, except for the Dell XPS bit that YOU keep bringing up as though it disproves everything I said, turned out to be true. I have a statement from nVidia and no statements to the contrary, even by the person reporting the story, that there were NO physics being run. Without any evidence to the contrary there is NO STORY.

Quote:
They had clear pics.... of a random machine, not even showing what booth it was in, with an ageia physx card in it.


Which could easily be confirmed by anyone at the expo at the time of publishing who saw the same booth, and I'm certain there's additional information should nV challenge it. [/quote]
Which could have been easily confirmed by a wide-angle photograph had they the integrity (or intelligence) to take one. So some assumptions are safe to make but others aren't? I'll agree with you on that, but your metric is way off. My point wasn't that I didn't think the computer was at the nVidia booth, I too assumed that they were at least that accurate. My point was obviously that they hadn't even bothered to show us that it was at the nVidia booth, but expected us to believe it just because they said it, tyhen expected us to believe their next, and completely rediculus, statement also. You tout their "photographic evidence" then admit that it would have to be confirmed by someone else at the event. Why are you even bringing this up again?

Quote:
Of what is that evidence of? That system builders are putting physx cards into makines with nVidia GFX cards? INQ got nothing right.


That's hyperbole on your part at best, 'nothing right'? :roll:

They were right about it being there, like I said that is evidence of myopia to say the least from a company that is supposed to be making competing products. Either way it conveys... for physics nV uses Ageia. Do you think they'd ever dare get caught with pictures of their main Booth PC with ATi cards in them even though we know for sure they have a ton of them back at work to test and mess with? From a PR side alone it's a blunder. At the very least it conveys support for Ageia just by it being there, even if it isn't used.[/quote]
They have pictures of an Ageia card in an nVidia booth. Big... freaking... deal. They have no evidence of any physics acceleration being done. Without that they have nothing. Not to mention, nVidia doesn't make physics acceleration products yet, so your analogy to ATI cards is rediculus. Oh and "got nothing right" was a play on yor statement of "they did get it right" that appearantly went right over your head and your attempt at a personal attack on me over an obviously intentional grammer transgression is pathetic both in execution and also because it's irrelevant to the argument. It doesn't "convey" anything except to someone who puts way too much stock in baseless rumor. Had I seen an ageia card in their demo I would have attempted to learn more about what demos they were running... that would have required real journalist work though, not something to expect from INQ. They just like to jump at the slightest big of information that something is happening and post it so they can feel good about themselves when they manage to get something right. They don't wait for real evidence before they post a story.

Quote:
They found a machine with a physx card in it and completely made up the entire rest of the story. This is proved by nVidia taking the card out of the machine... and continuing to run their demo.


If they made up the rest of the story why would it be nvidia taking the card out? That's alot of defending you're doing based on even less info than the InQ had. And nV taking the card out doesn't prove anything unelss you had benchies before and after. The only thing taking it out does is prove that even they felt it was a consideration, leaving it in would've be more of a statement that they didn't care. But coming clean after a slip-up like this is always the best way to go about doing it, now there's no way to test for sure what was the before/after results. The wise thing would be to invite them back to explicitely show there's no influence. Instead of eliminating doubt, they sustained the doubt and eliminated the source of the doubt (and the source of redemption). IT probably is nothing, but just like so many corporat questionable actions the response is usually bunged up by trying to hide it rather than address it directly.[/quote]
I'm not defending nVidia, I'm attacking a craptastic INQ story. And this time you've choosen to make a rediculus assumption: Don't you also think that if nVidia's demo went to crap after they took out the physx card that it would be easily "confirmed by anyone at the expo at the time"? They only thing they "acknowleged" by taking the card out (which they left in all day since they didn't think it was important enough to stop their demo) is that they recognize that a bunch of morons read INQ that can't seperate evidence from speculation. They took it out and continued to run their demo when there was never any evidence to suggest they needed it in the first place? How much more do you need?

Quote:
They physx card was idle, there was no story.


How do you know it was idle, once again, you're assuming far more than the InQ did, the InQ at least had pictures of it in the rig, you have nothing at all one way or the other.

Quote:
INQ is to blame for making up baseless rumors and making a big fuss about them, much in line with their reputation for low-quality reporting.


ME thinks thou dost protest too much. You can say what you want about them, but they obviously touched a nerve to get this much attention on what appears to be a factual information and picture. The questions that it led to are leading for sure, but the main reportage was factual unless you can prove it never existed.

Quote:
Their second article only confirms how completely off-base the first one was.


No it doesn't, it simply adds to it saying well the card was there, but it wasn't by nV design so much as pre-built design, and that it's probably a tempest in a tea pot, but they weren't anymore in it being there, just in the way they phrased their sarcasm which is the only thing off base. nV still showed up to the expo with essentially the nitrous kit strapped in there, whether they intended to or had the need/opportunity to use it or not.

Quote:
Wether it was a Dell XPS or a Biohazard Computers (the INQ still doesn't tell us what model it was, if they did we could probably look it up and go "oh, those come with physx cards") is irrelevant, because the card is irrelevant, because there were no physics demos. Hence this entire story was BS.


See and the thing that makes me laugh about what you write, is that at NO POINT was all the software/demos detailed, yet you think it's more important to find out which model of the pre-built it was. Wow you really miss the point, either on purpose or simply becuase of your obvious dislike of the InQ. The question shouldn't be who made the PC, it should be what were all the demos being displayed, and how do they work? Prove that it means nothing, don't hide behind the 'we don't know what's in our rig defence, it makes them look dummer than the people who buy DELLs.

Quote:
The only good thing you can say about the INQ at this point is that they did do a follow story, but they didn't blatently admit that Ageia was not the "...'sekret sauce' behind their physics acceleration." and appearantly it needs to be blatantly stated, repeatedly, for some people to get the message: nVidia was NOT using Ageia to speed up their physics demos.


You actually CAN'T say that, and nV didn't even provide the evidence to dispel that misconception. Bad move on their part.

Quote:
There was never any evidence to suggest that they were.


Actually a PhysX card in a rig DOES suggest it's there to do physics, but what there isn't is evidence ONE WAY OR THE OTHER of what exactly occured in the demo. nV reacted in such a way to now make it pretty much impossible to prove. They could demonstrate the demo now, but how would they prove that it wasn't a software work around like the use of thePhysX calculations on the dual core CPUs. They should've addressed the isssue immediately, openly and decisively. not yank the card.

Quote:
People should simply not read INQ. This reporter appearantly didn't even know what physics acceleration is


Where do you get that from, no who's making up porky-pies. :roll:

Quote:
or what supports it as he seems to have thought that simply slapping an Ageia Physx card in your system speeds up all your graphics...


Yeah and we all know the PhysX card slows down the FPS, we've all discussed it many times. :twisted:
Reality is like I said you don't know WTF it was doing there, you only know what nV says it was doing there. At the very least it make nV look very bad at picking parts considering that if it's idle it's still wasting power and generating heat. Silly rabbits!

Quote:
So he made a story about it. Read the last two sentences and the image captions of that first article again. He's either an idiot, or he thinks all of his readers are to post such garbage.


Read the WHOLE story and not just the pictures and select sentences and you actually get the whole picture where the questions are obviously leading and obviously sarcasm, specifically meant to ilicit a reaction from nV, and point out the humour of nV putting a competing product in the rig in their display. To think this stuff means nothing you're fooling yourself, just look at how much Intel's conroe launch changed once ATi was own by AMD, suddenly it would look foolish to have the direct competition being the co-star of your show. Just like a "Stop outsourcing ~ Buy American" buttons made in China. :mrgreen:

However, like I said before "Of course people are making alot of noise out of a PC that had the PPU in it, yet we don't know if it was actually used in any way, may have been there in the test system for other stuff prior to the expo/demo."

But like I also said in this thread, I think you protest far too much, and make as many if not more assumptions about the whole story than the InQ. :roll:[/quote]

Holy crap you go on and on and start talking about something about outsourcing... Both the original and follow up articles from INQ leave any reader NO CHOICE but to make assumptions as they are drasctically lacking in information. I think the assumptions they try to get you to make are the least likely. I made several safe assumptions, which turned out to be true when you posted an article that showed that I was right and tried to present it as evidence that I was wrong. You're in some kind of fantasy land and I really don't know where you're coming from. They used a system with a physx card in it to demo GFX, not physx. There is no evidence they were demoing physx. Almost nothing supports that stupid card anyway, and then they took it out and continued their demo. There were no reports of missing effects or reduced performance. If it's safe to assume that there are enough witnesses to confirm the PC was there in the first place is it not also safe to assume that there are enough witnesses that they would notice a newly-crippled demo?

If there is more evidence of some kind of relationship between nVidia and Ageia I am all ears. Until then there is NO STORY. There is only far-fetched assumption and speculation. Why are you rallying against reasonable assumptions to support rediculus ones? You make no sense. You posted an article that confirmed what I said and then told me that my inductive reasoning was flawed. While it is certainly possible I used the wrong technique to come to the correct answer, you should wait until I come to the wrong answer to try and make such an argument.

And what in the hell did I do to deserve such a long and rambling point-by-point response that so closely teeters on the edge of meaningless flaming? I was right. Had I been wrong I could certianly understand you insulting me and picking apart my statements (not that I would have liked it, but I would have understood it at least).

INQ could have certianly challenged the nVidia demo team to take the Ageia card out righ then and proove they weren't using it. They didn't, they fired up their rumor mill instead. And then nVidia took the card out and prooved them wrong. Until there is more evidence (and better evidence) the rumor ends here. They shouldn't have reported it in the first place, it wasn't news, it was garbage. They could have sat on it and gathered more information and posted an article later when and if they had a case, instead they had to post what was basically a correction as it took everything they said in the first article, and invalidated it.

You accuse me of defending nVidia... Why are you defending INQ?[/quote]
Damn somebody wrote a book :) 
a b Î Nvidia
October 17, 2006 10:16:27 PM

Quote:
OMFG you did not just do a point-by-point reply without making a single valid counter argument... Fine, have it your way.


The single most important counter argument to all your statements is nVidia's Demo rig had a single-use PhysX card in it, and they cliam they didn't use it, yet never demonstrated that their demo wasn't affected by it, yet they felt concerned enough about it's exposure to yank the card.

Quote:
On who's shoulders is the burden of proof?


The one who's displaying the product, not the one who notes that the displayed product has the equivalent of a nitrous bottle attached. The burden of proof is on nV to prove that the product that they had in their display machine was NOT doing it's single-use task. While it wasn't proven that it was used, it also wasn't proven that it wasn't, but the InQ clearly showed that it was there in the rig. nV missed the opportunity to prove nothing was up, instead they yanked the card and have no proof they didn't use it other than their statement they didn't (like their statements about other activities that later turned out to be the opposite of what they were saying [not that no one else has done that either, but you pretend they are trustworthy about anything they say when caught off-guard :roll: ]). And it's not like they haven't had a few gaffs before;
http://www.theinquirer.net/default.aspx?article=32529



Quote:
Not to mention, nVidia doesn't make physics acceleration products yet, so your analogy to ATI cards is rediculus.


BS! nV pimped SLi physics before ATi pimped their VPU-based physics;

http://www.tgdaily.com/2006/03/20/nvidia_sli_forphysics...

http://www.rojakpot.com/showarticle.aspx?artno=303&pgno...

Quote:
You're in some kind of fantasy land and I really don't know where you're coming from. They used a system with a physx card in it to demo GFX, not physx.


You're in a fantasy world if you think the two aren't related for gaming. :roll:

Quote:
I said, repeatedly, that it was irrelavant who made the box.


Yet you fixate on it's ommission. If it's not important then never mention it again, I only mentioned it to say your assumption was wrong. :roll:

Quote:
You made a big deal about it the first time and I pointed out that it was irrelevant.


No you made it relevant and said "they weren't allowed to change the system", followed by a statement that it is a generic Dell, not just Dell. Thus trying to negate any responsability by nV's demo crew. :roll:

Quote:
Go find your own points, stop trying to steal mine. You're the one that seems to think it makes a big difference.


BS, I don't think the BOX makes a difference, I think that you think it matters about which box, because to you it means they couldn't pull it, which is exactly what you said. I'm not stealing anything from you, just showing your statements are bogus assumptions at best as bad as the InQ's.

Quote:
all of which, except for the Dell XPS bit that YOU keep bringing up as though it disproves everything I said, turned out to be true.


You're the one using it as an excuse, I'm the oe saying it doesn't matter and that your assumptions are wrong, you had equally no proof yet you stated what it was and then put the question mark afterwards to cover your A$$, hmm just like the InQ did. Considering your dislike for them, you use the same tactics.

Quote:
They have pictures of an Ageia card in an nVidia booth. Big... freaking... deal.


Yeah obviously it is. Just like a Toyota in a Ford assembly plant parking lot.

Quote:
They have no evidence of any physics acceleration being done.


Yet they have evidence of a single-use product in their rig, and no evidence that that hardware wasn't being put to it's intended use. Like an athlete saying the roids in their house are 'being held for a friend', is that the defence you're trying to use? The 'we didn't know what it was or why it was there' excuse?

Quote:
It doesn't "convey" anything except to someone who puts way too much stock in baseless rumor.


Far from baseless, baseless rumour would be 'a little birdy told me the covered slot contained a PhysX card, this rumour is based on the picture of the PhysX card in the nV rig.

Quote:
Had I seen an ageia card in their demo I would have attempted to learn more about what demos they were running... that would have required real journalist work though, not something to expect from INQ.


So you're saying they should have asked for what I said they should've asked for? Wow now look at that, is that your own point now too? I don't say the InQ is a paragon of journalistic methods, however nV is also not a paragon of honesty either, yet you find fault with only one part of this equation, your inability to see the bigger picture that they both have problems with this incident shows you to be either biased or ignorant.

Quote:
And what in the hell did I do to deserve such a long and rambling point-by-point response that so closely teeters on the edge of meaningless flaming? I was right. Had I been wrong I could certianly understand you insulting me and picking apart my statements (not that I would have liked it, but I would have understood it at least).


I point-by-point your post to ensure your statements are properly addressed, whereas you combine everything and in so doing distort the information. And no you weren't right, you said they couldn't alter the machine, and shortly after you said that the second article comes out saying they did just that. Now you accuse me of 'stealing your points' how else do I prove they aren't your unless I highlight yours, BTW at best you stole IcY18's. :roll:

Quote:
You accuse me of defending nVidia... Why are you defending INQ?


Because you're attacking the wrong part of their article. You needed to attack the follow-up, not the BS like them not reporting who made the F'in PC. You should've attacked the InQ for not being more direct with their questioning and getting a better answer of why he card was there. Instead you simply attack the InQ overall in an attempt to say the article has no point, when just the simple fact that nV is running PCs at their booth with PhysX cards alone is interesting, considering that nVdoes have their own competing phsyics solution. That you miss that shows you're more focused on the InQ than what's up with the card. I would've agreed with your assertion that it's no big deal if nV had made the effort to dispelly their leading question, instead they didn't address it, they simply pulled the offending card. That's just like the Watergate break in or Lewinsky issye, the acts was a minor 'who cares' incident, how the reaction to it was handled becomes the interesting issue.

nV cold've handled this much better, and granted the InQ could've written a far better article.
October 18, 2006 12:11:55 AM

Quote:

But isn't that a huge waste of Cores? Surely using a £80 graphics card as a PPU is a much more sensible option, or am i the only one thinking this?


Thats no argument. Most Software available doesn´t use a CPU as it is intended. Instead of actually calculating the Processor is is juggling the stack like mad. So right now it may seem as if the GPU may be better, but if Intel intends to go multicore like crazy, i bet $/Core is going way down and in two years four cores will be cheaper than even a entry level GPU. Then again, two years from now, there might be GPUs integrated into the CPU or at least available vor CPU Sockets...

AMD open socket plans: http://www.simmtester.com/page/news/shownews.asp?num=95...

Open socket designs are good for bringing innovation to the market; good for consumers. GPU, PPU, FPGA, dual Proc, Intel or Via CPUs running on an "AMD" socket. These are all possibilites. An FPGA "co processor" is the one that excites me the most. When an application is launched it could program the FPGA and you then have ~500mhz of low-overhead specialized processing power to do exactly what you want to do no matter how much load you put on the rest of your system. I am currently testing ASIC code on FPGAs running at a mere 53mhz and they can respond to and route 2ghz traffic on multiple channels simutaneously within a few nano seconds. IMHO specialized logic chips are the only way computers are going to get significantly faster from here on out. FPGAs capable of over 500mhz are currently available. Specialized logic chips are extremely powerful, as you can see by a ~500mhz GPU having significantly more processing power then a dual-core ~3ghz CPU.

I don't think it's really feasible for high-end gaming GFX though as they depend heavily on having tons of very fast RAM very close to the GPU with dedicated bandwidth and that might not work well for a "socket" GFX solution if it relies on a single Hyper Transport link and shares memory bandwidth with the rest of the system (but we're still talking much faster then current GFX cards that use system ram). For mid range GFX and many other applications it could work extremely well.

As far as it being a "waste" of cores. Dual-cores are dropping in price and rapidly becoming the norm. Using a core that a system already has instead of buying additional hardware is certainly a valid strategy. CPUs are very ineffecient though as they are designed to be able to do anything and require a lot of software overhead to accomplish most tasks, so they may simply not be fast enough, no matter how many extra cores you have, to perform some tasks (real-time GFX being an obvious example of something you wouldn't want to do with an "extra" core).

Actually Flasher nVIDIA GPU's ARE too slow for GPGPU calculations.

Mike Houston: Stanford University Folding@Home Project

nVIDIA GPU's have to many limitations. First of all they're 256bit internally (vs 512bit for ATi X18/19K VPU's), which normally would not be a problem but nVIDIA 7x00 GPU's have a 64K limit in shader lengths. Thus many passes need to be made in order to get the work required by Folding@Home done (or physics). The more passes the slower it is.

ATi X18/19K VPU's support unlimited shader lengths. (they can do all the work in a single pass).

Another aspect is the dedicated Branching unit found on ATi X18/19K VPU's. These are used extensively by GPGPU applications. nVIDIA's 7x00 do not support such a feature.. thus more passes required once more for Branching.

And last but not least pure Shader power. ATi X19K series have more shader power then ANY nVIDIA GPU currently (by 2 times or more).

All this makes nVIDIA 7x00 too slow for hardcore GPGPU applications.

Read up on it.

ATi x19K > AGEIA Physx in GPGPU applications if dedicated to such an application as well BTW.

Elmo: Thank you for finally taking the time to point out what the actual big problem with the nVidia card as a GPGPU is that it sucks for shader lengths bigger than 64k. I don't know why you still insist on rambling on about all that other stuff; the important information could have been summed up in once sentence. All that other stuff seems pretty irrelevant if you're dealing with shader lengths >64k.

And really, a few sentences later he said this:
"While it would be possible to run the code on the current NVIDIA hardware, we would have to make pretty large changes to the code they want to run, and even past that, the performance is not great. We will have to look at their next architecture and re-evaluate."
Isn't that *exactly* what I said was probably their attitude??? :)  And also notice that the phrase "too slow to run f@h" or anything even close to it close to it does not appear in the article anywhere. SO STOP SAYING IT! Just say "nVidia can't do shader lengths over 64k that GPGPUs need to do and ATI can". It's a 100% accurate statement, and if your goal is to trash nVidia it makes their product sound likes it's comletely incapable, which is perhaps a bit misleading, but no more so then "it can't do it because it's too slow" and you won't sound like a whiny fanboy when you say it

I get the idea that they are saying nVidia cards are to slow as of now to mess with. But yes I am sure they can run it.

http://folding.stanford.edu/FAQ-ATI.html

Quote:
What about video cards with other (non-ATI) chipsets?
The R580 (in the X1900XT, etc.) performs particularly well for molecular dynamics, due to its 48 pixel shaders. Currently, other cards (such as those from nVidia and other ATI cards) do not perform well enough for our calculations as they have fewer pixel shaders. Also, nVidia cards in general have some technical limitations beyond the number of pixel shaders which makes them perform poorly in our calculations.


It would be nice to know about how much slower they are.
October 18, 2006 1:32:05 AM

Quote:

I said, repeatedly, that it was irrelavant who made the box.


Yet you fixate on it's ommission. If it's not important then never mention it again, I only mentioned it to say your assumption was wrong. Rolling Eyes


I was fixated on the fact that if they had told us the make and model of the box we could have seen if the physx card was a standard part or if nVidia had added it. If it was not standard we would have known they put it there on purpose, not just "it came with the system". "That it was there at all" is not very interesting at all as we have seen by the follow up article. If they had found that it was in a PC that doesn't normally come with physx THAT would have been more of a story. Instead they omitted the information entirely. It would have been the difference between a random machine that happened to come with physx (which is inconsiquential if there were no physx demos), and a machine that they deliberately added a physx card to (which would have been very fishy indeed). *Who* made it is irrelvant. What parts come standard on the machine could proove to be very interesting. Did nVidia purposefully "upgrade" the machine to add a physx card? We don't know... because the information was omitted.

Physics coprossesors do not boost graphics perfomance. They are not intended to boost gaming perfromance. They are intended to add effects to games that simply could not be created in real-time by a CPU (at least not one that is busy doing something else too). Read any review of the difference a physx card makes on a game, you'll see that it's about adding effects, not fps. You don't need a benchmark program to tell the difference; the effects are either there or they aren't. If they had been using the physx card for something then yanked it out their demo would have gone to crap This is the obvious answer, until there is something to suggest otherwise it's not "fishy" or "suspicious" or "a PR blunder". It's nothing but pure baseless speculation.

eh? Is "SLI Physics" actually available? I knew they had done a demo but I didn't think it had actually been brought to market. Did I just miss that? The articles you linked are from march. In a quick search I was not able to find any "SLI Physics" products for sale. Searching for "physics" at nvidia.com the most recent article is from march 31st. Please correct me if I'm wrong. nVidia is not currently competeing with Ageia, they have some R&D on a competing product, but using (which it turns out they weren't anyway) a product that does something they are researching is very different from using a product that does something that they're currently trying to sell people. Isn't part of the point of a community like this to see through the hype? I don't care about what would happen to their "image" if nVidia used an Ageia Physx card to spiff up their demo, and neither should anyone else. Demos are not real. The product I'm currently working on just worked in a real-world configuration for the first time yesterday but we started demoing it a year ago. I care about real data. The INQ article contained none. It's a rumor about hype. Real data is comparing available products to other available products. This is a rumor about comparing an available product to a product that's not available. It's an insinuation that nVidia admitted that their physics solution is inferior to Ageia's... but nVidia doesn't even have a physics solution, they're working on one. And really, last time I checked, nVidia's public image has zero effect on my system performance anyway.

This *could* have been a keen observation that lead to uncovering that nVidia was switching from the havok engine to the physx engine. But they skipped doing any actual journalism and posted a rumor instead. I don't think they even knew that nVidia isn't even working on it's own engine... just making their cards run someone elses engine, so even if it were true they were switching engines would be bad news for Havok, and change nothing for SLI Physics as it's not released yet. It's a completely uninformed and uninformative article. You and I have now both wasted more time talking about this rumor then they put into making it up in the first place. Our posts are 10times as long as both articles put together. It's a stupid rumor, I'll torch my nVidia card if it turns out to be true (heh, just ordered a new ATI one... but it's the principal of the matter, right?). I'm not talking about it anymore...
a b Î Nvidia
October 18, 2006 3:13:40 AM

Quote:

Physics coprossesors do not boost graphics perfomance. They are not intended to boost gaming perfromance. They are intended to add effects to games that simply could not be created in real-time by a CPU (at least not one that is busy doing something else too). Read any review of the difference a physx card makes on a game, you'll see that it's about adding effects, not fps.


What do you think effects are? They are graphic features, what creates them whether it's the CPU, the PPU or another VPU still has to be rendered by the graphics card, and the effect is to make more complex visual scenes (like deformable objects or particle effects, etc.). To think these have nothing to do with graphics is ridiculous. Their role in this demo can't be determined unless the demo itself is detailed by nV or whomever made it.

Quote:
You don't need a benchmark program to tell the difference; the effects are either there or they aren't.


That's not true, that's only Ageia's current implementation with their hardware, a more generalized physics API, like the ones ATi, Havoc, nV and M$ have all said they were working on will scale depending on the systems according to their releases.

As for the effect being 'there or not' the whole PhysX launch demo can be done on a dual core PC but at a reduced speed, so the 'PPU only' scenario was busted long ago;

http://forumz.tomshardware.com/hardware/ftopic-185393-0...

Even then it's not THAT reduced a speed, but it likely reduces the rock bottom drops that would be noticeable in a demo, whereas noticing a missing cloth effect or a reduced resolution or particle count might go un-noticed, especially since so many small issues similar to the Ageia demo went un-noticed. Seriously, your statement isnt supported by the evidence already out there about the PPU. It's not an all or none issue, perhaps you should look into it.

And that speaks to exactly the scenario that would cause issue for a demo that incorporated things that have been demo'ed even in the Crysis engine.

Quote:
If they had been using the physx card for something then yanked it out their demo would have gone to crap This is the obvious answer, until there is something to suggest otherwise it's not "fishy" or "suspicious" or "a PR blunder". It's nothing but pure baseless speculation.


How do you know how the demo scales, and considering you never saw it, and nV hasn't detailed it, and doubtful anyone took a magnifying glass to it pre/post PhysX card, you can't say anything about it one way or another. Yet you're sure quick to say what is and isn't possible, and try and invalidate any concern, yet your methods do more to hurt your case especially when you try and make statements of fact without any supporting evidence, like I said before, exactly what you're bashing about the InQ article.

Quote:
eh? Is "SLI Physics" actually available? I knew they had done a demo but I didn't think it had actually been brought to market. Did I just miss that? The articles you linked are from march.


LOL! So now you're going to argue by using semantics about whether nV has a competing solution to market or not? Wow that's a lame argument dude.

Quote:
In a quick search I was not able to find any "SLI Physics" products for sale.


Well based on the implemenation the 'products' are already for sale, (the GF7600GT, the SLi / Quad SLi Mobos. I think what you're trying to say is the 'software isn't available yet'.

Quote:
nVidia is not currently competeing with Ageia, they have some R&D on a competing product, but using (which it turns out they weren't anyway) a product that does something they are researching is very different from using a product that does something that they're currently trying to sell people.


Hardly bud. And nowhere near your attempt to minimise it by saying, just using it is no big deal, and then claiming they weren't, which you still don't know. If nV had an ATi AIW in their rig because they didn't have a market equivalent, but were developing something similar, would be equally embarassing as a current part, it still points to the same gaping hole. Whether it's a current gapng hole in their line-up or a future gaping hole in their line-up, it's still the same gaff. You want to debate the semantic of whether we can buy their announced physics solution or not doesn't diminish that blunder, which they obviously saw as a blunder too, hence the withdrawl.

Quote:
Isn't part of the point of a community like this to see through the hype?


No it's to discuss hardware, and this is a hardware issue as it relates to the demo software, that at least you should be able to understand. Is it earth-shattering, hardly, but it is eyebrow-raising.

Quote:
I don't care about what would happen to their "image" if nVidia used an Ageia Physx card to spiff up their demo, and neither should anyone else.


Everyone should, especially if people are thinking the way you are telling them, that it's only graphics performance that's being shown. It adds false expectations to the market for the product that is the focus of this forum, graphics cards. There are additional issues that could even come from this, what does it say for nV's physics solutions if they have PhysX cards in their rigs instead of implementiing their own solutions. The pictures raise alot of questions, and nV's actions and statements do very little to minimize those questions, if anything they accomplished the opposite by making it a little more interesting that they pulled the card.

Quote:
It's a rumor about hype.


No, it's a rumour about a picture, the Hype is about that rumour about a picture of an otherwise unknown nV PC... in the hole in the bottom of the sea.

Quote:
Real data is comparing available products to other available products.


No it's not, real data isn't limited to just that, real data also comprises data about as of yet unreleased products, I have alot of REAL DATA about Vista and the SantaRosa mobile chipset and neither is technically 'available' yet. Now you're making yourself look ignorant to simply try and bolster your weak argument.

Quote:
It's an insinuation that nVidia admitted that their physics solution is inferior to Ageia's... but nVidia doesn't even have a physics solution, they're working on one.


It's still a solution, even if it's not being given to the public yet. nVidia may have a solution to a driver issue, and it still remains a solution even before it's lunched for public beta consumption.

Quote:
This *could* have been a keen observation that lead to uncovering that nVidia was switching from the havok engine to the physx engine.


Now who's pulling more out of the story than would ever be confirmed. :roll:

Quote:
I don't think they even knew that nVidia isn't even working on it's own engine... just making their cards run someone elses engine, so even if it were true they were switching engines would be bad news for Havok, and change nothing for SLI Physics as it's not released yet.


I think now you're just being obtuse because you have nothing to contribute other than to mirror what you dislike in the InQ article and add more hyperbole. Of course they already know the implementation of SLi-Physics, they've covered it many times already in great detail, as has many of the other sites;
http://www.theinquirer.net/default.aspx?article=30571

I only posted THG and Rojak because they were easy finds, and can't use nothing but the InQ you'd have simply ignored it as more rumour.

Quote:
It's a stupid rumor, I'll torch my nVidia card if it turns out to be true (heh, just ordered a new ATI one... but it's the principal of the matter, right?). I'm not talking about it anymore...


I wouldn't bother torching an existing product unless you are one of those poor souls to buy your card solely for the promise of SLi physics or Havok FX as promoted by nV. Same as the previous issues, including the Catalyst AI issue, it's an issue until it's addressed and put to rest.

From the start I said it's implication is minor, but the thing that does suprise me was the handling of the issue. I easily grant that the article was poorly written and the issue poorly persued, but considering your lack of caring you certainly go to great lengths to mis-characterize the entire article just to try and dminish the srcastic and rhetorical statements meant to illicite a response. It's like trying to crush a bug with a sledgehammer and breaking the floor and missing the bug.

Like I said before, it's a rather trivial thing, but their reaction and lack of full disclosure just smacks of previous gaffs, making it all the more questionable if there's nothing to hide. The biggest question being, have they not learned from their past mistakes? Probably the one thing they have learned from this experience, to update the expo prep handbook;

Thing to do before Expo opens:
1
2
.
.
.
.
.
8: Remove any any hardware or stickers, etc that has the potential to embarass, paint us n a bad light, or raise questions we don't want to answr.
9: Check breath before talking to press or booth babes.
10: If missed something in section 8, cash pay cheque from nV ASAP and check Manpower or Monster.com
October 18, 2006 3:54:25 AM

Quote:
Hmm well I'd be tempted to say it's just the Inquirer, but there is photographic evidence.

This is just wild speculation, but perhaps it wouldn't be crazy to conclude Nvidia is planning on buying out Aegia? It would make sense to include the PhysX chip on a graphics card, like the 3D accelerator was integrated onto 2D cards.

Of course it could just be another sensationalist Inquirer story!

Yeah, there was photographic evidence that the Inquirer took a photo of a PC with PhysX card in it. Nothing else. :D 
!