Sign-in / Sign-up
Your question

DirectX11 shader in action on Nvidias upcoming GPU

Tags:
  • Graphics Cards
  • GPUs
  • Graphics
Last response: in Graphics Cards
May 1, 2009 9:17:52 PM

More about : directx11 shader action nvidias upcoming gpu

a b U Graphics card
May 2, 2009 1:00:24 AM

Yep it's a bit like any other Nvidia advertisement in the past year.
May 2, 2009 1:08:39 AM

I think weve just about reached the point where nothing can excite this market anymore. Graphics have offically stepped into the realm of photo realism which means there will never be anymore true improvements other than a little refining here and there. At least within the next few years.

Ever since buying my GTX 285 I havnt even glanced at any tech. Ive completely lost my will, urge, and addiction to upgrade my hardware. Ever since ive bought this X3370 and GTX 285 I finally feel secure and pleased with my system, it runs everything perfectly smooth on max and does what I want it to do.

Pass on GTX 300 and HD 5000.
Related resources
a b U Graphics card
May 2, 2009 1:26:32 AM

I sorta agree with you spathotan. Except the bit about photo realism. There is a long way to go in that regard yet.

As for excitement? Yep even the 5890 and g300's make me think 'meh'.
May 2, 2009 3:20:53 AM

I moved onto consoles, alot cheaper, and I can't tell the difference. :) 
a b U Graphics card
May 2, 2009 3:27:08 AM

spathotan said:
I think weve just about reached the point where nothing can excite this market anymore. Graphics have offically stepped into the realm of photo realism which means there will never be anymore true improvements other than a little refining here and there. At least within the next few years.

Ever since buying my GTX 285 I havnt even glanced at any tech. Ive completely lost my will, urge, and addiction to upgrade my hardware. Ever since ive bought this X3370 and GTX 285 I finally feel secure and pleased with my system, it runs everything perfectly smooth on max and does what I want it to do.

Pass on GTX 300 and HD 5000.



Just wait until you see the benchmarks...and when Crysis 2 comes out. XD
May 2, 2009 3:52:08 AM

spathotan said:
I think weve just about reached the point where nothing can excite this market anymore. Graphics have offically stepped into the realm of photo realism which means there will never be anymore true improvements other than a little refining here and there. At least within the next few years.

Ever since buying my GTX 285 I havnt even glanced at any tech. Ive completely lost my will, urge, and addiction to upgrade my hardware. Ever since ive bought this X3370 and GTX 285 I finally feel secure and pleased with my system, it runs everything perfectly smooth on max and does what I want it to do.

Pass on GTX 300 and HD 5000.


Just wait till all the games start coming out again late fall early winter.
May 2, 2009 3:59:39 AM

again, ppl need to stop assuming that graphics will make a hop, look at a 8800 GTX, 3 yrs old and still pulling its weight.

For all we know, graphics won't budge enough to put pressure on current cards, or we might be wrong, and these cards turn into integrated cards.

:D  lets hope its atleast worth the frame loss unlike DX 10.
a b U Graphics card
May 2, 2009 6:24:22 AM

Im looking forwards to tesselation, and to see how more "real" real can get.
a b U Graphics card
May 2, 2009 6:42:22 AM

spathotan said:
I think weve just about reached the point where nothing can excite this market anymore. Graphics have offically stepped into the realm of photo realism which means there will never be anymore true improvements other than a little refining here and there. At least within the next few years.

Ever since buying my GTX 285 I havnt even glanced at any tech. Ive completely lost my will, urge, and addiction to upgrade my hardware. Ever since ive bought this X3370 and GTX 285 I finally feel secure and pleased with my system, it runs everything perfectly smooth on max and does what I want it to do.

Pass on GTX 300 and HD 5000.

I disagree. Graphics are nowhere NEAR photorealistic. The closest I have seen is the ATI demo from a while back, but most games (yes, including crysis) are quite far from photorealism in most cases, especially with reflections.

Here's the demo I'm talking about btw: http://www.pcgameshardware.de/aid,655714/Ruby-20-Screen...

(look at the video on the page)

That is the closest I've seen to true photorealism, and even in this case, the hair is completely wrong. It's way better than Crysis though.
Anonymous
a b U Graphics card
May 2, 2009 7:00:55 AM

L1qu1d said:
again, ppl need to stop assuming that graphics will make a hop, look at a 8800 GTX, 3 yrs old and still pulling its weight.

For all we know, graphics won't budge enough to put pressure on current cards, or we might be wrong, and these cards turn into integrated cards.

:D  lets hope its atleast worth the frame loss unlike DX 10.

Fame loss dx 10 cards are great lol running in dx9 games when dx10 cards came out they ran dx9 cards faster then anything else could and then some somewhat amazingly so with dx11 lets hope it runs dx 10 at grea fps lol and well move on from there
May 2, 2009 12:17:16 PM

Bluescreendeath said:
Just wait until you see the benchmarks...and when Crysis 2 comes out. XD


Didnt give a damn when Crysis came out, wont give a damn when Crysis 2 comes out.
May 2, 2009 12:19:06 PM

cjl said:
I disagree. Graphics are nowhere NEAR photorealistic.


Its close enough. Graphics dont need to get any better. How about we start seeing some actual GOOD games made (remember those?) instead of all these $50 tech demos weve been given over the past 2 years.


a b U Graphics card
May 2, 2009 2:31:13 PM

I think Spathotan hit on a good point. Creating such beautiful models and awesome graphics is a huge amount of effort and money. Only the big makers can do it, and when they do, they can't afford to make something that will flop. So, now we see fewer just for PC games and we see fewer interesting games (sure most are fun, but think about 10 or so years ago and how many different and even wacky games there were). There still are plenty of good games (I just went back to Oblivion and M2TW (both heavily modded, of course)) but they are fewer and rarer due to the huge amount of resources required. I also wouldn't say I've completely lost interest in the new cards and DX11. I'll have to wait and see how things go, but who knows, I may even get a 5850 (though I am quite satisfied with what I have now, I do need some more VRAM though for Oblivion). And liquid, I can't believe you've gone to the dark side (consoles)! :non:  :lol: 
a b U Graphics card
May 2, 2009 3:51:30 PM

All this Physx stuff is actually starting to add up, it may come down to Physx support when I choose between ATI and NV on my DX11 video card... I think ATI needs to come up with something fast.
May 2, 2009 6:58:03 PM

Quote:


Nvidia need to move with the times and stop using marketing tricks to win sales


Right, because Spider and Dragon arent "marketing tricks". Riiiiiiight...


Of course i am as biased said:
Of course i am as biased


Yep
a b U Graphics card
May 2, 2009 8:03:43 PM

Lately, ATIs focus has been on the betterment of gfx in general, such as having DX10.1,or having a built in tessellator in their cards.
CUDA, and using nVidias gpgpu arch, the way it appears to be setup, enhances this capability, so essentially, nVidia is moving things ahead as well, tho, its their own initiative, but it is coming, and once LRB is here, itll be all the talk, trust me, those cpu boys wont be able to stop talking about gpgpu.
Having said that, liike SS is saying, Physx is just a SW solution, that started out proprietary, and now needs help with or from both AMD via ATI, and Intel, I believe, in order for it to survive. They spend alot of effort and monies for TWIMTBP, and going to the devs etc, and even for their own Physx, but apparently, they havnt done nothing at all similar with either Intel or AMD.
As far as DX11 goes, its about time nVidia catches up to the standard, renaming instead of creating has held us all back in some ways
May 2, 2009 8:19:33 PM

Quote:
I'm sorry, but that is AMD, not the same thing.


Actually it is. Keep making excuses though. Next
a c 331 U Graphics card
May 2, 2009 8:26:13 PM

Quote:
I'm sorry, but that is AMD, not the same thing.

Re-naming old tech as new is a marketing trick, Physx, until something actually comes of it, is a marketing trick.

What has nvidia done or supported recently that has benefited the industry?

Of course they are a company out to make money but why are they dragging there feet with dx support not just 10.1.

Why are they trying to push their own physics api and gpgpu tech and dividing people?

I know people will come out and say ATI does the same but of late, or really for a while ATI appear to be trying to do things and not just maintain a status quo.

Yup, I agree with all the above. Still not gonna buy Ati though.
May 3, 2009 12:22:01 AM

Sry Spath, but I fail to see how AMD offering a full computing platform is the same as nVidia's PhysX drumbeating. If you want to use an AMD GPU with an Intel CPU, your perfectly free to do so. If you want PhysX, you have no choice but to purchase nVidia.

Also, were no closer to photorealism than we were when DoomIII came out. Yes, the games look amazing, but Ive never thought I was watching a movie rather than playing a game. Trying to blame the eye candy for the lack of good games is BS, theres always room for graphical improvement.
May 3, 2009 5:19:59 AM

B-Unit said:
Sry Spath, but I fail to see how AMD offering a full computing platform is the same as nVidia's PhysX drumbeating.


You missed the point, completely. Why bother posting?

Nvidia markets/advertises Physx/cuda like its something amazing and performance breaking if you dont have it. Just like AMD (owns ATI btw) marketed the spider platform and now Dragon exactly the same basically, they pretty much claim using ATI GPU with AMD CPU will offer you benefits over Intel/nvidia combinations.

Point being, both instances are BS marketing ploy. Only difference is Spider and Dragon are just catchy little terms for a full AMD/ATI system. At least PhysX/Cuda DOES something.
a b U Graphics card
May 3, 2009 8:20:12 AM

Quote:
I'm gaming less and less these days and doing more professional work instead.
The most often I use is Photoshop which is still fairly CPU-bound. 3D effects via OpenGL 2.0 is just eye-candy.
Then there's video encoding/editing and that's the area nVidia is focusing their GPGPU (CUDA) the most. It would seem while nVidia is supporting the ever slow maturing OpenCL, they're pushing CUDA development (already v2.2) as fast as possible and heavily marketing it. Their ultimate goal is GPGPU domination.
The design of GT200 is the first step, they were willing to sacrifice 3D performance in order for faster/more CUDA usage. Rumors of GT300 so far points to even more GPGPU to 3D power ratio. It's quite clear nVidia wants to win the GPGPU race with CUDA, BAD!

As for DX11 even tho it's more programmable it's still very DX10.1 with tessellation and a few other minor things added. Meaning the design of RV770 and therefor RV870 would mostly likely still hold the transistors/performance advantage in gaming while nVidia is putting all their effort into creating larger, less efficient 3D performance dies all for the sake of more raw power for CUDA (and GPGPU in general).

What Im wondering is, nVidia will have to most likely move G200 to its mid-level cards, and even keep a few G90s for low end, as if they continue to make huge di , for middle and low end, itll keep cutting into margins big time, and eventually have an eroding effect, which both nVidia and ATI can expect with the onset of LRB, as LRB will have a certain portion of the market, and that market isnt any bigger before or after LRB comes
May 3, 2009 9:42:29 AM

well LRB is quite a way off and it may well seem uber powerful at this time but when it comes out it may be just as powerful as a mid-end card, maybe if they get the 48 core version out first they might compete but I can't see LRB competing in it's 32 core state.
May 3, 2009 11:57:31 AM

PhysX, CUDA, not supporting DX 10.1, rehashing old techs and renaming them and the "The Way it is Meant to be Played" program are all about NVIDIA's political game. PhysX is a marketing hype by NVIDIA and it might not live very long and it might never be a standard Physics engine in the future. Microsoft and ATI/AMD and Intel hasn't even approve PhysX yet and AMD also mentioned that Physics is actually dead until Microsoft's DirectX 11 comes out. Many NVIDIA video card users are blindfolding themselves into believing that PhysX is the future standard Physics engine while maybe it isn't at all.

(Maybe I am wrong, but that is my impression and my opinion)
May 3, 2009 12:49:36 PM

EXT64 said:
I think Spathotan hit on a good point. Creating such beautiful models and awesome graphics is a huge amount of effort and money. Only the big makers can do it, and when they do, they can't afford to make something that will flop. So, now we see fewer just for PC games and we see fewer interesting games (sure most are fun, but think about 10 or so years ago and how many different and even wacky games there were). There still are plenty of good games (I just went back to Oblivion and M2TW (both heavily modded, of course)) but they are fewer and rarer due to the huge amount of resources required. I also wouldn't say I've completely lost interest in the new cards and DX11. I'll have to wait and see how things go, but who knows, I may even get a 5850 (though I am quite satisfied with what I have now, I do need some more VRAM though for Oblivion). And liquid, I can't believe you've gone to the dark side (consoles)! :non:  :lol: 


haha, I wanted to play RE5 killzone 2, and a sports game (nhl 09 and fifa 09) that wasn't a ps2 port:(  lol

When i saw the graphics and the mechanics for fifa 09 and nhl 09 for PC, I was like....R U FING kidding me! 2 285s, and an 09 game, and this is what I get....

I popped int he ps3 demo, I was like lets see, omg....next gen....and then I cried...by which I mean I had somehting in my eye of course:p 

:p 

The dark side aint so bad, they have 3 bean salad and german potato salad! (-Simpsons :p ).

May 3, 2009 12:56:54 PM

Can I ask why ppl need to turn this thread into an Nvidia vs ATI 1?

My god, I can't post anything about Nvidia without starting something up, or ppl just trying to sell it down.

Although many points are valid, and I too don't like Physx, since havok works fine for me, but that doesn't mean that its cr*p.

it is what it is, a selling point, just like dx 10.1 is to ATI atm.

Oh and you can stop using the word rebranding for Nvidia only, seeing as the 4890 is a rebranded 4870 basically, 10% faster than the 4870, just like the 285 is 10% faster than the 280.

K
Stop fighting, i was hopping ppl would actually talk about DX 11 more than physX.



May 3, 2009 5:10:13 PM

It would also help to not be biased, but to each his own.
I posted it in a rush, because I was stepping out, and thought that it could be a fair read either way.

And your the 1 that posts pics of don't feed the trolls right?

Nice quote.

I'm done in this thread, good day!
May 3, 2009 5:25:02 PM

Quote:
You do realise the enormous gain in clocking potential when AMD redesigned RV770 to improve its shortcoming (a cap. ~880Mhz) into a 1Ghz+ clocking monster we know today as RV790 found in HD4890 don't you?
The same goes for GT200 65nm of GTX 280 to GT200 55nm of GTX 285. Vast clocking potential difference.
While GTS 250/9800GTX+ and 9800GTX/8800GTS 512MB are purely rebrand.

And yes I agree we'll stop talking about PhysX and focus more on DX11, but not nVidia(GT300) in particular.


Yes although thats true, it still isn't a new card from either company. To me they shouldn't be praised as much as they should be spited, because this is what the 4890 should've been and this is what the 285 GTX should've been.

You are wrong for 1 card, apparently the 260 GTX 55 nm refresh didn't OC well at all, I've seen some bad experiences. I still think to date the best 260 GTX to OC would probably be the 192 SP.

I would still say that the number 1 card to get right now for the price is the 4890, the 275 GTx will always be a broken 285 in my books and very unnecessary especially since we were screwed over with the 260 SP 192 to 216, now its the 240 Edition.

If you think about it, that's exactly what it is, 260 GTX SP 240.

Anyways
a b U Graphics card
May 3, 2009 8:00:30 PM

The rebranding you mention, and what eveyone else refers to are 2 different things. The G200s arent refered normally to the rebranding fiasco nVidia did, its the G8x-9x-1xx-2xx thing, so only you are really including the "real" G200s, no one else is
a b U Graphics card
May 3, 2009 10:23:04 PM

To me, tessellation will make a face look more like a face, itll help fine grain things become more realistic, and thats a good thing. At this point, no one knows the fps hit itll take to do good tessellation, but it does look promising.
Having the improvements across the board on fps in reards to AA like we see in DX10.1 games on ATI cards also looks good, as that will allow for greater eyecandy with the same current fps, if you will. Looks like around 15% on average, which Id assume the devs will use for more eyecandy anyhow. To me, having W7, OpenCL and DX11 will boost gaming alot, as well as other apps, for the gpu ram usage, MT etc
a b U Graphics card
May 4, 2009 12:22:08 PM

Please JD; you don't add new improved ways to produce more graphics while gaining FPS; I'm expecting a FPS drop, and a major one at that, when tesselation is used. And again, I bring up the Split DX API...

I also point out, PhysX does have one huge advantage over OpenCL: Console Support. As developers HATE split API's, the fact PhysX is already being used for consoles, while OpenCL has yet to get is feet wet, leads me to belive OpenCL won't be adopted in large numbers. I also point out, the only reason we don't get PhysX on ATI is because they choose not to.
a b U Graphics card
May 4, 2009 5:18:07 PM

PhysX has been ported to all the consoles for a few weeks now.

Havok is an ENGINE, not an API. Thats a very important distinction to make. To expand physX, all you need to do is extend the API, and no existing implementation will need to be affected. To expand an engine, you either need to modify it on a case by case basis, or re-write from scratch. This takes time, effort, and even games that use the same engine, after modifications, will use an entirly diffrent set of features.

Look at it this way: A person runs full speed into a guardrail. How do you get the person to trip over it? In PhysX, everything is done for you, and you wouldn't need to code a single event, reducing code overhead and maintainability. For Havok, you need to upgrade the engine to allow this ability, script it for every place where its a valid action, and apply it only to the objects you want to use this effect on.

See the diffrence?

It should also be noted, that because PhysX has yet to be used as the PRIMARY engine for in game physics (thanks in part to ATI's refusal to implement support), you only get a mismatch of features instead of full support (the same issue exists with DirectX10+). Thanks to its console support though, we're starting to hear about the first batch of games coded with PhysX, which should show exactly how far the API can go, and why animation-event engines are going the way of the dinosaur.
a b U Graphics card
May 4, 2009 10:08:19 PM

What I mean is, using DX10.1 allows for better fps. Having that, itll allow for more eyecandy, or better use of tessellation. Thats a new improved way, and nVidia doesnt have it til they catch up and get DX10.1 capable cards, as we see the difference of DX10.1 games being run on ATI cards show around a 15% increase when its used vs regular DX10, on a DX10.1 game.
So yes, the new fangled improvements will allow for more, but surely will be taken up