Sign in with
Sign up | Sign in
Your question

General questions: DX10; FPS

Last response: in Graphics & Displays
Share
June 19, 2007 12:01:57 AM

What is the difference between a DX10 card and a non DX10 card? There's something special about DX10, I gather. My current card has supported several versions of DX... but pre-DX10 cards won't support DX10?

You don't have to explain everything to me - I'd gladly accept a link to a page that does.

My second question is about FPS. When playing counterstrike I often hear people boasting "I get x00 FPS". I was under the impression that the human eye loses track of changes at around 60hz, and that's why most light bulbs and screens run at 60hz, so that people can't perceive any difference (flickering.)

So, is improvement in FPS above ~60 really that noticeable? I'm not very worried about it myself, but if so it WOULD mean I'd have to buy a new monitor, as this one only supports up to 60hz.

I could understand if there was a margin, and that with practice and experience one could distinguish differences a few tens of hz away from the 60hz "limit", but... 100FPS? 120? Is that really necessary?
June 19, 2007 12:31:33 AM

What the precise reason is I don't know, but DX10 is not compatible with DX9 cards. It's the way the software handles shader commands i believe as this is what separates the DX10 cards from their DDX9 brethren. Unified shader architecture. Find GreatGrapeApe :wink: , he can explain this stuff in great detail.
As for the FPS question, there is allready a thread going on the subject, giving a great many opinions on why it is/can/should/would not be usefull. check it out:

http://forumz.tomshardware.com/hardware/point-ftopict24...

Enjoy!
June 19, 2007 12:37:51 AM

Ah, thanks for that. I searched for DX10 but not FPS. I won't comment any more on that issue in this thread, unless somebody wants to talk about something somehow unique to this thread.

My DX10 question is, I think, answered. The other considerations are tangential issues like future-proofing and what games and so on will take advantage of it and how, etc etc. I really just wanted to confirm that there was a big difference.

Um, can a DX10 card run DX9 and back? Does it matter?
Related resources
June 19, 2007 12:43:40 AM

yes a dx10 card can run dx9.
June 19, 2007 12:57:06 AM

Quote:
What is the difference between a DX10 card and a non DX10 card? There's something special about DX10, I gather. My current card has supported several versions of DX... but pre-DX10 cards won't support DX10?


Correct, a Pre-DX10 card cannot run under the DX10 renderer, they have to use the Dx9 one. Also, DX10 cards are special in that they feature unified shaders. In older cards, they had distinct pixel and vertex shaders. In DX10 they added a new geometry shader, and unified the whole package. They opted for a unified structure because it allows the video card to dynamically allocate the proper number of shaders for each task as you play a game, using the right mix of pixel and vertex (soon geometry when DX10 games are available) shaders as necessary for maximum performance.


Quote:
My second question is about FPS. When playing counterstrike I often hear people boasting "I get x00 FPS". I was under the impression that the human eye loses track of changes at around 60hz, and that's why most light bulbs and screens run at 60hz, so that people can't perceive any difference (flickering.)

So, is improvement in FPS above ~60 really that noticeable? I'm not very worried about it myself, but if so it WOULD mean I'd have to buy a new monitor, as this one only supports up to 60hz.

I could understand if there was a margin, and that with practice and experience one could distinguish differences a few tens of hz away from the 60hz "limit", but... 100FPS? 120? Is that really necessary?


24 Hz is about the minimum necessary for "fluid movement" higher frame rates result in smoother motion, which is especially necessary in first person shooters because below around 50 - 60 FPS, turned and making fast adjustments is very jerky. Also, the refresh rate of your monitor is sort of the hard coded "maximum refresh rate" for playing games, but having a higher frame rate then that sometimes helps.

In short, as long as you maintain above 24 - 30 fps in most games, gameplay will be smooth. Some, like First person shooters, demand higher frame rates for fluid gameplay.
June 19, 2007 1:01:57 AM

Well the first question is allready answered i see. Does it matter? If you mean by that ,will there be a great difference in DX9 vs DX10 games, well.... it was supposed to be the next great thing in games but for now it remains to be seen if the actual implementation is going to be as good as it's intended one.
Microsoft's goal was to create a software environment far more efficient then the allready existing DX9. Less driver overhead with more unique objects on screen, and far better effects. Also through the Unified Shader Architecture making more efficient use of available graphical processing power by using as much shaders as possible instead of having dedicated shaders sitting idle.
From what I understand in previous (dx9) cards there are dedicated pixel and vertex shaders. If there is a high load of vertex shading, pixel shaders will sit unused and vice versa. In the Unified Shader (dx10) cards ALL shaders can be used for either pixel or vertex shading, thus using available hardware much more eficient.
In addition to shader tasks they can also be programmed, ingame or through driver support, to take on additional tasks like AI and physics, though I am not entirely sure about this part.
Sift through the graphics forum a little bit, there are some great posts concerning DX9 vs DX10 architecture.

GL.
June 19, 2007 1:15:23 AM

You wont need to get a new monitor to play at higher frame rates. The refresh rate of the monitor won't affect the frame rates, except that they might be out of sync with each other, in which case you might notice the so called "tearing" (which i have never noticed, but i don't game that much). Which i think is why games have the V-sync option in video settings, to limit the maximum frames per second to the refresh rate of your monitor.

The minimum frequency required for the human eye to not notice flickering is about 20 (Hz, FPS, whatever). However, in games, to get fluid movement it is best to have a minimum of 30 fps.

Some people say they can notice differences in different frame rates up to about a 100 in games, but I'd say these guys spend a lot of time gaming. In the case of the counter strike guys talking about frame rates in the hundreds, i think the reason for that is that in competitive first person shooters (apparently) every millisecond counts, so they want the movement to be as fluid and responsive as possible.

Don't associate the refresh rate of your monitor with the FPS of your games.
!