X38 and Dual PCI Express 2.0 x16 graphics

MajestyMephisto

Distinguished
Oct 19, 2007
16
0
18,510
PCIe 2.0 x16 (2 - x16)

Ok folks was waiting for this chipset to come out and started to put together a putter online. My question is about the new 2 xPCIe x16 in crossfire. Am I correct in understanding that both cards in crossfire config (dual pcie samething?) will run at full bandwidth (16 x 2 as opposed to old running at half 8 x 8 ? ). And with this thought...wouldn't it make more sense to purchase 2 mid-range GPU's that support 10x and 3.0 shaders and run them in crossfire as opposed to purchasing one high-end.

ex. 2 x ~$200 in stead of one for like ~$650

Sorry for the noob question that last time had to sort it all out was back in the AGP x 8 days with SATA in RAID. Currently running BFG 7800 GS OC and P4 2.4 oveclocking to 3 (up the FSB of course) with some very sweet RAM.

Cheers

MM

PS Doesn't this new crossfire running in full bandwidth both slots make it now very equal to SLI. Maybe some could right a technical report on all the new speeds and configurations?? Would be a great help to the n00bs :)
 

MajestyMephisto

Distinguished
Oct 19, 2007
16
0
18,510
You sure about that mid vs high end for better gaming...smoothier fps etc...not just how high fps at high resolutions etc. Anything over 60 FPS is hype and propaganda anyhow....human eye can only see about 60 fps...

How you done a technical test on dual pci x 16 running at full speed 2.O...but thanks for the links...
 
i didnt do a technical test , if u read some reviews about X38 , u will see it runs CrossFire @ dual 16x

also i am sure about 2 midrange vs 1 high-end, for example a single 8800GTS 320 trashes 2 8600gt , u may want to read some benchmarks and see it for yourself
 

T8RR8R

Distinguished
Feb 20, 2007
748
0
18,980
Actually the human eye can only receive about ~25 FPS. It's very rare for anyone to be able to see 35 FPS. Try some stop motion animation or read into how they made the 1st star wars movies(4,5,6).
 

nicolasb

Distinguished
Sep 5, 2006
93
0
18,630
Everybody else in the world worked out why this isn't true for computer games round about 1997. Where were you? Sleeping off a hang-over?

(sigh)

The reason cinema film looks all right at 24fps and a computer game doesn't is that the camera's shutter remains open for a substantial part of the duration of each frame. That means that the image of any object moving across the frame is not perfectly sharp: it's blurred, as the camera has actually recorded its motion while the shutter was open.

By contrast, an image generated by a computer is absolutely sharp; the effect is as if a moving object were perfectly stationary for the duration of a frame and then suddenly, instantaneously, jumped to its position in the next frame. This makes motion intolerably jerky at speeds as low as 25fps. For gaming, 60fps should be regarded as the minimum frame rate for anything resembling smooth motion.

(This is, incidentally, one of the main reasons why modern CG animation looks so much more convincing than old-style stop-motion models did: modern CG uses blurred images, as if the CG model were moving while camera shutter is open; each frame of stop-motion animation is perfectly sharp, which means the motion isn't smooth).

In fact, even 60fps isn't fast enough. Try a quick experiment: wiggle your mouse rapidly from side to side and see what the pointer looks like. Do you see an even blurred image all the way across where the pointer is moving, or do you see a small number of sharp, discrete pointer images? If you see sharp images, it means you can see a difference between smooth motion and discrete motion at the frame speed of your monitor's refresh rate. With this kind of motion you would have to be refreshing the display at several hundred frames per second for it to look perfectly smooth.

Of course, the benefits of having a game run at a rate that is higher than the refresh rate of your monitor are questionable - if the monitor refreshes at 60Hz, any one pixel on the screen cannot refresh more than 60 times per second, regardless of the frame rate.

There are actually other reasons why a high frame rate is desirable, of which the most obvious is controller latency: the amount of time in between you moving the mouse or game controller and the time when the display updates to show the results of the movement. The longer this latency period is, the harder the game is to control, because you aren't getting feedback quickly enough as to whether the movement is what you intended to be or not.

One effect of this is that SLI and Crossfire sometimes don't produce quite the same feeling of speed that the frame-rate would suggest they should. Crossfire and SLI sometimes work by having each card render alternate frames. This more or less doubles the frame rate, but the controller latency remains the same, because you have to wait two frames rather than one before the effect of a controller move becomes visible.
 

MajestyMephisto

Distinguished
Oct 19, 2007
16
0
18,510
Thanks nicolasb! I actually found that very interesting....as I am into the "medical" field area. What I meant about testing was what you get with crossfire x16 at 2.0 with two cards vs one...has anyone...I UNDERSTAND the fact that it does support that...not in question....plz read my questions alittle more closely...I am NOT a complete noob man and I think you are responding as if I am before COMPLETELY reading things. Are there even any cards that run at PCIe 2.o yet?? Cause I thought I read yesterday there isn't? Not sure on that one....
 

KyleSTL

Distinguished
Aug 17, 2007
1,678
0
19,790
The VAST majority of monitors can only display 60-85Hz (LCD). CRT's can go higher. So you can pretty much blanket-statement that your monitor is not even showing anything above 60Hz if you're using LCD.

Although I did see some manufacturer (Sharp or Pioneer, I forget which) is going to be releasing a 1080p LCD TV that displays 120Hz, although 1920x1080@120Hz is some huge bandwidth, not sure if that's true. Someone correct me if you know better.