Dual Displays = Twice the GPU Power?

ikjadoon

Distinguished
Feb 25, 2006
1,983
44
19,810
Hi, everyone. This is slightly a noobish question, I know, I know.

I have a Mintek 26" HDTV and a Samsung 17" LCD. My lowly GeForce 4 Ti 4600 has two outputs, one D-SUB (blue 15-pin) and one DVI. I plug my DSUB to the monitor and the DVI to the TV and I set the monitor to Clone so that the exact same image is on both the TV and the monitor. Am I correct in saying that when I play a game it would take twice the GPU power because it is driving two displays over one, even if it is the same image? I've never noticed a drop in FPS, but my eyesight isn't great and I game at 8x6, :(

Just wondering! Thanks!

~Ibrahim~
 

Vinny

Distinguished
Jul 3, 2004
402
0
18,780
I don't think so... it only has to create one image and then scale it to the resolution necessary for your monitor and TV. So it'll take a bit more juice than running a single monitor but not twice as much.
 
Nah. if it's cloned then it's just rendering the scene once, maybe a clock cycle or two delay to send things to the appropriate displays, but that'd be it, not noticeable.

What happens is that the image from memory is sent to the DVI out and then the RAMDACs and then to the DB-15 output. Both are going to be reading the info from the same buffer source so there's not mcuh additional work other than to tell the second display, "OH yeah, use your unused hardware to show what in memory too".

Only if you were stretching the game or doign 2 different views (like map on one and then first person on other) would it require more GPU power/resources than using one mon.
 

ikjadoon

Distinguished
Feb 25, 2006
1,983
44
19,810
Along those lines, yeah. If the same image, then I guess so. 7900GT will be able to play nearly every game on the market with amazing FPS at 12x10.

~Ibrahim~