After a bit of sorting out how to get the card working, I've discovered a small problem I'm having with my monitor and card. When I view my monitor from a moderate/low angle, wavy moire patterns are visible subtly flashing across the screen in diagonal lines. The effect is also noticeable from the sides, and is least noticeable when looking at the monitor from the front, especially at high angle.
Also adding to that,
in this set of monitor tests, one of the squares is flashing (4 series). I did a similar test on this on my desktop at work, and no such flashing effect seemed to occur. What this leads me to believe is that the monitor I'm using doesn't match well with the card I'm using - which would make some sense, as it's an LCD monitor with a VGA plug, being pipelined through a HDMI adapter.
A lingering fear of mine is that some sort of component on my computer is overheating. I do know that even after reducing my clock speed on the cards, some screen tearing/jumping still occurs
very infrequently. Here's a listing of my computer components and what temperatures they reach (Recorded from CPUID when available...) while running a graphics-intensive game (Champions Online) on high settings:
- Motherboard - Dell 0R849J : 63C (some voltages seem to be fluctuating slightly. At the moment, VIN1 and -12V seem the most volatile)
- Processor - Intel i7 920 : Minimum 53C, Maximum 81C (1st core)
- Hard Drive - WDC WD6400AAKS-75A7B2 : 35C (very consistent)
- Graphics Card - Radeon HD 7950 Gigabyte : 60C
- Monitor - Dell ST2010: Running at native resolution, 60hz
I guess the question then is sort of twofold: Is there anything going particularly wrong with my setup at the moment? And for the monitor, what adjustments might I be able to make to minimize the wave artifacts/flickering, and failing that, what new monitor might I want to get?
Thanks to anyone who can give advice!