Heres a bit of an odd one for you: I'm using a Samsung UNC7000 series TV as my moniter, with the default screen resolution as 1920x1080 (1080p). GPU is an ATI 4890. I currently have overscan enabled to completely fill the screen. When I start a game in fullscreen (the two I've been playing are Civilization V and Battlefield: Bad Company 2), the TV switches from 1080p to 1080i.
The input I'm using has been designated the PC input, so that doesn't seem to be an issue. Likewise, window mode works fine, as do other devices hooked up to the TV. The TV is reporting that it is 1080p capable as well, as its been added as one of the operating modes for the GPU.
I'm kinda stumped on why my games are starting up in 1080i. I'm ASSUMING its the TV, but it could be GPU releated as well. Any ideas?
Well, sitting a foot away from a 46 inch screen...yeah, its actually quite obvious
I'm actually starting to think its the PC though; the TV should be accepting whatever the PC is putting out, shouldn't it? Maybe the way the PC is detecting the TV is causing it to output i instead of p? I might try forcing 1080p today on the GPU and see what happens...
I don't know how it looks for PC's, it's not bad for a TV. However, PC games do not have motion blur and it might be a bigger deal.
For those who don't know, the I stands for interleving, which means that every refresh rate only updates half of the horizontal pixels (every other line at a time). It used to be horrendous looking with CRTs. It wouldn't be as bad with a solid state monitor, I'm sure.
I'm sure it's something a lot more noticable when gaming than watching a TV.