Vizio E320VT LED LCD Edge Lit Razor 32"

uvall

Distinguished
Sep 8, 2011
8
0
18,510
I just bought an E320VT Vizio LED LCD Razor TV. My dilema is that this TV is supposed to be a 720P max picture from what I have read. But for some reason when I hook up my PC through the HDMI 1 port I get Full 1080P resolution. My recommended resolution goes to 1920x1080 at 60hz. I've tested this in various ways and it seems to be a true 1080P picture. My question is, how is this so? Does Vizio make a 1080P and 720P that are the same but then disable the 1080P EDID and market it as an E320VT to sell it cheaper, when infact it's the same as another 1080P they may sell? I did run a program called Phoenix that scans a TV's EDID and saw that this TV had two, one with a 720 setting and one with a 1080 setting. To test my TV I ran several games with an overlay checking my refresh rate/frames per second, which were locked with VSync enabled. All the while in 1920x1080 I stayed at 60 frames per second after locking my framerate to my TV's refresh rate (60Hz). If I swap my TV's refresh rate to 30Hz (which, if I'm not mistaken is 1080i) the picture flickers somewhat and doesn't look that great. But at 1080 60Hz it looks great. So is there anything at all you can tell me that may help me figure out how a so called 720P tv with this model number (E320VT) from Vizio can be achieving a clean and clear 1080P resolution?
 

revolution2718

Honorable
Apr 8, 2012
272
0
10,860
This is one of those problems that's not really a problem haha. My first guess would have been the TV is automatically down-converting the signal, but beyond that I'm not sure this is kind of odd.

On a side note 1080i vs 1080p has nothing to do with the refresh rate. When you knock it down to 30Hz, you would see flicker on any monitor or video source because it is only refreshing the image 30 times a sec. The difference between i(interlaced) and p(progressive) has to do with how the monitor displays the individual lines in the video signal (with interlaced displaying half then half and progressive displaying all of them at once). This is independent of both the refresh rate and the frame rate. I honestly don't think they could have made it more confusing for people.
 

uvall

Distinguished
Sep 8, 2011
8
0
18,510



Yeah it is odd and not a bad thing, but I'm bewildered by this and really wanting to understand it more... The thing is, on my video settings it shows the 30hz as interlaced and not the 60hz. It actually shows all interlaced resolutions in parenthesis next to it, for instance ((1920 by 1080, True Color (32 bit),30 Hertz (Interlaced) -vs- 1920 by 1080, True Color (32 bit), 60 Hertz)). Also I thought that no 1080i could achieve a 60 Hertz/refresh rate. Which is why I tested it the way I did. On a side note, I do have a legit 42 inch 1080p Vizio as well, and this 32 inch seems to look just as clean and clear when set at the default/recommended 1080p(1920x1080). Also when I run my satilite in 1080i(which is the highest it will go), my 32 inch in question will show on the screen info that it is indeed running in 1080i instead of the 1080p it shows when I'm running my PC in 1920x1080 (the recommended display resolution). I have also gone on today and scanned my 42 inch 1080p Vizio's EDID and noticed it had an identical 1080 setting to that of the EDID I'm seeing on this 32 inch which is supposed to be 720p max. So I'm starting to think that maybe Vizio just makes one chip for a certain series of TV, then goes on to release them under different model numbers at slightly different prices. Disabling the higher 1080p resolution on the lower priced model while keeping it enabled for the higher priced model. Sort of like what Nvidia has done with some of their GPU's in the past.