how do I get windows (any version) to ignore hdtv info and do what I want it to do instead?

pcGnome

Reputable
Feb 22, 2014
2
0
4,510
I have an Element ELCFT262 - 26" 720p 60Hz LCD HDTV.
.
The manual says "HD compatible - up to 1080p.
.
And, in fact, if I trick it (vga switch & 1080p monitor) it WILL display 1920 X 1080. The manufacturer assures me that it will scale its input to 720p as the resolution can never be better than that. But I still get my Win7 desktop to display a 1920 X 1080 image, so my desktop is just how I like it.
.
I think it's a GREAT feature that it can do that scaling thing.
.
If it was a monitor, I'd just figure a way to fiddle the device driver so Windows thinks it's 1080p and all would be fine. But it's an HDTV and they don't use device drivers. A monitor without a device driver will default to some standard VGA, but an HDTV will be queried by Windows and it will tell it that the maximum resolution is 1366 X 768 and that's all it will do.
.
So, every time it reboots or does a "check for new hardware" or "detect monitor" and my 1080p screen shrinks immediately to 1366 X 768, then I have to go trick it again.
.
The bottom line is if you send this HDTV a 1920 X 1080 signal, it will display it just fine. So, I'd like to NOT have to fight Windows and the continuing aggravation this causes me.
.
The manufacturer seems not to understand that they built it to receive and CORRECTLY display 1080p and instead claim I'm trying to do something beyond the design specs and they stopped conversing with me on the subject. I maintain that if it WILL DO 1080p, then the design specs are falsely limited to far less than it's capable of doing.
.
Any ideas? At least people with this HDTV should know that it's much more powerful than they are led to believe. Used to be that everyone had to be careful not to damage a monitor by sending it signals it cannot handle, but I'm pretty sure that every HDTV & monitor have got the word and easily know how to display "unsupported mode", which it will do when it gets an unsupported mode - I just know that if it displays 1080p, then it's supported, regardless of what it reports to Windows and regardless of what the manufacturer thinks is beyond its capabilities.
.
It's not perfect. When I got it to do 1600 x 900 @ 60 Hz, the image is displayed too far to the right (beyond the monitor's ability to correct), but it turns out that the fix is to do 1600 x 900 @ 75 Hz and it works just fine.
.
Just want to make my property do what I want it to do.
.
pcGnome
 

Wolfshadw

Titan
Moderator
There's a perfectly good reason why your TV manufacturer won't discuss the matter with you anymore. You're pushing your TV beyond the recommended specifications laid out to you by your TV Manufacturer. It may be possible that your TV is capable of displaying 1920x1080, but it's unwise to do so for any number of reasons. Of course, by discussing this with your manufacturer and pressing the issue, you've all but voided your warranty.

Think of it as over-clocking a CPU. Yes, you can get performance, beyond the manufacturer's stated specifications, but at the cost of heat and energy. Push it too far, too long and you'll likely damage it. With a CPU, you can install a better cooling solution to alleviate that issue. Not so with a TV.

-Wolf sends
 

AnUnusedUsername

Distinguished
Sep 14, 2010
235
0
18,710
More on topic, I don't think there's a standard way to override the display modes detected by windows. These modes aren't always correct (in particular I have an older 1920x1200 monitor that Windows thinks can do 1920x1440, go figure). It might be possible to modify your graphics driver to always detect the resolution, but I don't know how to do it in Windows. You could probably also "fake" a monitor driver, but that's also way above my head. Most of the flavors of Linux do support arbitrary resolutions, so that's an option (though not a good one for gaming, obviously).

However, if your TV upscales everything by default, the first thing to do is fiddle with it's settings to see if you can disable upscaling, then work from there. Upscaling content takes TV processing time, which will introduce input lag for no benefit. Most TVs have a game or PC mode somewhere which can be used to disable all or most post processing (though I figure you've already looked for that).
 

pcGnome

Reputable
Feb 22, 2014
2
0
4,510
Quick comments first: it appears that it avoids needing a graphics driver by querying the device, which apparently responds "I can do 1366 x 768". I also reiterate that the manual does (cryptically) state it's "HD compatible up to 1080p", just above where it says "Panel Resolution 1366 x 768.

Not worried about warranty, found out too late that 720p just won't do. It's a cheap refurb until I can get something decent. The surprise was that it can do what it can do.
.
Not a lag problem at all as it is downscaling (taking 1080p input and shrinking to 720p, or so said the manufacturer), not upscaling and there's no delay or anything, it's entirely transparent and looks for all the world like it's really doing 1080p. I just know I can get the size of desktop I want and not the scrunched one 720p affords.
.
As far as going beyond what's intended, well, someone at Element created the ability to handle 1080p input by downscaling. I think a more correct comparison would be the ECS X79R-AX which went to great pains to enable the SAS controller, which Intel apparently didn't want to happen. (both Intel and ECS say they don't warrant that the SAS controller will work ... but it does anyway, for SATA drives at any rate).
.
Possibly my real problem is that an HDTV doesn't use a device driver and I'd sure like to learn how to make it use one, if I have to figure out how to write one myself. But I don't know how to forcibly make Windows accept a driver for a device it doesn't think needs one. Anyone know of a way to adapt / hack an existing driver for something else and make windows think it's compatible? I suspect this is either impossible or so easy I can't think of how to apply it.
.
I suppose I'm back to getting a monitor with real resolution (2560x1600 or 1440) and letting this dinosaur with futuristic abilities go collect the dust it deserves. Or maybe stick with HDTV and get a UHD 39" (and a new graphics card, of course). Sadly, either choice will end my lovely KVM system which cannot do greater than 2048 x 1536. Has anyone noticed that what they're calling 'ultra wide' 2560 x 1080 is really (compared to 2560 x 1600 / 1440) just Ultra Short?
.
The easiest and most obvious solution would be to just get another 32" 1080p (I actually HAD an Element ELDFW322 32" 1080p until I accidentally let this heavy pendulum I was playing with crack the screen).
.
In case anyone's interested, I'm fooling my system now with an old 22" ACER V223W, which is 16:10, but found a device driver that says it can to 1080p (it can't do more than 1680x1050), but Win7 accepts it and lets me set it to 1080p, when the ACER says "out of range", I use the VGA switch (never thought I'd use that old thing again) to apply it to the Element 720p and it's just that cool.
.
So, unless a person a lot more brilliant than I am knows how to fool windows into installing a device driver on a device that doesn't use one, I suspect the answer to my problem is "you can't get there from here".
.
And I thank those who answered and all those who read and contemplated but didn't answer.
.
pcGnome
.
p.s. I use a KVM to switch between an old XP pro computer (I can only get it to do 1600x900 @ 75Hz) and a Win7ult that does the 1080p thing like I like it.
.
Apologies for selecting "Don't forget the monitor" just below, then couldn't change it back. If I'm not supposed to make this an ongoing discussion - I apologize and let me know and I won't do it again.
 

TRENDING THREADS