I just got a 42 inch tv (pretty sure its HD cause its a flatscreen and...big--i got it from a friend) and i connected it to my laptop. When i went to the Nvidia control pannel, the maximum resolution i could set it as was 1024 by 768
i have a NVIDIA GeForce 7150M on the laptop, and although i couldnt find its max supported output resolution on the nvidia site but lower quality graphics cards have a much higher max than 1024 by 768 so i assume that my card can go higher.
I have a 7pin svideo adapter, but the connector connecting it to the tv is 4 pin svideo on the laptop side and composite on the tv side Could this be the reason why it wont go higher?
Please help, thanks
i dunno, that is a really old card. Try updating the drivers on it but i wouldn't be surprised if it didn't support any higher resolution. After all that card is only meant to display picture to your small little laptop screen, not a 42 inch tv
Yep, it is not the video card that is the problem, it is the TV you are hooking it up to. It won't support any higher res than 1024x768.
And yes, just because it is a big flat screen TV does not mean you are going to get a fantastic picture by hooking it up to a computer.
For those who believe the "old" video chipset which is the culprit, you're wrong. Even GeForce 5 series and Radeon 9xxx series cards are capable of such resolutions. (Note that GeForce 7 series chipsets are used on widescreen laptops all the time. Please refrain from spreading misinformation.)
What is likely the problem is the standard (read as, not widescreen) output limitations related to the even "older" S-Video port. Most modern laptops have a built-in VGA output, and this is the port through which you actually should connect the TV. The S-Video port would work just fine for a standard definition monitor or television, but this is a widescreen, so that's not the case here. Newer TV's actually have VGA input ports on them. If yours does not, does it have a DVI input, or perhaps HDMI? If it does, a simple VGA-to-DVI adapter along with a DVI-to-DVI cable, or possibly (though I've never actually tried this myself) a VGA to DVI adapter paired with a DVI-to-HDMI cable, might do the trick. Sadly, if it has neither of these ports, then this model's a bit of a dinosaur as far as HD LCD widescreen TV's are concerned. I'm not sure if there are VGA-to-Component adapters available. Consult the PC manufacturer and see what they say about it, because there should be a means to do it.
What you could also try fiddling with is the nVidia Control Panel. You could disable the SSID function for the Monitor, (or in this case, the TV) then manually set the resolution to match that of the TV. As someone mentioned, most widescreen HDTV's that size have 1920 x 1080 resolution, however there are several still around that run at 1366 x 768, not 1024 x 768. (1024 x 768 is a standard definition resolution, not widescreen.) A little research on the TV manufacturer's website for that specific model would shed a little light on that matter. Read the sticker on the back of it to find the model number.
Thanks for all the answers,
the TV is definetly an HD tv, and is definetly capable of HD resolutions,
it does have HDMI connectors, and has a resolution 1080p (1920x1080) and 1,080i . So i guess that the problem is with the connector and not the TV or the Computer. So should i buy a vga to hdmi converter/cable? would that work?
use DVI to HDMI.. you cant convert an analog signal to HDMI by an adapter.. another thing.. is it a plasma tv? if then, 1024x768 IS most likely the resolution of the tv, as plasma uses non-square pixels as old CRT tubes.. then you'll just have to feed it with and 1080p signal, and use some reduced/increased overdrive..
yea it does, my friend told me about USB to HDMI but i looked it up and it doesn't support 1920x1080. Is there anyother way (besides getting a VGA to HDMI converter) that i could get the full resolution on my tv from my laptop?
I dunno if it's a scam or not. It's in Hong Kong... Who knows what sorts of gizmos they have there already that we don't in the US? However, since neither Newegg nor Radio Shack (between the two, they have just about anything and everything) sell any such VGA-HDMI cable, I would tend to believe they don't work.
Yes. But using one of those VGA to HDMI converter boxes seems like quite an expense to display your laptop on an HDTV.
I have a full hd tv which can support a max resolution of 1920 x 1080. I want to connect it through VGA with my laptop. All i want to know the maximum resolution that the "VGA out" can provide and on which parameter of the computer the resolution of the VGA out depends? you can also mail me at "email@example.com".
Odd to see such an old thread dug up, but I'll respond. In so, I'll also finally answer the OP's concerns regarding the use of a VGA-to-HDMI cable.
To my knowledge, VGA has no maximum resolution. However, it's often said that the max is 2048x1536 due to the bandwidth limitations of the VGA adapter's DAC (Digital-to Analog-Converter, or RAMDAC) along with the bandwidth limitation of VGA cables. Whether or not this seemingly accepted maximum coincides with the max resolution of CRT monitor screens, or if their max came about due to VGA's own limits is the real question. Unfortunately, it would seem the only way to find out would be for a CRT screen capable of even higher resolutions to be developed. It would seem to be a bit of a "which came first - the chicken or the egg" situation.
I'm afraid I don't understand the 2nd part of your question... Which parameter of the computer the resolution of the VGA-out depends? What do you mean by parameter? The only thing I can say to that without fully understanding is this. Output (or display) resolution is typically determined by one thing - resolution(s) supported by the monitor, TV, or display device. The video card can also affect your resolution, as some GPUs are capable of more resolutions than others. But that has become more of a generational gap, as modern GPUs are all capable of very high resolutions. (Up to 2560x1600, usually.)
Since the thread was dug up and I never addressed it before, onto VGA-to-HDMI conversion issues.
First and foremost, understand that VGA is an analog signal while HDMI is a digital one. Video cards create a digital signal originally, and as I mentioned earlier, the DAC of a VGA output changes that digital signal to an analog signal. An HDMI display requires no such change or conversion - it's digital from start to finish. So, when attempting to connect an analog VGA output to a digital input (in this case HDMI), that signal must be actively reconverted into a digital signal again. A cable cannot achieve this, as their is no active device within them to actually "convert" the signal. Converter boxes, which contain active converter devices can achieve this, but output resolutions can vary.
The best scenario when connecting to a digital input is to begin with a digital source.