Dell 2405fpw 24" 1080p?

phnguyen89

Distinguished
Sep 22, 2007
47
0
18,530
Hello. I searched for previous topics about this but i did not find any. I want to ask if my current monitor 2405fpw is 1080p capable? It does not have an hdmi port so it's not hdcp. However, it does have components. Moreover, the Sony AV connector at sony.com says that the connector supports 1080p. So with all of that said, do I need to buy a new monitor? Thank you for your time.
 

Kraynor

Distinguished
Aug 10, 2007
829
0
19,010
All 1080p means, essentially, is that the vertical portion of resolution is 1080. 24" monitors mostly have a native res of 1920x1200 and as such can display 1080p signals. My Iiyama 24" monitor has no problem displaying either my xbox360 (via vga) or my ps3 (via hdmi) when set to 1080p
 

Iscabis

Distinguished
Jun 8, 2006
82
0
18,630
I have that exact same monitor, and yes it can do 1080p. Like Kraynor said, that is just describing 1920x1080p as a resolution. The 2405fpw can do 1080p just fine, since it is capable of 1920x1200 :D
 

phnguyen89

Distinguished
Sep 22, 2007
47
0
18,530
once again thank you cause I've surfed around and people keep saying, you have to have hdmi and hdcp. Hurray, I don't have to buy a new one.
 

eltoro

Distinguished
Dec 31, 2007
70
0
18,630
Yes, this monitor can do 1920 x 1200, and that's its native resolution.
We all know how critical it is for LCD monitors to work at their native resolution, cause the graphical quality is seriously degraded at any resolution other than its native one.
That said, won't using this monitor with a 1920x1080 degrade the visual quality?
Is there any official word regarding this (like a 24" monitor spec that states that 1080p is also considered as a native resolution)?

Tom
 

Iscabis

Distinguished
Jun 8, 2006
82
0
18,630
I think what should happen is that there would just be black bars at the top and bottom of the screen. It should be the same as playing an HD movie on it, in which case the black bars maintain the native resolution. The only difference between 1920x1200 and 1920x1080 would be that less pixels are used.
 

eltoro

Distinguished
Dec 31, 2007
70
0
18,630


Sounds logical. I hope that's really the case and I'm considering getting a 24" monitor myself (2407WFP-HC). Of course there's the issue of having to invest in a top class graphic card every time I'd want to bump up the generation I'm using. That would be critical for decent performance in 1920x1200 in contemporary games.
That would cost much more than the initial investment in the monitor itself.
 

Iscabis

Distinguished
Jun 8, 2006
82
0
18,630
It should definitely be for both, because HD cable signals can run through component at 1920x1080. Component is the highest quality analog signal available, and it should be able to do 1920x1080p. I would use DVI if you can, since it is a digital signal.

Oh, and as for the graphics card power needed to run games at 1920x1200, it is not as bad as it seems. The graphics cards around the 200-250 price range can do 1920x1200 gaming quite well nowadays. I have an 8800gt (which can be had for arond 200 or so right now for 512MB version), and I can play all my games at 1920x1200 maxed out. Crysis is an exception, but that is it. 1920x1200 gaming is easily attainable right now. This will pretty much remain the same for the rest of the video card lines, because HD gaming is very popular these days and the companies know it.
 

raiden99

Distinguished
Mar 16, 2008
6
0
18,510
I have the Dell 2405fpw 24" monitor you speak of. The monitor can handle any resolution under 1920x1200. 1920x1080 will only fill most of the screen. However, this monitor has a cool feature, like most big screens, that allows you to stretch the image to fill the screen. In the image properties, you can choose 1:1 ratio, aspect ratio, or "fill".

1:1 will display only the exact size of the original resolution; this includes games. I can play an older game that displays at 648x480 and it'll only show on part of the screen.

Aspect ratio will stretch the image, no matter the size, to maintain either an 16:9 ratio or 4:3 ratio. If you're watching television over the component input, for example, and you set the output of the cable box to 1080i then the monitor will keep the aspect ratio of the output. 1080i is defined as 16:9 ratio (1920x1080). 1080p simply means that the image is progressive and all pixels are processed at the same time instead of every other line (odd then even or vice versa). A game or regular television broadcast that uses 4:3 ratio will only show in the middle of the screen, but will go to the top. 4 is the width and 3 is the height.

When using "fill", you're essentially stretching any image to fit the monitor's ratio of 16:10. 16 is the width and 10 is the height. 16:9 ratio would then be stretched taller to fit the screen. 4:3 would then be stretched much wider and everyone looks really chubby. =)

16:9 only looks funny on a 16:10 monitor if the film isn't true 16:9 but instead filmed on a much wider film. I'm sure you've seen them. They're the ones that have much thicker black bars on the top and bottom than other films. 16:9 is simply an average of all the ratio types, not the rule. Of course, there are many that are and so it's no problem.

Also, games that aren't natively widescreen will look really wide if you using "fill" instead of "aspect". However, there are many games that are and so you don't have to worry.

As for HDCP, that's where this monitor lacks. It has no HDCP processing. The 2407 introduced that much needed feature. I bought this monitor before HDCP was in wide use. I knew I need it eventually, but instead of spending twice as much (at the time) for an HDCP capable monitor, I decided to spend less and then sell this monitor or use it on a computer exclusively and buy another one that has it.

In other words, any source that requires HDCP to view the image, then you're out of luck. HDDVD and Bluray both require HDCP, no matter how you view them (computer or component box). On the other hand, you can bypass HDCP on both methods, however, you'll supposedly lose quality. On computer, you can use a VGA cable instead of DVI and you'll be able to view most titles this way. However, in some cases you won't. Instead, it's better to just purchase a copy of AnyDVD HD. It detects any disc with HDCP encryption and removes it while you view it. Not everyone has an HDCP capable monitor and so this is the workaround for computer. Of course, only use real movies on the original discs or you'll run into trouble, both with the law and with future software.

As for component boxes, you can watch the movies over component cables (RGB) but the signal isn't the same quality as it is over HDMI or DVI. First of all, digital vs analog quality is obvious, however, as part of the standard and security measures, HD movies with HDCP require that the resolution of the signal over an analog connection be lowered to discourage piracy. Sure, you can view the movie but it won't be true 1080p. Of course, I'm sure there are models that allow for viewing over a VGA cable or something but I haven't done any research on that yet. I'm sure there are people on here that have.

My advice? I would stick to watching them on the PC and then just output the sound over a 7.1 channel HQ sound card to your receiver. AnyDVD HD works flawlessly. All you need past that is to purchase a Bluray/HDDVD software player. I assume you already have a drive. If not, there are combo read only drives for around $230. I know you can get a bluray/dvd burner for $150. I'd get the $230 drive because it's going to be awhile before they can remake all the HDDVDs into Blurays and I'm sure there will still be HDDVDs for quite some time. I'm certain HDDVDs will be cheaper than Blurays, of course they always have been from the start. Keep in mind that Bluray has pretty much conquered HDDVD. Toshiba bowed out (co-founder of HDDVD).

Anyhoo, thanks for reading and good luck.
 

raiden99

Distinguished
Mar 16, 2008
6
0
18,510
Oh, I have the MSI 8800 GTX videocard and it'll decode 1080p smoothly on every machine I have (AMD x2 4400, Intel e6600). Of course, the e6600 runs it with less strain. To run it even smoother than get the even later generation cards that have better HD acceleration. I know the 8500 and 8600 included better optimization for the HD codecs. You can decode HD media on a slower machine than you could with an 8800 gtx. Look on the internet for a review on that very subject. The tested how low you could actually go, cpu-wise, without experiencing problems.

You can also go the ATI route. I can decode HDCP with my ATI x1800xl combined with my Intel e6600. The x2 4400 combo will do 720p, but doesn't always like 1080i or especially 1080p. It depended on the encoding. WMV 1080p is easier to play, for example, than MKV files. Of course, K-Lite codec is always being improved and so maybe I should try again with their latest revision.

Anyway, as long as your video card supports HDCP and your processor is fast enough then 1080p is a realitiy. I'd also recommend using 10,000rpm hard drives on slower machines or even RAID 0 if you're really worried. However, that won't save you from a lack of proper video acceleration.

Good luck and good night! :D
 

TRENDING THREADS