Eizo Updates CG276 IPS Monitor to Accept 4K Resolutions
Eizo has announced that all the CG276 monitors produced from the end of April and onwards will be able to deal with 4K and 2K signals.
Eizo's ColorEdge CG276 has already been on the market for quite a while, but a new update will allow it to support 4K signals. While the previous models wouldn't, any model produced as of now will be able to support 4K signals as well as 2K signals.
The monitor itself doesn't have a 4K panel; in fact, its resolution is 'just' 2560 x 1440. What the unit does is scale down the UHD (3840 x 2160) or 4K (4096 x 2160) resolution to the panel's resolution. Granted, at these resolutions the refresh rate will only be 30 Hz.
The reason why Eizo has decided to update the monitor to support this is because in the film industry 4K is becoming a standard at an ever-increasing rate. While televisions are slowly gaining 4K support, computer monitors are not. The WQHD resolution (2560 x 1440) is still far away from becoming a standard, so obviously, it'll be a long wait until we see 4K computer monitors.
Eizo's ColorEdge GC276 has already been on the market since November 2012, and it now has a street price of about $2,300.
I think a closer expectation is to get 1/4th the frame rate, since 4k is 4x the resolution of 1080p. And frankly, that is what we're looking for.
1080p is approximately two megapixels. The highest res monitors are about four megapixels, and a 3x1 Eyefinitity setup of 1080p monitors is about six megapixels. A large 4k monitor is pretty close to eight megapixels, the highest yet.
I wonder, if a game like Skyrim or the latest Battlefield game had 4k support, how they would run on a 4k monitor with maxed settings and a single, double or triple GTX Titan. Of course a game like Crysis 3 on max settings could choke even three Titans, I'd guess. But what about a less demanding game or playing with less than max settings?
I wonder whether gamers would prefer, if given the choice, 2560x1440 at max settings or 4k without the anti-aliasing. I guess time will tell.
30 inch monitors at 4K should be under 500 dollars.
eventually it will happen.
until then I'm good with what I have.
Hell.. most people can't see the difference side by side 720p vs 1080p on a typical monitor sized screen.
This is obviously for video/tv and the such. The upped resolution would be cool for gaming graphics, but we're far away from the tech to run such resolutions at an acceptable level..
Cable companies can barely provide 1080P...mostly 1080i and 720p...
No need to spend money on this 4K display until it settles down
Agreed. I know the theory is that you can fit more stuff on one screen, but if it's too small to see, then who cares.
Personally I am more interested in better display technologies. I wish OLED displays would hurry up and go mainstream. After two phones with them, I'd like a 24" one to replace my current monitor.
4k monitors will eventually be standard, and content will support it. Then we will all complain about how the newer Crysis will not run on a 4k monitor without a $10,000 computer.
i wouldnt get a 4k monitor less than 48 inches for any low price.
Anyone who thinks that you can't, has never seriously looked at a monitor for more than 30 seconds. We're getting 1080 on 5" phone screens now, maybe that is not needed in that size but on 30" desktop monitors at 120-240 Hz you WILL be able to tell. Quite easily and with the nekkid eyeball.
Trust me you will tell the difference at typical viewing distance which is about 2-3 feet. Of course 4k displays are probably going to be 30"+ but the benefit is very much worth it.
There is a separate component in monitor like these which handles the input video and up/downscales it to the native resolutoin of the display. This is worthy of bragging because other monitors that don't have the native 4k resolution can't even handle the signal. It is not an ideal situation, but it is a feature, I'm not sure why people are bashing it. If you had one of these monitors and needed to view such signal you could, and not have to buy a whole new native 4k monitor.
Those sizes are a bit unrealistic. For 4K you want something along the lines of a monitor at ~40" and a TV of ~120".
Here is my reasoning:
Right now we are seeing 5" phones which can do 1080p. 4K or UHD is essentially 4 1080p diplays put together, which means at the same density a UHD screen would have a diagonal measurement of 10" (which would make one sweet tablet). The optimal viewing distance for such a device is roughly 6" or 1.66" of screen size per 1" of distance in order to keep things at roughly a 'retina' resolution. For 1080p there is a similar ratio of .8" of screen size per 1" of distance.
For a UHD computer monitor where you sit ~2' or 24" from your screen then you would want a 40" screen. Compare that to a 20" 1080p monitor having similar clarity.
For your average living room setup where you sit some 6-7' from your TV then you would want a 120" TV. Compare that to a 1080p TV at a similar distance which sould be ~60" for the same sharpness.
Lets say that you have a larger home where you sit some 12' from the TV. That would mean a 240" UHD TV which would quite literally be 16' by 9' (really a bit larger than that), or a 1080p TV that is a much more pedestrian 120".
UHD TVs are not big simply because they are luxury items. They are big simply because that is how they are made to be viewed. Any smaller of a screen per distance simply defeats the purpose of having one because your eye physically cannot see that kind of detail. If I had infinate money, but had to live in a 'normal' home I simply would not want a TV much larger than 120", and would not be able to fit one larger than 170" through my front door (making a optimal viewing distance of 11'). This makes it a real and true candidate of being the 'last' resolution standard for consumer use on monitors and TVs.
The very idea of a 'last standard' for anything is a bit scary and troubling, as well as exciting. It is not that they will not be able to make bigger higher resolution displays down the road (because they will), it is simply that we are reaching the limits of what can be appreciated by our human senses within the constraint of 'normal' spaces.
This is something we have already started to hit in the CPU space for office and home users. By this I mean that for web browsing and video watching you don't need any more CPU power than can be found in a $50 Pentium G2020 (note that this is CPU power, not GPU). Outside of gaming and content creation there is simply no further need for CPU horsepower to get better. You can only open word, outlook, and web browsers so quickly before there is no longer an appreciable difference, and if it is not loading quick enough it is much more likely to be a HDD bottleneck than anything having to do with the CPU. Gaming and content creation dictates that CPUs continue to improve, but hitting visual limits on what can be displayed on a screen implies that there will be a point where CPUs for game and content creation use will hit a wall of practical improvement as well (granted we are probably still some time off from that).
So I guess my question is this: What does the electronics industry look like in 10 years when this type of technology is just standard? Will things be built cheaply so that they are made to break and be replaced every 5 years? Will prices rise to make up for the lack of new sales as volume slows down to replace broken units rather than keep up with the demand for upgrading units? Will electronics companies shrink in size to keep overhead costs down? Like I said earlier, this is super interesting to think about. There may or may not ever be an end to how far we can push technology, but there is definitely a appreciable end to it, and it looks like we will hit it at some point in the next 10-20 years, which is all sorts of weird to think about.
Nvidia and AMD both more than likely will be supporting 4k on the new 8000 and 700 series. Nvidia making a jump to 384bit will be able to make 3-6GB cards.
Someone is buying up all the ram too, driving up prices. yet we don't see any new cards from each vendor. hm... I wonder.... Can Nvidia and AMD be a fault? Can the new cards from AMD and Nvidia causes the memory manufacturers to stop making DDR3? and tie everything to GDDR5 manufacturing?
Only time will tell..
One thing is for sure, the new cards will support 4k and this shit is about to get real! GIMMI!
This is an Eizo screen, it will cost you an arm and leg even at 1x1 pixels =)
http://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_Titan_SLI/6.html
BF3, 3x GTX Titan drops almost 50% of fps when scaling from 19x12 to 57x10, which is more than 2.5x increase of number of pixels, going to 4K most likelly would drop it to ~70fps.
I would rather go with 4K resolution and no AA, should have better impact on image quality than 4x supersampling.