Hacker News new | past | comments | ask | show | jobs | submit login
ASUS Launches Monitor with 3840 x 2160 IGZO Display (hexus.net)
211 points by fdm on May 31, 2013 | hide | past | favorite | 176 comments



People talk about Retina displays etc ... but back in 2003 (yes, 10 years ago) ViewSonic had even higher resolution display called the "ViewSonic VP2290b" [1]

It was 3840x2400 in just a 22" display. It has a 204dpi and again, that was 10 years ago.

Any for the nay sayers, yes - this was mass produced to the point that IBM was selling them as the T220 display [2]

[1] http://reviews.cnet.com/lcd-monitors/viewsonic-vp2290b-lcd-m...

[2] http://en.wikipedia.org/wiki/IBM_T220/T221_LCD_monitors


Indeed, the medical imaging industry (radiology particularly) has been relying on monitors like these for the last 10+ years:

http://www.necdisplay.com/category/medical-diagnostic-displa...

These monitors use IPS panels, run upwards of 5MP, and have better color depth than consumer-grade monitors. Often times they are grayscale (most of these images are not colored).

These monitors are extremely expensive (~$20K), but they are FDA approved for diagnostic use. And yes, radiologists do need them.

Interestingly, as digital imaging proliferates across other medical disciplines (such as pathology), the costs of these high resolution displays and appropriate GPUs have been cited as a barrier to digitization. It seems that now prices are falling fast enough to make that excuse go away.

Radiologists and pathologists seem to love retina iPad displays...


> Radiologists and pathologists seem to love retina iPad displays...

Does the FDA allow them to be used for diagnosis, or do they have to 'technically' make the diagnosis after looking at one of those monitors you linked (or similar from another company)?


The FDA has actually cleared a couple of iPad apps for diagnostic viewing of radiological images; the ones I'm aware of are Nephosity and MobileMIM, and they can be used for x-ray, CT, PET, and MRI.

I believe these are being cleared as Class II devices under the section 510(k) premarket notification process.

Briefly, Class II devices include those that may be useful for diagnosis but aren't strictly necessary to sustain human life. The distinction has subtlety, so excuse the simplification.

The 510(k) process has you explain that your "device" (hardware or software) fits into some existing category of medical device and is equivalent to those existing devices. It's not a particularly stringent process and generally doesn't require clinical trials.

Note that in this case, "device" entails the iPad and app approved as a single item.

So for radiology, certain setups have been cleared, yes.

---

What's interesting is that on the pathology side, digital systems have not been cleared for primary diagnostic use, unlike in radiology. That means that a pathologist cannot diagnose a cancer based on a digital image of a biopsy alone.

Indeed, the FDA has ruled that digital pathology systems are Class III medical devices (necessary to sustain human life) and thus require a premarket approval(PMA) process. Unlike 510(k) approval, PMA requires a clinical trial phase that can take years (and millions of dollars).

While this seems arbitrary, there's actually decent logic to it: pathology often provides the last word in the diagnosis of many diseases, from malaria to cancers. The FDA is being very cautious, as compromised pathology work caused by unvalidated technology would directly impact patients.


Technology has come a long way, though - the VP2290b was a thick and heavy behemoth that was extremely difficult to drive using the signaling technology of the time.

Plus, even the final revision of the T221 required being driven as either two monitors using one dual-link DVI and one single-link DVI or as four monitors by using four single DVI links. Even then, the internal refresh rate was a slightly annoying 48Hz.

The T220/T221 really were ahead of their time but convenience and standard size of the new generation of "retina" displays shows how far we've come.


IMHO, the reason why the ViewSonic/T220 wasn't adopted in wider use isn't because of the reasons you listed above. It was because very few graphic cards could push that many pixels back in 2003.


That, and lack of operating system support for high PPI displays. A friend of mine has his PC hooked up to a plasma TV, and it's absolutely astounding how poorly Windows 7's UI element scaling "feature" works, in 2013.


A plasma TV has really LOW PPI... Like it's 50+ inches with only 1080 pixels down the screen... That's like a factor of 10 worse than retina displays...


No kidding. The effective PPI (number of dots that make up the display / size of the image being projected onto your retinas) of a large TV viewed at a distance is pretty high, though, so both "retina displays" and large, low-resolution TVs viewed from a typical distance require large fonts etc. to be legible.


If you could afford the monitor the graphic card would hardly be an issue.


Those cost like $20k though. If this is available for under $1000, it's a game changer.


It cost approx $8,500 [1] when released.

Also keep in mind that it wasn't until extremely recently (last few years) that LCD priced have dropped liked crazy. People back in 2003 were use to paying these kind of prices.

It's simply amazing how today, you can buy an highly performance computer ... even laptop, for less than $800.

[1] http://gizmodo.com/017019/viewsonic-vp2290b-mega-monitor-lus...


That's a later model. The original IBM T220 cost $17,999 (In 2000 dollars, $24,300 in 2013 dollars). It wasn't until 2 years later that the price dropped below $10k.


Part of that cost was due to the market though. IIRC it and others like it were targeted at medical imaging and to a limited degree publishing.

If you're an oncologist looking at MRI images, $8K is a Starbucks money.


yeah, it was around the $12k line. but also keep in mind, that cheap LCDs were in the $2k line. So not too much in terms of high end specialized gear.

today low ends are $200, and the high ends for $1.2k are uninteresting at best.


Agreed...I'm a bit disappointed that my 4.7" smartphone screen is 1080p but "that's all" they can squeeze into a full sized, full powered 31.5" display? zzz


There's a Viewsonic VP2290b on eBay right now: http://www.ebay.com/itm/190846605823


There was some ThinkPads (and some Dells, too, I think) with a very high density displays. Don't see too many of those anymore, except MacBooks.


Somewhere in early 2000s, vertical resolutions for all laptops magically got locked to 768 lines and most users neither noticed or cared.


I think it was during the second half of the '00 decade actually. I remember 1050 vertical being popular at least until 2007.


Correct. Many Thinkpads had 1050 and 1200 until 2007 or so. It went downhill from there.


I had a 15.4" Dell Inspiron 8500 laptop from 2003 with 1920x1200 and it was amazing. When I released it a few years later it was not possible to buy a laptop from Dell with that resolution :(


I have a D830 with that resolution and I've used it exclusively until just the last few months simply because it was so difficult to find a (cheap) replacement. I now use a 13" Zenbook that has 1600x900.


I have a T42p with 1600x1200 display. I am actually afraid to use it in fear it might break so I use a T41 from the same time period with 1024x768 :(


The iPad retina screen is 264 dpi. Your face will probably be a bit farther from the 22" screen, but still.


To add a better bound, the lowest resolution display Apple has called a Retina Display is the 15" MacBook Pro at 220 dpi. So although the ViewSonic in question was larger than any of the Retina Displays and almost as high resolution as the one with the lowest resolution, and was produced ten years ago, it wasn't actually higher resolution. :)


Even though it existed, it only had niche use cases at the time. Mainly viewing high DPI images.


Ford bought hundreds of them for CAD/CAM work. I had one at SGI to add support for high DPI to IRIX in 2003.


I have been waiting for this for years [1]. I am salivating at the thought and I'm eager to know the price. Will I be able to afford to retire my 30" monitors that were first manufactured eight years ago?

ASUS, if you're reading this: I'd say add an option to drop the integrated speakers because I still want a very thin bezel so that I can orient two or three side-by-side.

Edit: Ars has a photo of the PQ321 [2].

[1] http://tiamat.tsotech.com/displays-are-the-key

[2] http://arstechnica.com/gadgets/2013/05/asus-brings-4k-to-you...


They are using Sharp's 4K IGZO panel. The MSRP for Sharp's own monitor is $5500, the street price is closer to $4800. ASUS will be in the same ballpark most likely. If you can just order 3 of them to put side by side in portrait mode, I envy you :)

BTW - one nasty thing about the Sharp display is that it's LED edge-lit, and the ASUS is most likely the same (the press release says "backlit" but they don't mention local dimming so I'm guessing edge-lit as well). Perhaps I am being an unreasonable purist, but I don't like such corner-cutting on premium displays.


That is disappointingly pricey, but not all that surprising.

Maybe just one to start out with. :) I can rationalize such caution against my desire to splurge thus: with any luck, this will spark a little bit of necessary competition in desktop displays; an area stagnant since 2005's introduction of 30" 2560x1600 monitors. Better to have not invested heavily in the initial offering.

Plus, as you point out, there may be some odd corners cut. Edge-lit? Boo.

On the other hand, I've been dealing with four fluorescent backlit 30" monitors for years. LED backlit would be an upgrade even with an edge-lit design.

Is it too much to ask for OLED or similar 150+ dpi displays on my desktop before the close of the 2010s? 50+ inch. Possibly even flexible/concave. Please?



This is great! I probably annoy my friends with how much I tell them I wish there was an (affordable) high-resolution monitor available.

In fact, the retina MBP's screen is the only reason I switched from Windows to a Mac; that display alone was worth it.

Looking forward to a non-pixelated future. I'm surprised ASUS was the one to take the initiative though!


Apple has at least also being using 16:10. This ASUS monitor is 16:9 so you lose 11% of useful vertical resolution which is what matters a lot to us techie folk.


YES, I absolutely love 16:10 displays, but their always inordinately expensive and hard to come by. I have one very nice 1920x1200 Acer P241w but I'd love to have a nicer one.


I miss the old square-ish monitors!


16:9 is practically useless for development, at least for me. It's a trend I am not enjoying.


Apple has been a bit funky on this front. They ditched 16:10 displays in all of their desktops in 2009, while keeping 16:10 on their laptops.


Not sure the 140PPI is comparable to the Retina.


It is for a 31" monitor. How close do you plan on sitting to it?


No, it's not comparable to 227ppi on a Retina MBP, not even if it is slightly further away. Desktop monitors' ideal distance is usually arm's length, barely further away than where a laptop screen sits. This isn't enough to make a 140ppi screen appear equivalent to a "Retina" screen.


A laptop screen is usually quite a bit closer than a desktop screen because the comfort of typing on it limits how far you can push it away.

I got my measuring tape and put some numbers on this.

I sit an average distance of 75cm away from my desktop screens. The 24" screen in front of me is 52cm wide. So the horizontal viewing angle is 2 * inv_tan(52/2 / 75) = 38.2 degrees, assuming I sit positioned at the middle of it.

I sit about 45cm away from my MBA's screen, with the keyboard in a comfortable position. Maintaining the same viewing angle, the screen's visible-resolution-equivalent width would be inv_tan(x / 45) = 38.2, or 45 * tan(38.2) = 35cm. So in order to have a higher effective eye resolution than my monitor, the screen would have to have more than 52 / 35, or about 1.5, times the PPI.

Work that out with the numbers in this thread, and you get a 140ppi monitor being roughly equivalent to a 210ppi laptop screen, in terms of average view angle of a pixel on an mid-range sized display.

IMO it's not that far off the 220ppi of a 15" MBP. Definitely comparable. Especially since a 31" monitor won't be as easy to sit right in front of as a 24" - as your screen gets bigger, you end up with more head movement, and it can get tiring.


> you end up with more head movement, and it can get tiring.

They we ditch full-screen and go back to windows separating the contexts you are working with.

Most of the time, I'm with two screens side by side, 22" and laptop, separating my contexts, and keep the 22" with a text editor full-screen with a vertical split and two buffers, further minimizing head movement.

A 31" screen seems like a very workable solution.


I ditched full screen a long time ago. My browser window is portrait shaped; most of my editor windows are approximately square. I don't maximize anything except movies and games. I cascade windows with the bottom left corner acting as a selection surface - cascade means these corners are never occluded. I arrange my terminals on the other screen in a grid with gaps, so I can click through to my secondary testing browser.


Well the 30" market is 101PPI and has been for almost 9 years by my memory. A 40% jump isn't too bad after all that time.


> I'm surprised ASUS was the one to take the initiative though

I agree, it seems unlikely that they would create a new panel size all by themselves foregoing the benefit of scale. Perhaps there is another client for that same panel. Perhaps a well-known brand that does not reveal its products until they are ready to hit the market and the product can be presented with much fanfare during a keynote speech.


It's disappointing that every new display I see is still stuck with the archaic 16:9 resolution.

Sure, humans may have more periphery side-to-side. This is great for television. But, when you read a book, I doubt you read it horizontally.

So, if we do a lot of text manipulation with our computers, why not use a taller aspect ratio, like 3:2 or 4:3?

It makes no sense why this display still has less resolution than some ThinkCentre monitors from the early 00's. It makes no sense why there's still that crap 1366x768 being sold now.

We went from 1280x800 to 1366x768 (fewer vertical pixels), from 1440x900 to 1600x900 (same vertical), from 1680x1050 to 1600x900 (far fewer vertical pixels), from 1920x1200 to 1920x1080 (fewer), from 2560x1600 to 2560x1440 (far fewer).

You could keep the vertical resolutions the same, and simply increase the width, but that would be more expensive to produce, and for the same cost, you'd get a more proportional 8:5 display, so why would you bother?

It's an extremely sad, sad, thing.


I disagree with a few points here...

First off, consumer offerings, like TVs, and perhaps laptops, are all at 16:9, sure, but almost any decent computer monitor (from Dell, HP, Apple, etc.) are 16:10. Even if I'm off base, 16:10 is certainly not hard to find.

Second, while I agree that computing activities lend themselves well to vertical orientation, like you suggest, they lend themselves even more to multi displays and multi tasking.

So the question is do you really want 2 or 4 or 8 vertical displays, with a physical seam between each of them, or do you want 1 or 2 or 3 horizontal displays, all split in two with your window manager, and with fewer physical seams ?

I organize my workspaces vertically, and I have 3x 16:10 monitors split into (roughly) 6 vertical workspaces. I only have two physical seams. If monitors were not horizontal, I would have 6 displays with 5 vertical seams.

So if you're working on a laptop, or have just one screen, I feel your pain. But big computer monitors are better off horizontal (and in threes :)


I'm working on a laptop right now, which is part of the reason for my animosity towards 16:9. 16:10 displays exist, yes, but they're both expensive and limited in range.

I can no longer find a decent 22" 16:10 monitor; they're all 16:9. The only good monitor that's left is the 30" Dell Ultrasharp, which IS 16:10, but it's $1299, which is godly expensive.


There are many 24 inch 16:10 displays. The dell u2412m, hp has their zr24w. You can get these between 300 - 500.


Many desktop monitors can be rotated 90 degrees, effectively giving you a 9:16 ratio.


Horizontal resolution is important as well.

1080 pixels of horizontal resolution is far too little. 1440 is the same as my MacBook's Air's horizontal res, which is kind of ridiculous when I'm paying in excess of $1000 for a monitor.

Again, 8:5 ratios give you the same horizontal res (I don't know why this is such a difficult concept to grasp for some people) with more vertical resolution, in landscape mode.


Earlier you claimed book pages were the ideal ratio. 9x16 is a lot closer than either 8x5 or 5x8.


Yes, it would be, but then again -- computers are dynamic. We manipulate text around on screen, not just read it like we do in a book. Sufficient horizontal space is required for that.


I think it's the best of both worlds. If I can have four files open side-by-side and see a reasonable slice of each, I think that's a win.


But you can do that with 8:5, plus get the same vertical screen height as the 16:9 display.

You can still use xmonad/dwm/whathaveyou like you want; you'll just have more vertical height per slice.


I can't wait to see all the broken applications/unreadable small fonts on my Linux.

I've been excited for a high density display since Apple announced the retina display, but I have a gut feeling that it will take a lot of time for the Linux desktop to support it properly, if ever.


Linux has always handled font scaling better than windows in my experience. Windows has a tradition/culture/tendency toward building user interfaces by positioning UI elements in absolute pixel coordinates. But linux UI toolkits put much more of an emphasis on automatic/scalable layout.


Font scaling works just fine on Linux. The problem is icons. Many Linux programs don't have icons for high resolution displays, so while menus and toolbars look fine, icons are unclickably small.


Both dwm and xfce have worked just fine on my chromebook pixel. This is on debian 7. The only two applications that have given me issues are ones that do their own custom rendering - Chromium and Sublime text. Both of them have some UI text that cannot be resized and does not appear to respect system dpi settings.


It seems ironic that Chromium should give you issues on a Chromebook. Does (non-free) Chrome work?


On ChromeOS chrome works just fine. On debian webpages are rendered perfectly because I can change font settings for those, but the tab titles are drawn with text that is too small. There is no setting exposed currently from what I can tell to control the font of the tab titles.


Have you tried Sublime Text 3 beta? I know that it is suppose to support support Retina under Mac OS X.


He's talking about linux machine. Sublime Text 2 seems to support retina displays just fine with mac os.


I'm typing this in Debain Linux on a Chromebook Pixel, which has a 239 dpi display, significantly higher than this Asus monitor.

Everything works very nicely once I've tweaked a few settings, specifically adding zoom in Chrome and setting appropriate font scaling in Gnome 3.


Here's my Pixel running Sublime on Openbox:

https://github.com/appsforartists/pixel_webdev

It's almost perfect. There's some descender clipping on the tab label and the subheadings in the Command Palette are tiny.

Most apps I've tried work well using the DPI flag in xserverrc. Ones that use custom ui, like HipChat, are unusable though. Linux really needs a better UI toolkit story.


The better UI toolkit is called Qt.


My screen is 166dpi (1920x1080, 13"), I run Ubuntu and have minor problem only with Intellij. They use their own font scaling, and it interferes with button placement on some windows.


Could you describe how to adjust the dpi? I would like to try changing it on my current laptop, to get a feel for if my apps will work in future.

I realise I should be able to read the docs, but I have found lots of different suggestions on how to do it, while you have successfully!


There are a few ways to change the "text scaling factor" http://askubuntu.com/questions/60044/how-do-i-change-the-fon... but that doesn't change the real DPI that X uses. If your hardware supports XRANDR, it should automatically configure your device. If it gets a wrong value or you just want to override it: http://wiki.debian.org/XStrikeForce/HowToRandR12#III.3._Chan...


Unity tweak tool -> Fonts -> text scaling factor.

And for chromium I set default zoom to 125% in browser settings.


I'm using the dpi flag in X. Not sure if it would do the right thing on multiple monitors, but it works great on my Pixel:

https://github.com/appsforartists/pixel_webdev/blob/master/r...


What kind of display is that?



Are you happy with it? I've had my eye on Zenbooks for a while. I was about to pull the trigger on one, but have decided to either wait for Haswell, or consider getting a (silly) taichi 31 when they come out.


I have one, I love it. Running Linux Mint 15 with Cinnamon. One thing to watch for is that some kernel versions don't work well with the function keys or bumblebee (for optimus support). I'm running 3.9.4 now and it works quite well.

edit Also, I should add that I did spend the extra money to put a Samsung 840 SSD and an extra 8gb stick of ram in here


I'll vouch for the awesomeness of this exact laptop as well. I run linux in a VM inside it, though.

It's small, the resolution/screen is great, and has everything I need. I also upgraded the RAM and put in an SSD.


Yes. The screen is the best ever notebook screen i had.

After upgrading to 10gb RAM and SSD, it's a really good computer for developer.


Most Linux DE work just fine with higher DPI settings. KDE apps look as awesome as OSX apps.


You only have to wait for the next wayland/weston version.

The gnome team made a proposal for supporting HiDPI displays similiarly to how it's handled in OS X: https://docs.google.com/document/d/1rvtiZb_Sm9C9718IoYQgnpzk...

It's already merged into wayland and weston: http://lists.freedesktop.org/archives/wayland-devel/2013-May...

Hopefully wayland support for gnome 3.10 will be ready, so this can also be supported in Gnome.


I'd love if the high-level APIs dealt with physical dimensions rather than pixel counts, scaled or not...

How hard would it be to just ignore the physical properties of the display?


Arbitrary scaling makes images blurry, even vector graphics. It's a difficult problem to solve because computer graphics are highly aliased - they have many details only one pixel wide.

We have solved this problem with fonts, which have hinting and subpixel antialiasing. But we don't have software to do this for graphics. There is no image format with hinting, and no rendering library to do subpixel antialiasing. Hinting is difficult to do well, fonts are hinted by hand. It is a very time-consuming process and most graphic artists will not be good at it.

If your screen is sufficiently high-resolution, say 300+ dpi, you can get away without these tricks. But if you have a 140 dpi screen, and you want to scale everything 150%, it's necessary or everything will look blurry.

If you only allow pixel doubling, none of this is a problem. You can scale any graphic by 2x without ruining fine detail like thin lines.


It should be pretty well supported.

140ppi isn't a massive leap over current displays that are common in laptops. 11.6" at 1366x768 is similar... density wise.


Going to add to what others are saying, this has not been a problem for me with my chromebook pixel. I set my default scaling in chromium to 125%, and have urxvt rendering a nice freetype font that I adjust the size to on the fly depending on the task (I have opted to let Xft believe that the display is 96ppi. I tried both ways and prefer it that way.)


Hopefully compositors will be able to auto-scale windows in the future. Either by some automagic divination that the program is stuck in the '80s, or with a simple rightclick->holy fuck this shit makes me squint fix it pls.


You're talking about Linux, not a BSD. There are many desktop users who are capable of developing support, and I'd be surprised if nobody is already working on similar things for retina.


>You're talking about Linux, not a BSD.

No, he is talking about desktop environments and windowing systems that run on both, and have very little to do with either. What was your remark supposed to be implying?


Finally some movement in the desktop display market. I've been hoping for this every time I dragged a window from my retina display to the desktop monitor and it lost 3/4th of it's pixels.


More than 10 years after the IBM T220/T221 came out, but the IBM monitor still has a significantly bigger resolution (3840x2400) and PPI (203.98 vs 139.56).


Yes but the problem is its low refresh rate. If you're looking at static images, I'm sure it will do great, but if you are trying to watch a movie, play a game, or do anything that has fast motion, you won't have a good time.


Meanwhile the PQ321 has a refresh rate of ???.


DisplayPort 1.2 should allow for 60hz: http://forums.guru3d.com/showthread.php?p=4608199#post460819...

Gotta assume the monitor supports 60, going to 30 would be terrible.


60 fps, but the current drivers for both AMD and Nvidia (still in development) make you connect 2 cables and essentially treat it like 2 1920x2160 displays from what I read.


8ms, according to the OP.


8ms is the panel switch rate, not the same as the slow refresh rate of the Lenovo due to the bottleneck in the interconnect with the GPU (4 DVI-channels and 41 Hz refresh, when the panel surly did 60 Hz). It is a massive datarate, especially for a 12 year old system.


So a 24" Retina Thunderbolt display at WWDC is pretty much a given, then.


Maybe a bit premature, but I'd say you're spot-on in execution. This is almost certainly how Apple will bring Retina to the desktop: A 24" 1080p @2x, which just happens to be the same resolution as the emerging 4K standard. 27" is too big for 1080p-scaled UI elements, but 24" is just about right (and being Retina, you'll no doubt be able to crank it a bit higher if you give up the pure 2x multiplier).

If they launched this summer, I think it'd be a $2000+ display, which is probably more than Apple wants to enter the market at. In another 6-12 months, I can see them being able to hit something more like $1499, which sounds about right.


I don't think Apple has the same cost structure as everyone else; they could afford to loan a supplier enough money to essentially bring cost savings forward. I could see a $999-1499 Retina Desktop display now, and/or Retina iMac.


$1499 maybe, pulling off a 4K in 24" at the market rate 30" is going for is a tall order.


It'll be 16:10 (3840x2400), not 16:9 (3840x2160), but otherwise, yeah, that's what I figure.


What makes you think they'd switch back to 16:10? (I'd love it if they did)


I'd like to claim Special Knowledge, but of course it was strictly a brainfart.


Whoever down-voted you is a hater of the first degree. I would love less wide monitors as well.


I expect they'll launch the Retina Thunderbolt with the new Mac Pro (or whatever the new high-end thing is) because it will have Thunderbolt and will be the top end.

Will that come at WWDC, or later in the yaer... ?


>because it will have Thunderbolt

Uh, the whole Mac line has Thunderbolt, doesnt it?


Not the "current" Mac Pro.


I'd much prefer having higher-ppi 24" screens.


Take what you can get. The biggest obstacle right now is getting over the 1080p "hump"; once 4K is common, you'll see a variety of sizes available.


Honestly I think my next monitor will have to be a 120Hz 4k screen.

I'm not sure I could take the step back down to 60Hz after getting used to it, for me it's a much bigger deal than resolution.


Why is it so important? Is it for gaming, maybe? Because I am pretty sure it's impossible to notice that while working, because there is no flicker between frames anyway. And gaming on a 4K screen (at PC viewing distance) is pretty crazy.


Yes. Quake players swear by 120Hz+ screens. We're all geeks and nerds who still use CRT monitors because those new hip 'flat' things have too much input lag and too low refresh rate. In a fast-paced game like quake, these things make the difference between winning a duel and losing one.

Also in the fighting game community (King of Fighters, streetfighter) input lag is a killer. if your monitor has lag, you won't be able to chain your combos as well as the pros.

For working and document editing and browsing, 50Hz with slow input time is fine, but for 'serious' gaming it's not, I guess.

It's also one of the arguments I hear why PC gamers consider themselves 'more awesome'. "Why would I play on an xbox, on one of those TVs with the horrible refresh rate, and all the post-processing that adds milliseconds of delay"


I see. But I suppose that this kind of monitors are not made for gamers, anyway. Ultra-high resolution is good for sharp text/CAD/etc rendering. A game is just too fast-paced and textures aren't that high-resolution anyway.


If you are talking about Quake, fans have made plenty of high resolution textures, there is a 2.5GB HRTP at moddb: http://www.moddb.com/mods/quake-epsilon-build

Compare that to the original game data being around 30MB.

Interestingly, fan texture projects almost always have super high fidelity that normal games don't. Most of the Epsilon textures are in the 4kx4k neighborhood.


Gamers may swear by them, but that does not mean there is an actual difference. Have there been any double-blind tests between 60hz and 120hz?


If I ever get fu money I'm going to start double-blind testing everything - screen refresh rates; mouse rates; coffee preparation methods; speaker cables[1]; bitrates for media; compression for media; everything I can.


A site focused on conducting controlled double-blind tests would be amazing.


Blind Busters


Nice idea, I'm planning on using my fu money to run TV adverts countering the non-sense claims made by other 'hydro-nano-bollocks' pushing adverts.


You won't be able to buy the time to run those adverts, unfortunately.

Kalle Lasn[1] of Adbusters fame has been trying to do it for years, and the networks just say "We won't run adverts contrary to our big sponsors". Yes, even PBS and the CBC say this.

http://en.wikipedia.org/wiki/Kalle_Lasn


Can't say for gaming, but 120Hz is great for watching movies/anything at 24fps. With a 60Hz screen you gotta do something to make that 4:5 ratio work (either holding frames, or attempting to interpolate/interlace). At 120Hz, you got a nice clean integer ratio, so you can watch 24fps 'as it is'.


Anecdotally... The way I perceive framerate is related to the resolution. Take a horizontal panning shot at 720p/60hz... a vertical line might move 2 or 3 pixels per frame. At 2560*1600/60hz the same line jumps 4 or 6 pixels per frame and starts to appear discontinuous in vision. High contrast makes it more obvious. Motion blur removes this in exchange for input lag, but high frame rate is a better solution.


I've had numerous discussions with people about 60Hz vs 120Hz, including one where a friend cited some study (sorry no link) which claimed that, perceptually, people couldn't tell the difference between these two refresh rates. I was incredulous. I'd been playing Quake at a high level for a number of years and anything lower than a v-sync'd 120Hz setup was painful, on the flip-side, playing 120fps@120Hz (CRT) felt so fluid, like water (hard to describe, you have to play on the two setups to feel it).

I haven't had the same experience on any flat panel display I've used till now but I'm on the lookout for a good 120Hz LCD gaming monitor in the hope I get the same experience again.


I recently took a resolution downgrade to get a vg248qe for the 120hz with lightboost. See here: http://www.blurbusters.com/zero-motion-blur/lightboost/

If you ever have a chance to play with a monitor at 120hz, especially with lightboost, it's so obvious that you wouldn't feel the need to have a double-blind test. If you're not playing games, it's most noticeable when scrolling down a page in your browser or moving the mouse.


It's very easy to spot the difference for anyone, even the uninitiated.

Just launch your favorite first person shooter, and rotate the view very quickly. Of course the game needs to be running at framerates above 60 FPS, and with v-sync off. (Or at framerates above 120FPS, with vsync.) The difference between 120 Hz and 60 Hz will be immediately visible through the "smoothness" of the rotation.


And gaming on a 4K screen (at PC viewing distance) is pretty crazy.

Crazy Awesome.


Well yes, but I'd rather play at 1920x1080 and not have to sell a kidney for a graphics card that can drive the 4K display.


The next generation of graphics card will probably be able to take the dubious honor of being able to run Crysis (2007) at 1920x1080 / 60fps on a single GPU...

[1] http://www.anandtech.com/show/5261/amd-radeon-hd-7970-review...


You still have that option, I can't imagine any 4k displays would fail to support running at a 2x multiplier given that it is designed specifically for that.


I notice the difference whenever my desktop has quietly switched back to 60hz from 120Hz even in non-gaming applications. I'm not going to claim that it is important. But, the wet-glass smoothness of 120Hz makes me happier :)


Are you using a CRT monitor?

I can't even tell the difference between 120Hz and 60Hz on a monitor, simply because the display doesn't flicker.

If you're using your monitor for gaming, well, that's a different story.


Is there a technology aside from VGA that can drive 4k @ 120Hz?


Would do anything for an affordable ultra high resolution 24" or 30" monitor for development. Think of all the tmux and vim panes you can cram into one of these puppies :)


Seiki has a 4k 50 inch screen for $1300. It has poor inputs and is currently sold out, but it signals things to come. http://www.tigerdirect.com/applications/SearchTools/item-det...


Disinclined as I am to use Apple's "Retina" marketing term, it is a conceptually useful distinction. On a given display at the real world distance you use it for work, you can't see the pixels.

That is what we as users want, and it is also the limit of what is useful: packing additional pixels into the same surface area doesn't help anything.

This monitor isn't quite there. But it is still a big leap in terms of progress over the standard 2560x1440 27" inch panels currently used by Apple, Dell, etc.

We already have similar monitors in Japan for like $4000... it will be interesting to see how this one is priced.

I have a couple 30" and a newer 27" monitors, but I'd trade all three of them for one 24" truly retina monitor.


Can anyone explain how 10 bit color would work for a monitor? Would we even notice it since most image formats are 8-bits-per-channel... and what does the OS do?

Is the monitor able to more accurately show the difference between the sun vs a desk-lamp? (Where today both would appear as RGB 255 in a photo, unless you took 3 exposures and combined them into a 32 bit HDR file... but anyway...)


10bit colour eliminates banding on gradients.


I think you can take advantage of the extra bits for color calibration. Your source is still 8 bits, but those 8 bits are converted to 10 in a way that provides better accuracy.


I'm still trying to wrap my head around how this works, I guess. If I am viewing a JPEG that is only 8 bit color (per channel), how does a 10 bit monitor make that any better without having a 10 bits per channel image source?

Or what if I'm in Photoshop in 32 bits per channel mode, doing some HDR work... does this 10 bit monitor mean I'm seeing slightly better colors (more dynamic range) than my old 8 bit monitor?


The data channel from GPU to display is most likely still 8-bit. The path from the display controller to the panel, however, is 10-bit (some monitors even use 12). This allows the display controller to apply gamma, contrast, and brightness curves to the incoming data without sacrificing any color resolution. When you make the same adjustments on your PC, the GPU has to dither the result back to 8-bit for transmission.


I'd assume the JPEG can have some embedded colour profile. This profile specifies a transform between the JPEG's 24bit colour space into the monitor's 30bit colour space.

For example, consider two JPEGs with embedded colour profiles. "Red" (255,0,0) in one image will map to some "Reddish" 30 bit value, while (255,0,0) in the other will map to a slightly different 30 bit value.


Anybody have more info on this "IGZO" tech? My main interest in viewing angle, color stability, and color reproduction.


I'm guessing they're using the Sharp panel announced last year. http://www.youtube.com/watch?v=L1wRivEeIU8


The IGZO replaces the a-Si transistors, while the viewing angle, color stability, etc. depend on the LCD, color filters and the backlight.


I just bought a 2560x1080px monitor just to discover that there is no way to use it with my laptop. It seems that most laptops are limited to 1920x1080 despite that the graphics card may support higher resolutions.


I had a similar problem at work getting my external (27" 2560x1440) monitor working with my laptop (a Sony F-series) running Ubuntu. We solved it by lowering the refresh rate used by the HDMI output. It took quite a bit of tweaking to find the right values, and frankly I've forgotten what most of it meant, but this may help you:

In the "Monitor" section, we defined modelines:

    ModeLine       "2560x1440_33" 164.99 2560 2688 2960 3360 1440 1441 1444 1468 -hsync +vsync
    ModeLine       "2560x1440_30" 146.27 2560 2680 2944 3328 1440 1441 1444 1465 -hsync +vsync

Then, in the Screen section, I appear to have modes selected from there:

    Option         "metamodes" "DFP-0: nvidia-auto-select +0+360, DFP-1: 2560x1440_33 +1920+0; # ... and a bunch more that seem to be auto-generated by nvidia-settings?
The down side of this is that I can't connect my laptop to use any of our projectors in on conference rooms, but it's a small price to pay in order to be able to drive the larger screen.


If your laptop has a displayport output you should be fine, alternatively try to use dual link DVI if your laptop supports that.


I can only slobber at the thought of 24 (3 monitors * 8 columns per monitor) columns of emacs open at once.

That's almost 3 KLOCs.


8ms GTG response time is probably fine for most users but isn't great. Even Monoprice's cheap IPS displays are 6ms.


I swear technology is working against me; I just finished buying my IPS panel monitors.


Random guess - under $1,000? Would that be an okay price for it?


I doubt that a 31.5-inch 3840 x 2160 monitor will be less than $1,000. Dell's U3014, which is 30-inch and 2560x1600, is $1,200.


We've seen some $400 2560x1600 monitors, and also a $1,300 4k TV. That Dell one sounds like it's from the "old era" of high resolution monitors. I could see it go under $1,000, if Asus really wanted to turn it into a mass market product, and doesn't just want to sell 1,000 of them.


And that one is worth every penny (looking at it right now).


You can get the same panels for 1/3rd to 1/2 the price from the usual Korean suppliers.

I bought two.


Fortunately I am no longer in the spot where I have to order things from halfway around the world to save a few bucks but I can just go to dell.com/dell.nl and order what I need for my job.

I'd feel if I still had to scrape the bottom of the barrel in order to get the tools I need for my trade at 48 that I'd be doing something seriously wrong :)

It's good those cheaper options exist, me, I just buy what I need at the list price if it's tools, I have one of these at home and one of them at work. It may be a bit more expensive but the time spent and possible frustration on trying to save $500 on something ordered from far away just is not worth it to me, especially not if it were to break at some point during what would normally be the warranty period. I guess at your price point you could simply order yet another one as a spare :)

I have the same attitude towards cars (even though I'm perfectly capable of fixing just about anything), food and a few other items. In some ways this is a tremendous luxury and I am well aware of that.


I hear you. There's a lot of things I don't bother to comparison shop for these days. I'm in that place where I can order them from Dell and pay the 300% markup.

I just felt that earning effectively $1600/hr in terms of time spent selecting a weird Korean monitor was an effective ROI.

When I'm billing that much I guess I'll feel differently :)


Quality is about on par with price for those Korean monitors.


They're literally the same panels as the Dells and Apples.



Can't be. The 30'' WQXGA monitors are ~$1200, the ultra cheap Koreans at ~$500. My guess would be $3000+


Awesome progress! Now also do it witha 24" monitor please, 31" is a bit on the big side for my desk setup at home.


Maybe this will trickle down to the oculus? We just got one and Ya the only things missing is pixel density.


> Inputs/outputs DisplayPort, 2x HDMI (optional), RS-232C, 3.5mm audio-in, 3.5mm audio-out

RS-232C?! Cool!


RS-232 is basically required for any serious display. Why? Integration into videoconferencing and other systems with Krestron and other units. This channel is used for switching them on and off, changing the mode and input, etc.


Finally!!! I've read the article multiple times and no price? Did I miss it?


8 ms response time... not ideal for PC gaming I don't think.


Refresh rate is probably 30Hz I'm guessing. If so...ick.


I don't think I would need more for a terminal and a text editor, really.


Specs on page say 8ms response time.


That's not relevant to what he's saying.


so is this IPS or TN?


IGZO


oh, so it isn't even TFT technology?



neat, I didn't know about this, thank you!


Why can't they make a 21 inch 4k monitor...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: