Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Mini-LED, Micro-LED and OLED displays: present status and future perspectives (nature.com)
66 points by zeristor on July 6, 2020 | hide | past | favorite | 62 comments


One elephant in the room that wasn't mentioned:

Samsung is years ahead of anybody with self emissive quantum dots, and new types of TFT matrices.

Samsung can commercialise cheap self emissive qdots within 2-3 years if they want, and then microleds will be automatically obsolete, probably with exception of displays for VR goggles.

Samsung has another ace in the sleeve they don't get public about: they know a trick how to grow GaN leds on amorphous substrates. Their papers from 2012 showed them being in a quite advanced stage already.

If they can get advanced with the later, they can not only bust the display market, but pretty much everything LED.

If LEDs can be grown on a much cheaper, and less finicky substrate than sapphire, we can talk about them getting n-times cheaper.


Link?

> Samsung can commercialise cheap self emissive qdots within 2-3 years if they want

Fantastic if true, but I'll believe it when I purchase one for a price competitive with similarly performing technologies. I've seen enough promised display technologies that were "2-3 years away" to be very skeptical.



So if samsung is so ahead, why aren't we seeing it in present displays? Is it because it's too expensive? Or are they hoarding it for a rainy day?


If the technology is so much better it doesn't make sense to horde it. Put it out there and dominate the market. The longer you sit on it the longer you give your competitors to develop similar or better alternatives and waste your advantage.

The most likely answer is that they can't be manufactured at scale for reasonable price points currently.


You should probably tell that to Mr. Lee :)


> Or are they hoarding it for a rainy day?

Very much it, that's how "big semi" operated through the years.

R&D offices produce motherlodes worth of research for directors to pick depending on market situation on the moment notice.


So, then, where's Intel's hoarded research? Seems like they're in the best possible "market situation" to want to be seen as innovating, right now.


Take a look at how Samsung handled OLED screens. They had an immeasurable advantage over almost anybody early on, but they did not want to destroy their LCD screen business which was still making good money, and they only made it on scale for small smartphone screens, and their own tablets/laptops which did not compete with their own OEM sales.


When you say years ahead, are you talking about production “at high scale” or quality of product? Everyone seems to have quantum dot sets now (tcl, vizio, lg, Samsung, Sony “triluminos”) and I don’t see anything earth shattering between the offerings


Those displays are not really self emissive - they use quantum dots to enhance the RGB filters, but are still LED backlit or edgelit, they still suffer from backlight bleed like regular LCD


Very excited for the new display technologies on the horizon. Sure, current Tv's and Monitors are pretty good, but I can't wait to see the picture quality that we're going to get. Linus Tech Tips has a pretty easy to understand video (much higher level though) on this stuff. He goes into the manufacturer roadmap which I think is pretty good to know in terms of the time horizon for these technologies. https://www.youtube.com/watch?v=RTTiQeXXrhI


Thanks, that was an excellent video.


This report doesn't seem to mention burn-in directly. This is IMHO the achilles heel of the current crop of OLED TVs. This is a very hidden fact. Everyone talks about how great they are and say they're the hands-down winner among popular TVs but after having owned one for 2 years it displayed horrible burn-in. They are fantastic while they last then they are awful. It would be nice to know how mLeds, uLeds, and the new tech sizes up for burn-in.


Seriously it is not. I have a LG C9 and bought it after recommendation from friends who game on theirs. If you are going to leave it on the same screen all day long there are better choices, like having it on a new station that has a permanent ticker or channels which insist on a permanent logo.

You have to go out of your way to force burn in, there has been some good 3rd party testing [0]

Oh, as for the TV, the colors are astounding and its thinness is just a bit on the silly side, seriously from one third on up from the base it is as thick as iPad

I expect micro LED to over take it one day but for now if you are a movie aficionado there is no subsistute, this is especially true when dealing with letter boxed movies as there is no bleed into the black borders

[0] https://www.rtings.com/tv/learn/real-life-oled-burn-in-test


I have a C9. Best television I’ve ever owned. I’ll be heartbroken if it suffers burn-in in a couple of years.


C9 is still pretty new, isn’t it? My C6 burn in is awful.


I have a C7 and after running it maybe 4-5 hours / day average I haven't seen burn-in. If anything, not watching cable news is an improvement to our lives and when I eventually upgrade I can use it for gaming and not mind that the burn-in is happening for anything remotely serious. LG's input and panel latency with OLED is quite an exception and is one reason I paid for such a panel.


Another C7 user here checking in, had it for almost 3 years now - about 5hrs/day of mixed use TV/movies and it's still awesome.

No sign of burn-in and I check it with color slides every few months. I think the C6 panels were much more prone to burn in than C7.

Some people with C7 panels had problems with a bug where the automatic compensation program does not run which greatly accelerates burn in, but I believe LG is replacing TVs for people who get that.

My C7 looks absolutely stunning with good sources 4k/HDR.

I'm happy with my purchase every day and would never think about going back to crappy LCD.


I use my B6 for watching TV (streaming) and for gaming. I have absolutely no burn-in of any kind. Also see: https://www.rtings.com/tv/learn/real-life-oled-burn-in-test


You need to distinguish between burn-in and image retention. Are you seeing burn-in or just image retention?

Burn-in: When you have an image that stays even long after you've switched (e.g., the news channel logo is permanently stuck in the corner).

Image retention: When a static image is briefly retained when you switch the image to a mostly-even color (e.g., a logo stays after you switch to a mostly-gray background, but fades over a period of minutes/hours).

I've seen image retention on my 55EG9100 (even older than another post's 'burned in' C6), but I've never seen burn-in. And I played lots and lots of video games that has static HUDs.

Ostensibly the newer OLED panels have even less image retention problems than the older ones. I've never noticed even image retention on my newer C8 OLED.


Wasn't the problem with OLEDs simply breakdown? They lose some percentage of their brightness and color depth every year until they look old and washed out? And the worst part is each color degrades at a different rate so the color balance gets wonky as the panel ages.


OLED aging is one of the primary reasons that burn-in is a thing. There's a ton of algorithms to try to normalize this so as the panel ages that it just uniformly dims, and you don't get hot spots (or cold spots). I don't think it's complex enough to handle stuff like the news logos though. Generally speaking it's assumed that the pixels are all exercised somewhat uniformly I think.


Whether burn-in is still an issue depends very heavily on how you use the OLED TV. I have my OLED TV for almost three years and no burn-in. I know there are tests that show that you can provoke burn-in, but that is with rather extreme scenarios. If you don't have static elements on for hours at a time, you generally shouldn't have an issue. One thing I've heard, but that doesn't apply to my usage is that e.g. putting on TV stations with static chyrons in bold colors can lead to burn-in.


I first noticed the yellow button from the youtube app burnt in (after about three years). Then it was a smear of green in the middle probably from all the yellow highlighting of objects in the witcher(Xbox game). Also some lines from subtitles are starting to show, and lately I’ve been able to read Netflix written in the lower right corner.

It seems there are some ui-patterns involved that could perhaps, at least for the webos apps, be avoided through interface standards.


Here is one of those tests:

https://www.rtings.com/tv/learn/real-life-oled-burn-in-test

Is it extreme? Sure, but it's not contrived--five hours on, one hour off. If the TV in the office break room was an OLED, it would absolutely exhibit this level of burn in.


> This is a very hidden fact

Burn-in risks crop up in every online conversation about OLEDs, at least since LG's 2006 line was released.


How do you use your OLED TV? As a computer monitor, or for some other similarly static image?


Is OLED burn in worse or better than Plasma?


Same as on a B/W CRT.


I'm somewhat disappointed in the lack of progress on OLEDs the last few years. For TVs, LG is the only company that actually produces panels, every single OLED TV uses their panels. And even though there are new models every year, all OLED TVs of the last few years use the same old panel.


That is not true at all. While LG is the only OLED TV panel manufacture on the market, they have been making progress every year in their WOLED panel. And many more technical innovations in the pipeline. And older panel drop in price as they improve yield, that is why you start seeing affordable OLED. They differ in colour accuracy, refresh rate, QA, max brightness etc. To say they are the same panel every year give very little credit to LG's WOLED engineering.


My information is from various reviews, and the image quality has stayed pretty much the same the last three years according to that. The image processing has changed, but the panels seem to be pretty much the same within the differences you get per-panel anyway.


Max brightness progress is marginal at best for the past 4-5 years.


I’ve never looked at my C9 and thought “I wish it was brighter”. Does seem to bother some people though so YMMV.


Because that is at the expense of the panel's longevity that most people not prefer as a trade offs?


That same old panel is still the best display you can buy.


One major question I have: how come we haven't seen any innovations to desktop monitors? Most of the innovations come first to TV's and smartphone displays. It seems like the pace of desktop monitor innovation is slower.


I'm more interested in why TVs and monitors are considered separate markets. The difference between a TV and a monitor is software, and maybe a display port input for the highest resolutions or refresh rates. This large TV I use as a monitor is great, apart from lack of suspend and an annoying banner that pops up every time the resolution or audio inputs change. And much cheaper, assuming I could find a 43" 4k computer display at all in my region.


Spot on. I switched to a 40” 4k Phillips VA panel (BDM4065) a few years back and can’t imagine switching back to anything smaller. Quite cheap, too. Might not be suitable for colour-sensitive work but perfectly serviceable for VS Code and a browser.


Total PC 2019 255.7

Mobile Phone 2019 1,743.10

and is 4k, higher refresh rates, curved displays, ultra ultra wide, very very thin (comparying my new to old monitor, and less energy usage; not innovations?


My guess is that most desktop monitor buyers are very price sensitive and largely quality insensitive.


Smaller market?


Very interesting. One thing I don't fully understand is the difference between mLED and uLED. As I understand, mLED have a an array of separately controllable backlights, but what exactly is uLED?


miniLED means that they place small LED's that are basically equivalent to the present level of LED technology behind an LCD array. So instead of the normal LCD backlight design where you shoot LED light sideways from the LCD module edge into a series of films that direct the light from the sideways direction to the normal viewing axis direction, the LED's are placed in a regular grid behind the LCD array itself, but you need a large number of these mini LED's to cover the entire surface of the display panel, as well as some diffusers to reduce hotspots. The main advantage is that the peak brightness you could get using this technique could be quite high. And in a sense you can get high contrast ratios by selectively dimming certain zones. So comparing two zones the contrast can be quite high, but comparing within a zone the contrast ratio is still the same as a normal LCD. uLED means that you are attempting to place one LED chip for every pixel. Equivalent to how they build those large LED signs at sports arenas. In the smart phone situation, the LED chips would need to be crazy small -- on the order of say 10-50 microns -- so the name uLED (microLED) makes sense.


FALD has been a thing long since before mled was spun up. As far as I can tell, mled is just a marketing term for slightly higher number of dimming zones.


Mini led is supposed to be quite a lot higher number of zones, but it looks like the controllers / algorithms aren't ready yet. Regular FALD tvs generally have 16-800 zones.

The tcl 8 series tv is the only miniled one i know of, and has 25000 minileds.

Unfortunately they are grouped such that there are only about 900 zones, which is very disappointing. Hopefully the next generation of miniled will deliver the hype.


If the number of dimming zones approaches the number of pixels, that could make a pretty big difference.


Approaches, but very slowly. Jumping from 1000 zones to 10000 zones might seem impressive, but it is still pretty far cry from 8+ million pixels of a typical 4k screen.


The pixels are self-emitting (similar to OLED).


One problem I see with microLED's taking off is that the primary parameter the microLED design helps with is power for a given brightness. If the uLED tech costs more than a normal smartphone display (seems to be likely to be true for some time) then getting improved power efficiency I'm not sure would cause a big wave of adoption similar to what has happened with OLED over the past 3-4 years. OLED compared to LCD tends to have both better color gamut and contrast ratio, and when using dark mode UI's the power savings can be pretty high. Hard to see how uLED would be able to differentiate itself from OLED except for showing HDR images in a power efficient way. I'm probably missing something else about why uLED is good though?


> I'm probably missing something else about why uLED is good though?

The fancy VR/AR goggles category. It's pretty much the one, and only way to make battery powered goggles usable outdoors.

The one who gets there first with patents wins the game.


Also the "doesn't degrade/burn-in/shift whitepoint after a year of ordinary use" category.


This is the main concern I think. It's what's always being brought up with OLEDs.


Yes and OLED has a decent roadmap of Cost Reduction in the next 3- 5 years along with other quality improvements. So I dont think uLED would replace them on Smartphone. Not to mention the possibility of screen without the cover glass. So your smartphone with reduction in weight, does not break the panel when fall, and thinner.

But for Desktop applications and VR Glasses uLED will have an advantage with no burn in issues.

I do sometimes wonder if WOLED could scale down to those needs. But I dont see any intention of LG going down that path.

Correct me if I am wrong.


Why not sandwich two LCDs together — one B&W lower resolution, and then the 4K color one? Align them so one B&W pixel switches for the three sub pixels of RGB.


Hi-sense has one coming to market. Dual layer lcd.

Ive read somewhere that i cant find currently that due to the second layer its much worse energy efficiency wise than current backlit tvs and may fail energy standards from various organizations and may not be able to be sold. (EU maybe).

Oled is coming down in price though, itll be interesting to see if its price competitive.

https://www.cnet.com/news/look-out-oled-hisense-unveils-dual...

Great idea, but i dont think id buy first gen.


These exist, called dual-layer LCD. IIRC mostly reserved for professional applications, with some consumer stuff on the horizon.


People want HDR. hard to make a brighter display by removing more light.


They also want better black levels that oled provides. Its relatively easy to get lcds bright, but then the black level suffers.

But yeah, ive read these dual layer lcds are horribly inefficient power wise.


how do you get blacker than 'no light emitted' when the oled pixel is off? in other words, how does a light bulb in your desk lamp produce less light than zero?


Please already replace the LED LCD of the last 20 years, whose backlight bleed is ridiculously bad.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: