Hacker News new | past | comments | ask | show | jobs | submit login

I don't mind CRTs. What does bother me somewhat is the narrative that they're somehow better than the displays we use today; your use case has to be so narrow and specific to benefit from a CRT is any meaningful way that most people may as well just ignore they existed. They're less color accurate, have horrible artifacting ("but it makes pixel art look better!"), force you to choose between refresh rate and resolution, are massive and heavy and consume more power than most people's computers in the first place.

If you're a retro fetishist or an analog gaming nut, then I could see how you might get something out of it. For everyone else, save your money and buy anything else. The price that good CRTs demand is simply absurd when compared to their cheaper, flatscreen alternatives.




Saying that modern monitors are more color accurate that CRTs really undersells the difference. Having been a professional photographer before a button presser, I feel I have a need to step in here. This narrative is so, so horribly wrong.

Nearly any given color CRT (in the 90's era) has a far flatter visual spectral response than modern ubiquitous displays. Each color has meaningful contrast; whereas the typical blue-LED is heavily weighted towards blue.

Even if a new display measures better, your brain interprets color from a tube better.

The only established displays today that approach that flatness (exceed them) are genuine three-color OLED displays, which are prohibitively expensive as packaged as a computer monitor.

I have some mild hope for new display technologies that I'm hearing about. I'm betting we are five years, at a minimum, before displays with that color quality are common.


There's nothing quite like a calibrated P22 phosphor display in a dark room.


Adding to com2kid's point (I had a CRT that I ran normally at 1600x1200, 80 or 85Hz), current displays still have problems with blacks. VA suffers black crush / gamma shift, IPS glows. Both suck in a way that CRTs did not. Don't even mention TN. OLED and microled monitors are not really here yet.


CRTs are relevant to gamecube players- in particular, super smash bros. latency.

For example, playing on 720p+ displays causes upscaling, which introduces variable latency into the game and renders it unplayable at a competitive level.

These days there are monitors that you can get instead, but the demand for CRTs is still high since they're cheap.


Until recently, you could only have LCDs at 60hz.

Over 20 years ago we had 1600x1200 CRTs at 85hz.


To some degree negated by the phosphor leading to smearing, unless the monitor was built to only work at those higher frequencies.

I had a big Barco that had this to a ridiculous degree. That thing was also scary to degauss.



Pretty much all beyond-60-Hz LCDs up until ~3 years ago were pretty horrible trash (and there is a fairly decent argument to be made that they're still trash).


I ran my Sony monitor at 100 Hz back then. The mouse movement sure got much smoother when I clocked the PS/2 port to 200 Hz.


What is analog gaming?

(The output lag in LCD displays is really significant. Switching between LCD and CRT feels remarkably different in games where reaction speed matters.)


I think he means vintage gaming, especially arcade games that used 15 Hz monitors.


Similar things can be said about records and cassettes. But there's a retro nostalgic vintage contingent that will pay a premium for them.


>They're less color accurate,

No.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: