Hacker News new | past | comments | ask | show | jobs | submit login
That annoying shade of blue (2018) (bsago.me)
267 points by todsacerdoti on Sept 15, 2022 | hide | past | favorite | 126 comments



A lot of these colours were probably designed for white backgrounds; the default xterm background colour is white, and many terminals copied that. My old SparcStation uses a white background colour from the moment it boots, etc. I'm not sure how common it was in older systems, but it was definitely a thing.

In Vim it's been a fixed problem for ages if you use "set background=dark" by the way.


Having a dark background used to be the default in the beginnings with CRT, colored/bright backgrounds look very ugly when you can see the distinct scanlines.


If I remember well, back then the black background wasn't perfect black, but something like very dark green. I have recently played again (with a LCD) games that I used to play as a kid with a CRT monitor and in some cases the colors look extremely ugly in the LCD. I don't know why


Analog monitors like the CRTs back then suffered from artifact colors[1], LCDs do not. Games that were designed to account for these artifact colors do indeed look worse on LCDs.

[1] https://en.wikipedia.org/wiki/Composite_artifact_colors


Note that this is for the crappy CRT TVs most people used (some home computers either connected to such TVs or used similarly bad CRTs).

This was not an issue on computer monitors that were meant to display crisp text.


Less so on computer monitors, but the raster on a CRT is not pixels. So graphics made for CRTs look worse on LCDs. Even low-res games on a CRT look less chunky than on LCD.


Unless the CRT has some issue (out of focus, or whatever) then graphics should look 99% the same as on an LCD, especially low res graphics that gets double scanned.

Here is a photo i took some time ago from a CRT and a flat panel i have with the same software showing the same picture (with the picture at the top) - both monitors are of relatively high quality:

https://i.imgur.com/xlsGRRw.png

In both images you can see the pixels clearly and that was in 640x480 (which doesn't get double scanned) for the CRT - they also pretty much look the same (the small color fringes are because of my awful phone camera, in person they are almost identical).

Here is another one, same shitty camera (hence the glow) but at the top left and right side (where there isn't much glow) you can see how clear the pixels are - this is double scanned:

https://i.imgur.com/10oa6xe.jpg

The main difference between decent CRT monitors and flat panels is that with the CRT monitors you get to use a bunch of different resolutions without visual degradation from scaling. Also scanlines, but that depends on the CRT (more visible in larger CRTs with lower resolutions) and you get a somewhat similar effect with some flat panel monitors (especially larger ones). Beyond that when there isn't any scaling (i.e. both display the art pixels at 1:1 ratio) there will only be very minor differences. Anything else would be due to a faulty CRT (nowadays many are in bad condition due to their age) and not really inherent in the tech.


CRTs were never crisp at decent resolutions (like 1280x1024 on a 17").

LCD sharpness was basically mind-blowing in comparison at the time.


Not true at all - good CRT's could be overclocked and do HD no problem. LCD's looked like shit in comparison.


The article mentions this the colours were not designed at all. It was an accident.


I think that is at least misleading, if not inaccurate. They may not have been "designed" in the sense of a graphic designer choosing the most pleasing or user-friendly colours, but there was logic behind the choices. They're the eight permutations (three primary colours (red, green, and blue), three secondary colours (yellow, cyan, and magenta), black, and white) that you get by setting each of the three RGB channels to either zero or maximum.

It's not great for terminal text on a black background, but it's better than the CGA palette, IMO.


1 bit each for R, G, and B, plus one for intensity IS the CGA text palette.

Since it’s a digital signal, levels are dictated by IBM hardware. I remember there were numerous different monitors with slightly different palettes (defined by resistors between reference voltages and the analog electron gun power)


They were designed for sure; just by an engineer rather than a graphic designer.

Like: for each character cell on this terminal, we have an eight-bit memory budget for color. For each of foreground and background, we're going to allocate one bit for "bright" and one bit each for "red", "green" and "blue" ... in the output circuit to the DAC that drives the display, we're going to send a full-scale value for each of R/G/B when the "bright" bit is set and a half-scale value when it's not set.


I thought we were talking about games in this subthread. The artwork was not an accident.


On the other hand, many video terminals (such as the VT100) could be put into a reverse mode, dark text on bright background, and I've seen it in enough old photographs to assume people actually used it pretty often.


The screen of my first computer (b/w screen, late 80s) had a physical button to toggle inverted mode.


Is that inherent to CRTs? Windows and Mac used black text on white background in their GUIs from the beginning.

Even some early text mode computers used a colored/bright background - my personal experience was on the TI-99 in its onboard BASIC environment.

Scanlines were normal at the time - I think we only see them as ugly now after getting used to LCDs.


"in the beginnings with CRT" almost certainly refers to monochrome "green screen" monitors. The monitors you got with Macs and most monitors that came out with Windows 3.1 were pretty different.


Yes, it's certainly not impossible to display white backgrounds well on CRTs in general (consider upmarket late-'90s CRT monitors in particular, still overall decent displays by modern standards) but it wasn't to be taken for granted on early-'80s CRTs. White-background display conventions more or less came in with the Engelbart/PARC GUI and IIRC (I think Levy's Insanely Great or https://folklore.org discusses this?) Apple put a lot of effort into making the original Mac's (and I assume Lisa's) monochrome display pleasant to use in that manner.


"Designed" is a stretch. Yellow and cyan become the nigh-unreadable colors instead with a white background.


Sparc Terms were always the pinnacle of achievement for me. I prefer white backgrounds still today.


And they had a gorgeous font. Solaris on x86 still defaults to it, but as white on black.


I seem to remember that the Xerox Alto defaulted to a white or light grey background as well, and of course the same for early Macs. For me, in the 80s and 90s, lighter colour backgrounds - even on essentially monochrome displays - were generally a signifier of a higher end, or sometimes experimental, workstation machine or OS, than black backgrounds, which felt much more like corporate droneware.

(Of course, that wasn't always the case: the Sinclair ZX81 and ZX Spectrum, both very cheap microcomputers introduced in the early 80s - and I think also maybe the Acorn Electron - defaulted to white backgrounds.)


These "defaults" are the 4-bit colors linearly mapped onto the RGB space. It made the color generation easier to implement in hardware.

I think the sensible use case for this color is as background color, like the Norton Commander, derivatives of it, and several other ncurses programs do.


Exactly! I just wish the article got to the real issue, which is that sRGB isn't a linear color space (roughly quadratic) and your eyes perceive blue luminosity as ~10% of the brightness of green (red is ~%50).

That means that a half blue (00 00 7F) would be 22% as bright as full blue (00 00 FF) with an apparent luminosity (Lab) of 2.2% compared to full green (FF 00 00) or 1.6% of the luminosity of full white (FF FF FF). No wonder it's hard to see!


This is a problem introduced by Windows. In VGA and earlier text modes, the dim blue was brighter.

https://en.wikipedia.org/wiki/ANSI_escape_code#Colors


You can always increase the gain of the blue gun to compensate for how humans misperceive blue.


Ummm, misperceive blue? Relative to what, green, and red?

There is a thing called white balance. Although the eye/mind will adjust to different levels of blue (absent an environmental source) the apparent 10x gain adjustment needed to do this is unlikely to be effective or comfortable to anyone using such a monitor. I suspect the violet halo of such a "white" would be searing and likely not appear balanced in any way.


Yep, the venerable 16-color palette is at least as old as CGA: [1]. And the "annoying" dark blue is not even #00F, it's approximately #00A (and similarly to the other "dark" colors; the "bright" colors are roughly #55F etc.) CGA text mode character cells comprise 16 bits, with 8 bits encoding the character, and 3+1+3+1 bits the foreground color, foreground intensity, background color, and blink state respectively.

[1] https://en.wikipedia.org/wiki/Color_Graphics_Adapter#Color_p...


One issue with this article is that it seems to assume that everybody is using a black background. I know PuTTY supported white backgrounds, because that's what I used. White background makes the blue perfectly readable; suddenly it's bright yellow that might be a problem. So anyone looking to fix terminal colours should not just pay attention to blue, or to how they look on black, but also to yellow and how it looks on white.


You really need two different palettes for white vs. black background if all the (other) colors are to be readable with good contrast on the respective background.


The problem here is that applications can set a background colour, and that as an application you don't know what colours the user defined.

For example Angband just sets the background colour to black (not sure about Nethack), so if you defined colours that work well on a light background you're still going to have a bad time. Quite a few other terminal programs (especially those using ncurses or other interactive TUI interfaces) tend to set a background colour.

There isn't really a good way to deal with this using the default 16 colours other than never setting any background colours, which can be a bit limiting for interactive applications. The best solution is to have application support for this and using 256 or 24 bit "true" colours (256-colour support is pretty much universal, and 24 bit colour support is very wide-spread).


The best solution is for the background color (and possibly other colors) to be configurable for the application, as many applications allow. If a user prefers white or black background, he/she usually prefers it in all applications.

256 or more colors has the drawback that the user can’t configure the colors anymore to his/her needs (light vs. dark, high-contrast, color blindness, etc.), unless the application also provides comprehensive theming support. In that case, the 16-color variant is a much simpler solution.


Never being a fan of purple, I do feel Ubuntu get it just right as a background colour - over grey and black as defaults. The screenshots in the article highlight this. While it of course can be changed, kudos for whoever did that. Having given up on Ubuntu over Snaps and going with Debian for everything, this is one of the first settings I changed - back to purple.

Small things are important.


I agree, Ubuntu's seems far more legible than the others. The "new Windows console" design still seems a lot closer than the pathological example they show than it does to the Ubuntu text. It is certainly a step toward being more legible but the contrast still doesn't work for me.


One of the worst "ossifications" of non-standards is the notion that colour is three channels of 0..255, one each for red, green, and blue. Combined with the incorrect assumption that the scale is linear[1], and you have a perfect combination for a Dunning-Kruger effect: A bunch of programmers[2] who think they've figured out "graphics", but haven't even begun to understand how any of it works. Not any of: light, display technology, colour standards, or even the human eye.

A short list of falsehoods programmers believe about vision:

- sRGB is the only graphics standard.

- Monitors are actually sRGB and follow that standard.

- Colour intensities are on a linear scale.

- White is 0x00FFFFFF or 255,255,255

- 50% Grey is half of the above: 127,127,127.

- Blending colours is just a matter of linear interpolation.

- Red, green, and blue have a single consistent definition.

- 8 bits is enough for each colour channel.

- Okay, 10 bits is enough.

- 12 bits is enough, surely.

- 0 means black.

- Okay, 16 means black on televisions.

- Fine, black can have different levels in different standards, but it's always clear which level is black.

- At least the white level is consistently defined.

[1] Narrator: it's not even remotely linear.

[2] Including myself until recently.


> White is 0x00FFFFFF or 255,255,255

Calling this a falsehood isn't even beginning to describe the problem: - does not distinguish between a white surface (reflects all light) and a white emitter (emits blackbody radiation in a specific temperature range) - there is no maximum intensity for emitted light, but there is a maximum for reflection - RGB (255,255,255) does not produce a blackbody curve -- though it can cause the same reaction as a blackbody curve for people with exactly three working cone types. This point may be wrong on this list because it may be thought of as a solved problem. - RGB (255,255,255) does not specify the temperature - that is before monitors even start being imperfect

Oh, and a general one: An emitted color (e.g. monitor) and a reflected color (e.g. printout) cannot be the "same", ever, because only the latter depends on how it is illuminated.


I always understood OxFFFFFF to be “give me all you’ve got”. Of course a quasar is going to be able to emit more powerful photons than a 90s CRT. But if said quasar had a GUI, 0xFFFFFF would mean “master blaster”, and the same for the old screen. My expectation as a programmer was never that the colors would have any semblance of “realism” or “uniformity”, and to expect variance from machine to machine. Perhaps since I have not involved myself with photography software and kept my budget for displays below the thousands-per-unit this has held true so far


What I tried to say by that is that "give me all you've got" is the only kind of maximum a screen can do, and a brighter screen (or a quasar) will be brighter. OTOH, "reflect everything" is the absolute maximum for reflection, and no other object will top that.


> OTOH, "reflect everything" is the absolute maximum for reflection, and no other object will top that.

Maybe some old night scopes with photon amplifiers could be coaxed into making "over 100% reflective" mirrors? A fun idea to reasearch.


Any monitor can simulate HDR ("brighter than 255") images by simply darkening the rest of the picture around them.

Also, in TVs it's 16-235 not 0-255. They're not even 8-bit!


> An emitted color (e.g. monitor) and a reflected color (e.g. printout) cannot be the "same", ever, because only the latter depends on how it is illuminated.

Surely there is a case when the light coming into your eye is the same for both? If the printout is lit by the monitor wouldn't it be the same?


The colors can be the 'same' under a specific illumination condition. Standards that help make sure prints look like your screen specify the illumination condition of viewing, and then demand the spectrum radiating from the screen and paper 'look' the same.

Often this is a daylight lamp shining on the paper with a specified intensity, in an otherwise dark room (I believe) compared to watching the screen in a specific ambient light (again daylight with a given intensity).


There is sometimes the expectation to get a printout that looks the same as the image on screen in any illuminating condition. The whole process of designing printed material (or, say, a passive billboard) on a computer has to constantly avoid making that assumption.


What about RGB (256, 256, 256) though?

Edit: Oops, I read "there is a maximum for reflection - RGB (255,255,255)" as one statement rather than two fragments but I still like my response. :P


I made the same mistake again as always -- writing as bullet points and not looking at it after submitting :(


Light speed is too slow!


> - Blending colours is just a matter of linear interpolation.

I recall there being a particularly contentious bug report for Adobe Photoshop eight or nine years ago. The original thread is here: https://community.adobe.com/t5/photoshop-ecosystem-ideas/p-g... - but all the users have been replaced by "FeedbackCommunityMember", so it's a bit hard to figure out who is talking to who (perhaps someone can find a properly archived url).

In any case, an Adobe product person was unbelievably stubborn about their assumption there is just a single way colours can blend one to another, and that all the other ways couldn't possibly be desirable. It was pretty hard to read.


That is a pretty wild thread to read. Also funny to note the update- > Special thanks to @bennettf96052341 for his gracious feedback

I remember a youtube video from years ago explaining this problem with photoshop's gradient tool (and other naive implementations), but I can't find it now.

Colors are hard.


The first one would be "colors exist as a physical concept". They do not.

There is no relationship between the physical nature of light and what we call "color". A wave has a frequency. There is no concept of "color" associated to a wave.

We statistically put names on some convoluted detector/brain transformations and everyone does the transformation differently. This is why most of the people agree that a tomato is "red".

As an ex-physicist, I hate to speak about colors in a physical context (and yes, we have charts that show that higher frequencies are "blue" and lower ones "red"). Color is a biological concept and should stay that way.

And then, as a farther, I have to explain to my children the convoluted course on colors in high school in France.

Sorry, I had to let off some steam.


This is navel gazing along the lines of cocktail-fueled debates over a tree falling in a forest. Does light still have a color if no one is around to see it?

Fundamentally these words have lots of reasonable wiggle room to expand or narrow their definitions. Still, lots of technical domains would like to be explicit about what is meant, especially when reusing common words, which is why man invented the glossary.

Unfortunately this hasn't stopped endless argumentation over whether a tomato is a fruit or a vegetable. For whatever reason, it's often presumed that the definition of fruit used in botany has "won", even though botany has nothing to tell us about what a vegetable is or how to distinguish one from a fruit.


I am not sure I understand your point.

On the one hand you have light waves that hit some receptors in the eye. These receptors (various kinds) sensitive to frequencies according to a function. This creates a specific signal that is analyzed by the brain which assigns a "color" to that signal.

Everyone's, to some extent, functions are different. There is a general trend so most people agree on "colors". When you are colorblind, the functions differ drastically (or are just flat).

This is why "color" is a biological concept (same as, say, "pain") to which we are desperately trying to attach three or four numbers (or, worse, a wavelength)


450 THz electromagnetic waves are unambiguously red light, you can nitpick about the shade of red, how people perceive it, etc... but it is red. It may look differently to colorblind people, or dogs, but it is red. The electromagnetic spectrum of frequencies just lower than red is called infra-red, don't tell me that physicists don't talk about infrared.

If you look up some colors in Wolfram-Alpha you will have precise ranges of frequencies, I am not sure they are officially defined ranges, but colors, at least pure colors correspond to physical quantities.

There is some overlap with biology, and there are some colors are special (brown, purple, ...) and can't be defined by a single frequency. Then you can introduce color perception, and how very real and physical red photons are perceived as red, why a combination of red and green photons are perceived the same way as yellow photons and how spectrometers reveal the difference, and how there are no purple photons.

Composite colors are a bit more complicated, you need to add a bit of math (namely, a color space) in order to formally define them from physical quantities, but saying that it is not a physical concept is going a bit far.

No need to make physics more abstract than it is. Because what's next? Temperature is not a physical concept because we don't all agree on how hot something is? Like color, temperature is a well defined physical concept that is based on our perception.


In that case, what is the frequency of the light needed to create "salmon"?

Or in other words: is "salmon", "fuschia" and other colors located on the frequency spectrum? If so - why do we have zillions of color spaces instead of just a simple frequency? (or wavelength)

At the moment when you introduce color spaces, you start accounting for how our eye receptors are sensitive to frequencies and leave the realms of physics (mostly because the receptors differ from person to person)

When you take a 450 THz wave, you say it it is "unambiguously red" because it is what happens to be the word for the signal which is triggered by that wave. In most of the people. Roughly the same way.

What I am trying to say here is that a color is a concept as well defined as "pain". You can have all sorts of measures, but none is strict.

Finally, there are no "red photons", or green or blue (you mention them) - this simply does not exist at all as a concept.


I think with color, it’s important to appreciate the irregularity that exists in this domain. You’re right that colors are defined as ranges of wavelengths or blends of ranges of wavelengths. And so there is a mathematical component to color. Yet, the mathematical view of color is imperfect if you ultimately are concerned with the perceptual end result. People are always trying to create better, more perceptually accurate color spaces, but all of them have their issues and weird corners where the adjusting the values in a certain direction doesn’t yield the expected result.


> A wave has a frequency. There is no concept of "color" associated to a wave.

Funny because when I explained radiant heat to my daughter I used it as an example of light, but infrared is a color we can’t see (but we can feel the light the same way we feel the sun on our skin). She asked about other colors and I went into ultra violet, X rays (a color to which we are transparent), gamma rays, and so on.


Yes, you described her how a wave interacts with her body and creates sensations (heat, colors, ...).

This is absolutely true, but the wave does not carry any of these sensations on its own, it is its interaction with sensors/receptors in her body that creates it.

Like I said elsewhere, a color is like pain - a needle does not have any "pain" in it and a couch hit with a needle will not create "pain".


Also:

- Pixels are squares

- Pixels are as wide as they are tall

- Limiting the colour gradient across an area will make it look smooth (Mach banding yay)

- The number of bytes in a line of an image is bits_per_pixel * pixels_per_line

- High frame rate means smooth animation


- Transitioning from an RGBI value of one to zero is instantaneous.

- If not instantaneous, it’s linear

- The phosphor’s response to the gun intensity is linear.


Forgot one:

- An isolated lot pixel on a CRT is a circle.


This would make for an excellent blog post


I would, if I thought it would make a difference. It won't.

I just bought a new 2022 model flagship television, and its colour management is very visibly broken![1] My partner, who knows nothing about colour standards mentioned that she thought the colours were "off" within minutes of using it.

This is the good brand and model according to reviews. I've seen the other brands in the store, and they're markedly worse.

Like... stupidly bad. Garish beyond belief. People looking like they're wearing clown makeup. Normal colours looking like neon tubes.

Are most people colour blind? Am I weird? Did I marry someone equally weird?

Or is it a mistake to just say: red = 0xFF0000, commit the code, and call it a day?

It's probably just me. Never mind.

[1] Unless I connect an Apple TV to it, on which nothing is broken. It's amazing that only one company on this planet is able to comprehend colour. Every other organisation, propriety limited, corporation, and charity is staffed entirely by colourblind people or robots that see only black and white.


Check your TV's settings. Most, if not all, TVs ship in some sort of display demo mode that jacks up the contrast and sharpness and does all kinds of horrible things to make it stand out in the environment of a show floor. On Samsung, it's called "Dynamic" last I knew, but other euphemisms are used by the other manufacturers. I call it "Claw Your Eyes Out Mode", but that's obviously just me.

Generally if you tune your TV to any other "mode" as a base mode, you'll get much better colors. At first it'll look dull next to Claw Your Eyes Out Mode, but stick with it a bit and your eyes will rapidly come to appreciate not being clawed out.

You can also generally find suggested settings for your TV from people who calibrate them with professional tools. You'll see them also say you shouldn't just take them directly because your TV may be different, but from my point of view, the upgrade from CYEOM to "calibrated based on the same model even if it is a different instance" is probably about 99% of the upgrade you can expect and I don't consider the remaining single-percent upgrades that may be theoretically possible to be worth the effort required.


I tend to like “game mode”, where the TV does as little processing it can só the image is minimally delayed. It also has the side effect the TV doesn’t try to “improve” anything.


> This is the good brand and model according to reviews

Any half-decent review should mention both out-of-the-box colors and "calibrated" colors, or colors in different modes.

Its kinda open secret that the default modes and settings of TVs are absolute bullshit. Most "good" TVs can be adjusted to decent color reproduction (albeit not always without compromises).

It feels like the tide might be turning; stuff like "Filmmaker Mode(tm)" is popping up, and generally people are becoming more aware of the stuff. But that is not without its downsides, just recently there was a story about how big-box stores are price gouging unaware consumers with their "calibration" services that they are very aggressively selling with TVs.


Garishness on new TVs also comes from the "soap opera effect", caused by motion smoothing, which is enabled on every new TV I've seen (and which tends to reset itself every few months on my home TV). Turning that off makes everything so much better.


Moving pictures can look strange (juddery) on LCD/OLED TVs without either a little motion smoothing or black frame insertion. They were made to be shown on a flickering projector after all.

Also, you see people saying "motion smoothing needs to be turned off to respect the artist's intent" but have you seen what those people get up to? It's like a moral obligation to disrespect them.


On my TV it's called "perfect motion" and it's terrible.


God made movies 24 fps for a reason.


I can't even tell the difference between the color settings on my phone or TV. I am not colorblind, I pass those tests. I think maybe I have spent so much time staring at screens in my life that my brain has internalized rgb.

One of the most enlightening things for me long ago was trying to write a converter between wavelength and rgb value. It led to all the right questions.

Ultimately I still dont understand color perception. It is hideously complex. But also so fascinating. The mothers of colorblind men are often tetrachromatic. I really wonder what TV's look like to them, since they assume 3 normal frequency responses.


The problem with the color balance on most TVs is that the red channel has been turned way up. The reason manufacturers do this is it makes the picture stand out when TVs are displayed in a big line. A properly balanced TV looks bland in comparison. And most people don’t realize this and have just gotten used to the way things look.

There are other things to do, but usually just turning down red significantly gets you a lot of the way there for minimal effort.


Look up the correct settings on https://www.rtings.com/tv/reviews and set those, it'll be fine. Or watch everything in Dolby Vision, which should already be right.

If you didn't buy an LG OLED, I also recommend returning it and getting one of those.


> Unless I connect an Apple TV to it, on which nothing is broken.

Did you calibrate the Apple TV using an iPhone? (e.g., https://www.macworld.com/article/344476/iphone-apple-tv-colo...)


Well this is a thing that exists after all:

https://en.wikipedia.org/wiki/Tetrachromacy


And Pink isnt even real :-)


I once met her and I can tell you she exists.


That's the thing about Dunning-Kruger. It's always invisible. You could be surrounded by a herd of 100-ton Dunning-Krugers and never know it.


It’s only invisible from the inside. From the outside it’s painfully obvious.


Unless it's shared, of course.


Then you are still inside.


Actually, the correct plural is "Dunnings-Kruger".


> If your terminal emulator gets told to make text blue, and instead it makes it unreadable, then you can’t say that’s not a problem with the terminal emulator.

Hear, hear. A neat tool to tell you who will be able to read a combination of background and foreground colors: https://www.whocanuse.com/


Only somewhat related, but it reminds me how Alacritty actually has its green color set as yellow because the creator thinks it looks nicer: https://github.com/alacritty/alacritty/issues/1561

It's really hard for a terminal developer to trust that what it looks like on your system looks much at all like it will for your users.

Also, I see with the new Windows colors that black has a glow now? Does anyone have any more screenshots of that or is that just something the author added? I ask because it is 100% solid black everywhere for me still.


Complaining about colour terminals for nethack? All we had were green screens. But you know, we were happy in those days, though we were poor.


Heck! Because we were poor!


Those colors never looked great, but I remember them being a lot less jarring back then.

At least some of the colors weren't pure binary RGB values as claimed in the article. For example, "dark yellow" was closer to a brown #804000 than #808000. Maybe that depended on the graphics card? Otherwise, I'm not sure why many terminal emulators would have gone with a pure dark yellow instead.

Also, maybe the overall contrast was lower on CRTs. At least I think the blacks were less dark on early monitors, which might have helped to make the colors look less harsh.



CGA used 2/3 brightness (#AA) for the primary colors, then added 1/3 gray (#555555) if the intensity bit was set. The exception is that with low intensity, yellow is turned into brown by reducing green to 1/3 (#AA5500).

Unfortunately a lot of terminal emulators get it wrong, the normal colors are too dark and the intensified ones are too saturated!

I also remember blue being easier to see in the past, but that is likely my eyes getting worse.


This is a reassuring article. I had started to feel like I was losing eyesight because some "themes" are so hard to read, especially those dark themes using the annoying shade of blue. This is especially true for Emacs users. There are so many themes available for Emacs that finding the good ones can be difficult.

My advice with respect to Emacs themes, the built in themes are quite good and especially the modus themes (and also the package ef-themes) both by Protesilaos Stavrou.


Several years back, frustrated by this same problem and underlying constraints, I spent a weekend writing a machine-aided cost minification algorithm (aka what most startups call "advanced ML") that used LAB and LUV to try and pick 2x8 colors (the eight colors in dark/bright configurations) where the cost was defined as

a) how far apart each of the color pairs was from one another (hue only),

b) how far apart each dark/bright color in a pair were from one another,

c) how high the contrast against a specific background color (provided as a parameter),

d) how far the perceived brightness of any "bright" color differed from the rest of the "bright" colors,

e) how far the perceived brightness of any "dark" color differed from the rest of the "dark" colors,

f) how far each of the 8 base colors strayed from a range that could passably be called the de facto colors mapped to each of the palette members (e.g. cyan, blue, black, white, etc) so that TUI apps using colors not to distinguish but to draw (expecting "blue" to be a shade of blue) would not be thrown off.

Unfortunately I never found the results to be pleasing to the eye. I also think there was something wrong in either the math of the individual cost function factors themselves or the weights assigned to each factor because I didn't feel like the results were properly spaced out or representative of what even a local minima what look like if the math was all correct. I didn't really want to put more time into it than I already had, but I'm convinced that it's possible to generate a fixed set of 2x8 colors that meet these conditions for any given background color (at least one for a black or dark grey background and another palette for a white or off-white background).


Pixel artists have really, really explored that problem space. You might want to check out the palettes on lospec, and the discussions about these things on pixeljoint. I'm not deep into colour theory myself, but I get awed by the diagrams Dawnbringer's palette analyser tool produces, let alone the things pixel artists do to highlight the opportunities and limitations in palette choice.


I always use Tango for terminals. Simple, usable and that colorscheme it's available everywhere.


The cause is simpler than this article makes it out to be. Older displays had pretty narrow gamuts so very saturated colors washed out to something more legible and the crude palette construction was not much of a problem back then.


Yes, and by the time displays were able to reproduce colors with any reasonable fidelity, every sensible application changed those codes to represent some color that reflected what a person would see on a bad monitor.

The author is just using a terminal with non-sensible defaults (on way more things than color palette), but tries to pin the flaw into everything else (from nethack to our eyes), instead of putty.


Those simple values worked well in analog CRTs because they didn’t always responded linearly to input voltage. Back then, when producing images where color mattered, you needed to know how your monitor responded to the analog signals (and, often, how the destination screen did). We used to calibrate our monitors.

It was also more common to adjust brightness and contrast to suit your preferences. A blue like 4 wasn’t necessarily as invisible then as it is now with our perfectly linear and 100% responsive screens.


"I can’t find any documentation on where these colours came from — they seem to have been passed down from terminal to terminal since the dawn of time. But this can’t be a coincidence!"

Is it really hard to understand why these colors were chosen? #000 = black. After that, just iterate through the combinations of each combination of replacing the 0 with an F. Full on, full off. That's your choices. never mind 1-E of the in betweens.


This is exactly the same palette the BBC Micro had.

It was frustrating to have such a fast computer, high resolution (640×256) and be stuck with those awful colours.


Indeed. The C64’s fixed palette resulted in so much better colors.

OTOH, the BBC’s palette matched the one from Videotext.


Why not turn off the colorizing? That's the first thing I do (even outside of NetHack), because the colors often render things unreadable.


Nethack, Omega... Angband... Those games are so deep. A true evolutionary sweet spot. I was in love/addiction for a long time.

Naturally I have given much thought to what the next step might be.

No, not prettier pictures, realtime or 3d. All those newfangled mmorpgs took the obvious, and wrong, turn.

Sometimes I think that a different tesselation (than square grid. Something with more directions). And variable size icons.

Would that detract?


With BSD games (specially ATC, Tetris, Trek, Adventure and others) you can have fun with very little. Add Slashem and IF games and you can play with a and with a shitty SVGA card.

On "graphical" games, Shadowrun for the Genesis with the 2058 patch (ips.pl it's your friend) it's the only "modern" game I play.

And yet Shadowrun pales against Nethack/Slashem's interactivy between monsters, objects, statuses and the world.


I think there are some modern rogue-likes that use hex layouts rather than square grids. I'm not sure if there is anything more creative than that, but I would be surprised if there weren't. There is a sub-culture out there who are making more modern versions of the traditional rogue-likes. See the r/roguelikedev subreddit for a lot of interesting info on this.


This reminds me of the way some of those colors give the illusion of depth. I.e. red seems to jump out of the screen and blue seems to receed.

I used to think this was limited to computer monitors until I saw a large piece of stained glass that on which the artist juxtaposed red glass and blue glass to the same effect. It was fascinating.


I spent an unusually large amount of time trying to get this color change just right: https://github.com/solvespace/solvespace/pull/446

It's the blue and yellow only, but IMHO it made Solvespace a lot more readable.


I love solvespace. Thanks for contributing to it!


The author is the maintainer of exa, an excellent `ls` replacement that makes effective use of color in the terminal.


I thought this was going to be about that butt-ugly Twitter blue, as seen in every crap Bootstrap site on the net.


Me too!


On every term that supports 256-bit color I've mapped this to `dodgerblue` for at least two decades.


There's a lesson here for anyone doing a "dark mode" UI as well. Avoid blue.


I use an LG 4K TV as a monitor...something about the darker half of the standard 16 terminal colors is incredibly hard to read on this panel.

Irritating for screen sharing with people who use default putty settings, small size courier new, default colors. Ugh. Savages.


> Irritating for screen sharing with people who use default putty settings, small size courier new, default colors. Ugh. Savages.

I'm visually impaired, so if I were to screen-share my PuTTY window, you'd be treated to 22-point bold Consolas and mostly white on black.

This points out, though, that for accessibility, frame-buffer sharing is really not ideal. What we really need is some kind of high-level content sharing, where the content is re-rendered on the viewer's machine with their settings.


https://asciinema.org/

I wish Git(Hub|Lab) READMEs could embed asciinema sessions instead of blurry, bulky, un-seekable GIF clips.


I've never figured out how to use it (never had the need) but I've heard GNU 'screen' is capable of sharing the same terminal window to multiple people. Since it runs in the terminal, it would be rendered to each viewer using their own terminal emulator settings.


If it's an older panel check to make sure it's not chroma subsampling the red channel.

I had one where I had to change settings to keep any text with red from being a muddy mess.


Switching to straight HDMI instead of USB-C to HDMI fixed it!


Wow! Subsampling red is unbelievably stupid. You usually get away with blue, but subsampling red or green is immediately visible.


Subsampling red is about as common as blue. Digital video, digital cameras, PenTile displays… the only full resolution you're getting is green, because that's the most effective way to get luminance.

They mostly don't even use good resampling algorithms either.


Sorry. I'm shortsighted and, I guess, it gives me the superpower of being able to focus red. OTOH, blue/violet is just a blur.


Slightly different perspective on [optical] perspective:

>Impossible Colors:

>https://en.wikipedia.org/wiki/Impossible_color


I actually find both the "actually readable" and the "something decent" colors harder to distinguish than the original "annoying" colors. (Ctrl-F the source for those quotes to find the images.) I _am_ red/green color-blind but I have never had an issue with blue.


That is expected to a degree: the revised colors are less saturated because some white is blended in. The point is that they stand out more against a black background. Is that also your observation?


Speaking of annoying, what's up with that gray (#606060) text again.


Remember CGA?

Ah yes, MAGENTA+CYAN. Everybody's favorite palette.


CGA can do quite a bit more: https://www.youtube.com/watch?v=fWDxdoRTZPc (Area 5150 demo)


That was dang cool. Thx




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: