Yes I know. My point was the arrangement of the RGB dots that made up a region was different on CRT TVs to that of colour CRT monitors (ignoring the old terminals for now because they’re a different era of device entirely). That did have an effect on the sharpness of the image.
What I didn’t mention but is also worth noting is the quality of the input signal would have mattered as well. 8bit micro monitors, even in the 80s, generally took RGB input. Whereas any micro that output to a TV would often do so via a the antenna, ie the computer would have to be tuned in like a TV station. This means you then have all of your prime colour “wires” as well as audio mixed into one analogue signal. Which obviously degrades the overall image and sound quality. Plus TV tuning isn’t nearly as precise as having a proper dedicated cable. The devices from the 80s (and 90s) which supported both ANT and RGB really show just how significant the difference between the two is. Which is also why you often see a lot of retro gear being sold as “RGB modded”.
Basically what I’m saying is comparing a budget micro on an 80s home TV to a raster CRT terminal is rather pointless. Different tech, different target audience, different eras.
My post wasn't meant as a critique, rather as an expansion on yours. We are so used to thinking in pixels that we often forget there weren't any hardware pixels in B&W. It was whatever the circuitry could produce. You observations on color TV are certainly right. From what I remember, the most obvious distortion came from the sharpening (which is seldom mentioned today), resulting in hard, double vertical edges. This may have been more prominent in PAL than NTSC. – How I did long for a legit RGB monitor, but it was just a color TV for my C64 anyway. :-)
What I didn’t mention but is also worth noting is the quality of the input signal would have mattered as well. 8bit micro monitors, even in the 80s, generally took RGB input. Whereas any micro that output to a TV would often do so via a the antenna, ie the computer would have to be tuned in like a TV station. This means you then have all of your prime colour “wires” as well as audio mixed into one analogue signal. Which obviously degrades the overall image and sound quality. Plus TV tuning isn’t nearly as precise as having a proper dedicated cable. The devices from the 80s (and 90s) which supported both ANT and RGB really show just how significant the difference between the two is. Which is also why you often see a lot of retro gear being sold as “RGB modded”.
Basically what I’m saying is comparing a budget micro on an 80s home TV to a raster CRT terminal is rather pointless. Different tech, different target audience, different eras.