I'm sorry to be the bad guy here and point this out, but ... this article is completely wrong :/
An emulator outputting 100% perfect RGB pixels onto an LCD screen is very different than running an analog RGB output to a CRT television.
I have a PVM-2530 CRT, and a hand-built Multi-AV to PVM-CMPTR adapter board ( http://i.imgur.com/EgCX2EJ.jpg ), and the colors still blend perfectly fine in his example of Kirby's Dream Land 3, as well as the more odious test case of Jurassic Park's translucent menus.
CRTs don't have an "official" horizontal resolution .. the number of RGB columns is dependent upon the tube and its size, and there's never an exact match against the analog nature of the RGB output from the SNES to the columns on a TV. So the colors naturally bleed together and produce the desired translucency. Maybe just a touch less severely than composite, but well more than enough to achieve the desired blending effects -- without all the horrific composite video artifacts.
Further, my SNES emulator simulates this effect as well, by choosing "Video Emulation -> Blurring." It doesn't quite work the same way as a CRT would, but it simulates the effect very nicely. It's not yet connected for my Genesis emulator, but that's only because it's very new and has more pressing issues to address, but it'll be there in time as well.
I can't speak for how these effects look on an XRGB-Mini (I haven't tested these types of translucency tricks on mine yet), but I would venture a guess that it would not even look close to as 'pristine' (striped) as the emulators output: it's still having to use an ADC on the RGB lines, and that's always going to be lossy to some degree.
> An emulator outputting 100% perfect RGB pixels onto an LCD screen is very different than running an analog RGB output to a CRT television.
I don't see the article suggesting otherwise. It suggests that some of the artefacts inherent to composite video encoding are not present in RGB.
I agree that the effect varies depending on the tube and the device, but in general I've found the blurring to be more pronounced in composite encoding, not to mention having a distinct quality.
It looks like a lot of those effects are possible because when outputting to composite, the horizontal rows are well-defined, while the vertical columns are not and unlikely to precisely match/map the output. For instance, there are a lot of clean gradients from top to bottom (clean rows of color) which use dithering in the rgb output. Columnar gradients would have bleeding/blurry edges due to the unpredictable nature of the analog display. Am I understanding this correctly?
Composite would get you a different amount of horizontal blur for different colors. In an analog sort of way, it's like resolution is reduced for each color component, but the red-green color axis is done differently from the blue-yellow color axis. For each 4 "pixels" (again, analog filtering) of brightness value you get 2 pixels worth of data for one color axis and just 1 pixel worth of data for the other color axis.
You also need to account for blooming, phosphor delay (different for each of red/green/blue), a "black" that is grey...
If you want faithful emulation, you need all these things. If you want performance, you want none. For the most attractive display, you need SOME of these things, and it varies from game to game. Probably some can be done in a GPU. You might want to let the user know if a real (not just MesaGL) GPU is available and able to handle something, if a GPU can do a half-good job, if something will just munch CPU time, etc.
> a "black" that is grey...
I hear this a lot. Why is an incorrectly biased monitor's bad black levels so widely considered the way it is supposed to be?
There is nothing incorrect. It's just old technology. This is how things were, and full compatibility -- if you want it -- demands that you make things match. SNES and Genesis artwork was designed for the old hardware.
The monitors looked grey even without being plugged in.
To be clear, this is tube technology, which was widely used right up to the end of the last century. Screens were often thicker front-to-back than they were wide. They were really heavy too, with thick leaded glass (to stop X-rays) at the front and big copper magnet coils at the back. There was a fine-pitch metal grid inside the glass, and some phosphors printed on inside of the glass. That stuff wasn't black. In operation the situation got slightly worse, from stray electrons probably, but fundamentally the displays just couldn't ever be black. It's the same as printing on brown paper: don't blame the settings of the printer when you don't get any white.
Even the oldest CRT televisions have a "brightness" control, and it's supposed to be used to set the black level to black, not grey. The face of the tube has a grey color while it's off because that's how the surface of the tube looks - but that's not the result of emission from the tube, and even by the eighties tubes had dark anti-glare tinting for this. Unless you're shining a light on the tube, the black levels should be black.
Are we to say that the black level of an OLED display isn't black because external light can raise it through reflection?
Glad I wasn't the only one shouting in my own head while reading the article. 100% of the time I want perfect RGB and I'll run the source through the appropriate hardware (like the XRGB stuff) if I want to use a modern monitor.
I'm jealous of your PVM-2530...I tried for years to get these in the late 90s/early 2000s and they just weren't obtainable near me and now they're mostly priced out of orbit in good condition. There are some other options, but they're all reasonably expensive for what they are.
The situation for real arcade monitors is looking even worse with the commonly available new stuff mostly being shit. Soon to be unobtainium, it looks like, and to me that's sad.
For now I have to settle for this tiny old Atari ST monitor...but hey, it'll sync down to 15.7Hz!
The first one shipped and had massive geometry and brightness problems. Every few minutes the entire screen would flicker -- like the image would zoom in by 5%, zoom out by 5%, repeat -- really disorienting. I took it to a local place that actually repaired them. Guy charged me $200 to replace some voltage regulators. It fixed the brightness but the image still flickered every few minutes. He wouldn't touch it again unless I paid him again.
Then I found a guy two states over selling two of them for local pick-up. Drove eight hours each way, got them loaded up and taken back. One of the two had the flyback transformer failing, and had a massive gauss problem at the top of the tube -- degauss only barely helped it, and it'd come right back every time. The other one seemed to be in good condition. I don't know how long it will last, and I always feel a bit guilty using it, knowing I'm wearing down its remaining life.
The 2530s set me back around $700 for all three plus the failed repair attempt.
I do also have a working 2030, and two 1390s. The 1390s are too small to be usable, but I can put them up on my plant ledge, so I'm keeping them for posterity.
If you have ample and cheap/free storage space, keep all three. The only thing not "easily" fixed (not easily, but no magic) is the phosphor on the inside. All other components can be upgraded and replaced by competent people. Don't let them be a drag on your soul and don't feel guilty. Use the third with joy!
If you don't have space, throw them away of course, no worldly possession is worth hassle in itself. But if you keep them, their value as cultural artifacts will continue to increase. Also with time, more people will be willing to pay the (high) cost of rebuilding them. I predict there will be at least one canonical company in the US which will spend many, many years rebuilding CRTs for collectors, enthusiasts and museums.
Believe it or not, I actually sold them to a museum.
I listed them on eBay as defective along with a detailed description of their issues, and they were both purchased by some sort of museum place in New York I forgot the name of. They sent a truck out to my place to pick both of them up.
I have one because my dad used to use it for our CCTV system at work. It's been gathering dust for a while but I can get it running again. Are they worth a lot?
Maybe not now, but they will be worth a lot. Not enough maybe to offset cost of storage if you have to pay for that kind of thing, but if you have "free" storage (which means your old electronics junk should not take a big chunk out of your other storage needs and life in general) it can definitely be worth saving stuff like that, even from a pure financial standpoint.
Analog = 'sine waves', not 'square waves'. So if you have the left side of the screen white and the right half black then there will be a point in the middle where the signal, no matter how the PAL or NTSC is encoded on a carrier wave, has to do 180 degree change. So this affects the signal and the 'aliasing'. It isn't a mismatch of dots in the horizontal only approximating defined rgb columns, it is all in the 'sine waves'.
We have fairly decent CRT emulation via pixel shaders already. Scanlines, phosphor glow, barrel distortion (curvature), shadow masks/aperture grills, etc. And yeah, all of that at 4K can strain a top-end video card. That and using a shader like this adds an additional frame of latency.
The truly limiting quality issue is that the contrast of LCD monitors doesn't even come close to that of a CRT.
It's really quite amazing how much we acclimate to the inferior contrast of LCD displays. But if you ever run the output to both a CRT and LCD side-by-side ... it'll truly shock you what a difference there is.
I've yet to see an OLED PC monitor in person. Hopefully they're a lot closer.
An emulator outputting 100% perfect RGB pixels onto an LCD screen is very different than running an analog RGB output to a CRT television.
I have a PVM-2530 CRT, and a hand-built Multi-AV to PVM-CMPTR adapter board ( http://i.imgur.com/EgCX2EJ.jpg ), and the colors still blend perfectly fine in his example of Kirby's Dream Land 3, as well as the more odious test case of Jurassic Park's translucent menus.
CRTs don't have an "official" horizontal resolution .. the number of RGB columns is dependent upon the tube and its size, and there's never an exact match against the analog nature of the RGB output from the SNES to the columns on a TV. So the colors naturally bleed together and produce the desired translucency. Maybe just a touch less severely than composite, but well more than enough to achieve the desired blending effects -- without all the horrific composite video artifacts.
Further, my SNES emulator simulates this effect as well, by choosing "Video Emulation -> Blurring." It doesn't quite work the same way as a CRT would, but it simulates the effect very nicely. It's not yet connected for my Genesis emulator, but that's only because it's very new and has more pressing issues to address, but it'll be there in time as well.
I can't speak for how these effects look on an XRGB-Mini (I haven't tested these types of translucency tricks on mine yet), but I would venture a guess that it would not even look close to as 'pristine' (striped) as the emulators output: it's still having to use an ADC on the RGB lines, and that's always going to be lossy to some degree.