Hacker News new | past | comments | ask | show | jobs | submit login
Sharp-Bilinear Shaders for Retroarch (github.com/rsn8887)
62 points by tosh on Feb 5, 2022 | hide | past | favorite | 21 comments



There is a great twitter account that compares raw "sharp pixel" renders of classic games side-by-side with how they actually looked running on CRT displays of the era.

https://twitter.com/CRTpixels

You start to realize that these artists not only crafted sprites specifically for the resolution and color constraints of console, but also for the unique display characteristics of the CRTs that were used at the time.

Here are a couple of my favorites:

https://twitter.com/CRTpixels/status/1468423054145425410

https://twitter.com/CRTpixels/status/1443969255809200133


A couple more good ones:

https://twitter.com/CRTpixels/status/1461432490636234756

https://twitter.com/CRTpixels/status/1484553881329745925

Sharp pixels are fine for modern pixel art, but some of the old stuff looks much better using a CRT or CRT-shader (and sometimes even NTSC color effects matter).


> some of the old stuff looks much better using a CRT or CRT-shader (and sometimes even NTSC color effects matter).

Yes, I built a custom MAME cabinet around a 27-inch Wells-Gardner multi-scanning arcade CRT with authentic arcade buttons, sticks, trackball and spinner for this reason. Although some advanced shader emulators are recently getting closer in some games, there's still usually a visible difference with a real CRT. Whether that difference matters to you in the games you play is up to you.

If you get a CRT for emulation of classic arcade machines, look into GroovyMAME, which is a dedicated, synced fork of MAME with changes under the hood specifically to support accurate CRT display. Arcade machines often had different, non-standard resolutions, frame rates and scanning frequencies which makes visually accurate emulation even more challenging than emulating home game consoles like Nintendo. For accurate appearance, some arcade games require tweaking the config file to ensure the correct resolution and scanning frequency are sent to the CRT but once you've set these, with a multi-scanning CRT it's possible to achieve essentially perfect emulation of a wide variety of raster CRT originals.

I certainly don't tweak the config file for each of the thousands of games emulated by MAME. For most, I just go with the defaults which are usually close enough. However, for a few dozen games that I played the most back in the day (and a couple of which I actually owned a cabinet), it makes a noticeable difference to me.

There are dedicated scan converters to adapt HDMI to composite, S-Video or analog component but they come with various quality or accuracy trade-offs. The best way to output analog to a CRT is to instead use an older graphics card with native analog VGA output (usually on a 15 pin connector). The last widely distributed PC graphics cards with native analog out were circa roughly 2014, so many HN readers probably have a few kicking around in a parts bin. Emulation is a great way to put these older cards back into useful service as their 3D capabilities and memory sizes are inadequate for modern gaming but still exceed the requirements for emulating most 90s and earlier arcade machines and 16-bit era home game consoles. I'm currently using a Radeon 3850 in my dedicated cabinet. With a fast enough CPU (circa 2016-17) and recent emulator improvements, now home consoles such as most PS1 and even many PS2 and Gamecube titles can be fully emulated. All out of parts bin and thrift store grade components.

People who see my MAME cab are often impressed by how great the games look when properly emulated and displayed through a quality analog signal chain. The best part is that now new generations of players who've never seen these games are discovering just how fantastic they are. While modern AAA games certainly have spectacular visuals and scope, the harsh constraints imposed by 8 and 16-bit era hardware forced the best games to really nail the game play and feel in ways modern games rarely do.


CRTs are very different, absolutely. I also observe this sort of thing on the original Game Boy. Spaceship shooters frequently used white space with black stars since the obvious opposite would be an indecipherable mess on the DMG's display. They also use flickering content to add more shades due the low-pass nature of the slow display response. Game developers and artists have always been great at getting the most out of the hardware and I strongly believe that part of preservation has to be preserving the input & output devices.


Are any emulators set up to properly simulate these persistence effects? Seems to me that it would properly involve passing in the last N frames to a shader to allow for time integration. I don’t think that’s how RetroArch works but I could be wrong.


I've gotten really cool results in my graphics experiments by not-quite clearing the screen on each frame, ie. very slightly lowering the opacity of the rectangle with which the screen is cleared. It does however leave a subtle "stain" in places where the color doesn't fade all the way out, perhaps it would work better in combination with subtracting 1 from all RGB values.


I'm not sure if you have access to previous frames in RA shaders, you get a frame index at least, maybe there's some option / way to feed old framebuffers, the docs are a bit lacking.

But yes, RetroArch emulation cores, the Analogue FPGA systems and IIRC even replacement IPS screens for GameBoys address this particular issue. I can't really tell you how good of a job they do, I mostly play GB(C) on my unmodded GBC. But I also run GB games on my CRT emulation setup through RA and I definitively played some of the problematic games that use the slow display response for additional shades or fake transparency and I can't say I noticed anything wrong. There are options to control frameblending etc. in the emulator specific options, but I haven't spend any time tweaking them.


I really like the hylation shaders as they seem to emulate the CRT effect pretty well, sorta makes whites glow.

I'm constantly switching back and forth between real hardware and consoles with an ossc (upscaler device). No doubt CRTs look better for these games.

However if you are stuck with LCD, N64 and playstation games look better on an emulator rather than through a hardware upscaler.


Many of the things emulators do are also available to original hardware, though. There are patches that disable dithering on PS1 games. You can get rid of the infamous N64 blur through a combination of hardware mods and software patches. The N64 will of course always be low-res, low-FPS and have its low-quality bilinear filtering, but I'm not sure changing all of these really makes things better. For instance, I think content is also often authored with a target resolution in mind and the typical 240p game does not really look good at 4k with perfect texture filtering. Also, the scanline settings on the OSSC never really looked good / interesting to me, but the CRT settings on the RetroTink5x seems quite awesome. They got actual shadow mask / aperture grill emulation and different settings for high-TVL PVMs vs consumer tubes etc.


Ha! Coincidence I'm fiddling with RGB mod deblur right now, I think I'm blind as I can't see difference. A lot of this is personal preference. I tried playing Goldeneye in my ossc, honestly to me it's a better game on an emulator, it runs at 60fps, can appreciate the draw distance they achieved as I can make out the baddies instead of a jumble of pics.

Same with wipeout 2097 on PS1 emulator.

However Ive noticed played 1080 snowboarding on real hardware is way way more responsive. So it's a tough call. Also emulators have bugs, like Mario's triple jump not working etc. It's all trade offs really.


Through an OSSC you should see a difference! :-)

There's no fundamental reason emulators need to have lots of lag. Often the culprit is a controller or controller converter that has a inherent input lag. Also your video / audio output configuration (triple buffering etc.) can introduce lag. And of course your choice of display. There are also emulation features like run-ahead, frame delay etc. unique to software emulation that can even reduce the latency below original hardware or FPGA recreations in some circumstances.

Many systems have basically perfect emulators. For some we even have emulators where there should be no known deviations whatsoever (pretty sure bsnes/higan are at that stage). I think N64/PS1 are not quite there yet, but certainly nothing game breaking like a triple jump not working.

I think it's fun to just keep trying different setups. Playing retro games on a giant 2020 TV can be just as fun as using an ancient CRT. Emulation can allow for fun things like higher resolution, endless controller options and rewind/save-state, but original hardware also has its charms.


If 15Khz/consumer CRTs are more your thing, I've written a shader for RetroArch to get things looking proper on those types of displays:

https://github.com/blitzcode/crt-240p-scale-shader/

For what it's worth I'm still using this pretty much daily a year after I wrote it. Experiencing your favorite 'content' on different displays, speakers, formats etc. is like a cheat to get excited about the same-old again.


This is very neat! It looks like a spatial version of a popular technique MPV uses for temporal interpolation. [0]

As other commenters mentioned, "perfect" scaling is all personal preference. In my experience it does a good job with the usual jutter of 24fps on a 60FPS monitor. Never as impressive of the smooth motion of 60Hz video on a strobed 60Hz display.

[0]https://github.com/mpv-player/mpv/wiki/Interpolation#smoothm...


I don't know why this strategy wasn't the default option in the first place for emulators for the past couple of decades, when typical display resolutions have been greater than 2x emulated pixel field resolutions in each dimension.


Having been following the retro gaming community for a long time, I think there's often just a small and sometimes deeply unpopular minority that pushes for doing things the 'correct' / maximum quality / authentic way while most just don't care or notice or just want to keep doing things the wrong way for whatever, maybe even perfectly valid, reasons. I think most people just don't care about shimmering, or a bit of extra softness or windowing 960p (4x240p) in a 1080p frame or using the wrong 8:7 aspect ratio for 256x224 or just stretching it all to 16:9 (shudder). Think of the motion smoothing + vivid TV preset crowd. Also a lot of things just went unnoticed for a long time, like until recently the RGB palette / colors for the PC Engine / Turbografx 16 console was rather wrong in both emulators and RGB-modified original hardware.

Thankfully, these days there is a community of emulator authors, console modders, FPGA developers, shader writers, YouTube creators and preservationists that have really dug deep into these kinds of issues and more and more implementations get it right out of the box.


It's really neat when the big boys end up doing it right in an accessible way. Nintendo's Virtual Console(VC) releases on the Wii support a lovely 240p over composite/s-video/component. This makes it easy for novices to get a clear image on consumer 480i TV without much fiddling.

Sadly not every VC game supports this feature. It's also hidden behind an undocumented button ritual to enable tje first time.


It's unfortunately rare that the big companies get these things right. Their efforts are mostly amateurish compared to the community provided solutions. Just look at the recently added N64 games on Switch with high input lag and plenty of graphical issues. Or the notoriously poor Game Boy Player software for the GameCube that now fortunately has a far superior community supplied rewrite. The Wii virtual console was decent, but remember how dark all the games on it looked? And then of course Nintendo made things worse on the Wii U. I think Nintendo actually knows less about their own hardware and software than the community that develops emulators, FPGA cores, upscalers etc. I rather give my money to these people than renting / buying the same poorly emulated 20 games from Nintendo again and again. But Raspberry Pis and MiSTers aren't exactly newbie friendly plug & play solutions. And the stuff from Analogue is too pricey, always sold out and out-of-the box requires original cartridges. Not really clear what a good recommendation for the non-technical crowd is.


I'm still not sure why the NES Virtual Console palette was so dark. My most guess was that it was trying to stay out of the automatic-brightness-limiter range that CRT TVs at the time had. This way the dynamic range between levels was preserved. Oddly none of the other VC emulators had that issue, so maybe it was just incompetence.

3DS VC improved things a bit with each colour palettes tuned for each game, though it brought it's own weirdness.

For pretty CRT gaming, I still recommend the Wii. It's straightforward to hack and install retroarch on and you can benefit from the component video out with many cheap and nice TVs. RetroArch can still be a bear to configure for novices, and the Wii can be a tad underpowered, but the good IO makes up for it. It also has oodles of plug and play controller options.

Sadly there's not too many other ways to take advantage of the 240p Y/Pb/Pr on consumer TVs with emulation.


I'm not aware of a single CRT TV or monitor that had a brightness limiter circuit or anything like it? CRT TVs never limit brightness, in fact most will quite happily burn themselves out if they suffer from a vertical collapse or such defect. There's an overload circuit, but that's for defects not bright image content. If that existed it would of course also affect all the original consoles.

OG XBox is also an option for hacking. I think the Wii is a great platform as it's relatively cheap, but following the hacking guide correctly (https://wii.guide/) and configuring RetroArch is so complicated you might as well get a more powerful solution. Raspberry Pi + one of the various analog output options with CRTPi or my more minimalist shader solution or MiSTer with the analog board / direct video are really good.


Since everyone is here for this, what are y'alls goto for, e.g. Raspberry Pi (4) on retroarch with a regular HDTV?


I personally still like the RetroPie setup the best. It has some quirks, but I think it's mostly a sensibly configured solution and there's plenty of community knowledge / help out there. Maybe it's just familiarity bias, but after many hours of poking around in it and tweaking stuff I can't say I dislike it.

IMHO worst part of the out-of-the-box experience with RetroPie is that everything is configured for maximum performance, which kinda means worst latency. The input lag is astronomical if you're used to original hardware / FPGA emulation / CRTs or zero-lag scalers etc. You can get it to very acceptable levels for many systems, but you have to know which settings to change.

Here are the notes I took when configuring and tuning my setup:

https://github.com/blitzcode/retropie-setup-notes/blob/maste...

They're for a Pi 3B that's setup for output on a CRT TV, but like 90% should apply to a Pi4 on an HDTV.

Shaders are always a personal taste thing. Those posted here certainly get the as sharp as possible without shimmering, blurring, borders or wrong aspect look right, but there are other options if you want the CRT look.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: