Hacker News new | past | comments | ask | show | jobs | submit login

> If the point is to preserve their pixellation

The "point" of pixel art-- what makes it so convenient for graphicians, even amateur ones-- is not the blocky appearance (what you call "pixelated" - but in fact these icons did not appear "blocky" on the CRT screens that were in common use at the time!), but to set a uniform constraint on fine detail (and sometimes color depth) within the image, and then to maximize quality while staying within that constraint. It is perfectly consistent to want a means of rendering these images that preserves whatever level of detail was in the original while not introducing blocky artifacts.




The "not blocky CRT" argument doesn't really apply to personal computer CRT monitors, as even when they maxed out at 640x480 or 800x600 the display was still pretty crisp.

I can kind of see this argument if you're talking about playing nintendo on mom's old dog-eared TV with the UHF adapter... but frankly I prefer to see pixel art in its original unmolested, pixellated form

edit- on that note I remember very clearly that 320x240 games had a blocky appearance in the 640x480 era. That was one of the biggest reasons to get a 3D card!


I remember playing Master of Orion 2 on my computer at the time (mid 90s) and thinking:

1. it's amazing this game actually runs at 640x480

2. there's no point in having resolutions any higher than that, as you can't see the individual pixels at that size anyway (I had a 14" CRT, viewable area probably around 13").

At least in the early to mid 90s you definitely still had "CRT fuzz" on computer monitors.


I think the typical CRT fuzz is actually quite close to the optimum smoothing that could be achieved with a simple, analog system, such as was common in the 1980s and early 1990s - in that it should closely approximate a Gaussian blur! But lanczos (or hq#x) is crisper than that, of course.

(And yes, 320x240 did use 2x nearest neighbor interpolation on later video cards/monitors that could only display higher resolutions natively. But I assume that back in the early 1980s, you would actually get a "native" 320x240 screen, just like on a home computer or console.)


On any CRT screen, you'd just get 320x240 as native resolution, the "interpolation" basically done by the phosphorus of the screen. This was the norm well into 90s, and not everybody was on an LCD monitor in 00s, either.

I remember that many games (myself included) resisted LCDs for a long time even beyond that, precisely because they could only do one resolution well. If you played old games, this wasn't satisfactory because those were often hardcoded in the resolutions they support - typically 320x200 or 640x480. And if you played new games, you'd often have to dial the resolution down to get it running reasonably fast.


I think the point was that it was the video card that didn't support 320x240 natively, so it NNed to 640x480.


Any VGA card (which you needed to get 640x480) would also support 320x240.


The same art style was used on portable consoles with LCD screens, and on line doubled VGA modes with blocky pixels. Even pixel art designed for SD televisions and low quality interconnects might not have been intended to be blurred, e.g. Chrono Trigger includes a pixel art typewriter in the starting room, where the keys are represented by a single-pixel checkerboard pattern that is easily made unrecognizable by blur. Some designers even showed blocky pixels in printed materials, e.g. the cover art and the instruction manual of Super Mario Bros. It's possible that some artists intended their pixel art to be blurred, but it's not universally true.


If that's your argument, then you should use a CRT filter, not hq2x or lanczos :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: