Hacker News new | past | comments | ask | show | jobs | submit login
What the death of the CRT display technology means for classic arcade machines (venturebeat.com)
179 points by zdw on March 6, 2017 | hide | past | favorite | 160 comments



I've been working on an existing Apple II emulator, OpenEmulator, which takes an interesting (and as far as I know, unique in the Apple II emulator world) approach: it renders a fairly raw NTSC-like signal, and then uses GPU shaders to convert to colors, while adding fairly realistic CRT effects, like persistence, curvature, dot masks, etc.

The result is an emulation that is incredibly close to what I see when I walk downstairs and run my test code on the real Apple IIe in my basement. I guess that's what happens when a DSP person writes an emulator :-) (The original author has disappeared themselves from the internet, unfortunately.)


Relatedly, if anyone hasn't read it, Kyle Pittman has a lovely rundown of his attempts at emulating various CRT-effects digitally during development of the indie game Super Win the Game: http://www.gamasutra.com/blogs/KylePittman/20150420/241442/C...

Of course, most people (Pittman included) felt that configuring the visually emulation too accurately was annoying and/or unbelievable. :P


I made an attempt at this around 2001, as part of the XScreensaver collection. See video at https://youtu.be/p3QZqhp67l8?t=30s. It generates an NTSC signal, adds noise, and recovers the H and V sync and color in a DSP implementation following the schematic of an old TV. It reproduces the color fringing of text and many other cherished artifacts of Apple ][ computing.


Awesome! I've actually seen people reference that as the most accurate emulation they'd ever seen. Perhaps we should add noise to the OpenEmulator one too! :-)

[Edit] the bending, and shadowing to the right of the characters is amazing :-)


It remember me a lot when I plugged my Spectrum +3 via composite video to my small TV when I was a child.


Oh, one more note: I do not believe it would work with a light gun.


Depending on the latencies and frame drawing speeds, a Zapper-type ("show a light on the screen, poll whether the gun sees it") might work, but probably not. A Super Scope-type (going off hblanks) definitely would not.


There's an Electron/VIC20/2600/Oric emulator that does this too: https://github.com/TomHarte/CLK - looks authentically horrid and it's got that somewhat elusive 50Hz feel.

(Not many 50Hz Electron games, though! Firetrack is one.)


It's pretty common in the world of emulators, it's just that a lot of people don't care and choose not to use the plugins for it (I'm one of those people).


What this article doesn't touch on but I have found an issue with classic consoles on non-CRT screens is:

1. Light guns

Even on consoles as late as the Sega Saturn, the light guns were design around the timings of CRT refresh rates and thus many (albeit not all) LCD screen are not compatible. I would imagine this could be a real issue for collectors / restorers of cabinets as shooting games (eg Virtua Cop) were quite popular in arcades.

2. Early 3D systems as used on Sega Scope for the Sega Master System; and the Nintendo 3D System for the Famicom (Japanese NES)

These devices are less common and as far as I'm aware only available on home consoles. For those of you who don't know what I'm talking about, it's active shutter glasses (ie disabled one eye at a time) and was timed around the interlacing on CRTs. At 50/60Hz (depending on your region) the effect can be a little unpleasant. But it's also pretty amazing to play when you consider the devices are approximately 30 years old.


There was an interesting post on hackaday a while ago about getting this to work using a Wiimote https://hackaday.com/2016/08/30/tricking-duck-hunt-to-see-a-...


Wonderful!


It would seem that if you're modifying the cabinet to use a different type of monitor, you could also modify the gun. Is there some kind of light gun-like technology that does work on LCDs?


The issue is that you'd also have to modify the software. A traditional lightgun can't tell the game where it recorded a hit, only when. It's up to the software to translate the timing information it got from the lightgun into a position on the screen. It accomplishes this by being synchronized with the electron beam of the CRT and thus having an accurate model for the position of the beam over time.


VLSI CRTC's like the 6845 had a light pen input and a register that captured the character position at the time it fired. You could read that and translate it. It was a small matter of integration to make the input work. This chip was meant for text display but the PC and other systems used it for pixel-based displays too.


WRT to the light guns, I never quite really understood why you couldn't somehow simulate this with LCDs. Isn't the refresh rate on LCDs higher? So couldn't you work around that somehow? Obviously the answer is prob. not since no one (that I know of) has done it, but I was genuinely curious of what precisely the technical limitation is.


The way light guns work, they rely on the fact that a white frame is not displayed all at once but "drawn" by the CRT ray - if you flash a single white frame, one location will flash white slightly later than other, and you can infer the location from the timing.

LCDs, well, change all the frame at once. The issue isn't with the frame rate but at what happens when the device is drawing a single frame change.


But following the same logic as GP, if you have a higher refresh rate, can't you just split the frame into several frames where you simulate that refresh scan as an animation?


Not in practice. To completely simulate the CRT refresh, you'd need to flash each pixel separately along the electron gun path. If your LCD has 1,000,000 pixels, you'd need 1,000,000 times faster refresh rate to do that.


>LCDs, well, change all the frame at once.

Is this true? It doesn't seem like it is, if it were then we'd not experience tearing in games, etc.


Tearing is a property of the output from the GPU. The data takes time to send, and the GPU can be directed to send different data halfway through the process (or the memory it's in the process of sending can be modified, etc.). The display just shows what it's sent... it shows it as it receives it (CRT), or it buffers it up and draws it later (LCD), but if there's a tear, you'll see it just the same.


There may be intermediate buffering, but the panel is still updated in a scanning fashion.


Screen tearing is, iirc from the gpu buffer only partially overwritten with the next frame when it's sent to the screen.


Tearing is, IIRC, a product of only part of the next frame being rendered when the refresh happens.


Obviously not, there is no memory inside the glass, and bandwidth is limited. LCD screens still scan in.

The difference is LCD keeps picture until told otherwise, whole picture is emitting light. CRT has limited phosphor persistence, there is only one point/pixel being lit at any given time.

https://www.youtube.com/watch?v=nCHgmCxGEzY


The latency on LCDs is much much higher, often as much as several frames. This could be removed but only at the cost of disabling all manufacturer's image post-processing including upscaling.

I have a "100Hz" CRT television that won't support light guns because the manufacturers decided they'd put this kind of image-"improving" technology in.

There are actually two techniques for light gun operation, one of which is "racing the beam" (unreliable in the presence of high retention phosphor) and the other of which is to draw a single frame of white squares for targets. https://en.wikipedia.org/wiki/NES_Zapper#Technical_details


I remember buying a Playstation2 light-gun as the console was already in its "sunset period" and when i got home i found out it wouldn't work on my TV. My TV was a late CRT model before flat screen LCD took-over and this particular CRT had 100Hz refresh rate.


LCDs cannot simulate this because their latency is too high. By the time the pattern is on the screen the lightgun has already recorded a miss.


Later than that; PS2 and Dreamcast.


Have you guys seen Musee Mecanique in Fisherman's Wharf, San Francisco [1]? It's a collection of older-than-old slot machines and games, I want to say I've seen some 100 year old ones, but the point is: there's Heavyweight Champ' style boxing, there are all sorts of sport games, and even a mechanical road runner [2].

The gameplay is obviously changed with technology, but for some games the core idea survived some heavy decades of transitions. The originals would probably be less engaging for today's kids, so it's only fair to ship them a 3D/VR/mobile version, and keep our nostalgia from trying to sell mechanical road runner to all arcades. CRTs will phase out in favour of everything else, and that should be ok.

[1] http://museemecaniquesf.com

[2] https://www.facebook.com/museemecaniquesf/videos/vb.51627194...


This is my thinking as well. Hamfisted attempts to make LCD's look like CRT's by artificially curving the screen and messing with colors is just nostalgia. It doesn't add anything to game and 1980s designers were most likely pissed they had to deal with these limitations like color distortion, curvatures, ghosting, burn-in, etc. Pac-man isn't better on a shittier screen. Pac-man succeeds because it had novel game mechanics for 1982.

Outside of lightguns not working this seems like a non-issue to me. I'm sure the emulation community can find a work-around for this for the few people actually using lightguns under emulation.

>Have you guys seen Musee Mecanique

I live in Chicago and have access to all sorts of barcades as well as one of the largest, if not largest, arcades in a nearby suburb, so I'm pretty well versed in the arcade experience today as well as from my childhood. I went to Musee a few years ago and was expecting an experience like that, but instead the game floor was mostly these near turn of the century mechanical games. I was fairly disappointed, from a gameplay or mechanical perspective they were kinda terrible and largely uninteresting. They had historic value but that's about it and probably better off in a proper museum and lovingly maintained than beaten up for quarters. I think in 20-30 years this is exactly how CRT's will be seen. Kids will marvel at how terrible their low resolutions, framerate, color accuracy, and ghosting are.


This is my thinking as well. Hamfisted attempts to make LCD's look like CRT's by artificially curving the screen and messing with colors is just nostalgia. It doesn't add anything to game and 1980s designers were most likely pissed they had to deal with these limitations like color distortion, curvatures, ghosting, burn-in, etc. Pac-man isn't better on a shittier screen. Pac-man succeeds because it had novel game mechanics for 1982.

I mostly agree, though I think it gets a little muddier when you consider that the artists of the day — while doubtlessly wishing they had access to "more perfect" displays — were often designing their art to take advantage of those distortions to get effects they wanted. So, if part of why we're doing emulation is for history's sake, we have to keep that in mind. The painters of ancient Roman frescoes probably wanted more vibrant pigments than they had and it might even be interesting and attractive to "enhance" one with modern pigments, but it's ahistorical.

That being said, I've never personally found added distortion in emulators to improve the experience. I'm always constantly aware of its artificiality and that far outweighs any potential gains in "authenticity".


Or the (fortunately no longer common) trend in the 1990s of "colorizing" black and white movies under the misconception that this would make them better, ignoring that people filming in black and white took advantage of the medium for things like shadows that actually worked better there than in color.


I'd like to see CRTs maintained as long as possible but when I see the rebuilt machines in the barcades (was just at Headquarters in Chicago last night) using LCD screens I'm usually very satisfied with them - even preferring them. Occasionally I play one with unacceptable lag, but I suppose that has something to do with some suboptimal scaler or converter being used. Generally it's a great looking and playing experience.


> Pac-man succeeds because it had novel game mechanics for 1982.

Pac-Man was released in 1980, not 1982.


Yes! I spent at happy morning there last time I was in the city. That old submarine is right outside the front door. There was some very cool old arcade games in there but some of the mechanical ones were incredible, like the baseball game.


My wife had never really been in a real arcade till she and I visited that very same one, and we were both blown away!


They don't really say how large the market might be. Or if there are patents that would block a startup.

I'd be willing to bet that you could find or train people in Detroit right now to build these CRT's. All you'd need are a couple of people to train them for a few months.

They said the same thing about vinyl records, soon the industry would die. Detroit has become a hub for pressing records and as old machines become scarce they're now supporting a company making new machines.

http://www.rollingstone.com/music/features/inside-jack-white...


I would guess that up until now the biggest thing preventing anyone from investing in small-scale CRT manufacturing is the simple fact that there was an inventory glut that covered the world demand for CRTs for nearly a decade after the last one was made. Now that we're apparently getting close to the point where there's actually demand for new CRTs is when we can find out if it's something that you can build a business around.


> there was an inventory glut that covered the world demand for CRTs for nearly a decade after the last one was made.

I collect and restore analogue oscilloscopes and even today you can still get original NOS scope tubes for many models that have ceased production 30, 40, 50, 60 years ago.


But is there really a small cohort of CRT nostalgics out there? I mean I get the records, the sound is unique and music is absolutely ubiquitous. Arcade gaming is a much smaller market and I find it hard to imagine that there are enough CRT purists out there to warrant a business.

Especially considering a nice new OLED could be fashioned to look like a CRT.


I use CRT as my main screen. Some reasons for it:

1. I can choose any resolution without blurring... problem now worse with 4k screens (lots of angry posts in Stellaris forums due to UI that is hard to read in 4k and become blurry if you lower the resolution...)

2. Ridiculous response speed... playing Crypt of the Necrodancer or Super Hexagon is much, much more fun on CRT, all mistakes feel like it was purely lack of skill, instead of feeling something unfair happened. Ditto for fps... CS community in fact is hogging all CRT, the ones I knew for sale all ended in some CS player home willing to fudge settings to end with crazy refresh rate and response speed (there is even a video of a guy pulling 200+ fps by using interlaced mode!)

3. Better blacks and whites... in fact, this was the main reason, when CRTs died I also abandoned them, but all screens I had after had colour or contrast problems, I had to fiddle with settings constantly, one day my screen broke and a CRT was available nearby, when I plugged it in, I was blown away by the image quality that required only geometry settings, no endless messing with gamma, colour profiles and whatnot.

4. Not valid anymore, but until recently CRTs were much cheaper than flat panels of equivalent quality, at least on my country... but now with "new" stocks empty, used prices around rising, I even own a CRT tv that now on eBay costs more than the price it has when it was new... (this particular model is also the one used as reference for some SNES emulators CRT "filters")


Out of curiosity, color CRTs do have a kind of native resolution. It's the RGB pattern used to show colors.

See: https://www.reddit.com/r/askscience/comments/3a2tx6/do_old_c...


You can get all sorts of Sony Trinitrons off of sites like Craigslist for free... just turn up and haul them away


And CRT are just a giant light bulb, bombarding you with not so good radiation. Add to this the fact that the corner are never going to be perfectly square. I get the attraction for some of the superior quality of the CRT, but once this technology has been overtaken by a superior one, there is little sense to look back.


> And CRT are just a giant light bulb, bombarding you with not so good radiation.

In other words, you do not know how a CRT works.

Educate yerself, it's free: https://en.wikipedia.org/wiki/Cathode_ray_tube


The article is a poor source for your argument, because it does at length argue about the health risks of X-Rays emitted by CRTs.

It's more helpful to put it into perspective with other radiation sources:

https://upload.wikimedia.org/wikipedia/commons/2/20/Radiatio...

IOW, radiation dosage of using a CRT for a year comparable to ten bananas.


I don't see the "does at length argue about the health risks of X-Rays emitted" in the article. It discusses them and concludes that they are widely considered safe.

> CRTs can emit a small amount of X-ray radiation as a result of the electron beam's bombardment of the shadow mask/aperture grille and phosphors. The amount of radiation escaping the front of the monitor is widely considered not to be harmful. The Food and Drug Administration regulations in 21 C.F.R. 1020.10 are used to strictly limit, for instance, television receivers to 0.5 milliroentgens per hour (mR/h) (0.13 µC/(kg·h) or 36 pA/kg) at a distance of 5 cm (2 in) from any external surface; since 2007, most CRTs have emissions that fall well below this limit.


> Especially considering a nice new OLED could be fashioned to look like a CRT.

Except for all the old cabinets that just won't work with it I guess.


Ah, but it would be less "vibrant" and "nostalgic", according to the article.


vinyl : digital audio as crt : lcd


That's an interesting comparison, particularly now. Until the last few years CRTs certainly had better image quality than the alternatives.


And vinyl forces a better mastering where digital audio allows loud and compressed sound...


Vinyl, nokia 3310, vintage clothing, furniture. The whole Etsy phenomenon... Nostalgia and older stuff is a huge market. Just look at vintage gaming in general. You still cannot buy a NES classic edition for less than double retail recommended pricing. Also CRTs have 'something' that no panel will ever have. The curve, the thick glass edges, the static buildup, the scan error, the whole heavy tactility, the legendary white noise. And as gen X ages, they'll get nostalgic, and willing to pay. I'd invest.


If you are seriously considering going into this business, consider also the possibility that somebody will figure out a way to make an LCD screen that is acceptable for this use case once the CRTs are no longer available. Light-guns are likely gone no matter what, or at least going to be inconsistent and hard to use, but an LCD screen explicitly designed to emulate a CRT may be able to fill in. LCD screens not explicitly designed for this use case are getting down to ~25ms latencies, which is getting to within spitting distance of one frame.

It seems to me there's no fundamental reason that an LCD can't be designed nowadays to paint the screen almost identically to the way a CRT did it, complete with an emulation of their "flaws" (with at least a line or two of buffering to pull that off). The fact that older LCDs use an internal framebuffer and are locked to a certain refresh rate seems like an artifact of the older technology rather than a fundamental limitation. Just as G-Sync and friends can start playing with the refresh rate it seems like nothing would prevent you from designing an LCD controller that doesn't wait for an entire frame but just starts painting the NTSC or similar signal.

If I'm wrong, I eagerly await informative corrections. But bear in mind I'm aware that I'm calling for fairly significantly different supporting electronics behind the panel, that work on a fundamentally different architecture than current LCD panels. I do realize this is a non-trivial modification.

You'd have to run numbers but it's possible this is more cost-effective than trying to source all the CRT parts and/or run your own manufacturing line; this approach would involve creating custom electronics (feasible even for a startup) and then sticking them on commodity screens, which is almost certainly much cheaper than trying to make CRTs.

Edit: Apparently I have gored some sort of sacred ox or something? I'm getting downmodded for this? Really? I'm talking about deeply redesigning how LCD screens work, so your current issues with LCD screens don't apply any longer. There are some things about how LCDs work now that are historical accident, not fundamental to the technology.


25ms is still several orders of magnitude too slow for a lightgun. It's important to remember that ~16ms is the approximate time for a CRT to paint an entire frame at 60hz. A lightgun is way, way faster than that. It needs to determine the x and y position of the user's aim. To accomplish this means being faster than a horizontal pulse (the signal that draws one line of pixels) which in the case of NTSC is 4.85µs. I highly doubt you can get an LCD to refresh that quickly since the process involves actual physical twisting of liquid crystals in response to a change in electric current.


"Light-guns are likely gone no matter what, or at least going to be inconsistent and hard to use,"

If I am correct about how the LCDs could be redesigned to work, I can imagine a world where you're paying about 1ms + the actual liquid crystal latency for the display which can apparently also be in low-single-digit milliseconds, which is probably good enough even for frame-perfect fighting games, but you're still going to have a screen that is fundamentally persistent rather than a scanning flash of light. (Possibly 3 or 4 ms if you're going to do some basic CRT emulation. Such emulation wouldn't be able to do full screen distortions induced by too much brightness, but even in the CRT era that was considered a failure, not a feature. It should be able to do slight persistence and local blurring.)


Modern 144Hz LCDs are getting close to sub 1ms latency and we have 200Hz LCDs coming out.


sub 1ms latency

Which is still several orders of magnitude slower what is needed to simulate a scanning electron beam in order to make lightguns work.


This is assuming that no one will just make a new lightgun that works with LCDs.


The discussion is in the context of playing classic arcade games. A new lightgun would not work with these games.


Couldn't light guns be emulated? Use something wii-mote-ish, and then when the trigger is pulled, wait the requisite time after the next frame to trigger.


Rather than actually fabricating new tubes, I think a smarter approach would be to make a convertible circuit with adjustments and connectors that could revive any/most CRT's. So you could find any non-punctured CRT, gut all the electronics, replace with the add-on board, and fire it up again. The new circuit would have HDMI or breakout boards for any interface you want, IR, Wifi, audio, etc.


CRT production is a much more complex bit of modern manufacturing than making vinyl LPs.

This youtube video documents what a rather dedicated individual does to resurrect small-scale Nixie tube manufacturing: https://www.youtube.com/watch?v=wxL4ElboiuA And technologically, color CRTs are way beyond that. https://en.wikipedia.org/wiki/Cathode_ray_tube#Color_CRTs


Rebuilding CRTs is possible at the small shop level.[1] There's still a commercial service that does it. "We are your obsolescence solution."[2] It's not cheap, and most customers are rebuilding tubes for military systems that still use CRTs. This is likely to be too expensive for video game restoration.

[1] http://www.earlytelevision.org/crt_rebuild.html [2] https://www.thomaselectronics.com/repair-overhaul/


There are also a handful of "rejuvenation" processes. For example:

- Filament-cathode shorts can be burned out (by discharging a capacitor across the short) - Cathode poisoning can be reversed (e.g. http://www.ke5fx.com/crt.html )


> Or if there are patents that would block a startup.

Given patents usually expire after twenty years it seems very unlikely there would be anything still valid for CRT manufacturing (at least for these arcade machines from the 80s. There may still be a few flat CRT / WEGA patents and the like that are still valid from the mid 90s, but they won't be valid for much longer either).


In the Detroit area you can get most arcade games repaired by the guys at BigToys near 14 mile and Mound Rd. I am not aware of anyone repairing the actual tube in the monitor though. Everything else is fine.

From TFA: "In the very far future, the value of an authentic 1980s arcade game may become so high, that the monetary value will be significant..."

I guess I'll be hanging on to my I,Robot and pair of vector games for some time then...

It's a shame that emulation devalued real classic arcade games, but then it's also great that everyone can experience them. I too am guilty of writing and emulator ;-)


Barcades and enthusiasts' YouTube videos can help convince skeptics (i.e. yours truly) that emulation is no substitute for the actual cabinet-including original hardware and CRT monitor, which in turn serves to help keep some of the value in those classic games. Have you seen price graphs for classic arcade games over the past decade? A properly restored and fully functional I, Robot can go for over $2k in some markets, and that price is only rising (well, for now).


As someone who missed most of the classic arcade games on CRTs, I was blown away trying out Asteroids on a machine with a CRT Vector Monitor. Because the electron beam can focus longer in one spot, it could pull off ridiculously bright bullets. I don't think you could get the same effect on any modern display.


Have you tried Star Wars, Tempest, Star Trek, or some of the other vector games? Yeah, no comparison to raster emulation.

MAME can be interfaced to a laser setup for true vector graphics.


The low input latency of original Tempest is also amazing - I think it's the most fluid game I've ever played.


Tempest is my favorite "golden age" game, and I've been trying to find a working copy (to play!) with a vector monitor. Sadly the monitors that were used in Tempest are (apparently) particularly failure-prone.

I came close once, at the Strong Museum of Play in Rochester, where they had real hardware but, sadly, an LCD screen.


With proper amounts of determination any CRT is vector...

http://spritesmods.com/?art=bwidow_fpga&page=2


It's interesting how we have traded off some aspects of display quality in favour of overall unit size and practicality.


same story for digital photos vs film. Film is still superior in several areas but not as convenient.


Which areas is film superior to the modern sensors?


- higher resolution (with fine grain sized stock)

- higher dynamic range

- gracefull saturation behavior

and IMHO the absolute killer feature:

- a "fresh" sensor for each and every frame captured (no worries about scratches in the optical lowpass coating, no hassle with dust on the sensor).

Also absolutely no dependency on charged batteries if one uses a purely mechanical camera.


35mm film is considered to be roughly equivalent to 20 megapixels. A modern digital camera has 42, more than double that.

35mm film is considered to have a dynamic range around 13 stops. A modern digital camera will do about 13.9

Film has a few advantages and pleasant quirks over digital, but sensitivity and resolution aren't amongst them.


> but sensitivity and resolution aren't amongst them.

Digital cameras have been more sensitive than chemical film for some time now, hence I didn't mention sensitivity (film is clearly inferior).

Regarding dynamic range and resolution, it really depends on what kinds of stock you compare to digital. If you take a low sensitivity ISO 50 fine grain stock, that one will easily outperform current generation electronic sensors – except for low-ish resolution high dynamic range sensors with large charge collecting capabilities¹. But you get this at the expense of having to use slow shutter speeds or a fast aperture.

You're right that in the "usual" ballpark of operational parameters (ISO 400 to ISO 2000, ~30MP resolution) modern electronic image sensors are getting close or are on-par with standard application chemical film stock.

----

1: Ironically chemical film stock resolution and dynamic range increase "in the same direction": Smaller grain → higher DR, higher resolution, but lower sensitivity. It's exactly opposite for electronic sensors since pixel size determines charge collection capacity and therefore saturation levels.


if you enlarge a film the traditional way for printing it goes way beyond 20M pixels. Probably closer to 50M pixela at least. Try enlarging a current 20M pixels digital picture to A2 format you will see a huge difference and film clearly wins.


I based my numbers on http://photo.stackexchange.com/questions/30745/what-is-the-e...

Their breakdown is based around top of the line equipment. If you think 50mp can be pulled from film, I'd like to see how.


when you enlaRge film you dont get pixels. Enlarge a digital picture and you only get a mosaic of squares.


The Sony A6300 offers 13.7 stops of dynamic range (according to DxOMark, so not entirely uncontroversial). The slightly more expensive Nikon D810 goes up to 14.8 stops. It's hard to find definitive numbers for film, though.


I work with post production for tv/movies. Current digital sensors are much better than film in clarity, noise, dynamic range, etc. Film is still, arguably, superior in look.


What makes it "superior in look" though?


If I was to hazard a guess, minute irregularities of film blending the details instead of forcing them into a discrete grid.


The noise inherent to film. Aka film grain is more irregular than digital sensor noise and can be more pleasing to people. People have argued that part of this is familiarity. People who grew up watching film have a large base of memories and associations with their favorite movies and the 'film look' IN vfx work we use a large library of film-stock grain patterns to add noise to cg elements.


Large and medium format completely blow the best digital sensors out there. Dynamic range (unless you have a 3000 USD DSLR), contrast in black and white... and the rendering of colors. With a DLSR you are stuck with one sensor, with film every new roll is a different experience.

And as long as you have the negs, you can keep your pictures almost forever. Try reading that fancy RAW, proprietary format from your 2010's camera in 2050.


Technically that's a limitation of the price point, though. There is no physical reason medium- and large-format DSLR's couldn't be made with higher resolution and sensitivity than film.


Price is a big deal though. You can buy a 4x5 camera and some sheet film for a few hundred bucks online and get started making some amazing photographs right now. To buy a medium format digital back with comparable resolution you're looking at prices rivalling an entry level luxury car.


not even close to MF cameras cost 9000 usd currently. Large sensors are prohibitive. If you do landscape photography Film is still the king.


You can take that 15 year old photo, scan it and upscale it to meter width and keep sharp edges and so on. Try doing this with that fancy iPhone picture.


Not really an apples-to-apples comparison.

An early-2000s consumer film camera, addressing roughly the same market as a modern iPhone camera, is not going to have nearly the lens quality, nor will the mass-market film stock have the grain resolution, nor will the 1-hour photo print have the transfer sharpness to survive a scan enlargement to a 1-meter-wide print. The result will look grainy and blurry compared to an enlargement from a recent iPhone.

An early-2000s 35mm SLR with professional film and a high-end darkroom print will look great enlarged, but then an iPhone is no longer an appropriate point of comparison.


early 2000 slr cost close to nothing these days though.



> We’re looking at a situation where playing Donkey Kong in the way that its creator intended is reserved only for the most dedicated collector. It will be prohibitively expensive to recreate that experience.

While this is a bit sad on some level, the art itself will survive the transition. After all, nobody these days experiences Romeo and Juliet in the way that its creator intended, yet it endures. What I'm more worried about are those games that are going to be completely lost once the hardware expires; we can preserve the game data itself all we like, but without proper emulators you're SOL (and unless CPU speeds start doubling again, we're never going to be able to even semi-accurately emulate anything past the Xbox 360/PS3 generation). To say nothing of online games that will vanish into the mist once the servers get taken offline...


> unless CPU speeds start doubling again, we're never going to be able to even semi-accurately emulate anything past the Xbox 360/PS3 generation

Luckily all the new consoles use x86_64 processors for their main processors, so no low level emulation will be required.


Oh yeah, having an x86 CPU makes it super easy to emulate - just look at how good the original XBox emulators are. That was sarcasm. There are no good XBox 1 emulators because the GPU is a custom, undocumented chip by nVidia and nobody has sat down to reverse engineer it. I predict that similar custom hardware parts will make it very hard to accurately emulate PS4 and XBone games. Then again, I feel like there are a lot less console exclusives nowadays, so we might not lose out on too much after the systems' deaths. And in the end, it's just games - people will always keep making more. It's sort of a philosophical point, but I feel that letting some art disappear completely, so people can re-discover and re-make it as new isn't really a bad thing.


I'd argue that the real reason there are no XBox emulators is because nobody with the requisite skills cares enough to bother to reverse-engineer it and write an emulator.

Since there are very few exclusives, there's little demand for it to be emulated in the first place. And from what I understand, the XBox isn't fondly-remembered by geeks. The XBox was the console of choice for people who liked to make fun of geeks and nerds, and as such most people who have the skills and the desire to write emulators (i.e. geeks and nerds) have a pretty strong aversion to the XBox. That the most popular games for the XBox weren't the kind of games that were enjoyed by nerds doesn't help, either. There's very little overlap between emulator programmers and people who care about the original Halo or old Madden games.


You'll just need to somehow emulate the closed down servers that run the now defunct game worlds...


Is that true? The prospect of emulating a console on similar-gen PC hardware is super appetizing, but this is the first I've heard of the possibility.


While the CPU is indeed x86-64, the architecture of the SoC and the device is rather different so emulation still won't be all that easy.

This 33c3 talk gives a lot of fascinating information on how PS4 differs from your run-of-the-mill PC and what kind of issues they hit when trying to run Linux on it: https://www.youtube.com/watch?v=QMiubC6LdTA


It's really hard to predict that stuff. People thought original Xbox emulation would be easy, since it was "essentially a PC", but it's the only console of that generation that still doesn't have a good emulator. Meanwhile WiiU emulation is already advanced enough that it can almost play the just released Zelda:BotW.

Technology aside, it's also a matter of interest. Many XboxOne and PS4 games already have PC ports, so there is less need and interest to create emulators. Nintendo consoles on the other side have many exclusives, so interest in emulating them seems to be higher.

Only time will tell.


> It's really hard to predict that stuff. People thought original Xbox emulation would be easy, since it was "essentially a PC", but it's the only console of that generation that still doesn't have a good emulator.

Probably a matter of demand, too, since it's got very few decent games that aren't 1) available on another console, or 2) available as an (often superior) PC port


Yep. Both the PS4 and XBox One are using semi-custom AMD chips with 8-core CPUs and 1152 shader GPUs. So managably beefy.


The XB1 has only 768 GCN cores (12 compute units of 64 cores, versus 18 for the PS4), but both CPU and GPU cores are clocked slightly higher than the PS4 (1.75GHz v 1.6 for the CPU, 853MHz v 800 for the GPU).

The PS4P increases everything except the CPU cores count: CPU clock to 2.1GHz, compute units to 36 (hence 2304 cores) and GCN frequency to 911MHz.


The CPU isn't actually very powerful, they use low-power Jaguar cores at 1.6 GHz or so.


> While this is a bit sad on some level, the art itself will survive the transition.

The art was created for a medium. If the medium is unavailable, the art is altered. That is not unlike photographs of paintings, at a macro level the art "survives", except some masters used layers of semi-translucent paint and three-dimensional structures which the photograph can not reproduce.

And the art thus survives in the same way thylacines or dodos survive as taxidermic mounts: a pale shadow (or washed out over-exposure) of their former glory.


A small but poignant correction here - no dodos survived what occurred even as taxidermy. The most complete known specimen of a dodo consists of 'a dried head and foot'

https://en.wikipedia.org/wiki/Dodo#Physical_remains


That's an interesting article. This bit from the top was particularly informative... I had no idea the dodo was only known to the world for such a short time before it went extinct:

> The first recorded mention of the dodo was by Dutch sailors in 1598. In the following years, the bird was hunted by sailors and invasive species, while its habitat was being destroyed. The last widely accepted sighting of a dodo was in 1662. Its extinction was not immediately noticed, and some considered it to be a mythical creature.


I'm being a bit pedantic but you can go to The Globe in London and get a pretty close thing to experiencing Romeo & Juliet as intended, albeit with the audience wearing modern clothing. If you ever get the chance you should go, its an experience worth having.


There are orchestras out there spending a ton of time, money, and energy on reproducing classical music on antique instruments exactly as it would have sounded when composed, and I find it hard to believe nobody is trying to do a historically accurate Shakespeare.


Except we know that we're only getting closer to the original sound. We're making a lot of guesses based on surviving instruments (in some case only known from paintings or woodcuts in books though) and surviving texts about how to play music, or diary notes, or notations on scores which were used by people teaching or learning. We don't actually know how any of it really sounded.

Which is, really, the same problem we've got with Shakespeare. Without really good footage of several performances, we can't really know how Shakespeare plays were really performed, we can just get as close as we can from the surviving evidence.

If I ever got a lift with the Doctor I would ask to go back to the 16th century and record as much music as possible.


So it's even easier to see why people are trying to preserve CRTs now and not when nobody really remembers what they were like.


This. Nobody ever said "my creation is intended to be seen only on this one type of display screen, and not another type that hasn't been invented yet".

I think "excessive verisimilitude" is the term for this annoying attitude.


Yes, that is true; instead, they never considered some other type of display that didn't exist yet at all, and so effects that they assumed would be there are not. The Sonic the Hedgehog waterfall is the classic example.


I see plenty of comments here about how LCDs will have to be the replacement, but there's a big issue about LCDs as replacements for CRTs (bar specialty ones being made):

The fact that most of the 80s arcade monitors were 4:3 ratio.

Sure - you can get 4:3 ratio LCD monitors today - but they are becoming harder to find. Furthermore, good luck finding them in a larger size than about 17 inches (19 inches, maybe). They never made them larger, from what I remember. The larger LCD screens were all widescreen (16:9?) - and still are. Even most of the smaller LCDs are wide format.

And you can't just stick a wide format LCD into an arcade cabinet and add "borders" - because there likely isn't room. So at best, you'll get an LCD stuck in, with a matte board around it, and the display smaller than the original. It'll look like crap, to be honest - and that's before the look of the display on LCD vs CRT...


More likely you'll have a 16:9 projector with black borders projecting unto flat black side walls, resulting in a full size back projected image.

The CRT people like grainy low res distortion so projecting onto the back of a purposefully bent and distorted piece of plastic should appear extremely similar to a CRT.

Of course this turns the problem of a CRT with finite filament life into a problem of a projector with a finite bulb life but presumably it would be easy to swap out the entire projector.

With considerable plywood hacking you MIGHT pull off attractive forward projection onto a screen for some games (not many of course) Or if you allow really ugly plywood hacking you can just place a projector screen in place of the CRT, ceiling mount the projector, and call it good.


>The fact that most of the 80s arcade monitors were 4:3 ratio.

A friend of mine rebuilt an arcade cabinet recently and his solution was to just rotate the display 90 degrees and offset it a bit to get it to fit. It might not be a valid solution for mini-cabinets since they were pretty tightly packed but according to him it was a pretty easy fit in this one.

The bigger problems were LCD processing latency and scaling. Most TVs that would fit were cheaper models (I guess it's getting harder to find sub 50" mid-range TVs) so they have a decent bit of processing latency even in "gaming" mode. But I don't think there's really a solution to the scaling problem without much higher resolution LCD panels because non-integer scaling is always going to have some shimmer when scrolling.


> Furthermore, good luck finding them in a larger size than about 17 inches (19 inches, maybe). They never made them larger, from what I remember.

I've got a 20.1" 1600x1200 4:3 LCD monitor; I'm pretty sure that there were bigger ones before widescreen started eating the world.


> Additionally, Ware explained that the refresh rate on an LCD may not play well with an old game’s code that is expecting a much more responsive CRT monitor. It could cause unsightly screen tearing that looks like one half of the screen is occasionally redrawing before the other.

Does this really happen with LCD arcade monitors? It happens with PC-based emulators because they usually can't make the OS/drivers/GPU/monitor switch to a refresh rate that matches the emulated hardware (which, despite the quoted "more responsive CRT" bit, is just as likely to be below 60 Hz as above it, and usually not by a large amount either way) but I don't see how CRT vs. LCD is a major factor if we're talking about monitors that are actually made for arcade machines.


Many games knew exactly where the beam was and played tricks on your eye. These don't translate well to LCD.


Bogost & Montfort's Racing the Beam (https://en.m.wikipedia.org/wiki/Racing_the_Beam) is about that very practice, though concerning the Atari 2600 in particular.


Loved that book! Loved all the behind-the-scenes stories and the deep-dive in to some of the tech issues (it was past my ability to comprehend it all, yet "not enough detail" for a friend of mine).


Are there many that actually depend on syncing up to the physical refresh and not just fiddling with registers at the right point in signal/pixel space? The only thing I can think of is some old-school light gun games, which involve a physical sensor pointed at the screen and thus depend on sufficiently high brightness and sufficiently low input lag (and so almost always fail on LCDs).


In addition to the other comments, some digital displays can only run at a fixed rate, and any input faster or slower than that rate gets juddery. I had an old DLP projector once (sadly, stolen) that had to be run at exactly 61.67Hz to match the color wheel speed.


Yes, this seemed unlikely to me as well. If the pixel clock is fixed, there's no other signal to display other than the one you'd see on a CRT... maybe people use some hardware that has some kind of framerate conversion step that introduces artefacts?


> “It’s just not gonna feel as nostalgic,” Ware tells GamesBeat when we asked him about the problems with the modern display technology. “The pixels will be sharper on an LCD, but they may not be 100 percent accurate. Colors won’t be quite as vibrant.”

The absolute loss of dead online games bothers me. I can't get too fussed about arcade games that are still 100% playable but with less nostalgic pixels.


I've never cared for most online games, although it bugs me on an intellectual level that they'll stop working. CRTs are connected to my childhood and early adulthood, and I get warm nostalgic feelings seeing one still in use. If I had space at home, I'd have a 30" CRT sitting next to my main LCD TV, to connect my older game consoles to. They work on the LCD. Then again, so does emulation on a PC. Not quite the same.


I have all my pre-HD systems (NES - Dreamcast) hooked to a round 32" CRT in my spare room the way I remember growing up.


I own a Japanese arcade machine, and while they are still somewhat common in Japan, and the US, it was not easy to find in Canada.

The nostalgic feeling and experience of playing not only on a CRT, but actual machine from the 1990's can't be replicated on an LCD.

Static on the screen, the hum of the monitor, the flickering of the colors, the look of the glare on the surrounding arcades, the sound of coins dropping down the chute, are the little things I bought my arcade cabinet for.

I dread the day that my monitor fails. As it'll mean the death of this experience for me without traveling to another country.


I own a few of pro broadcast monitors for consoles. I would really like some contingency when they go belly up.


You can see what one custom arcade builder does with their LCDs here, correcting aspect ratio and adding shaders: http://www.paradoxarcades.com/monitor/

They have light guns too (it uses a Wii-like infrared tracker): http://www.paradoxarcades.com/lightguns/

Nice emulations, but nothing is going to match the latency of a board hooked up directly to electron guns.


So playing a classic arcade game will look different than it did back then.

But if you think about it - if the screens we have today would have been available back then, what do you think would the game designers have preferred?

Also, screens will continue to get better. Gaming on [an arcade machine with] a HDR OLED screen could be amazing.


>> But if you think about it - if the screens we have today would have been available back then, what do you think would the game designers have preferred?

The best example of design-to-the-CRT by far is the original Star Wars. For starters it's a vector game - electron beam is deflected in lines to directly draw the objects. Vector monitors start with a very distinct look. Then they went further. For the enemy bullets (snowflakes or whatever you call them) they overdrive the electron gun which causes the tube voltage to drop, which in turn defocuses the beam and creates a wide-soft vector for those objects. And finally when you lose a shield (get hit) they draw a very much oversized square so far off screen that the electrons scatter off the back of the tube and flood the entire screen with a haze or fog as you take the hit. Oh, and the death star explosion also has a bit of a unique look as it involves more vectors than the hardware can quite keep up with.

That game was designed for the medium and is the pinnacle of genera. There is no substitute.


I agree that there are games that were made for CRTs. Asteroids is another great example.

However, most games do not take advantage of being displayed on a CRT.

Do you think it's better to play Donkey Kong or Galaga on a CRT vs a LCD? If so, why? Just nostalgia?


Of course the game designers would have preferred the highest-resolution display available.

Unfortunately, the hardware of the time could barely keep up with the graphics. If the hardware wasn't generating graphics on the fly (like the early Namco, Nintendo, and Midway systems), it needed a large chunk of DRAM for the framebuffer, which was even more expensive.

Some systems like the early Williams units economized by using a 4bpp buffer combined with palette hardware, but you were still taking nearly 48K of RAM, which was a sea of chips back then.

A VGA-sized buffer on this system would need 150K of RAM. That's triple the cost! HD 1280p? 10x the cost not to mention pixel clock speed, which the TTL chips of the time just couldn't do.


the game designers would have obviously preferred CRT. they were designed for a specific usecase that todays screens don't handle well, and probably never will unless a completely new tech comes along.

Also, if you have played games a lot on CRT, you will find HDR OLED not amazing at all. HDR OLED makes mistakes between were things are in memory vs the screen. so you think 'that thing didnt get me' and then it does even though it was pixel(s) away.


Mistakes? You mean it introduces delays? Or something else?


You're right, eventually modern display technology will catch up to the point where it indistinguishably simulate CRT screens :)


It's only the best quality arcade CRTs that are going to become unavailable. The US still has warehouses full of worthless CRT televisions that are too expensive to recycle because of the lead content in the glass. http://www.nytimes.com/2013/03/19/us/disposal-of-older-monit...


I was under the impression that there are still a few small companies in China making CRTs --- a search of Alibaba shows a few sellers still advertising CRT TVs, but I'm not sure if those are NOS.


The arcade CRT's are different from TV CRT's, the voltages are a lot higher (I heard 10.000 volt somewhere) and therefore quite dangerous to handle even. This is also the reason the colors were much brighter in the arcade then on your home system connected to your TV.


The voltage is related to display size, a 17" monitor might run on 27kv while a 32" might run on 34kv, but at higher voltages more X-rays are emitted so most TVs have safety mechanisms to limit voltage.

As for the safety I'd expect anybody servicing arcades to know what they're doing.


All this is saying to me is hoard discarded CRT monitors in a storage locker for 3 years then sell them off for good money on ebay. Am I mistaken?


Have fun with the shipping costs on those...


I don't think it's reasonable either. The eBay market is strange: 1GB 50 pin scsi drives go for hundreds of dollars, 486 machines can go for $150 and ISA sound cards can be auctioned for hundreds of dollars (http://www.ebay.com/itm/Roland-SCC-1-ISA-Sound-Card-/1623420...).

Don't get me wrong, this is a total waste of time - but the market is probably there for someone who wants it.


Pass the cost of shipping to the buyer. I resold games in high school. I wasn't savvy enough to write scripts to search for specific games and their prices. I manually combed sites (mostly eBay) to find people/stores that were selling games a few days/weeks before the actual release date. I'd order a few a copies, add a small markup, charge flat-rate shipping (would cover most distances), and be on my way.

My point is that there's a market for these. The product I was selling wasn't scarce. My guess why people bought these games was that they were teens who couldn't get their parents to drive them an hour to the nearest store that carried games.


Do not give up on CRT arcade monitors. There is still one manufacturer left in this world making new CRT TV's today in India. Do not give up hope. If we get together, we will find a way to rebuild or manufacturer new CRT Arcade Monitors in The United States or Imported to the United States. CRT technology is not dead. Find ways to learn how the CRT works and do simple Cathode Ray Tube experiments using a glassblower and other tools. If people do this, eventually, we can make custom CRT's one by one for video game and arcade collectors. Yes, this way might cost alot, but there will be collectors willing to pay that price. Another way is to make CRT rebuilding equipment to rebuild CRT arcade monitors. Early Television Foundation and museum will be rebuilding a 1950s B&W CRT TV Tube in May. Please don't give up on CRT displays. Have hope that New CRT displays will come.


I'm not asking to troll or be annoying, but couldn't we conceivably have LCDs in the future that have really high refresh rates? I'm not saying this year or anything, but is there a technical limitation to the max refresh rate to LCD?


Do audiophiles watch movies in "tube" (CRT) screens? :)


Videophiles, and I gather they're more likely to use projectors with their vintage LaserDiscs.


Or for that matter, projectors with actual film.


If anyone can fix and maintain classic arcade machines, it would be these people:

Museum of Play - International Center for the History of Electronic Games http://www.museumofplay.org/about/icheg

If you are in the area, you should stop by. They have every cool toy possible from many many years back. Brought back many memories.

Plus you can play many old arcade or pinball games!


You can already see the rapid appreciation starting in the console CRT market. All the most desirable monitors have doubled or tripled in price over the past year. A 20" monitor easily sells for several hundred dollars, and the larger ones run into the thousands.

More and more people are starting to realize that emulation is no substitute for the real thing, and the supply of functional, high quality CRTs will only continue to decrease.


As a person who is more sensitive than average to sub-85Hz flicker of CRT and plasma screens, I'm happy to see these technologies dying.


I'm quite sensitive to sub-75Hz monitors - I could not stand 60Hz - but another side-effect of higher refresh rate is smoother desktop updates when scrolling or moving windows.

I recently upgraded my second monitor on my home PC to a 144Hz gaming version. The smoothness and responsiveness of UI on that monitor side by side with the standard 60Hz is really remarkable, I'd forgotten how big a difference it makes. The PC physically feels much more pleasant to use. OTOH, my partner doesn't notice the difference at all.


I'm more sensitive to the really high pitched eeeeeeee that comes from CRTs and their flyback transformers.


That sensitivity goes away "automatically" with age.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: