Hacker News new | past | comments | ask | show | jobs | submit login
Computers Built to Last (datagubbe.se)
123 points by ecliptik on Jan 3, 2022 | hide | past | favorite | 54 comments



Raspberry 4 is probably the most likely candidate right now.

Because you need a user base to make the software carry itself.

Also from a power point of view 7W is pretty much what we have to play with if everyone is going to have one powered on permanently.

Get a good case for it so that you can passively cool it and you're good to go for atleast 100 years if the heatpaste doesn't dry up:

http://move.rupy.se/file/pi_4.jpg

I recently found this case, but it's still in the mail:

https://www.amazon.com/dp/B07Z6FYHCH


I'm afraid that 100 years are not achievable in a guaranteed way because of the tin whiskers phenomenon [1].

Maybe covering the solder surfaces with thick, hard resin could help. (So goes repairability.)

[1]: https://www.microcontrollertips.com/when-tin-whiskers-sponta...


While I agree with the sentiment... especially for ultra low cost solutions, the availability of Pb-Free HASL, Organic coatings, and NiAu plating make Sn whiskers much less of an issue than they were 10+ years ago. Through-hole components again might be lowest cost tin coating, but that's also easily fixed. Incremental 10% cost increases can dramatically improve long term and high temp reliability. Most SMT parts are pretty safe, because the suppliers are afraid of contaminating their supply chain.

Even for commodity parts (Cell phones and Laptops) PCBs and components are built to prevent problems with Sn. Certainly, in Automotive it's a big deal!


So then just use a brush and rub the PCB every 5 years? I mean they can only grow through air right? My 35 years old breadbins show no sign of these, maybe they have more lead in them... time will tell! How ironic if the old breadbins outlive modern hardware!


The "Unbrickable" nature of Pi family of computers make them very enticing. Pi 4 w/ 8GB of RAM is surprisingly well rounded too (I don't need that amount of RAM all day long, but when you need it, it's there and that's nice).

A good thermal paste (Arctic Cooling MX series, for example) is good for at least 10 years, and it has almost infinite shelf life, so renewing paste every decade is not bad.

Another viable case is Cooler Master's Pi Case 40 [0].

The nice thing is, it's protected from dust, and lower tray files are open, so you can modify the design and reprint if you want to adapt it to a previous or next generation Pi compatible board.

It also has a hardware button, which you can program.

[0]: https://www.coolermaster.com/catalog/cases/raspberry-pi/pi-c...


Is gallium foil sufficient?


Using gallium as a heat transfer agent is very a very poor choice, considering it has a melting point of ~30 degrees Celsius, and a bog standard Raspberry Pi runs around 40-60 degrees Celsius range, depending on the load.

I'm using Arctic Cooling's MX series thermal pastes for over a decade, and my eight year old MX-2 installation works as good as day one. It's taking heat away from an Intel i7 3770K, which is not a cool chip, esp. under load, and mine has seen some hotter days under constant load (with a scientific code which is designed to get as much performance possible from a processor, no less).

I have a couple of these Cooler Master cases too, and their out of the box pastes are doing a good job for now. So, I'd rather not fix something if it's not broken. If the temps start to climb, I already have more than enough spare of the paste I use in my systems.


Definitely, thermal paste is the answer for any practical application.

Ah you’re right. I was thinking of experiments with indium foil I had seen[1]. Plus gallium is used in the pastes apparently.

So if the goal is a very long term thermal interface material, I don’t think pastes are the best. A homogenous solid is likely to last longer.

Indium foil fits the bill nicely. Also thermal pads or graphite pads, but I’m not sure how long they last.

[1] https://www.reddit.com/r/buildapc/comments/99fq4y/experiment...


> So if the goal is a very long term thermal interface material, I don’t think pastes are the best. A homogenous solid is likely to last longer.

Arctic Cooling states that MX-5 is guaranteed to keep its performance its for 8 years [0]. Considering this stuff is made to interface with 200W+ CPUs, cooking it dry with a Raspberry Pi is very unlikely.

As I said before, I'm cooking MX-2 in two different systems (Q6600 & i7 3770K) for 8 years, and they didn't lose their transfer performance a bit.

From my experience, thermal pads and solid interfaces do not transfer with the same efficiency and cannot absorb heat spikes as well as a good thermal paste. I run my OrangePi Zero with a thermal pad interface to a small heat-sink and that thing runs uncomfortably hot. I can't transfer data faster than 40mbps for a long time or run CPU intensive stuff, because it gets hot, fast. Its form factor and case doesn't allow for a more robust solution either.

Of course the choice is yours, but in my experience, when compared to older paste, or to ordinary white stuff, current generation stuff is leaps and bounds ahead. Especially, Arctic Cooling.

[0]: https://www.arctic.de/en/MX-5/ACTCP00047A


Thermal pads are way thicker and typically higher resistance - even good and thin ones are worse than standard paste and basically useless over a dozen watts (depending on surface area * temperature difference).

OTOH, used properly, liquid (liquefying) metal should have a better transfer efficiency than paste.

If you have issues, you can also try lapping the surfaces.


The sinks I use have 3M integrated pads. They might not be the highest performing ones, but they're pretty decent for their size.

As I've said, it's attached to a OrangePi Zero which has a very small, epoxy packaged CPU. So tension-mounting, lapping, or using higher end compounds are out of question.

The whole point of OrangePi Zero is being small all over, so it's happy temp-wise with its normal load, I'm not planning to work on it further for now.

Raspberry Pi 4 is different and has a whole-body heatsink, and its paste is good for now. If it starts to lose cooling capacity, that'll be a different matter all-over.


Strongly agreed! I can highly recommend the flirc case. (Amazon B07WG4DW52)

The entire metal case becomes the heatsink for the processor, and no fan is needed. It's strong and sturdy.


I just had a UPS pop on me yesterday during a storm. The least reliable part of any computer is the power supply. I doubt any wall wart will last 100 years. Certainly no SD card will. If you're accepting maintenance in your scenario then other options exist.


It's a bet, I think my 4GB Panasonic SLC and 1TB SanDisk MLC SD cards will last 100 years with my MMO server on the small and database on the large. Our children will see how that works out! ;)

For PSU I use overdimensioned Mean Well that should last 50 years atleast.

You always need a window for failure, redundancy is your only friend in this case!


Now if the storage was reliable... Have had too many issues on that side.


Only use consumer SanDisc or industrial Panasonic SLC (no longer manufactured) SD cards, all other brands are scams.

The SD cards have peaked at 1TB but they are getting cheaper. (so far)

Only use Intel SATA SSDs, I bought X25-E SLC drives from 2011 to get that 65nm 100.000 writes per bit.

But yes, media will have to be replaced, as a whole with open-source software and distributed databases everything will be saved.


Now this leads to interesting question about media. Will SD interface be superseded for something else? Or might everything be in cloud or something.

If the hardware works, but you can't get software there anymore how usable will it be.


No, SD cards use flash memory which is based on quantum flips. There wont be anything substantially better per GB/W/durability/$ than 65nm SLC just like 28nm is pretty much the peak of transistor size advantages.

The lower you go from 65nm the brittler it becomes, also smaller and that's how the flash MLC has reached 1TB on the same surface (micro SD) that only allowed 8GB SLC.

My X25-E drives are only 64GB!!!

Photon/Quantum processors will never match electrical in Gflops/W/durability/$.

Magnetic drives are not an option because of energy consumption.

Also with peak hydrocarbons we don't have the energy to keep growing development/building of new things:

So yes your concern is valid, what happens if SD/SSD media explodes in price and eventually is unavailable?

My answer is buy the media you can right now and plan for it's use, charge ALOT for writes! Don't log anything. And try to get SLC while it's still available to buy:

I have calculated that for the 1 million customers I plan to have (I will cap my user base, but since the software in open-source others will hopefully fill the gaps if the product is successful) I have enough drive durability to store and update their records once per day for 500 years.

Set a limit to what you want to do.

Apply the same to monitors, keyboards, mouses, power supplies, cases (passively cooled), batteries (lead-acid), routers, switches, cables etc.

If you play it right the glass in the fiber strand will need replacing before your hardware stack does.

And it probably will get replaced for high latency / low bandwidth messages by P2P radio: http://radiomesh.org


I once soldered wires directly to the pads on a microSD to SD card adapter and, through a level shifter and managed to use my Arduino Uno to save text (GPS readouts) on microSD cards. I think even if more advanced storage comes along, SD cards will still be desirable for the simplicity and backwards compatibility.


FeRAM may be viable for long-term rom, ram, and persistent storage. The usage contract should ensure it is recharged periodically. A plug-board could allow the user to manually populate the roms from paper if it lost charge.


I wish there was a no-compromise passive case for the Raspberry Pi 4 that provided the silent cooing capability as well as add-ons such as a battery-backed RTC and PoE.


A little over a week later and it seems something has been released that nearly meets these two requirements: https://www.cnx-software.com/2022/01/13/raspberry-pi-cm4-nan...


I was pushing PoE until I found that a simple 5,5mm plug splitter to USB (micro/C) adapter will do the job with less complexity!

http://move.rupy.se/file/final_pi_2_4_hybrid.png


Assuming your capacitors will last 100 years. Chances are you'll have to replace them at least every 50.


The pi has no electrolytic capacitors, again my C64 breadbins from 83-85 still have their original ones and work fine.


Oh neat! That certainly improves the odds.


We don't think enough about sustainability, about history. The Amiga uniquely exists as history by itself and lives on as a platform that eschews multi-gigabyte, multi-gigahertz requirements, yet can do amazingly modern things.

The hardware is wonderfully documented and easy to repair. My Amiga has been running for a quarter of a century now.

http://lilith.zia.io/


A lot of hardware had exceptional documentation in the past. I have extremely high end equipment from the 80s and almost all of it came with a booklet with a copy of a complete hand drawn schematic including part numbers, test points, procedures for repairing and disassembling...

And then it all stopped, seemingly because building equipment became cheap enough to do that they started worrying about duplicates. Now you end up with errors that you'd never get if people were still able to learn how nice stuff works, by making it dangerous to publish the designs publicly now no one has detailed implementation references. Sure, things like the RPi and Arduino exist so you can find open hardware but it's nothing like when almost everything was open by default.

I wish things had gone down differently but things made post 2000 I end up throwing out far more often than stuff from the 1980s. Some of that is selection bias but a lot of it is having access to detailed repair manuals.


> The Amiga uniquely exists as history by itself and lives on as a platform that eschews multi-gigabyte, multi-gigahertz requirements, yet can do amazingly modern things.

Hm? Don't 386s do this too? And BeagleBones? MSP430s?


I've got a 1999 Powerbook G3 open that I'm making some music on right now. Some of the processes take a long time so I keep my X200 (13 years!) next to me as well to have something to do in between.

1999 isn't all of a quarter of a century, but there's a Quadra in the other room which will be 30 this year and also still sees some use.

It's not that the new stuff isn't nice, but I still haven't managed to exhaust the potential of stuff from 20 years ago for my music at least. So I can just stay off the upgrade cycle, which feels quite good.

For non-music, save for TLS, Hackernews could of course also work well on old hardware. Most websites you could also get the functionality, if not the exact experience. But alas, that's not the way the world went, and although a ten-year-old PC in 2022 is a lot more "current" than a 10-year-old PC was in 2012, it still seems like we will keep being dragged forwards by developers wanting more shiny things and hardware manufacturers who want to sell hardware.


So, it runs a pretty recent version of NetBSD? That's quite impressive. Isn't that sacrilege, though, not to be running AmigaOS?

I did see a YouTube vid last year of an Amiga booting up what I presume to be an old version of Debian. It got there - EVENTUALLY - at least proving that AmigaOS was a vastly more efficient of OS (over Debian, at least).


Given that AmigaOS team was actually inspired by UNIX and that was their original goal, maybe it isn't as much of sacriledge, although I think it was certainly good that in the end they did their own thing instead.

https://en.wikipedia.org/wiki/Amiga_Unix


The computers I own are from the 60s onward. The 70s and after 2005 (somewhat) are the most unreliable. All my 60s, 80s, 90s systems still work provided they are not PCs. The PCs from the 80s still work, but 90s and upward are quite hit or miss; HP servers, which can be seen as PCs I guess still work but desktops and laptops are broken. My favorites are msx-2(+), msx turbo r, atari st, Amiga and Sun E450 and UltraSPARC stations. They all seem unbreakable. Or maybe I am just lucky; I never had leaky capacitor issues yet that plagues many 80s ones. The one from the 60s is, obviously, over 50 years now but although called a computer, it cannot win consistently at tic-tac-toe as it is, as the memory Is wired and a few bytes; at least the 70s and 80s ones are usable somewhat (at least for games).


Funny you should mention an unreliable tic-tac-toe, I've been reading the book "Racing the Beam" and it mentions the 1956 relay-logic computer "Relay Moe", which, despite being constructed of little more than 90 relays and a spinning drum, touted a "variable intelligence" which allowed the operator to choose how often the machine makes a mistake, and rotates the board between games so its repetitive strategy is not immediately obvious. Quoting the book (p39):

> Playing against a perfect tic-tac-toe competitor, and playing well, can only ever result in a draw. This may be computationally impressive the first time, but it isn't very fun. Opponents, computer or human, become interesting when they make mistakes--or more accurately, when it becomes clear that they might make mistakes under certain circumstances. Such mistakes highlight weaknesses, which players can exploit as part of a strategy.

The book goes onto explain how Pong on the Atari 2600 achieves a similar level of unreliability, since it would be trivial for the computer to follow the y-position of the ball and never lose. Every 8 frames, the computer neglects to track the ball, drifting behind the position, but recovering its position when the ball ricochets off the top or bottom of the screen. This way the "AI" of Pong occasionally makes mistakes, in order not to be too harsh to the human's ego.

Thorough breakdown of Relay-Moe's logic with photos is here: (3.5MB PDF) https://www.vintagecomputer.net/cisc367/Radio%20Electronics%...


Racing the beam is a lovely book by the way. We'll worth the read.


>My platform of choice is instead the Amiga 1200

I was honestly expecting him to mention vintage (ie. IBM/pre-lenovo) thinkpads. Those have a cult following behind them, supposedly for their serviceability, durability, and features (nice keyboards/trackpoint). They're certainly more useful and easier to repair than the Amiga. Speaking of which, can something really be considered easy to repair if replacement parts have to be custom manufactured by "hobbyists and small scale businesses", and parts need to be reimplemented via FPGAs? At that point you're probably creating more e-waste than using a raspberry pi or something.


Although I had a friend with an Amiga growing up, I don't remember ever personally using one. Every screenshot or video I've seen since though makes it look like the aspect ratio was very weird and everything was "too wide" - was that actually the case, or is it an artefact of displaying on modern displays etc?

For example: [1] - I'd expect the clock to be circular instead of oval?

[1]: https://upload.wikimedia.org/wikipedia/commons/f/f4/Amiga_Wo...


The classic AmigaOS NTSC resolution was 640x200, and PAL was 640x256, both displayed as 4:3, so they look very cramped and squished on modern square-pixel displays. It's not difficult to double the height to get a 640x400 or 640x512 image, which is a lot closer to the intended appearance on modern screens, but still not quite right.

Later versions of AmigaOS supported higher resolutions, including square-pixel resolutions, but a lot of the UI artwork was still designed with lower-resolutions in mind so it still looks weird.

640x200, doubled to 640x400, but the clock still looks weird: http://toastytech.com/guis/amiga1apps.png

640x512, with a reasonable-looking circle at the top, but the icons in the middle look ridiculous: http://toastytech.com/guis/amiga2bounce.png

640x480 (square pixels!) with a nice clock, a nice colour wheel, and nice icons, but the title-bar buttons and the keyboard shortcuts in the menu look weird: http://toastytech.com/guis/amiga35utils.png


Aha, that makes sense - thanks for the detailed answer. I guess the same thing must afflict screenshots of machines I used, but none had the GUI of the Amiga so the effect is not so pronounced.


The HP 35 calculator was introduced in 1972. The early models aren't quite fifty years old yet, but will be later this year. Many (most?) of these still work just fine from the mains adaptor or once you replace the batteries.


I "repaired" one of those a couple months ago, found a set of batteries (intended as spare for an old model cordless phone) and used a tiny DC-DC converter to use a USB charger, the tricky part was to re-tin in a few places the PCB and re-solder one of the power cables that came off due to oxidation.

It came out as new, the only issue is one of the display segmemts on fifth number (from left) not lighting up, luckily it is the right bottom one, so it is still "readable".

My HP 28-C (circa 1987) is still going strong, and it is 34 years old by now.


Relative of mine bought a solar powered calculator back in the 70s. I plan on leaving it to my grandchildren.


The LCD might degrade, and you should probably replace the power capacitor.


Same article, posted a few hours earlier, discussed here: https://news.ycombinator.com/item?id=29773407


I'm building a series of computers designed to last for decades. They use FPGAs and run Linux. The first model will be ready later this year.

https://machdyne.com


Even considering just 2 or 3 decades, the problem with a computer built to last is that after long enough, its capabilities are outdated. Things have improved a little in the last decade: using a 10 year old GOOD computer is still comfortable if you don't need more than one vm or is patient enough to wait for things to compile, but still not a great experience.

Another thing a computer built to last needs are blob-free mainline drivers. It is sad to have a hardware you just can't use with a better kernel because it is not supported.


> In order to become relevant for the next 50 years, the computer needs to be made of easily replaceable parts. Inspirations are the Fairphone and the MNT Reform laptop.

Interesting. I knew about the Framework laptop but the Reform seems to go quite farther in being usable and repairable for a long time. Bulky and less advanced, but I like how one can just buy spare battery cells for cheap on Amazon or Aliexpress.

edit: https://technomancy.us/195


PowerPC Macs still running. I might have to install a more modern OS before 2038.


I still love my PPC Macs, mostly the G4s. I have a hard time doing anything useful with my G3s any more and the G5s are too hot and power hungry for me to be comfortable running them for long periods.

I keep a G4 iBook, iMac, and a couple of Minis around as backup computers because they still just work. I've been thinking about switching some of them to Linux/BSD now that TenFourFox is dead, but they're still useful for a lot of tasks on Leopard. My iBook definitely plays DVDs much more conveniently than most Macs of the last decade.


Got an Amiga 1200 in the garage right now.


My 16GB 2019 MacBook Pro is slow as F on the lastest OS and feels awfully slower than an cheap 3 year old Dell Windows laptop


If you're really going for "built to last," think beyond binary computing: https://en.wikipedia.org/wiki/Ternary_computer

Also, use whatever tech is used in deep space probes.


Why beyond binary? What ternary could offer to make it last longer?


I like balanced ternary a lot, but there is nothing longer-lasting about it.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: