Hacker News new | past | comments | ask | show | jobs | submit login
Microprocessors Running on Air? (erik-engheim.medium.com)
134 points by nnx on Jan 9, 2021 | hide | past | favorite | 55 comments



Reminds me of a scene from Cryptonomicon that may or may not have been inspired by history. It describes the RAM from a fictionalized 1940s computer:

>The pipes are laid out horizontally, like a rank of organ pipes that has been knocked flat. Stuck into one end of each pipe is a little paper speaker ripped from an old radio.

>“The speaker plays a signal—a note—that resonates in the pipe, and creates a standing wave,” Waterhouse says. “That means that in some parts of the pipe, the air pressure is low, and in other parts it is high.” He is backing down the length of one of the pipes, making chopping motions with his hand. “These U-tubes are full of mercury.” He points to one of several U-shaped glass tubes that are plumbed into the bottom of the long pipe.

>[...]

>“If the air pressure in the organ pipe is high, it pushes the mercury down a little bit. If it’s low, it sucks the mercury up. I put an electrical contact into each U-tube—just a couple of wires separated by an air gap. If those wires are high and dry (like because high air pressure in the organ pipe is shoving the mercury down away from them), no current flows. But if they are immersed in the mercury (because low air pressure in the organ pipe is sucking the mercury up to cover them), then current flows between them, because mercury conducts electricity! So the U-tubes produce a set of binary digits that is like a picture of the standing wave—a graph of the harmonics that make up the musical note that is being played on the speaker. We feed that vector back to the oscillator circuit that is driving the speaker, so that the vector of bits keeps refreshing itself forever, unless the machine decides to write a new pattern of bits into it.”

Mechanical computing in general is fascinating. People also used to use precisely-machined drums and cams to perform complex realtime calculations.



What’s described above isn’t quite the same as delay line memory.

Delay line memory relies on the speed of sound through a medium. You put a signal in at one end, and it propagates “slowly” to the other, where you receive the signal, and immediately replay it at the start. Creating a kind of never ending echo.

What’s described above relies on a standing wave in a tube, with sensors along the length of the tube to detect nodes and anti-nodes. Then encoding data into that by changing the signal input to change the locations of the nodes and anti-nodes.

I don’t think such a system would actually work, because your stuck using only the harmonics of your tube as possible states, and there’s gonna be a pretty limited number that you could realistically produce.


You're correct- mercury delay lines read/write in a set order and you have to wait for your address to come back around in order to access it.

Standing wave memory would mostly defeat the point of having memory, since you'd need to address every single antinode individually. You might as well just have a bunch of latches. The real point of delay lines, shift registers, or core memory is to reduce the address space: you store bits in a way that is slower but simpler to access, which means you can store more things. Mercury standing waves would not make it any easier to store bits, so there's no advantage.

Still, you could make a device like that. Only 2/3rds of the tube can store memory- the rest is a quarter-wave transformer, which basically makes the tube act as if it was open at both ends. You can construct arbitrary patterns with fourier decompositions: https://en.wikipedia.org/wiki/Periodic_summation


> Standing wave memory would mostly defeat the point of having memory

This is a common theme in sci-fi. Authors tend to know enough to make something sound plausible; but it's pretty rare for their inventions to pass the sniff test. I'd say "they're not writing patents, after all" but Salvatore Pais took that away from me:

https://www.thedrive.com/the-war-zone/31798/the-secretive-in...


From what I've heard this was really sensitive to any changes in environment - temperature, vibration or possibly even air pressure.

All of which were likely to change where you were about to demonstrate your new mainframe to a delegation of important people! :-) IIRC it was even named "the general syndrome" in some places.


A visual representation of standing waves on air pressure along the tube. You could set up a similar mechanical ram with a temperature or light sensor on each fire hole along the tube

https://youtu.be/pWekXMZJ2zM


I saw an air pressure analog computer in Leeds, in the early 1980s. Programs were stored on giant plug boards the size of a wall map, about six inches thick, and it looked amazing. They used it for modelling air-conditioning and other things in an architecture school, fluid dynamics stuff.

I've also seen the liquid model of the economy.

I have a copy of Svoboda on 3 bar linkage computing in the MIT radiation labs series, published in the post war period. All the examples look to be flight and ballistics related, I guess these kinds of things were used for bomb and gun aiming, and radar.


Not quite the same but there’s a fascinating podcast about analogue computers on Omega Tau/Bernd Ulmann who runs the Analog Computer Museum near Wiesbaden, Germany.

http://omegataupodcast.net/159-analog-computers/


I'm thoroughly irked by blog posts like this. Fluidics and microfluidics are still an area of active research. Useless pondering that because of parallelism, fluidic based computers could be viable in practical application. The inference that because Amish people use air for tools, as do many others, that is somehow special. Articles like this smell to me of useless 'what if'. It is one thing to dream, but this steps past that and presents no real useful input.

I've enjoyed making microfluidics in the past, and hope to make more in the future, but 'hey do it in parallel' isn't useful, its obvious.


I think you have misinterpreted the intended audience. It seems to me it's clearly intended as "hey, did you know this was a thing?" for a passing yet curious layman.

It rather briefly covers the main basics questions one might have after hearing first being exposed to this idea.

What even is it? Computation. With fluid. Is it possible? Water jet example. Okay but seriously, is it practically possible? Parallelism, so sure. Is it useful? Venus.

I don't see how the main point of this blog post could possibly be parallelism. It's mentioned, but in about as much detail as all of the other topics I listed above.

The whole thing is also so simplistic that I don't see how it could possibly be targeted at people with actual knowledge - never mind experience - in the field.


I understand writing for an audience, yet at the same the article is painfully erratic in its contents. It reminds me of writing stories with my kid. We're going to the pool, by the way there's a dragon, and that's why I should stay up late tonight.


It didn't come across that way to me. I suspect familiarity with the topic is to blame here. As I mentioned, it basically went through and gave a short answer to all the first common questions you see in any discussion about a new technology.

What is it? Does it work? Does it actually work in practice? What do you do with it?

For somebody first exposed to the topic, that's kind of the basic starting point. The post kind of directly went from one answer to the next. As I was reading it the post felt pretty natural. But if you already know all that and are instead reading for more detailed knowledge, yeah I can see why that might seem particularly erratic. It was probably kind of like reading a Q&A without any of the Qs.


I agree with your sentiment. As someone who was previously unaware of fluidic-based computing I found the article interesting, informative and well-written. Indeed, it has inspired me to look further into the subject.


I know there's lots of research papers out there talking about it but do you know if today's commercial microfluidics "lab on chip" devices use fluidics logic gates or is this still something that is still confined to academia?


I'm not sure of the current state of industry in that area. I know minifab make a few things in that area, but the specifics escape me now, it's been a while since I saw them. I'm not aware of any lab on a chip devices that work outside of a regular lab.


There is a sense of romanticization of "analog" computing. With air/fluids, with all kinds of techniques. This is nothing new as Scott Aaronson points out. Usually, these ideas tend to be novel (cool), but almost like one of those new perpetual machine ideas we constantly keep hearing about.

Quoting Scott:

> It’s important to realize that the idea of solving NP-complete problems in polynomial time using an analog device is far from new: computer scientists discussed such ideas extensively in the 1960s and 1970s. Indeed, the whole point of my NP-complete Problems and Physical Reality paper was to survey the history of such attempts, and (hopefully!) to serve as a prophylactic against people making more such attempts without understanding the history. For computer scientists ultimately came to realize that all proposals along these lines simply “smuggle the exponentiality” somewhere that isn’t being explicitly considered, exactly like all proposals for perpetual-motion machines smuggle the entropy increase somewhere that isn’t being explicitly considered.

With the exception of Quantum Computers, but in limited cases, quoting Scott:

> (Incidentally, quantum computing is interesting precisely because, out of all “post-Extended-Church-Turing” computing proposals, it’s the only one for which we can’t articulate a clear physical reason why it won’t scale, analogous to the reasons given above for memcomputing. With quantum computing the tables are turned, with the skeptics forced to handwave about present-day practicalities, while the proponents wield the sharp steel of accepted physical law. But as readers of this blog well know, quantum computing doesn’t seem to promise the polynomial-time solution of NP-complete problems, only of more specialized problems.)

https://www.scottaaronson.com/blog/?p=2212

From listening to Scott for many years, I don't have the expertise in this area to say it definitively, but most likely, your cool new analog computing idea isn't going to break the RSA anytime soon ;)

Further reading: https://www.scottaaronson.com/democritus/lec14.html


The suggestion is not an analogue computer at all. Did you read the article? But a digital computer produced from a different medium. Microfluidics is already being used in a really serious way in engineering automation of most biomolecular tech. Especially the latest generations of DNA sequencing machines. There the advantage of fluidics is you can mix and match the macroscopic logic domain with biomolecular function in a liquid phase.


My comment was general, not to do with the article except for the title :)

I couldn't open it and asks for a login account with Medium.


There's also photonic circuits, and molecular electronics.


First, cracking RSA is not believed to be NP complete. Second, even if it were, an analog computer could in theory provide a large constant factor speed up that would break current key sizes. But still, it seems very unlikely that an analog computer breaks RSA.


Ha, this reminded me of Ted Chiang's short story "Exhalation" — great read!

https://en.wikipedia.org/wiki/Exhalation_(short_story)


> The human brain operates at a measly 30 Hz. Still the human brain outperforms almost every computer. It has been calculated that the human brain has a processing power of 6 peta flops. That is six million billion calculations per second. Which compares favorably to the worlds fastest super computer

This caught my eye. It seems entirely too simplistic to claim the brain has a "hz" value/directly compare to traditional computing, doing some research, but anyone more knowledgeable on this?


It is too simplistic, heck, even for CPUs the executions are often in parallel, out of order, whatnot, depending on the type of task that needs to be performed. The same is the case with the human brain.

I’d say that they probably took the simplest task possible and see how fast the human brain could do that and ended up with 30Hz. Seems like a very silly thing to me.


No it is based on this: https://en.wikipedia.org/wiki/Neuroscience_of_rhythm

The human brain obviously works in parallel but that doesn't mean that it doesn't have something akin to a clock frequency. There is a limit to how quickly signals can propagate through the brain.


Well you could assign an approximate Hz for the brain by looking at how fast a single neuron can fire, around 500Hz [0]. But this is wrong in two ways. While one neuron is "reloading", another could fire, as they operate out of synch. But also, one pulse from one neuron doesn't really convey that much information. It takes more than that to send a message. You could also look at brainwaves or say flicker fusion thresholds, which gives a value in the double digits.

[0] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5067378/


Also it seems that neurotransmission is not discrete/binary but more complex (normally distributed stateful thingy?)


An action potential is a discrete thing that either happens or doesn't happen. This is the dominant mode of neurotransmission. But yeah I do believe there are some other possibilities too, but they are less important.


But isn't the firing function a lot of per neuron state variables ?


It links to an article and wikipedia article which explains it:

https://en.wikipedia.org/wiki/Neuroscience_of_rhythm

https://patrickjuli.us/2016/04/06/what-is-the-processing-pow...

It is based on the fastest neurologic rhythms in the human brain.



Stanford have demonstrated a small water droplet based circuit: https://youtu.be/m5WodTppevo


Here's a water-based 3-bit adder: https://www.youtube.com/watch?v=6qP9HfUOCN4


I suppose efficiency could then be measured in CFM instead of TDP, with a fan rated to deliver a certain amount of CFM to power things.

Then in large server farms you'd basically need specialized HVAC engineers to design them, and admins with earplugs and goggles trained on the safety protocols of walking through 40mph wind tunnels through rows of server racks. "Server crash" could refer to an accident navigating the wind currents.


Tantalum wires, a less known altetnative to transistors: https://hackaday.com/2014/04/15/retrotechtacular-the-cryotro...


As I was reading this, I immediately thought back to the steampunk rovers article posted on HN not too long ago. Using mechanical or fluidics technology could allow us explore places we haven't had access to before.

My next thought was, what about random mechanical malfunctions? We face this with electronic devices too: electromagnetic interference, cosmic rays, etc. It's all about understanding the operating conditions where the device will be used and designing accordingly. It seems like most of the techniques we have learned in the electronics world have an analogous approach in mechanical computing.

It's fun to think about the possibilities, and will be interesting to watch how old tech like this could be used in the future.


Great article, but poorly written with quite a few grammatical, punctuation, and phrasing issues. Anyway the notion of using compressed air reminds me of an innovative pneumatic engine system that could be used to power all kinds of automobiles - but you can imagine the one huge drawback of exploding air tanks, which is probably why it went nowhere. Still, pneumatic tools are often preferred method for many construction professionals.


Compressed air is by far the most expensive form of power. Compressed air tools are used because they are fairly cheap and small yet high performance. (And in mining because they’re inherently Ex safe)


The author isn't a native english speaker.


How to make simple micro-fluidics using shrinky-dinks. They work pretty well for a simple DIY method.

https://hackaday.com/2019/04/23/making-microfluidics-simpler...


I've read somewhere in Russian internet from the guy who operated microfluidic devices for a long time, the main problem with them is that regardles how good are the air filters the logic bits get dirty all the time and cleaning them is hard. He also mentioned intense noise in the operating room.


One used example is a rover on the Venus because fluidics would keep working at the high temperatures. I really wonder, how one would implement cameras, radar, or lidar for navigation and the radio uplink for remote control without using electronics.


Two other apps for fluidics from the old days were nuclear reactors which are full of hyper-purified water and implantables in the body like maybe your internal insulin pump in the year 2050 would have an insulin dosing program written in fluid mechanics of blood powered by blood pressure.

If we could just find a problem for the fluidics solution... Kinda like lasers in 1960, it works so now what do we do with it?

The other interesting fluidics note is its been popular in hard sci fi for decades (well, for certain very small values of popular). The problem with fluidics is you need a pretty big digital computer to optimize the fluid mechanics and fancy digital computers to run the CNC machines to carve out the 3-d shapes. But if you could be VERY patient to do the calculations by hand and find a way to make the processors using 2-d photolithography maybe space aliens would have a fluidic technology.

Another interesting note is this scales by size and speed of sound in liquid so its always going to be slower than photonics or electronics. But, the computational power to do "stuff" seems to scale on a power law so the invisible hand of cheapness means your smart thermostat for a hydronic hot water heating system would inevitably be fluidic in nature given an infinite number of years of market pressure. Yeah, a "big easy to mess with by hand" system is as slow as a pocket calculator, but we sold the world a heck of a lot of pocket calculators over the decades and if you shrink the dimensions by a million and increase pressure modestly and you could run a fluidic cell phone, perhaps.


Sonar could probably be implemented using fluidics for navigation on Venus. Now a big problem is that it has traditionally been difficult to make fluidics operate at ultrasonic frequencies. There have been fluidic ultrasonic beam break sensors that worked on the principle of ultrasound disrupting a laminar jet to turbulence, but these can't 'directly' sense ultrasound. The response rate is also somewhat slow and not that sensitive. There are also means of heterodyning ultrasonic frequencies, but the method relies on a transitional jet so it's not reliable.

However, if you do the math it appears that fluidic amplifiers with dimensions that do not require semiconductor processing to make using a working fluid of helium should have flat gain to about 26 KHz, which is in the ultrasonic range. What really makes this work is the fact that helium becomes less viscous as it gets much hotter.

In fact power requirements for fluidic amplifiers may be about the same as their electronic counterparts(~0.2 mW per amplifier). In addition, laminar fluidic amplifiers have very low internal noise, meaning we should be able to amplify the fairly weak sonar return signals. A simple sonar sensor which reports back range should be possible, but a more interesting possibility is using acoustic metamaterials to do 'image recognition' to steer the rover away from obstacles.

A couple problems are how to couple sound from the fluidic circuits to Venus' atmosphere and keeping the helium contained. Having the entire circuit encased in metal and using metal would ensure that the leakage rate is insignificant. The issue is that making a reliable metal bellows pump could end up being a boondoggle. NASA tried to make an RTG that used a stirling engine rather than a Seebeck generator, but it stopped working reliably within a month or so, which may have been due to fatigue in the metal bellows. Another is it's difficult to determine sonar system performance due to the difficulty of calculating return strength


You would use magnetic logic. There are materials that can function as core memory, even at the temperatures of the surface, what is little known is that cores can also serve as logic units (NOR, NAND, etc) if they are given a suitable clock drive. This clock signal could be generated with an alternator driven by the wind on the surface.


All that is explained in the linked article if you read it: https://erik-engheim.medium.com/making-a-non-electric-rover-...

Fluidics can run amplifiers for acoustic signals, which you can use for transmission.

Seeing can be done like bats. Article gets into the details.


In one of my comp sci classes decades ago, I remember the discussion steering towards logic gates not having to be electronics, but with water pipes (as an example).

Edit: I wish I could edit the grammar and syntax mistakes.


Microprocessor implies micrometer scale. I highly doubt that valves like that would work if scaled down to that size.



Fluid computing within a simulated environment in modern computer will help me to build neural fluids.


I wonder if similar rules apply with microfluidics. The idea of solid-states have fascinated me.


Medium should be banned here. One of the biggest ironies in recent times is to see many self-appointed tech leaders posting their thoughts on a platform which offers nothing except nuisances. You would expect a technological-minded person to at least have a basic WordPress site or a simple html site.


If you don’t want to give away all your writing then that is not a good choice. It is not unreasonable for people spending time researching and writing an article to get compensated for that effort. Ads is one way of getting compensation. Another is to use something like medium. Medium also gives much better exposure than something like a personal blog.

As someone who has written personal blog for 20 years I can tell you medium has made a huge difference. I have switched to doing the bulk of my writing there. It gives better reach and allows you to get compensated for the effort.


I love the idea of writing a Navier Stokes simulation on a fluidic based processor.


wow the first picture makes me wonder how much of that was in the mind of EE at the time triodes, tubes and transistors were invented.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: