Hacker News new | past | comments | ask | show | jobs | submit login
Vladimir Lukyanov's hydraulic computer (pruned.blogspot.com)
317 points by bogidon on Nov 25, 2018 | hide | past | favorite | 72 comments




In the 1980s people in chemistry labs that needed to find the area beneath a curve used to print/plot the curve, cut out the area and weigh it. This worked quite well as they had accurate scales.


This reminds me of how the geographical center of the US was once found by balancing a cardboard cutout [1]; an accompanying document then explains the limitations and caveats of this method and of other methods which produce different results, while musing about the discrepancy between the ambiguity of the ask and the public's expectations of producing a clear and reproducible answer.

[1] (PDF) https://www.ngs.noaa.gov/PUBS_LIB/GeoCenter_USA1.pdf


There was one experiment in physics lab course (undergraduate level) just ten years ago in which we also had to integrate like this. Not because we lacked better methods but to have experienced that it can be done this way.


Let's face it, that had that lab set up from before and didn't want to update it :D


I think it would actually have been easier to use current equipment than maintaining older but it was really about doing the work manually. Imagine it like having to learn to do mental arithmetics before being allowed to use a calculator. And so we got to write lists with data points or plot them with an X-Y plotter instead of automated collection of measurements. We hated it at times (noting a single value every 5 minutes is way more annoying that it might sound! The interval is too long to keep one busy and too short to do something meaningful in between) but in retrospect was the right way to learn to do experiments, in my opinion.


More generally, there was a dedicated instrument to calculate areas of a map or drawing, the planimeter, widely used before the advent of (affordable) computing:

https://en.wikipedia.org/wiki/Planimeter

https://news.ycombinator.com/item?id=16032930


If they had a plot of the function in a digital format that could be printed what stopped them from doing numerical integration?


It was not invented yet, the "Mathematical Model for the Determination of Total Area Under Glucose Tolerance and Other Metabolic Curves" paper came out in 1994... https://fliptomato.wordpress.com/2007/03/19/medical-research...


Original paper, which appeared in Diabetes Care.

https://math.berkeley.edu/~ehallman/math1B/TaisMethod.pdf

Reading it, it's quite hard to believe it wasn't a joke, but it doesn't seem to be. It thanks R. Kuc of Yale (a professor of electrical engineering) for his expert review.

Google Scholar says it has 336 citations, most of which appear to be non-ironic[0].

The original paper was immediately followed (i.e. in the same issue) by two corrective articles[1], one titled "Tai's Formula Is the Trapezoidal Rule". So why did they publish it at all? This page discusses that question:

https://academia.stackexchange.com/questions/9602/rediscover...

..and links to this, which talks about how wheels are reinvented all the time.

https://academia.stackexchange.com/questions/7360/how-common...

[0] http://scholar.google.com/scholar?cites=18129095207210817294...

[1] http://libgen.io/scimag/ads.php?doi=10.2337%2Fdiacare.17.10....


Before following your link, I was getting all ready to point you to Wikipedia's page on Simpson's rule (invented sometime in the 18th century).

Funny anecdote, serves as a reminder to not take all academic literature as gospel.


I hope that's a joke. The most basic numeric integration method is so simple I don't think it needed to be invented.


> The most basic numeric integration method is so simple I don't think it needed to be invented.

Exactly, the individual who came up with that paper shows a complete lack of imagination, not merely ignorance - how someone (scientist or not) lacks the ability to see the likelihood of such a simple method having been discovered before is quite puzzling to me... This happens to be closely related to what I was doing recently: implementing a very simple little tool to integrate raw data, (I don't know calculus well at all) but here is my thought process (and probably most peoples):

1. I just want basic integration of raw data, something simple no interpolation at this stage.

2. hmm just draw a polyline through the points - how do I find the area it outlines (figure that bit out quickly it's just basic trig then some simplification, all very intuitive).

3. Ok that's how I find the area, I wonder what this is called it's too simple to not have been discovered hundreds of years ago... goes and does some searching to find matching definition.

... What kind of person gets to step 3 and marvels at their re-invention of the trapezoidal rule and goes straight to publishing a paper? How is it they are "doing science"?


And how is it that the paper got reviewed and accepted for publication without somebody pointing out the obvious?

As a 1980s high school student my first thought of how I'd find the area under a curve with a computer was basically to pick random points and see if they were above or below the curve. At the time I had about one year of exposure to Apple II basic. My math teacher said "yeah that's the Monte Carlo technique."


It's not. It's serious and sad. I remember a news article from a few months ago that exactly mentioned (and made fun) of this paper.


It's a good example of why multidisciplinary research is essential though


When I was learning to use a chromatograph (barely a decade ago) the textbook still suggested the "plate stacking method" to figure out the AUC (area under the curve): basically see how many rectangles fit under the curve.


The user interface of scissors and scale is much faster than the human/computer user interface for numerical integration after fitting a smoothed curve to the measured data.

The human scissor operator, in theory, can exercise far better judgment at throwing out invalid data points than an algorithm, when fitting curves. (When I was measuring datapoint 15 I know I was distracted, no surprise its an outlier, I can safely disregard it completely, etc)


I think the concomitant risk is that a human scissor operator can also apply bias, sometimes unconsciously.


Ha! Very interesting. And the density of paper was uniform enough? Or maybe they used a different material?


I'm not sure if they ever did a scientific study on this, but you can choose your accuracy by playing with the scale of the plot.


You could always use a stack of paper to average out the error.


I don't have actual numbers but it's pretty uniform.


In a related vein, Ars Technica writer Sean Gallagher did a fantastic story on the US Navy's analog fire guidance computers a few years ago:

https://arstechnica.com/information-technology/2014/03/gears...

The instruction videos alone are worth the visit. Geekery of the first order.


reactive analog programming ;)

there's a remarkable feature of these old videos, they're so easy to follow. No matter the topic, analog gear-computers, car mechanics, wave principles & radio .. it's always fun yet quite precise. We've lost something here.


Incidentally, these were the computers in the early Heinlein books.

I was mulling the comment here a few days ago pointing out Heinlein's prescient depiction of networked computers, and thinking that the hard part would be having mechanical computers pick up the phone and sending electromagnetic pulses down the line. Perhaps pecking at telegraph keys ...


Ash the Android in Alien used a water based computer. Or maybe that was milk. It looks like he might have been had problems solving non-homogeneous differential equations, because the cream would rise to the top.

https://www.youtube.com/watch?v=VA8jv1M6Y2g


I always thought it was some kind of hydraulic fluid but I prefer your explanation. ;)


See also https://youtu.be/NAsM30MAHLg

A analog device for performing Fourier analysis.


These are absolutely mindblowing. Thanks for sharing!


I know of two excellent examples of this. One is an entertaining modern attempt to recreate economic models with a fluid computer: https://vimeo.com/131690448

The other is the fucking fantastic Bay Model, a 1:1000 physical model of the San Francisco Bay: https://en.wikipedia.org/wiki/U.S._Army_Corps_of_Engineers_B...

It was built in the 1950s to study the effects of various plans, including one proposal to divert all incoming rivers to "productive" use. It was eventually made obsolete by computer simulation, but it's still there. I bring a lot of my nerdy out-of-town visitors there; it's amazing to walk around on a 2-acre simulation.


Yes, first link is based on the MONIAC: https://en.wikipedia.org/wiki/MONIAC


Trickle-down economics


There's a little more info in this (somewhat wonderful) article: https://pruned.blogspot.com/2012/01/gardens-as-crypto-water-...


OK, we've changed to that from https://en.wikipedia.org/wiki/Water_integrator. Thanks!


> To better explain how it works, here is a description by Steven Strogatz of what I'm assuming is a comparable device. Built in 1949, nearly a decade and a half after Lukyonov's, it's called the Phillips machine, after its inventor, Bill Phillips.

In the front right corner, in a structure that resembles a large cupboard with a transparent front, stands a Rube Goldberg collection of tubes, tanks, valves, pumps and sluices. You could think of it as a hydraulic computer. Water flows through a series of clear pipes, mimicking the way that money flows through the economy. It lets you see (literally) what would happen if you lower tax rates or increase the money supply or whatever; just open a valve here or pull a lever there and the machine sloshes away, showing in real time how the water levels rise and fall in various tanks representing the growth in personal savings, tax revenue, and so on.

“It’s a network of dynamic feedback loops,” Strogatz further writes. “In this sense the Phillips machine foreshadowed one of the most central challenges in science today: the quest to decipher and control the complex, interconnected systems that pervade our lives.”


Video about the MONIAC here: https://www.youtube.com/watch?v=rVOhYROKeu4


> he water level in various chambers (with precision to fractions of a millimeter) represented stored numbers

I wonder how did they deal with thermal expansion. Maybe instead of using water at room temp they heated it to something higher and then a thermostat took care of it? But it still would be very hard to distribute heat evenly.


Thermodynamics lab tech was pretty advanced by that time.

The part that reads bogus to me is the fractions of a mm, water meniscus is sensitive to contamination and diameter, its "always" been easier to measure liquid masses to higher sig figs than to measure liquid volumes, I suspect the journalist filter turned mg into mm or the precision scale produced repeatable results equivalent to fractions of a mm in the container. The only solution I can come up with that would work in that era would be something weirdly optical involving mirrors and multiple floats.

I vaguely remember a quarter century ago in a chemistry lab placing a beaker of distilled room temp water in a very nice precision scale as a demonstration while the instructor had us calculate how long it would take the inch or so of water to evaporate based on the slowly decreasing mass, and the result during a dry winter was somewhere around a month. The room must have been sealed and 100% humidity because small fractions of a mm in height would represent in a very hand wavy way around a minute of evaporation if represented as time in normal winter lab air. I also wonder how volume of water changes as CO2 is absorbed or emitted by the water, even the smallest gas bubble could mess up fraction of a mm measurements.


If you squinted, you might consider the U.S. Army Corps of Engineers Bay Model a giant low tech special purpose water computer.

https://en.wikipedia.org/wiki/U.S._Army_Corps_of_Engineers_B...

>The U.S. Army Corps of Engineers Bay Model is a working hydraulic scale model of the San Francisco Bay and Sacramento-San Joaquin River Delta System. While the Bay Model is still operational, it is no longer used for scientific research but is instead open to the public alongside educational exhibits about Bay hydrology. The model is located in the Bay Model Visitor Center at 2100 Bridgeway Blvd. in Sausalito, California.


I believe now one could use microfluidics to leverage similar principles but miniaturize it almost into the size of a chip.

Here is an interesting demo:

https://youtu.be/7z8I7awRYY4?t=114


Hydraulic integrator would be a better translation.


Fix it!


t64b ballistic computer used to have one


Reminds me of the stone/water brains of the Quatzoli people in Ken Liu’s short story, "The Bookmaking Habits of Select Species".

http://www.lightspeedmagazine.com/fiction/the-bookmaking-hab...


I'm not sure how different this is to the ball integrator?[0] or Vannevar Bush's differential analyser?[1] The water computer article says partial differentiator, then integrator, it's unclear.

[0] https://en.wikipedia.org/wiki/Ball-and-disk_integrator

[1] https://en.wikipedia.org/wiki/Differential_analyser



This reminds me of "rod logic" computers: https://hackaday.com/2015/10/19/rod-logic-and-graphene-elusi...


This rather reminds me of the economic model in Terry Pratchett's "Making Money".


Fun fact, the device you're referring to was actually based on a different real-life water-based computer, the MONIAC[1]. It used basins of water, pipes, and adjustable flow regulators as a physical metaphor for the movement of money in macro-economics. I'm personally a really big fan, and I've need meaning to work up a simulation for quite some time now.

[1]: https://en.wikipedia.org/wiki/MONIAC


Awesome. I imagine a Water Computer Construction Set that allows you to "plumb" a computer program.


Similarly there was a story about a ~computer based on water surface tension to encode complex equations and let the physics approximate a solution in parallel. Can't find the name though.


Found an online link to one of the articles Dewdney wrote for Scientific American about "analog computers" (this is the second article, there was one a year earlier):

http://www.softouch.on.ca/kb/data/Scan-130202-0003.pdf


If I recall correctly, the world3 model which the authors of 'the limits to growth' used also utilized fluids to simulate the increasing, or diminishing influence of different factors on our ecosystem.


That sounds like something out of the Dune universe.


I think that would be forbidden* - a machine doesn't have to use semiconductors or even electrical current to be a machine.

* "Thou shalt not make a machine in the likeness of a human mind."


Wonder if they had cross-core security vulnerabilities?


Even better, is remote hacking of analog computer even possible?


It's not connected to anything, so probably not. However back when we used purpose-built electromechanical switching systems for telephones, people exploited them[0].

[0]https://en.m.wikipedia.org/wiki/Phreaking


If the phone system used water instead of electricity for in-band signaling, would phone phreaks have to inject blue fluid into the pipes to exploit them?


If water pipes were used for signalling, it would probably be as acoustic conduits (analogous to wires, which are electromagnetic wave conduits). You would exploit them in just about the same way as the electromechanical systems were.


Gives a whole new meaning to the term "wiretap".


You'd have to worry about information leakage.




Talk about floating point precision!


"Can someone mop up the memory leak?"


Reminds me of a discussion I heard:

Customer: the phone complains memory is full. Rep: ok there is probably a memory leak. Customer: no, it cannot be leaking since it is full!


No point, only float.


Wonder if it has a heat sink.


And if it's air or water cooled


Or, if moon's gravitational pull flips the bits.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: