Hacker News new | past | comments | ask | show | jobs | submit login
Computers should expose their internal workings as a 6th sense (interconnected.org)
158 points by tobr on Aug 29, 2021 | hide | past | favorite | 103 comments



The funniest demonstration that I watched was at the computer museum at the University of Stuttgart (it's just a single room, but it contains a lot of history!). The guide took an old, butchered radio that was reduced to a coil attached to a speaker and put on top of the front panel of a PDP-8. Then he started a Fortran compiler, which would take several seconds to complete. During that time, the radio made kind if hideous digital beeping noises from the CPU's EMV radiation that got picked up by the coil inside. You could easily learn to distinguish different compiler phases and tell whether the program made progress. The guide explained that this was a common way for operators back in the day to keep track of the jobs they were running while taking care of other tasks: were they still running? Did they get stuck? Did the job complete and is it time for the next one? Some inventive guys figured out that when you wrote certain instruction sequences, the EMV noise would become tonal and the pitch could even be tuned to some extent. That got them to write programs which would compute nonsense, but when you picked up the EMV emissions, you would hear music! The museum guide ran a few of these programs to our great amusement :).

I've yet to see this mentioned - or demonstrated - anywhere else.


During particle physics experiments, the number of reactions of interest that get detected is an important thing to monitor. If you stop measuring the reaction of interest, it might mean that a magnet drifted out of preferred tuning, that your data acquisition crashed, but the key thing is that something requires human intervention to fix. Frequently, the trigger signal, which has a short pulse anytime data are to be collected, would be sent into a speaker. You get a buzzing, not unlike a Geiger counter at high rates, which blends into the background noise, and that tells you how healthy the experiment is.

The funniest thing was seeing how everybody was paying attention to that buzzing at all times. You could have a dozen people talking about different aspects of the experiment, but if that buzzing drops out for a few seconds, every single conversation immediately stops. Usually it would come back after a few seconds and the conversations would resume, but it was fascinating to have visible proof that everybody was ready to drop their current work in order to get the experiment running again if anything happened.

Edit: I also heard tales of somebody who had trained themselves to wake up if the buzzing ever stopped. That way, they could take short naps during the night shift, while still being present and ready to resolve any issues that came up.


AM Radio Computer Music:

Conway's Game of LIFE in a DEC PDP-7 w/ Type 340 Display

https://www.youtube.com/watch?v=hB78NXH77s4&ab_channel=Livin...

Early computer graphics -LIFE - 4 Gosper Glider Guns on a DEC PDP-7 Type 340 display

https://www.youtube.com/watch?v=JhvOw7vW4iA&ab_channel=Livin...

DEC PDP-7 w/ Type 340 display running Munching Squares and Spirograph

https://www.youtube.com/watch?v=V4oRHv-Svwc&ab_channel=Livin...

Also PDP-7 related (but with more melodic music), here's a video remix I made of an early CAD system called PIXIE (with the first known implementation of pie menus, using a light pen) running on a PDP-7 with a type 340 display, networked with a Titan mainframe, at the University of Cambridge (one of the first network distributed graphics systems), set to music:

https://www.youtube.com/watch?v=jDrqR9XssJI&ab_channel=DonHo...


The GHC manual has this to say about the `-B` flag:

> -B > > Sound the bell at the start of each (major) garbage collection. > > Oddly enough, people really do use this option! Our pal in Durham (England), Paul Callaghan, writes: “Some people here use it for a variety of purposes—honestly!—e.g., confirmation that the code/machine is doing something, infinite loop detection, gauging cost of recently added code. Certain people can even tell what stage [the program] is in by the beep pattern. But the major use is for annoying others in the same office…”

https://downloads.haskell.org/~ghc/latest/docs/html/users_gu...


In case anybody is interested, the museum is http://www.computermuseum-stuttgart.de/


This reminds me of something really cool I saw a while ago but I can't find for the life of me, it was a bit of JavaScript that ran in the browser and could be used to send a signal to a nearby AM radio! I can't remember quite how it worked, but I tried it and it definitely did after turning off my monitor which apparently pumps out shit-tons of interference into the bottom end of the mediumwave band. I do remember that it was a demonstration of a security vulnerability for supposedly airgapped systems.


The first time I ever saw something like this was a program called Tempest for Eliza, which would generate patterns on a CRT screen that could be captured by an AM radio.

The website appears to still be available at:

http://www.erikyyy.de/tempest/



Not nearly as cool as your what you described, but I took advantage of the horrible coil whine on my Dell XPS 15 9560. The Intel CPU, Nvidia GPU, and Toshiba SSD all had different pitches of coil whine. Based on the pitch and volume, it was very easy to tell which component was being stressed :)


I can relate. Around the time I got this tour of the museum, I was working on a rendering algorithm that was slow and could occupy the GPU for seconds at a time. For some reason- whether it was a poorly stabilized power supply or EM radiation I do not know - I could hear pretty loud chirping noises when I had my headphones plugged into the onboard analog jack. It went so far that I could easily tell which part of the algorithm was currently running on the GPU and I could sometimes even count single iterations. This was very helpful because the screen was of course frozen. while the GPU was busy with my program.

This computer finally made me buy an external audio interface out of frustration. I went on to do some acoustics projects and I really needed cleaner audio for them.


I used exactly this phenomenon on my XPS13 to know when my Gentoo was in the "merge" phase of an ebuild (distinctive noise generated by copying a large amount of files).


This is probably one of the most relatable things I've ever read about that era of computing, it reminds me of setting up fun little git hooks and devops events to play sounds at various stages. Amazing.


I used to tune my shortwave radio to various frequencies emitted by my ZX Spectrum +2A. It had terrible shielding!

You could hear the noise/tone change with various different types of computation, and for some frequencies listen to the framebuffer scanout (I think) where the sound appeared to match the display changes. Definitely not in the UHF range of the actual signal though.


What we can do with EM radiation has always seemed so cool to me. It's just crazy how all this info is just flying through the air without us even noticing (without tools of course).


> without us even noticing

If only we had two receptors on the front of our face that were capable of detecting EM radiation with wavelengths between 380nm-700nm.


I don’t have a problem imagining that, but it would be amazing to have receptors that can detect EM radiation below and above 380-700 nm.


I think you know what I'm getting at though


I heard stories about this type of troubleshooting used in production from a colleague and friend about 40 years senior to me.


In the late 90's, the ops mgr at our studio hooked a network cable up to a little motor and attached a 5-6 foot long string to the motor shaft. He hung the motor-string thing in the corner of his office with the string dangling.

The motor made just a little bit of noise and the string would wiggle around indicating network activity. Soon he was able to know what was typical string movement and what was atypical frenetic motion that indicated a need to investigate.

He called it an "ambient interface" and said he had read about it somewhere.



Natalie Jeremijenko: LiveWire, Dangling String; Mark Weiser: Calm Technology, Ubiquitous Computing

https://en.wikipedia.org/wiki/Calm_technology

>Calm Technology

>History

>The phrase "calm technology" was first published in the article "Designing Calm Technology", written by Mark Weiser and John Seely Brown in 1995.[1] The concept had developed amongst researchers at the Xerox Palo Alto Research Center in addition to the concept of ubiquitous computing.[3]

>Weiser introduced the concept of calm technology by using the example of LiveWire or "Dangling String". It is an eight-foot (2.4 m) string connected to the mounted small electric motor in the ceiling. The motor is connected to a nearby Ethernet cable. When a bit of information flows through that Ethernet cable, it causes a twitch of the motor. The more the information flows, the motor runs faster, thus creating the string to dangle or whirl depending on how much network traffic is. It has aesthetic appeal; it provides a visualization of network traffic but without being obtrusive.[4]

[1] https://web.archive.org/web/20190508225438/https://www.karls...

[3] https://web.archive.org/web/20131214054651/http://ieeexplore...

PDF: http://www.cs.cmu.edu/~./jasonh/courses/ubicomp-sp2007/paper...

[4] https://web.archive.org/web/20110706212255/https://uwspace.u...

PDF: https://web.archive.org/web/20170810073340/https://uwspace.u...

>According to Weiser, LiveWire is primarily an aesthetic object, a work of art, which secondarily allows the user to know network traffic, while expending minimal effort. It assists the user by augmenting an office with information about network traffic. Essentially, it moves traffic information from a computer screen to the ‘real world’, where the user can acquire information from it without looking directly at it.

https://en.wikipedia.org/wiki/Natalie_Jeremijenko#Live_Wire_...

>Natalie Jeremijenko

>Live Wire (Dangling String), 1995

>In 1995,[9] as an artist-in-residence at Xerox PARC in Palo Alto, California under the guidance of Mark Weiser, she created an art installation made up of LED cables that lit up relative to the amount of internet traffic. The work is now seen as one of the first examples of ambient or "calm" technology.[10][11]

[9] https://web.archive.org/web/20110526023949/http://mediaartis...

[10] https://web.archive.org/web/20100701035651/http://iu.berkele...

>Weiser comments on Dangling String: "Created by artist Natalie Jeremijenko, the "Dangling String" is an 8 foot piece of plastic spaghetti that hangs from a small electric motor mounted in the ceiling. The motor is electrically connected to a nearby Ethernet cable, so that each bit of information that goes past causes a tiny twitch of the motor. A very busy network causes a madly whirling string with a characteristic noise; a quiet network causes only a small twitch every few seconds. Placed in an unused corner of a hallway, the long string is visible and audible from many offices without being obtrusive."

[11] https://web.archive.org/web/20120313074738/http://ipv6.com/a...

>Mark Weiser suggested the idea of enormous number of ubiquitous computers embedding into everything in our everyday life so that we use them anytime, anywhere without the knowledge of them. Today, ubiquitous computing is still at an early phase as it requires revolutionary software and hardware technologies.


And that has to be where he read it. Nice find.


That is fantastic.


Yes. And he later went back to school and became a user experience research scientist.


In the mid 90s, I worked with magneto-optical disk systems. The noises they made helped me (and others) diagnose their problems.

This type of "sixth sense" is also not limited to computers. When I worked in the aerospace industry, I heard a story about McDonnell Douglas replacing the F-15 cockpit fairing with a sleeker, fewer-piece version that reduced drag. Pilots found that without the noise from airflow over the metal joints, they didn't have as good a feel for speed and maneuvers.


Cars too. I learned to drive stick on a Honda Civic without a tachometer, but I learned to do without and tell what state the engine was in purely by its noise and vibrations.


I have a 90s pickup, manual, no tach. I use it to plow snow on my steep, long driveway. It has "shift up" light, but it's always wrong. Without the windows open I have no idea when to shift.


This was how I was taught to drive stick, though all the cars I learned in definitely had tachometers. Nobody even brought them up.


One fun example is the "malloc Geiger counter", which clicks whenever memory is allocated: https://www.youtube.com/watch?v=7vn6aGgLKfQ

(Previous HN discussion: https://news.ycombinator.com/item?id=24303832)


Cool! It doesn't really differentiate in terms of allocation volume (only frequency). It would be interesting if the tick rate was proportional to something like log(allocation rate) so you could get an idea if something really heavy is allocated.


I miss being able to know if my program is frozen because of CPU (fan noises), disk access (HDD noises), or network (silence).

I've thought of "reimplementing" that for troubleshooting my own projects. Imagine `clang --sounds`, so that it makes a sounds when

- Accessing disk.

- Sending something on the network.

- Allocating large buffers.

- Waiting for locks.

Or, alternatively, there's a constant background sound that is modulated on every function call depending on

- Call stack depth.

- If it's my code, library, or syscall.

- How long the last call took.

- Or just a unique modulation for each major function.

I think that's a nice, easy, and useful step before the more advanced applications mentioned in TFA.


This is why I'll never try to build a silent system. Coil whine + fans communicate quite alot, but also fade into the background. Something about the noise from the fan just modulating slightly makes it less intrusive for me than a sound coming out of silence.


It's like living in a noisy apartment or on a busy street. Yes noise does fade into the background but subconsciously it still affects concentration/sleep/mood/creativity/etc. You won't know what you're (not) missing until you try it without the noise.


Brendan Gregg did something similar for wifi signal strength with bpf.

https://www.brendangregg.com/blog/2019-12-22/bpf-theremin.ht...


This alludes to a concept I've been wrestling with: "hard" versus "soft" understanding.

A professional in _any_ industry will pick up a notable difference between extremely similar states. This article is expressing the fact that older computers were easier to read.

I'm convinced it's a product of sophistication. Distributed systems used to be an enterprise thing, but now everything is technically a distributed system. Memory registers are quantum scales more than they used to be. Drives now have no moving parts.

There are still ways to get an intuitive understanding, but they're...different, and certainly not audible. I've noticed that I can feel out I/O speeds when I'm power-using. I'm fairly convinced that many knowledge workers prefer a specific OS because this intuition pulling up false-positives in a new environment.


This is an important but seemingly understudied subsection of my field (human-robot interaction). From some well-cited work:

"Our goal is to enable robots to express their incapability, and to do so in a way that communicates both what they are trying to accomplish and why they are unable to accomplish it... Our user study supports that our approach automatically generates motions expressing incapability that communicate both what and why to end-users, and improve their overall perception of the robot and willingness to collaborate with it in the future."

I'm not as plugged into human-computer interaction work, but as a user, it seems like this is sorely missing and getting worse. I wish I could get a happy medium somewhere between a full stack trace and silent failure, e.g. when my iCloud documents won't sync.

[1] https://dl.acm.org/doi/abs/10.1145/3171221.3171276


I’m reminded of a programmer who would add click on the PC speaker inside function calls so he could listen to the timing of them being called.


I wrote a terminal emulator in Forth on my Apple ][ that had different sounding key clicks for different classes of keys. Upper and lower case letters had different tones, and the sequence of digits had rising tones, and certain control characters like return and backspace and escape and punctuation and space all had their own unique sounds, so you could hear what you were typing and know that you typed the right keys when you were typing ahead quickly on a slow 300 baud ARPA TIP connection.

Also each time it beeped the bell it would start at a higher and higher tone rising to a fixed pitch, each starting higher and lasting less time than the last, so a lot of bells in a row would ramp up in tone and shorten out to a high buzz, so they weren't so annoying. Then it would decay back down after you didn't receive any bells for a few seconds. It was inspired by the way of an excited guinea pig squeals for lettuce.

https://www.youtube.com/watch?v=5jfoxSeJzWo&ab_channel=It%27...

Also, the underline cursor floated up and down and up and down in the character cell, so it was very easy to see where it was, and it drew a wavy line in the phosphor as it moved across the screen!


At one point I wrote a program that would tail a log file and play very short samples of different engine noises for each line that matched the corresponding pattern. The idea was that if something changed about the running of the system, I'd hear the noise change.

It didn't work spectacularly well though, and I gave up on the idea.


That's a nice idea! That or a light would be more convenient than logging if it's triggering a lot.


I suspect that it's much easier to hear the audible pattern than to see the pattern of a blinking light.


the frequency range of pulsing audio is much higher than pulsing light thats for sure. sound was used a lot to calibrate analytical instruments in labs. throw a frequency divider a tiny amp and a tiny speaker and you could easily find if your frequency is stable, drifting...


Sure, I wasn't suggesting it was an improvement. I also assumed the actual pattern isn't important in a strict sense, just as an indication of when and frequency?

But of course, in general either could be the case. And perhaps don't want to wear earphones (or disturb colleagues) etc. only meant it as an additional similar idea.


Waterfall display might work though, or coloured sequences give each function it’s own colour then draw the last stack as a pattern of colours, humans are really good at spotting anomalies in patterns, probably because tree,tree,tree,wolf,tree was useful to our ancestors.


This was common in home computers that could run a routine at every VBL (Vertical Blanking) interrupt, often chosen as the main timing 'tick' interrupt. Change the overscan border color for each part of the subroutine, and the timing gets reflected quite directly in the colored fringes that are displayed as the program executes.


A colleague 3D printed a stop light that showed red when Xcode was compiling and green when it was done.


Was it me??


Yes :) I almost forwarded the comment to you lolz!


you can have both and put a sound and/or light trigger to the logs with a grep and a tiny script.


As computing environments mature, a lot of the instrumentation that used to be presented to users has been going away. I do think that's too bad. One of the first things I install on any new machine is a visible network/CPU/memory monitor.

Humans constantly monitor their environment, and tend to only become conscious of something when it is behaving differently than it had been. Think of long drives - unexpected motions, odd noises, etc. trigger conscious attention.

Computers, especially of the 'cloud' variety, lack the incidental physical environmental interactions that give us those, so intentionally building them in is required. (And because they're intentional and artificial, they're at risk of manipulation, something else to worry about.)


A previous PC I built had faulty grounding with the headphone jack and would constantly play static mixed in with whatever intentional analog signal was being played. When I played single player games certain patterns signaled a big boss fight was about to happen.


This is a common thing no?


>> certain patterns signaled

patterns in noise (static), not in the signal (game audio)


Computers should be noisier is an interesting take.

I can't tell you how much of a relief it was to have finally have a computer with solid state everything, an insulated case, 120mm fans. Turned it on, silence. It's good for recording sound, it's good for anyone else who happens to be in the room.

Why sound? why not just a.. oh idk. disk access light? Maybe add other little leds for other stuff?


>Why sound? why not just a.. oh idk. disk access light? Maybe add other little leds for other stuff?

Meanwhile, I'm over here with every LED in my apartment that's not tied to an IR reciever taped over with black electrical tape. I don't mind fan noise at all, but the sea of twinkling LEDs that come from modern electronics drives me insane.



Yes I'm glad to hear I'm not the only one!


One of my ideas is for application windows to have a button to flip to the internals of the application - see threads, connections, progress bars, concurrent loops, memory allocations, even stacks. This is what I call an encyclopedic desktop.


The JVM more or less allow you to do that. Its debugging and profiling abilities are amazing. You can also do most of these things (and more) with the various tracing systems of the linux kernel. (edit: typo)


One approach that I find interesting is to use Wasm because it was designed as a portable execution format for lots of language types. It has an amazing amount of flexibility for byte working and execution.

It is fairly trivial to see all of main memory and single step execution of a wasm program. If one runs wasm3 in wasm3, you can then trace the inner interpreter as well. Check out the section on trace visualization.

https://github.com/vshymanskyy/awesome-wasm-tools


I'm of the understanding it's actually possible to get 99% of exactly what you're describing if you're prepared to (learn how to) poke around with and squint at debugger-style tooling. Progress bars might be a bit tricky, but threads and connections are fair game, and tracing different kinds of loops is even viable too.

When I get back into the game with Windows again, I'll be seriously looking into ETW, Event Tracing for Windows.

It seems the best startpoint to learn about ETW is https://randomascii.wordpress.com/2015/09/01/xperf-basics-re... and https://randomascii.wordpress.com/2015/09/24/etw-central/.

The 2nd link above has a bunch of links to other pages, but is a few years old, so while the old info is still relevant, a quick poke around this blog's tags finds the following additional, newer posts that also demonstrate real-world insights of ETW saving the day in a bunch of practical situations:

https://randomascii.wordpress.com/2017/07/09/24-core-cpu-and...

https://randomascii.wordpress.com/2019/10/20/63-cores-blocke...

https://randomascii.wordpress.com/2019/12/08/on2-again-now-i...

https://randomascii.wordpress.com/2021/02/16/arranging-invis...

https://randomascii.wordpress.com/2021/07/25/finding-windows...


In sixth-sense terms, I once heard of experiments with a belt which would vibrate to indicate magnetic north. Apparently people really missed it when they had to give it up.


You're thinking of North Paw, perhaps, which is an anklet (https://sensebridge.net/projects/northpaw/)


Commercialized: > The original idea for North Paw comes from research done at University of Osnabrück in Germany. In this study, rather than an anklet, the researchers used a belt. They wore the belt non-stop for six weeks, and reported successive stages of integration.


Where I’ve lived my entire life I have a long river immediately due south of me that runs east to west parallel, years ago I internalise north as away from the river.

It’s really discomfiting when I’m away and lose track of that completely.



Feels like the culture series "aura" for droids could fit here. Subtle colours indicating mood in the books by Iain M Banks, but could indicate system health.


I never tried it, but ~20 years ago there was a project called Peep for auditory monitoring of networks.

From the Usenix (https://www.usenix.org/legacy/publications/library/proceedin...) abstract:

> We created a network monitoring system, Peep, that replaces visual monitoring with a sonic `ecology' of natural sounds, where each kind of sound represents a specific kind of network event. This system combines network state information from multiple data sources, by mixing audio signals into a single audio stream in real time. Using Peep, one can easily detect common network problems such as high load, excessive traffic, and email spam, by comparing sounds being played with those of a normally functioning network.

The SourceForge page is still up: https://sourceforge.net/projects/peep/


I do really like the concept of this and keep a full bar of monitors going in my top panel on desktop. I've also done things like tailing the ssh apache logs to watch real time traffic. I've also setup a streaming audio server that played sounds on various events like someone visiting a website or auctions, triggering the stairway camera, making a sale and tracking updates.

I also thought about, but never implemented some sort of lighted display, possibly a dollar sign that slowly turned from red to green as an indicator of the days profit from sales on my e-commerce channels.


I'm just missing the small led that used to indicate disk activity until someone decided laptops looked sleeker and less cluttered without it.


This makes a lot of sense to me.

MS Windows could really use a "WHAT THE HELL ARE YOU THINKING ABOUT?" feature. From the first version of Windows with networking, a Windows install had a lifespan of a few years, after which simple things like clicking on the Start menu would take several seconds to respond. You expect this when connecting to an external disk share, but it was woven into the OS so that it happened at weird times.

I didn't know enough about Windows internals at the time to figure out why this was happening. After a few days/weeks/months of irritation, I usually ended up doing a fresh install and re-installing/configuring all of my apps.

Just in the last month or so, my Windows 10 development system will sometimes take several minutes to pop up a File Explorer window. Default File Open/Save dialogs are affected as well. I'm not using any shared drives and I disabled the stupid One Drive thing. At this point, a reboot resolves it, but I sense another reinstall in my future...


In my old office I used to be able to tell if any of the machines in the lab were having issues just by walking in; the machines were old enough that if there was any issue (too much load, hard drive failing,etc) the difference in background noise was noticeable. My new office is unfortunately too loud by default and everything is too new for me to do it anymore.

Similar issue with the newer laptops that include power buttons as part of the keyboard. The power button no longer has a noticeable change when pressed, and if the laptop is fanless or just very quiet, it can be impossible to tell if the damn thing is turned on or not.

Sometimes I wonder if the manufacturers are doing this on purpose, to make the customer feel more distant from their devices and willing to part with them easier.


I have a Dell XPS 13 with an OLED screen (so no backlight). It has one of those power buttons in the keyboard. Luckily, however, Dell has included a "sign-of-life" option in the BIOS that will turn on the keyboard backlight, turn on the large white LED on the front edge of the case, and spin up the fans to max immediately after pressing the power button, if the computer is off and not in sleep mode. The instant feedback is so good.

Meanwhile, I used to have a Surface Pro 3 that did none of those. Pressing the power button did nothing for several seconds, and sometimes I would have to push the power button again after some seconds, for some reason. Quite irritating.


Maybe a smartwatch with more detailed tactile feedback could be a conduit for this communication. There are some truly tiny 5v mechanical relays that could be turned into little fingers for touching/vibrating specific spots. Bluetooth would be convenient too.


Dell got you covered: get a pleseant humming noise when your GPU is busy: https://m.youtube.com/watch?v=ATxR9FyBrVw


Modern computers and peripherals tend to come with a lot of RGB lighting that's mostly there to boost performance, but could ideally be used to expose this kind of background information.


I remember having this thought while watching TNG - if the computer was having trouble processing a "recursive algorithm" you'd know about it from a series of hesitant chirrups.

I've thought about making an app that lets you know what's happening in the background with a black bar that has jets of colour move across it when something happens, with different speed, colour, and shapes based on various attributes, like which app the activity came from, and what type of activity it was.


I use Task Manager to monitor the progress of some long-running tasks that have no progress feedback in their application. The memory and CPU use graphs follows familiar patterns and I get a sense of how far it's got and some confidence that it hasn't crashed. It would be nicer in audio though.


I like this idea at first pass! I think augmenting reality with a tap into the digital could be a nice new dimension to how we look at the world.

I do however think that giving tech companies + ad companies the ability to tap into this is risky. As a result, rather than have a call-to-arms to invest more in such tech, maybe it's better to just start the conversation on how we can interact with the digital world safely.

...Or maybe which digital worlds are safe to connect to? :thinking_face:


As field support engineer for a large company I can relate to what the author is exposing... I work with LED screen installations and most of the times I can detect a bad installation issue way ahead of any failure because this screens have some High Voltage capacitors that may produce sound if there is something wrong with the power quality, once I detected harmonics way ahead of the initial test, because of that we saved like 2 o 3 days of installation


Linux has a very flexible LED framework, and there's a variety of different "triggers" one can use to drive the LEDs. CPU usage, disk usage, network device usage, backlight level, usb port usage, audio are all different triggers one can control an LED via, that are baked into the kernel[1]. There's userland devices one can use to do whatever else one wants.

This isn't as aesthetic as say, the LiveWire[2] mention in the comments. But it's readily available on almost all systems, and is a very flexible ambient indicator.

There's a lot of really really fun good stuff in the comments here. Ambient is good, but to me, I want computing that exposes the causal relationships of what is happening as it's processing, as it's running. "This button was clicked so I'm trying to change the screen brightness now." All of the entities of computing, the data, these user events, should be reified, should be made into a logged sequence of what is happening. From that basis, we can all be free to explore computing, and to- EventSourcing style- extend the graph of computing as we might see fit.

[1] https://github.com/torvalds/linux/tree/master/drivers/leds/t...

[2] https://news.ycombinator.com/item?id=28348148


I second, we have improve in many aspects of the UI, others like visibility, no so much. Take a look a my experiments exploring ways to sense the virtual environment

https://mymakerspace.substack.com/p/another-look-at-infrastr...


My next motherboard will have 8 rgb LEDs on the bottom (vs 6 on old gen). I'd like if these could be colored per CPU core load. Say a dim blue for idle ramping up to bright red for full load.

Which reminds me I need to update the case for 8 lights now that the 5700G is available and worth doing an upgrade to the Mellori_itx.


I recall a classic Mac OS hard drive indicator icon to the left of the Apple menu that would roughly correspond to the crunching of the physical disk.

I had no idea what this article was going to be about, but determining behavior based on physical characteristics of the hardware is something I miss with modern, quiet machines.


i think if we combine modern monitoring and metrics systems for cloud computing with sound synthesis we could get valuable intimate insights that offer something dashboards or anomaly detection and alerting systems can not reach. if we listen to our companies systems noises, sounds and clicks thorughout the days and weeks we would naturally learn to read them and get that connection we had to our 90s workstations. the problem is its easy to make something like a novelty show room in eg. a facebook foyer, but to get the perfect balance of not being annoying or drawing too much attention from work and still representing enough real world data requires the best sound designers we have. ideally the result would be something that the kind of devs that like to listen to white noise and rain or ocean recordings at work would enjoy.


As much as I enjoy quiet computing (finally!), I have to admit that all these noises (hard disk access, fans spinning up, etc.) provided important feedback. I could think of a thin, rather dimm light bar with various sections for indicating processor load, bus and IO and network activity, etc.


Back in the days of 8 bit microcomputers running in the 1-2 MHz range, this was possible through EM interference that you could hear by putting a radio nearby.

I could tell whether thee machine was crashed, and a lot about what part of a program was running by the texture and tone of the noise.


In the past, the clicking of hard disk drives has told me a lot of things: An unexpected I/O intensive application running, lots of random seeks, or even failures like bad sectors. Now with SSDs, that aural indication is gone and the only noticeable effect is system lag.


When I'm compiling a big project, I'll often go read some emails or whatever and listen for the fans on my PC to quiet down again to know when it's done. It's actually become a pretty important part of my workflow lol.


I'm not sure this would really work. I mean, remember the modem sounds from the 80s? Could you really tell what was going on based on the sound alone other than the difference between connecting/transmitting/disconnecting?


Differences in handshake sounds would give you some idea of the negotiated speed as others mentioned, but also a good idea of if the handshake was going to succeed. Sometimes you'd get a bad modem or bad line and the handshake would sound wrong and try several times and either not connect or connect at a very low bitrate; cutting that off early was useful.

Of course, in a single line household, sometimes you'd catch the line in use, and the speaker would confirm that vs no dialtone error. Ocassionally, you might also get glare --- picking up an incomming call before it rings, and listening in might help recover from that as well.

Speaker on while connected could be useful for monitoring for connection disturbances (and maybe forcing a lower speed on a reconnect) or call waiting beeps, but was usually too low signal to bother. Also, I had a phone that would click/chirp on call waiting even when on hook which was a lot more actionable.


I could tell a successful handshake from a bad one. After the handshake, it would only signal data transmission, so I always activated the sounds only for the handshake

Earlier with my tape based computer I could tell what program was loading by the sound, and if it had an error.


I could tell from the sounds of the connection if it had negotiated the max 56k or something lower. Of course I could always see it in the connection properties later on, but I could tell just from the initial sounds.


yes 1000%. I remember my first impression of automatic updates was to feel profoundly out of control; software could now change itself without anyone even knowing. The transition to broadband was cool as I could pirate enough music to fill up my hard drive, but also profoundly unsafe: other things could come down that pipe.

Making the invisible processes of change and exfiltration legible to humans, even only as background noise, would humanize these awful OSes in a big way.



haha, back in the day my dad used a transistor radio to understand what his mid 70s microcomputer was doing


There is still the occasional time delay - when you say something isn't right.


Well there is always dtrace, strace and ptrace.


And <confused screaming> BPF, if you can figure it out :D


This is how you know you need a vacation from computers - when you write an article literally fetishizing sounds they once made. The whole piece is symptomatic of a fried brain. To the author: go outside for a long walk and shut your devices off.


I know this feeling. I remember playing Minecraft in a Windows 10 virtual machine, and the fan was roaring. Windows 10 is bad enough, but throw in a resource hogging game & a VM and you're asking for trouble. The game was so laggy as to be unplayable.


No!

Computers should have normal auditable log files, board schemes and spare parts. If it works right, it should be invisible. If it stops working, I call plumber who will fix it very cheaply.

This "computers are magic" is just BS. I refuse to "interact" with my thermostat. Soon this 6th sense will feed ads into my subconsciousness.


>I refuse to "interact" with my thermostat.

You... refuse to feel temperature? I'd wager that more people operate their thermostats based on their ambient reading of how hot/cold the room is rather than a data-forward approach.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: