Hacker News new | past | comments | ask | show | jobs | submit login

Bit flips from cosmic rays. Earth’s magnetic field helps to shield us, but is disrupted now. I work with a CCD camera in a sub basement lab. On a normal day we see a stray cosmic ray every couple of minutes (as a hot pixel). On a day like today, might be every couple of seconds.



>> I've always been hearing that solar flares and the resulting geomagnetic storms pose danger to hard drives and electronics. Is there any truth to it?

> Bit flips from cosmic rays.

I don't think that's it. IIRC, if a geomagnetic storm/coronal mass ejection is big enough, it can cause certain effects that are similar to an EMP. What the OP is talking about sounds like a distorted version of that danger: a nuclear EMP could directly fry microelectronics, a solar storm is a kind of EMP (solar EMP), but is missing most of the effects that could directly damage microelectronics, and many people conflate the two.

https://en.wikipedia.org/wiki/Carrington_Event

https://en.wikipedia.org/wiki/Electromagnetic_pulse#Types


Yes, but during the Carrington Event, electrical wires became overloaded and blowing out equipment attached to "the grid."

I've had a computer blow out (sparks, smoke and everything) due to a power surge when the main power went out and then back on, rapidly.

So while the "direct" causes might be different between EMP and CME, the end results are basically the same. I could only imagine what our current (very power sensitive) electronics would do in that scenario.

From the wikipedia Carrington Event you posted:

> Because of the geomagnetically induced current from the electromagnetic field, telegraph systems all over Europe and North America failed, in some cases giving their operators electric shocks.[22] Telegraph pylons threw sparks.


> I've had a computer blow out (sparks, smoke and everything) due to a power surge when the main power went out and then back on, rapidly.

> So while the "direct" causes might be different between EMP and CME, the end results are basically the same.

Not necessarily. My understanding is a CME solar EMP would take out grid-connected devices (via power surges over the grid), but leave most unconnected devices unharmed (e.g. laptops running on battery), but a nuclear EMP would take out both.


But it would also likely take out the grid itself. If all of the massive power cables strewn all over, connected to very large expensive power station transformers, become overloaded at the same time from a CME, I don't predict good things for those transformers and a lot of other things. The laptop would not be able to be recharged without solar or something. It takes a long time (1-2 years is my understanding) to get one single transformer at a power station, under ideal conditions. They are basically made to order, and backlogged.


True, but the hot pixels you see in the CCD isn't from bit flips. Rather the CCD is properly doing its job of collecting electrons. It's just that spurious electrons are being produced by the cosmic rays' ionization trail (as opposed to photoelectron in normal operation).


What's the difference? Aren't bit flips also just spurious electrons being produced by the cosmic rays' ionization trail?


It’s an excitation of the sensor. It may be interpreted somewhat like a bit flip when converted to an image, but it’s normal for image sensors to experience a lot of noise, going back to the days of film. It may just be eliminated by the denoising algorithm like most spurious electrons/photons.


Bitflip is when a high energy particle changes the state of a memory address or signal in a wire.

Hot pixels are when the CCD (camera sensor) achieves its max level and becomes bright. Pixel overloaded (should also result in spillover).

Difference is really just what part of the system receives the energy. For example, you can overload a pixel with a laser, but you also should get spillover. Bit flips tend to be more localized, affecting only one part. In reality, the difference doesn't matter too much, as long as we're keeping our discussion to cosmic events. But also note that images experience a lot of (non-gaussian) noise and plenty of this is from cosmic excitation as well as from other natural and man made sources. Just the levels are a lot lower and unlikely to result in a bitflip (which is a pretty high energy event).


Pretty much this. But the point is that a bitflip would be like, your CCD was supposed to read out value 0b00000 = 0 but due to cosmic ray interference, it read out 0b10000 = 16 instead.

Instead, what's happening in GGP's experiment is that actual electrons, created in the bulk of the active silicon region by a passing cosmic ray, are getting detected by the CCD, and it will (correctly) read out some value between 00000 and 11111, depending on how many electrons it saw. In principle you can actually make a histogram of these readout values and observe the Bethe-Bloch distribution of energy loss for ionizing particles through the (very thin) bulk of silicon! On the other hand, random bit flips (which are much rarer!) would just give you a random distribution of the numbers {00000, 00001, 00010, 00100, 01000, 10000}.


I like to put "Cosmic ray flipped a bit" into bug reports.


I used to work as a contractor for John Deere and one of the engineers there would capriciously insist on adding runtime consistency checks in code review because "you never know if lightning might strike or a cosmic ray flips a bit in RAM". This was not for any life-or-death software, it was for the infotainment stuff which was already insanely buggy beginning with the Bosch radio we had sourced and carrying into the ambiguous protocol John Deere designed to speak with the radio and the application code the lowest-bidder contractors wrote before hiring me to fix it (lightning strikes and cosmic rays were the least of their worries).

So the radio would fail to connect to many bluetooth devices at the time (because Bosch), the application had no way of telling what state the radio (that it was meant to speak to) was in (because of the faulty protocol), and the application was riddled with other bugs (because lowest-bidder contractors), but by god it was safe from lightning and cosmic rays (except not really because they could just as easily alter the program as the state the program was operating on).


I don't get the example, it sounds flat out better then doing nothing otherwise.


Well, the alternative was to invest the time into the many glaring concrete bugs rather than hypothetical 'cosmic ray' bit flips. I don't have a fundamental problem with runtime consistency checks if there's some compelling concern and a clear up-front policy for when/where to add them (as opposed to dealing with the whims of a capricious code reviewer).


We all have those anal coworkers


I remember hearing about this (not sure if there's a better article; this is literally the first one I found): https://www.thegamer.com/how-ionizing-particle-outer-space-h....

The story goes that Mario 64 speedrunner accidentally triggered a glitch which was thought to require that a particular value is "true" (Mario is touching a ceiling) which was not in this case. The following glitch hunt had many people concluding that the most likely thing that happened was a bit-flip via cosmic radiation which caused the exact distance change which could have otherwise been caused by a different but similar glitch.


There was a famous story how Google couldn’t produce a working index for months because they used non ecc memory for their servers. This was like 20 years ago.


That’s cool, is that some sort of specialized equipment, or could that observation be replicated by an amateur with a RAW-capable camera?


Put a lid on. Continuously take pictures. Look for deltas in the pixel values


Would make an interesting increase in RNG entropy ;)


annoying, but kind of awesome :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: