Between the two. 30 µSv per hour on the surface of Mars on average compared to 60 µSv on the Moon - compared to 5 µSv on a jet. Not friendly, but not too awful.
Or 0.5 per hour on Earth at sea level. That's only 2 orders of magnitude difference. Given that we don't expect radiation damage to be a major source of failures in Earthbound consumer or commercial electronics, it's a bit surprising it's such a big deal in space.
The composition of that radiation is important - Earth's atmosphere preferentially filters out a lot of the higher-energy particles that are likely to permanently damage electronics. The sievert as a unit is weighted based on damage to biological systems, not electronics.
Well 2 orders of magnitude can turn a problem that occurs once a decade into a problem that occurs once a month so it's not too surprising that the problem is a bigger deal in space.
There are actually a lot of bit flips in Earthbound electronics due to lack of parity error checking. But those bit flips don't necessarily cause failures, or when they do it's impossible to determine the root cause.
yes but these chips having to function and survive on the way to the planet right? and they do all sorts of software updates etc during this time. Maybe we could shield them and that would be fine? or would that add to much extra weight