I think the core of the explanation given is correct but some of the adjacent details need adjustment. The important part is:
"Clothes dryers are very effective at making statically charged surfaces. (Dryer sheets help.) So when radon and its temporary decay products are blown through the dryer, electrically-polarized molecules tend to be attracted to the charged surfaces"
What that commenter misses is that nearly all hobbyist grade detectors (Geiger tubes) are not sensitive to alpha but they are highly sensitive to beta and a little sensitivity to gamma. However, any thin solid will block beta, so they would need the Geiger tube to be very near the radiation emitting material to pick up the beta. In other words, if they're just waving the detector around they're probably just catching the gamma.
The radon in the air decays into various progeny, and by the time it reaches the dryer that will be to some extent in equilibrium, so several isotopes, including gamma emitters, will be present in the mix. Therefore I'm not surprised the detector reads a tiny bit of that.
Why it dissipates is probably not a decay thing but rather the accumulated material gradually diffusing away from the filter or whatever after the dryer is turned off and no longer actively accumulating radon.
This could be tested by putting a detector right next to the filter to see how much beta it picks up. I've basically done that with a home air filter:
That's with the detector I make and sell which is primarily sensitive to gamma, which is why I could register a reading through the plastic container, even a couple days after preparing the test. When I used a pancake style detector sensitive to alpha and beta, directly against the exposed filter, the detector reacted much more strongly... But the Better Geiger S-1 gives an accurate dose reading, the Geiger tube or pancake probe will dramatically overestimate dose in that scenario, which can cause undue concern... In reality it's pretty harmless levels of radiation. :)
> hobbyist grade detectors (Geiger tubes) are not sensitive to alpha but they are highly sensitive to beta and a little sensitivity to gamma.
This was part of the reason why it took so long to discover the cause of Alexander Litvinenko's death by Polonium-210 poisoning. Doctors (and later detectives) had suspected some form of radiation poisoning, but the early tests used Geiger counters and came back negative. But Po-210 decays almost exclusively by emitting an alpha particle which is not detectable by Geiger counters. (And it also means it's not very dangerous outside the body but becomes extremely toxic if ingested.)
Yes, and even if you have a high-end alpha detector, ingested radioactive material cannot be detected externally if it's only emitting alpha. One would need to take a blood sample or something like that and do more complicated tests.
Yes, this is why the idea to use Po-210 was a stroke of (evil) genius. Because of its unique decay profile it is extremely hard to detect. The FSB agents who did this could obtain a small but lethal dose of Po-210 in Russia and then carry it across borders dissolved in a vial of water. In that state it is not dangerous, so the agents had little risk to themselves by carrying it. And it is also undetectable by border security, even if they were monitoring for radiation.
> The UK Health Protection Agency, which is advising authorities on technical aspects of the case, characterises the contamination in all 12 cases as “not significant enough to result in any illness in the short term,” while “any increased risk in the long term is likely to be very small.”
It would probably just appear as faint noise, and then only if it happened to be at wavelengths that the sensors were tuned to detect, and which the machine's signal processors weren't trying to ignore, and if the image processing was designed to draw attention to it.
His symptoms were consistent with radiation poisoning and because he was a high profile dissident his case attracted enough attention that the Atomic Weapons Establishment tested his blood and urine.
They performed gamma ray spectroscopy but did not discover any strong signals, except for a very small spike at 803 keV. Some of the scientists were talking about the case and they were overheard by an older scientist who had worked on the UK's atomic weapons program back in the 50s. The early bombs relied on Po-210 and he recognized that 803 keV line as being characteristic of Po-210. Although the decay is almost exclusively via alpha particles, a very small fraction of decays happen via emission of a gamma ray at 803 keV.
Once they had the connection to Po-210 it was straightforward to test for its presence in his body.
> except for a very small spike at 803 keV. Some of the scientists were talking about the case and they were overheard by an older scientist who had worked on the UK's atomic weapons program back in the 50s. The early bombs relied on Po-210 and he recognized that 803 keV line as being characteristic of Po-210
I don't understand. So professionals from the AWE tried to identify the measured radiation lines from memory(?!), instead of consulting a publicly available database of spectral & decay lines? It's been a while since I've done any spectral analysis but I used to use ie.lbl.gov a lot (seems offline unfortunately) and they had a search function[0] that allowed you to filter their entire database of isotopes and decay chains by whatever line you were looking for.
Man, so glad I decided to finally click on the comments after seeing the article at the top of hn most the day - we don’t have radon where I live (I don’t think?) so I figured the dryer thing wasn’t applicable to me.
But for the vast majority of my life, I’ve always wanted a Geiger counter, and once they were cheap and readily available on Amazon, I was just about to pull the trigger. And then Fukushima happened a week later and sent prices into the stratosphere.
A few times recently I idly looked at the ones on Amazon but never got around to it because I couldn’t tell which were junky crap and which were somewhat more expensive junky crap.
But very cool to see a knowledgeable manufacturer explaining his product in a highly relevant comment thread. Finally, the radiation detector for me!
There’s a former 12 year old kid out there who still remembers the Geiger counter exhibit from taking a pre-9/11 NPP tour who’s going to be very excited for something coming in the mail in the next week or two.
>What that commenter misses is that nearly all hobbyist grade detectors (Geiger tubes) are not sensitive to alpha
from the original post:
"If your Geiger counter is actually detecting radiation, it's almost certainly the half-hour lead and bismuth. "
If you look at the table he provided lead and bismuth are beta decay. It is likely he is specifying that on the fact that most home geiger counters only detect beta and gamma not alpha.
Could also be a happy accident though. Half an hour is short enough that the amount of radiation emitted is pretty high, but long enough that it hasn't pretty much all decayed by the time you try to measure it.
Even though bananas are famously radioactive, it's very low activity and I don't know that a regular detector could pick up the emissions from a single banana. You'd probably need a truck full of them.
It's interior. No ventilation. I guess it's ionizing the air and bringing radioactive dust particles to the whole room. The radiation level is still very low so I'm not worried.
Thank you! Most people don't need one (and hopefully it stays that way) but I still think it's a fun and educational device. If nothing else it's a conversation piece. :)
Hey, that's a pretty slick package (and price point). I looked through bettergeiger.com and read the discussion about the difference between GM tubes vs scintillators, but never saw an indication of what material you're using. Would be useful information for those of us thinking of upgrading from an SBM-20.
I'm sorry to say that some of the sensor design details, including material and geometry, are not shared. That was a critical optimization process during the design phase (tricky balancing act between cost and performance) and I want to slow down any copy-cat devices. However in terms of performance I am pretty open and detailed about how the device actually behaves:
Another metric some people like is CPM/[uSv/hr] when exposed to Cs-137 (662 keV). That number is about 415, whereas cheapo Geigers can be as low as 10, and decent consumer-grade Geiger counters are usually around 120. This number gives a decent idea about relative X-ray/gamma sensitivity of those devices.
I’m not familiar with Geiger counter design/specs but would the considerable static electricity send some electrons into the detector and cause a false positive beta hit?
Beta particles are basically energetic electrons flying around, yes, but for a Geiger counter to detect them they have to be quite energetic in order to penetrate the Geiger tube itself (detection occurs essentially inside the tube, in the gas)... So I doubt that static field is generating electrons anywhere near that energy needed to be detected by a Geiger tube.
Alpha has the shortest range and can be blocked by a piece of paper. Alpha radiation consists of protons.
Gamma goes through most things and has the highest range. Gamma consists of electro magnetic radiation.
Beta is somewhere in the middle. It consists of electrons.
All are of concern, but Alfa is mainly of concern if ingested or inhaled. This is because all the radiation will be absorbed by your body, while most gamma would just escape. Gamma is of more concern outside the body, due to its reach.
I may remember some details wrong. I last learned about this in school a decade ago.
I haven't read much on physics before. My favourite thing to do on these stack overflow sites (or on specific tags) is to go to Questions -> sort by score, and just read through the top questions and answers. It's a bit hidden away because they want to steer people to the active questions, not to established ones.
The tidal bulge one is one my all time favorites, you should absolutely read the answer. tl;dr the simplistic view of two tidal bulges (that started with Newton) is wrong! This is one of those physics simplifications that still gets pushed around. For another example, see speed of light in glass is slower than vacuum.
> For another example, see speed of light in glass is slower than vacuum.
I'd love for you to explain what you mean by this. Either you're mistaken or I'm about to learn something really interesting. Here's a link a quick search brought me, for reference. Are you talking about something other than refractive index, ratios of c, and prisms? Like a latency metric of some kind? I'm very curious.
It's probably that photons always propagate at c and that the apparent change of speed of light in glass is caused by photon absorption and delayed reemission by atoms (at least it's how perturbation theory of quantum electrodynamics describes the process).
Do you really need photons and quantum stuff though? "Just" learn about how dielectric materials work + how EM waves work from Griffiths and then everything will make sense.
I'm looking at Griffiths right now, and 9.3 Electromagnetic Waves in Matter states (emphasis in the original): "Since [the dielectric constant] is almost always greater than 1, light travels more slowly through matter - a fact that is well known from optics."
I'm still confused. How is it an inaccurate simplification to state that the speed of light in glass is lower than in a vacuum?
I wouldn't call it inaccurate at all. I think it's a perfectly good model. You can explain the phenomena otherwise using a more detailed model, but you don't need to.
What insight is being gained by going with the more complex model? You can work out a lot using classical EM and predict a bunch of cool optical/diffraction phenomena without a lot of math, and the math you do is crystal clear. QFT or even just QM+time-dependent perturbations are a pain in the ass in comparison.
If you have engineering mindset (how to do something), then, sure, you are better off with the most useful approximation. If you have scientific mindset (how things are), then practical usefulness of the model of reality is secondary.
Oh boy, it seems you're one of today's lucky 10000 (https://xkcd.com/1053/ :-). The concept of EM waves slowing down in a medium is a curious one: it sounds obvious (you can hike a shorter distance in harder train even though exerting same effort) but generates interesting questions, e.g. what is the mechanism that "slows them down", how do photons speed up after they exit a the medium (https://physics.stackexchange.com/questions/330994/how-does-...). I think the problem here is similar to the concept of "relativistic mass", a simple way to think about things and work problems but at the same time creates another set of problems and confusions.
red75prime's short answer in this thread is correct, for a longer one see this Physics SE questions and answers (https://physics.stackexchange.com/questions/11820/what-reall...). In ELI5 terms: photons always move at speed c, but in a medium now and then (depending on properties of the medium, density, etc.) they interact with its atoms. In this case the atom absorbs the photon and after a short time generates another photon. So the speed of each photon is always c but the "overall light wave" becomes less than c.
I love QED! That was recommended to me by the professor who caused me to change majors from EE to physics. I just looked for it on my bookshelves but I must have misplaced it somewhere in the intervening two decades.
I understand what you're saying, and don't disagree that the quantum electrodynamics treatment is a theoretical model that can describe optical phenomena, but I guess I'm a little nonplussed when you compare tidal theory to optical theory.
Newton's theory of tides was incorrect; it does not accurately predict tidal phenomena[0]. Using classical EM field theory with refractive indices and ratios of c is not incorrect; it gives accurate predictions of optical phenomena at the macroscopic scale. It's certainly incomplete, but not inaccurate.
I think the disconnect might be that when studying physics, it gets drilled into you over and over that no model is right or wrong, just applicable or inapplicable to a given situation. For example, here's a passage in Griffiths' Introduction to Electrodynamics:
"In fact, when you stop to think about it, the electric field inside matter must be fantastically complicated, on the microscopic level. If you happen to be very near an electron, the field is gigantic, whereas a short distance away it may be small or point in a totally different direction. Moreover, an instant later, as the atoms move about, the field will have altered entirely. This true microscopic field would be utterly impossible to calculate, nor would it be of much interest if you could. Just as, for macroscopic purposes, we regard water as a continuous field, ignoring its molecular structure, so also we can ignore the microscopic bumps and wrinkles in the electric field inside matter, and concentrate on the macroscopic field. This is defined as the average field over regions large enough to contain many thousands of atoms (so that the uninteresting microscopic fluctuations are smoothed over), and yet small enough to ensure that we do not wash out any significant large-scale variations in the field. Ordinarily, the macroscopic field is what people mean when they speak of "the" field inside matter. (In case the introduction of the macroscopic field sounds suspicious to you, let me point out that you do exactly the same averaging whenever you speak of "the" field inside matter.)"
So I guess I'm saying that the physical intuition and mental models used to work with classical EM field theory are different fom those used to work with quantum theories is just part of what it means to be a physicist.
Anyway, I'm glad you're so excited about physics! If you liked QED, you might be interested in Schroedinger's "What is Life?", where he takes his experience from quantum physics to speculate on how the information-theoretical requirements of genetic inheritance in reproduction implies the existence of some sort of "aperiodic crystal" that encodes genetic information, decades before Crick & Watson discovered DNA. It's written for the layman, although it's not as readable as Feynman (although he's in a class of his own when it comes to scientific explanations).
Also, thanks for your original comment, because it gave me an opportunity to pull out some of my old textbooks and leaf through them for a while.
[0] My mental model of tidal phenomena corresponded to Newton's theory, so I found it fascinating how incomplete and inaccurate it is, prompting a long diversion down a Wikipedia rabbit-hole, to the point that I'm now considering giving a short lecture on tides at my birthday party in a couple weeks.
As a note: the tidal bulge image gives an 'equipotential'. I.e. the water level you would get in equilibrium. That part of tidal bulges is right.
The part that is wrong is that the actual water level is nowhere near equilibrium, because the moon moves. In fact, the water level doesn't even approximate this equilibrium but can (and does!) behave very differently.
> For another example, see speed of light in glass is slower than vacuum.
For all practical purposes it's true. You'll get higher delays in fiber optics than in, say, radio-relay links. The details of matter-photon interaction are interesting, of course.
No, but instead means that GPT can't be smarter than the people who are writing the published content on the web (generally, the very smartest humans if you're considering high quality published works alone).
If you could do a perfect job of training a GPT on the entirety of published academic literature, the total of what that GPT could spit out would be limited by the knowledge contained in academic literature. At the same time, you'd have created a tool that is cheap and does a good job of synthesizing knowledge/answering questions across all disciplines. The model will never replace the scientists who are working at the very bounds of their fields, but it doesn't have to in order to be extremely useful, even useful enough to replace a majority of knowledge workers.
Just because GPTs can't be smarter than the smartest humans doesn't mean they can't be smarter than most humans.
Whilst that is probably true, it is not necessarily true. Nothing fundamentally prevents GPT from synthesizing new and novel insights. If a novel insight is trivial to notice when combining knowledge from 4 disciplines, it could be neigh impossible for humans to find, and obvious to GPT.
Or perhaps the insight isn't trivial but follows analagous reasoning from some other obscure result. Or perhaps a million other things.
I don't mean to claim GPT will do this. I just mean to point out it can't be fully excluded GPT is able to.
Yeah, these kinds of things are what the internet should be all about rather than whatever the hell it morphed into. Sure, I'm applying my morals and expectations to it, but I don't think I've ventured onto a thin limb here.
I always find this a more entertaining way to learn about these topics, because they're questions which we might have had before, but never thought much about.
> I was once prevented from leaving a neutron-science facility at Los Alamos after the seat of my pants set off a radiation alarm on exit. This was odd because the neutron beam had been off for weeks. It was a Saturday, so the radiation safety technician on call didn't arrive for half an hour — at which point I was clean, so the detective questions began. I had spent the day sitting on a plastic step stool. The tech looked at it, said that radon's decay products are concentrated by static electricity, and told me that I needed to get a real chair.
Nope. Government salaries are public information, and are reported all over the usual sites (levels.fyi, glassdoor, indeed, etc) More importantly though, the government doesn't give out stock options with a vesting schedule. Meanwhile, techs who were at Tesla and got a small grant when it was at $20 are likely pretty happy with the stock being at $170. (Tech's that got grants at $400, less so.)
I presume that the technician had at least one PhD. I had a colleague who worked at Los Alamos with a top PhD and he was basically driving around placing and collecting data from some sensors all day long.
They also make you take off synthetic jackets for the same reason. Apparently alot of people have had to do extra paperwork over the years because of their patagonia grabbing radon.
Reminds me of the story about how Radon was discovered (well the fact that it can accumulate in homes). The story goes that a worker at a nuclear power plant kept on setting off the detectors when coming in to work and eventually they investigated his house.
The nuclear power plant had built the homes to house workers in the early 1980s and to an early form of Energy Star compliance, with an emphasis on being air-tight. Radon was trapped in these homes, workers set off the alarms, and the radon mitigation industry was born.
We had this issue in the Midwest (central Illinois). Can’t remember exactly why, but radon would accumulate in the basements. Had to have sensors and ventilation for it.
Because the rock formations in some areas are more prone to production of Radon gas which slowly diffuses through the soil. Basements in those areas tend to have Radon infiltration. It is obviously very slow but the basement is a low point and often not well ventilated which can allow the Radon to accumulate.
Poured slabs are less permeable, not a low point, and tend to have better ventilation either intentionally or unintentionally.
It is a myth that radon collects in "low points", concentrations are low and it diffuses as the molecules bounce around. However, as you said radon is entering into those areas primarily, which means it will naturally be higher in those areas. Secondly basements often have poor ventilation, which is not to say that the radon cannot otherwise escape, but ventilation can disperse it much more rapidly than diffusion.
Technically it's the daughters of radon, the decay products, which accumulate. Radon itself has a half life of less than four days, but it decays into all kinds of nasty stuff like Polonium and ultimately Lead.
This claim (which ultimately came from the EPA) is suspect at best. It just comes from assuming a Linear No-Threshold model, which flatly contradicts the available body of medical evidence.
Thanks for posting this, I will check it out. As a Midwesterner with a basement I had once looked into those claims about radon as the second-highest cause of lung cancer with concern and found the evidence much weaker than I expected. Even by EPA estimates[0] almost 90% of the lung cancer deaths attributed to radon are smokers (i.e. who have already weakened lungs), and there didn't seem to be good evidence of damage from basement radon for non-smokers.
Let me preemptively disclaim that I do have nitpicks about the talk I linked, but IMO his main argument is sound.
In particular I think his scientific explanations aren't always fully accurate, and he worries about measurement minutia very strenuously. Also at one point he says you could breathe 80% radon/20% oxygen, which while chemically this is true, radiologically it would be suicidal.
Those nitpicks aside, his point about the EPA LNT model having no clothes is spot on. After "the scales fell from my eyes," now I can't not notice how all those same Party Line Claims get parroted in every piece of radon-related content.
Another glimpse down the rabbit hole is this interview with the late Dr Bernie Cohen, who studied the link between residential radon and cancer:
The alternative to LNT is to conservatively assume radiation has the maximum effect not ruled out by evidence (after all, radiation is not a criminal defendant that's innocent until proven guilty). This would make low level radiation more dangerous than predicted by LNT, not less.
>The alternative to LNT is to conservatively assume radiation has the maximum effect not ruled out by evidence
No, the alternative is to use real data, rather than fabricate fraudulent data to fit the LNT model.
Namely, you need to look at the actual shape of the dose-response curve in the real world, instead of externally imposing a certain predetermined shape by fiat and fudging data to fit.
Here's one of the relevant timestamps from the talk, which runs on into Part 3.[0] However if I'm being honest, this talk is impossible to capture in a short clip suitable for modern attention spans. My apologies.
To keep from getting lost, one really should watch the whole thing before formulating the next counterpoint. Otherwise this will devolve into a very long and boring comment thread, where I laboriously digest the points from the video and feed them to you through an HN comment-sized straw. ;)
The LNT is consistent with the real world data, cherrypicked bullshit from pseudoscientists notwithstanding.
This is why the NRC recently rejected a call to stop using LNT in setting regulations. Their responses to the details of the petitions are pretty damning.
Now, at the very low doses typical of large scale exposure from nuclear accidents, it's true that the evidence does not directly require LNT. That's because the effect predicted by LNT is so small at those doses that it's swamped by noise and unmodelled biases. No practically obtainable data could do the trick.
But note what this means: the data at those low doses is so poor that it's consistent with an effect greater than from LNT. It cannot rule this out.
I'll add that a favorite pseudoscientific claim from the radiation apologists, radiation hormesis, would imply a larger effect from radiation at low doses than predicted by LNT. That's because hormesis putatively acts by inducing repair mechanisms above some dose, bending the curve downward. Below that putative threshold, the curve has a high slope (and thus more damage per increment of dose) than the linear curve of the LNT.
>The NRC, however, does not use the LNT model to assess the actual risk of low dose radiation. Instead, the NRC uses the LNT model as the basis for a regulatory framework that meets the “adequate protection” standard of the Atomic Energy Act of 1954, as amended (AEA). Furthermore, the LNT model is applied so that the framework can be effectively implemented by an agency that regulates diverse categories of licensees
This matches exactly what the presenter says: LNT makes great policy. It's just bad science, which the NRC demonstrates here by immediately throwing the LNT under the bus when questioned.
The NRC is literally calling LNT an "assumption" here. How much more of a hint do you need?
>The 1991 final rule explained that the NRC based its radiation protection regulations upon three assumptions. The first assumption concerned the use of the LNT model, which was described as follows:
>The first assumption, the linear nonthreshold dose-effect relationship, implies that the potential health risk is proportional to the dose received and that there is an incremental health risk associated with even very small doses
Go back and read about the malfeasance of the petitioners there. They do things like cite a study, neglecting later data from the same study where the effect they want no longer shows up in the better data.
Policy has to go beyond what is rigorously demonstrated. It's not like criminal law. Technologies are not innocent until proven guilty. Just because LNT cannot be demonstrated at the doses relevant to nuclear accidents doesn't mean regulators should assume no effect.
It feels like this is something that should be easily amenable to statistical analysis.
Large numbers of people get diagnosed with lung cancer; smoking rates vary widely and smoking is typically recorded in patient histories; typical radon levels vary widely with geography and are recorded / known. This looks like a near perfect natural experiment, no?
I'm in the middle of moving from Texas to New England. In Texas, Radon is such a "not a problem", that I've never considered it before our move. In New England, our basement tested above the "safe" limit (iirc, 4 somethings), so we are having a venterlator installed for ~$1.4k.
In the south, Radon "can" be an issue, but it simply isn't an issue. In New England, it is often an issue.
> In the south, Radon "can" be an issue, but it simply isn't an issue.
There's more to the south than Texas, y'know. I can't recall if radon tests are required during a home inspection here in NC, or just highly recommended, but when I was buying a house last year I toured several homes that had radon mitigation systems installed.
My understanding on why this is, is because in New England there's a lot more granite in the ground. Granite is particularly annoying in that respect because the trace amounts of, i think, uranium and radium that the rock concentrated when it was formed will give off the radon gas when they decay.
and that can mean high outdoor radon levels... a home inspector back in early 2000 told me about a seller sneaking back in and opening basement windows to attempt to pass a radon test (sensors left in the basement over the weekend.) The inspector caught it but let the test run to completion - at which point it failed anyway because the adjacent "forest with exposed granite rocks" was a fine natural radon source...
Heavy metals in the nearby earth are the obvious parameter here--you need something to decay into radon for there to be radon.
But I think another factor has to do with the water table. In some places the water table rises high enough to flood basements, so you don't usually have basements at all in those places, and subsequently you don't have large reservoirs of radon adjacent to living spaces. I know of some people here in CO (where we have decaying granite in the soil) that don't have a basement and still need a radon ventilator, but it's not very common.
See my other post in this thread, active ventilation is the last resort if passive methods don't get results. Perhaps in the middle of a real-estate transaction it's easier to just pull the trigger on the sure-fire method and get it over and done with...
...but if you ever get annoyed by the whir of the fan, unplug it for a month and do an alpha-track monitor. (Or get a continuous monitor, they're cheap now!) There's every chance that simply having the cracks caulked and the piping in place, might do the trick all on its own, with no need for active depressurization.
We had such a remediation done recently, in Alberta close to the rocky mountains.
Costs were around $2,000 CAD to install a sub-slab depressurization system (i.e. a fan that pulls air from below the house and vents it away from the house).
Radon values dropped from around 500 Bq/m3 to less than 20 Bq/m3.
Those systems must be new. The classic solution is to install drain pipes before pouring the slab, so one can imagine how difficult it would be using 1980's construction techniques to retroactively add piping below a finished house.
We also had to dig trenches to lay natural gas lines, but we have a way to do those with horizontal boring techniques (of course then people who didn't know what they were doing put them straight through sewer lines, causing backups, visits from the Roto Rooter man, and subsequent explosions due to dumping natural gas straight into the sewer main).
Is it safe to assume they're using something like that with perforated pipes to exhaust radon?
I had a side-job installing active, subslab depressurization systems in the 90s, and they were quite well established at the time. The EPA has quite clear and cogent guidance on radon, paraphrased as follows:
Before going with active depressurization, start by installing a sealed sump cover, caulking the basement wall-to-floor joint, and caulking all the cracks in the basement walls, in that order. If that doesn't get the number down where you want it, drill the subslab access hole and install the ventilation piping to the outdoors, but don't install the fan in the middle. Only of those fail to get adequate results, install the fan.
It's really simple and quite cheap. I don't think we ever did a job that was over $1000. Costs of running the fans were pennies a month, and Fantech still sells the classic FR-100 fan all these years later, though there are even quieter options now.
So make the interior of the house the path of most resistance, and see if it will passively vent itself, and if not bore a horizontal tunnel halfway under the house and install a fan to do the job?
No! There is no horizontal boring, at least not on any system I ever touched or heard of. I don't know where that idea came from.
There's typically enough gravel under the slab and around the foundation that there's plenty of soil-gas transport without doing anything more. So the install is just a simple vertical hole through the corner of the slab somewhere. Maybe you scoop out a few handfuls of soil before sticking the pipe in the hole, but there's no horizontal boring. Here's a very typical one:
Not shown is the crawlspace portion of that system, which would use a horizontal perforated pipe _laying on the surface of the crawlspace floor_ and covered with a plastic membrane that's taped to the walls:
>Those systems must be new. The classic solution is to install drain pipes before pouring the slab, so one can imagine how difficult it would be using 1980's construction techniques to retroactively add piping below a finished house.
For instance, the first subway systems in London were installed via cut and cover. Dig a trench, build a roof. Usually those trenches were in the street, or a former canal, not under a building.
Using modern tunnel boring equipment, Seattle's attempt to move the downtown section of 99 into a tunnel ended up stalled for a year because the ground shifted and pinched the boring machine in place. Oopsie.
And of course if you're worried about gasses, puncturing a solid piece of concrete is just going to make that problem worse.
Most of the cost is the installation of equipment in the existing home. In areas of the country that have a lot of radon, it's not uncommon to install the equipment during construction of the house foundation, which then only adds a few hundred dollars to the total cost.
> Radon-resistant new construction (RRNC) typically costs a builder between $250 and $750. RRNC could cost less than $250 if the builder already uses some of the same techniques for moisture control.
Some places in Czechia have a huge problem with radon emanating from the ancient bedrock below. These places, even though rural and not polluted, tend to have more deaths from lung cancer than the rust belt in northern Moravia, where air quality in general is worse, but the radon problem is almost non-existent.
Newly built houses are required to be radon-proofed, AFAIK.
I remember that "radon killer" story being vastly overblown / debunked. Apparently anything up to 150Bq/m3 is fine or even beneficial for your health. [0]
Has the debunking itself been debunked? It's hard to keep track.
Higher amounts of granite in the soil make it worse. Depending on the area, it can be a severe health hazard to the point no one puts in basements - others, kinda meh.
I really like my https://www.bettergeiger.com/ (no affiliation with them, I just own one and am happy with it) It uses a solid-state scintillator instead of the older and less sensitive Geiger-Müller tube.
I have a GMC-320S, its ok, I had to cut away some of the plastic so it could read a uranium source I bought. The plastic housing would shield the tube from most of the radiation of the source. There are little slits that would let some in, I just widened them with a dremel.
I wouldn't recommend it again, but if you received one as a gift, with a bit of tweaking, you'll be measuring random stuff in your house.
GQ makes several dosimeters, I own the dual-tube GMC-500+.
The cheaper single tube dosimeters will max out and saturate at radiation levels far below those that are immediately harmful to human health. (This was the source of the famous "3.6 roentgen per hour, not great, not terrible" meme-- he was looking at a meter that was reading off-scale high at 0.001 R/s)
If you're buying a dosimeter with the threat in nuclear war in mind, check the specs for maximum readings.
Aside: all of us, even the most competent engineers, can deny what is happening or make bad calls during an emergency, especially where everything is unknown during the initial stages. It takes very good training to teach us to make better emergency decisions - e.g. firefighter's training, and some military. Beware of selection bias in repeated deadly situations.
You are right max range is important for some applications, and most cheap Geiger counters saturate very easily (that two tube device you mentioned being a notable exception). Even still it is not great to refer to them as dosimeters because measuring dose with a low cost Geiger counter is risky at best. That's because they are calibrated to Cs-137. Some so-called energy-compensated Geiger tubes exist and give more accurate dose information, but they are not used in consumer-grade devices (or at least none that I've seen, and I own a ton and constantly study the market). Also Geiger tubes can pick up a lot of beta, and in that case it will dramatically overestimate dose because beta should be blocked in order to get an accurate dose reading, and none of the usual suspects do that. Even if beta is blocked, dose is often strongly overestimated due to the lack of energy-compensation I mentioned. That's one of the reasons I developed the Better Geiger S-1, to have an accurate and fool-proof high range dosimeter at a consumer-friendly price point ($149).
At the high end, the standard workhorse for really serious hobbyists and rock hunters is the Ludlum Model 3 with a 44-9 pancake probe [1,2]. Throw in a scintillation probe [3] if you really want to find needles in haystacks. There are perennial bidding wars on ebay for these things surplus, but the manufacturer has a very hobbyist-friendly reputation. They've walked me through a few DIY repair jobs on ancient equipment they were under no obligation to support.
I have a GQ 600-RPS and I love it. It detects alpha, beta and gamma. Detection range is only a few inches, so you have to know what to look for. We take it on family outings and search for uranium rocks.
Florida is about to build some test roads using phosphogypsum, a radioactive waste product of the fertilizer industry that has been piling up and they want to unload.
It contains high quantities of radium which decays into radon, this should end well.
> "Phosphogypsum contains appreciable quantities of uranium and its decay products, such as radium-226," according to the EPA. And because the fertilizer production process concentrates waste material, "phosphogypsum is more radioactive than the original phosphate rock," the agency notes.
> "The radium is of particular concern because it decays to form radon, a cancer-causing, radioactive gas," the EPA adds.
It’s outside though, so the radon wouldn’t normally accumulate unless something weird happens. Seems like this is the sort of thing you’d want experts to look at.
Are you suggestin' that gases accumulate in runoff?
That's the thing about the uranium decay series. Just after radium, it has this gaseous stage called radon, which doesn't matter one whit when it's sealed into a rock; an atom of radon in the middle of a piece of granite just sits there for a few days and then goes right on into being lead.
But if it's near the surface of a grain of sand or something, that radium turns into radon and the radon escapes into the surrounding air. And it goes wherever the air goes for the next day or two, before turning into lead.
So if that radon is in the runoff, it just fizzes out like any other dissolved gas, and blows around in the air along with all the other gases that make up the air, exactly like any other radon that had an escape route when it was created. There's radon in the air everywhere, right now, a brief stop on the decay chain that is otherwise all solids. And those atoms of radon are undergoing alpha decay all the time, turning into atoms of lead, minuscule bits of dust popping into existence, everywhere.
This is similar to a fun trick I used to show people back when CRTs were the predominant television screen. The face of the TV would build up a static charge and collect dust over time. I’d wipe the screen off with a damp paper towel and place it under a alpha/beta/gamma-sensitive Geiger probe. It would hit 500 counts per minute easily(background was about 250).
I knew people who took this to an extreme by putting a high-voltage static charge on a needle, then placing the charged needle in a closed bell jar with a pile of thorium-doped lantern mantles. The tip of the needle would pick up enough radon daughters that it made a great rechargeable point source for cloud chamber demonstrations.
I doubt it has anything to do with molecular dipoles. More likely I'd guess ionized radionuclides adsorb onto macroscopic dust particles (or some similar mechanism), which then maintain a net charge imbalance.
Yeah, the wording in the explanation was particular suspect:
> Polarized or polarizable objects are attracted to strong electric fields
Polarizable objects align in an E field, they aren't attracted to the E field. I don't doubt you could contrive a field to move a polarized object, but the wording seems to be confusing an ionized object with a polarized object.
Interesting that the second answer mentions the enhanced radioactivity of laundry detergeant. One of our smoke detectors briefly went off the other day as I walked under it with a basket of wet clothes straight out of the washing machine. Any chance that's what it was?
"Clothes dryers are very effective at making statically charged surfaces. (Dryer sheets help.) So when radon and its temporary decay products are blown through the dryer, electrically-polarized molecules tend to be attracted to the charged surfaces"
What that commenter misses is that nearly all hobbyist grade detectors (Geiger tubes) are not sensitive to alpha but they are highly sensitive to beta and a little sensitivity to gamma. However, any thin solid will block beta, so they would need the Geiger tube to be very near the radiation emitting material to pick up the beta. In other words, if they're just waving the detector around they're probably just catching the gamma.
The radon in the air decays into various progeny, and by the time it reaches the dryer that will be to some extent in equilibrium, so several isotopes, including gamma emitters, will be present in the mix. Therefore I'm not surprised the detector reads a tiny bit of that.
Why it dissipates is probably not a decay thing but rather the accumulated material gradually diffusing away from the filter or whatever after the dryer is turned off and no longer actively accumulating radon.
This could be tested by putting a detector right next to the filter to see how much beta it picks up. I've basically done that with a home air filter:
https://twitter.com/BetterGeiger/status/1605639346865901570?...
That's with the detector I make and sell which is primarily sensitive to gamma, which is why I could register a reading through the plastic container, even a couple days after preparing the test. When I used a pancake style detector sensitive to alpha and beta, directly against the exposed filter, the detector reacted much more strongly... But the Better Geiger S-1 gives an accurate dose reading, the Geiger tube or pancake probe will dramatically overestimate dose in that scenario, which can cause undue concern... In reality it's pretty harmless levels of radiation. :)