According to Wikipedia, granite contains 1-20 ppm of uranium [1]. One gram of natural uranium is about 25,280 Bq. Hence, one ton of granite is somewhere between 25,280 - 505,600 Bq.
In other words, if a beach is made of eroded granite, its radioactivity will be between 25,000-500,000 Bq/t: assuming the sand is more than a feet deep, I think it's safe to assume > 25,000 Bq/m^2.
> Some granites contain around 10 to 20 parts per million (ppm) of uranium. By contrast, more mafic rocks, such as tonalite, gabbro and diorite, have 1 to 5 ppm uranium, and limestones and sedimentary rocks usually have equally low amounts.
Radioactivity of rocks is caused by only a few elements so is useful to consider the minerals in which those elements are hosted. i.e. the radioactivity is not evenly distributed among the minerals that comprise granite. The elements that contribute most to the radioactivity of granite are U, Th and K. The K is mostly contained in alkali feldspar and mica. The U and Th are contained in accessory minerals, such as monazite and zircon. Monazite and zircon are dense, robust minerals, which means they persist after weathering and go on to accumulate in detrital sediments, and we call the sediments in which they are concentrated heavy mineral sands. Alkali feldspar is much less robust and rarely survives in mature sediments (i.e. quartz-rich sands), instead all the K ends up in clays.
When heavy mineral sands are processed to extract the Ti and Zr ( from rutile, illmenite, zircon), the residual concentrate is rich in monazite. This material comprises the bulk of our easily accessible Th reserves. However, you can't just leave this monazite-rich material lying around heaps, as it creates a windborne radioactive dust hazard, so it gets mixed back in with the other light material. This is a bit of a shame. All that energy and effort is expended to extract the heavy minerals, but as there is no immediate market for the monazite, and it is a liability to keep it in the extracted state, all that work hard work is undone to mitigate the dust hazard and it gets mixed back with the quartz etc.
So it is perhaps somewhat ironic that we mine beaches to get the minerals that are the source of the bulk of the radioactivity of granites.
> In other words, if a beach is made of eroded granite, its radioactivity will be between 25,000-500,000 Bq/t: assuming the sand is more than a feet deep, I think it's safe to assume > 25,000 Bq/m^2.
I don't think this computation is valid, because you have to consider what and how much radiation makes it out of the solid. So converting from activity per mass to activity per surface area cannot be done, I think, by simply assuming a given depth.
Yeah this is just a version of the 'chest x-ray' fallacy/lie/propaganda.
You just simply cannot compare radiation hazzards based on gross disintegration's per second. Anyone doing that is either lying because they know better or ignorant.
> You just simply cannot compare radiation hazzards based on gross disintegration's per second. Anyone doing that is either lying because they know better or ignorant.
You know, the really nice thing about this part of your statement is that there's no way to tell which side you're standing...
This whole thread started with Bq/m^2 (i.e., "gross disintegrations per second" per square meter), after all. Of course you can accept some numbers and reject others as invalid/irrelevant, but when your whole argument looks like "I reject this number because nuclear is dangerous", it starts to look awfully like circular reasoning, or propaganda, if you prefer.
> You know, the really nice thing about this part of your statement is that there's no way to tell which side you're standing...
Two points that inform my position. There is no civilian nuclear industry. Military and civilian are two sides to the same coin. Due to it's black budget military origin, that industry lies about risks out of habit. None of the public debate deals with the complex risks, dubious economics, and long tail waste issues.
I actually know almost nothing and I can see the public debate is just a bunch of deeply ignorant people spouting off. So my position, given the industry lies, the risks, and the existence of alternatives, the sooner we stop mucking around with radio-isotopes and sequester the stuff we've made the better.
I live near Aberdeen, in Scotland, a city built largely of granite, and somewhat famous for having (relatively) high levels of background radiation as a result. Supposedly, background radiation levels are higher than they are in Fukushima!
But sand is not normally made of eroded granite, or it does not contains only granite. A lot of sand is made of shells or coral. Most beaches should have much lower levels of radioactivity.
What I mean is that we can't just take the most radioactive beaches in the world and assume that this is the normal level in beaches so "is not so bad". In the real life, beaches aren't made (only) of granite dust. Most material is silicon, carbon and calcium.
And we should remember that unless x-chest rays, thousands of people lay practically naked in the same beach and favourite spots for entire weeks, ten hours a day. Being exposed to the UV rays of the sun also, that is another important source of cancer. In this cases, even relatively low dosis could be of concern.
People at the beach eats a lot of local sea foods also.
According to Wikipedia, granite contains 1-20 ppm of uranium [1]. One gram of natural uranium is about 25,280 Bq. Hence, one ton of granite is somewhere between 25,280 - 505,600 Bq.
In other words, if a beach is made of eroded granite, its radioactivity will be between 25,000-500,000 Bq/t: assuming the sand is more than a feet deep, I think it's safe to assume > 25,000 Bq/m^2.
[1] https://en.wikipedia.org/wiki/Granite
> Some granites contain around 10 to 20 parts per million (ppm) of uranium. By contrast, more mafic rocks, such as tonalite, gabbro and diorite, have 1 to 5 ppm uranium, and limestones and sedimentary rocks usually have equally low amounts.
[2] http://www.wise-uranium.org/rup.html