When I went to listen to a dark matter talk at the observatory in Cape Town, my first response was: Why on earth do they keep insisting that the r^2 rule should hold?
Maybe a physicist can explain why, but as a mathematician my first quest would be to try to form a mathematical system that follows what we are observing, without introducing physical objects that we aren't actually seeing.
1- 1/r^2 is the most natural in 3D space. Think of it as the surface of en expanding sphere. A lot of things follow the inverse square law: electric fields, light intensity, sound intensity,... It is not a strong argument: there are other things that don't follow 1/r^2 and we are not even sure our universe really is 3D (though there is strong evidence for it).
2- Theories that don't involve 1/r^2 (MOND) exist, but they are not sufficient to match observations without involving dark matter. And following Occam's razor, since MOND introduces another parameter (the behavior of gravity on long scales), but doesn't simplify anything (dark matter is still there), it is not preferred.
3- While we don't know the details about dark matter, we "see" it in a way. We see it affect the movement of galaxies, and the gravitational lensing effect it has. It is just that we can't observe it with our preferred methods, which involve electromagnetism. I mean, we don't need to see a glass panel to know that it's there. Sure, we'd like to know more, and evidence of dark matter is weaker than our glass pane, but we simply don't have anything better right now.
A question on 1-: does introduction of intermediary objects stack well? What I mean, for instance: If the moon is between us and the sun, the light gets blocked out but gravity does not. Light intensity then does not stack well (for visible light frequencies) when there are intermediate objects. Or does the analogy here require a more formal transposition to GR and the distortion of spacetime?
I'm not sure that I understand your question. Certainly all of our understanding of astronomical effects includes (and often depends on) the effects of intervening clouds of gas and dust: that hasn't been overlooked or omitted from any professional-level analysis. (My memory/guess is that in the presence of an absorptive medium, influences like light intensity will fall off as e^(-a*r)/r^2 rather than just 1/r^2. But the point is that we can model that absorptive term reasonably well, and the underlying 1/r^2 still results naturally from 3D space.)
(Meanwhile, to a first approximation, intervening dense objects like planets or stars are rare enough to be negligible: even in a region like our solar system where such objects are most likely to be found, the available volume is overwhelmingly empty space.)
My question is whether gravity works like light, or whether you can just ignore the planets in between. When for example you calculate the effect of the sun on the earth but venus or mercury happen to be in a line of sight, does it influence the sun-earth calculation component? (Of course, the other planets have their own gravitational influence in line with r^2.)
To first order, gravity is indeed linear in the way you suggest, so you can just add up the effects of different planets separately. To see why this is the case, note that the Sun itself could be considered to be a collection of independent masses. If there were some non-linear gravitational effect, then the gravity of the separate components would be different from the gravity of them all together. However, the surface gravity can be measured across a wide range of masses and scales as you would predict from Newtonian gravity.
That being said, general relativity predicts that there is in fact a non-linear scaling when gravitational fields are extremely strong. The reason for this is that the there is some energy associated with the gravitational field, and any energy produces its own gravitational field. So the gravitational field itself exerts its own gravitational field, leading to the non-linearity.
Are you indirectly asking whether planets can be dark matter candidates? If so, those are called MACHOs (massive compact halo objects). They have been studied via the gravitational microlensing effect. The presence of MACHOs can't account for most dark matter.
When you see figures like "1E-7 to 1E-2 M⊙ mass range" (from the above paper's introduction), note that a Jupiter mass is about 1E-3 M⊙---one thousandth or 0.1% the mass of the Sun. Earth is 3E-6 M⊙, the Moon is 3.7E-8 M⊙, and Pluto is 6.6E-9 M⊙; or 0.00037%, 0.000004%, and 0.00000066% the mass of the Sun respectively. Compared to stars, planets comprise a tiny fraction of the mass of a galaxy.
TL;DR - No, that's not how gravity works, but you can still ignore the planets in between because they're tiny.
I think the question is: given a 3-body system with objects A, B, and C, is C's gravitational effect on A altered in any way by the position of B, or are B's and C's effects on A independent?
The reason for the question, I think, is because of an intuition built up around EM. In the system
A B C
B can block and otherwise interfere with C's EM radiation so that its effect on A is different depending on B's location.
No, if you want to think of gravity in analogy to light, you can see every mass as analog to a fully transparent light source. There is nothing in this universe that can absorb gravity.
LIGO was able to detect gravitational waves, which I'm pretty sure must require non-zero transfer of energy in one form or another. (I'm hedging a little only because I know LIGO was measuring relative changes in metric distance, which might possibly not require absorbing energy... but just intuitively it's hard for me to fathom any transfer of information without a transfer of energy: it simply shouldn't be possible.) But 1) it's a remarkably small loss of energy, since the gravitational coupling to matter is so weak, and 2) LIGO is very specifically reacting to gravitational waves rather than to static (or quasi-static) gravitational fields, and the absorptive properties there will certainly be different (just as they are for electromagnetism).
General relativity will probably add some effects like that, but then quantum electrodynamics adds light scattering off of light, so the analogy is still ok as far as analogies go.
Thanks. So, in the analogy between gravity and light, you have to change all "massive bodies" to "fully transparent light sources" to make the analogy complete.
A nontransparent light source would not translate as a (straightforward) massive body.
I suppose that gravitational waves may be ever so slightly attenuated as they pass through matter, but the coupling is so weak that we probably wouldn't notice. (Think of how hard LIGO had to search before they found the first black hole merger signal.) So even for waves, the gravitational effects of the sun would be all but unaffected by Mercury or Venus being in the way.
For a static 1/r^2 field, intervening matter has very little effect at all. (There are non-linearities in principle, but for all but the most extreme configurations they're negligible in practice.) Certainly when the Newtonian approximation holds you can just add up the separate 1/r^2 effects of all the separate sources to find the overall effect. (This, by the way, is very similar to the behavior of electromagnetic fields. The waves are attenuated or blocked by intervening matter fairly easily, but it takes a pretty special situation to actually block the 1/r^2 effects of a static electric or magnetic field: you'd need to surround your system in a conducting "cage" that would polarize in response to an external electric field to cancel out the effects of that field inside, for example. That isn't possible for gravity, since there doesn't exist negative mass to do the screening.)
Ultimately, the 1/r^2 rule really is intimately connected to the three dimensions of space in our universe. In theories with additional spatial dimensions, that rule changes. Many theories with extra dimensions follow the Kaluza-Klein model, where the extra dimensions are "curled up" very small (in the sense that if you were to travel a distance L in that extra dimension, you'd come back to where you started: like a little circle). In the case of one extra dimension like that, if you were to measure the behavior of gravity over distances much shorter than L, you'd find that the gravitational force fell off like 1/r^3, but if you measured its behavior over distances much longer than L you'd find the familiar 1/r^2. (And for that reason, we know that the L for any system like this must be very small: my memory is that direct gravity measurements can set a limit like L<1mm or maybe even L<1 micron, and indirect evidence from things like particle physics observations pushes the limit down to the nuclear scale or below: L<10^(-15) m or even much less. I probably ought to know the actual values of both of those limits off the top of my head, but it's been a while since I looked at it.)
Nobody really likes the idea of particles or other stuff that is impossible to measure.
But until now, General Relativity has passed every experimental test and most alternative approaches to dark matter other than "Some heavy stuff not interacting electricly" is basically ruled out by one or another observation.
There are solutions that fit the large scales pretty well (Cosmic Microwave Background, Galaxy Formation) but fail at the smaller scales (Orbital speeds) and vice versa.
I can try to think of why, but I'd be guessing. I understand that many things in GR follows from when we translate reference frames that there is a light speed constancy requirement. My exposure to physics is via category theory and specifically self-dual theories that apply to the quantum sized world.
Well, there's the answer that it's what seems to fit the facts. GR may be one of the primo-grade "unreasonable effectiveness of mathematics" examples as it captures rather complex (to human minds) behavior in a handful of carefully-defined symbols, but technically it's still an empirical theory first and mathematical theory second.
Secondly, as others mention, there is the obvious answer that r^2 is how area changes as you increase the radius of a hypersphere in 3-D space; here I deliberately say "hypersphere" even though we're in 3-space here just to emphasize that it's particular to the the 3-D case. There is a theory that the reason gravity is so much weaker than the other forces is that it is "escaping" out in other dimensions we can't see, though that still leaves the stuff we can see expanding as inverse-square in what we can see.
(One of my favorite crazy dark-matter theories is that it lies in those other dimensions, and the reason we can't capture any is that it is literally not "in our dimension". That said, it still can't be conventional baryonic matter, because dark matter doesn't clump anything like it does. If there was a "parallel universe" or 10 just like ours, except they gravitationally interact, we'd see things like stars in orbit around "nothing", because the stars in the parallel universes are interacting. That's not what the universe looks like.)
They did try. Those alternative theories fall under an umbrella called MOND- MOdified Newtonian Dynamics. They were actually some of the first real proposals to solve the dark matter problem. Unfortunately, none of these theories panned out.
The easiest to understand evidence against MOND is shown in the bullet cluster. The bullet cluster is a pair of galaxies in the process of colliding. They also happen to be gravitationally lensing galaxies behind them. While the xray and visible light observations show that gas clouds that make up the bulk of the normal matter in the galaxies stopped when they hit each other, (like two cars in a head on collision) the gravitational lensing shows that the overwhelming majority of the mass in the galaxies simply passed straight through, the way dark matter as particles would.
So in order to accommodate the bullet cluster, MOND will need to show that empty space has some sort of momentum and memory, which is going to be much less elegant than a second hierarchy of particles which do not interact via the electromagnetic or strong nuclear forces.
There are a very large number of other observations which also disprove MOND or variations on them, the bullet cluster is just the easiest to describe in a paragraph.
It's actually very, very common for laypeople to come up with MOND as a solution for dark matter. I'm pretty sure I did when I first heard about dark matter. I don't think I've ever seen a dark matter post on HN or reddit that didn't include at least one post positing MOND, even if (especially if) they don't that the theory they came up with has a name, an acronym, and had already been thoroughly disproven by the scientific community.
> even if (especially if) they don't [know] that the theory they came up with has a name, an acronym, and had already been thoroughly disproven by the scientific community.
Without commenting on the merits, I think the theory "traditional kinematics need to be adjusted somehow at large scales" is a little too vague to have been disproven.
It seems pretty straightforward to me to say that has been disproven, for reasonable values of disproven, because you'd have to adjust it one way here and another way there.
The r^2 rule should hold because there is a concept called "flux". Imagine a field of force emanating from the center of a sphere.
Now twiddle the radius of the field. Presume the intensity of the field will be distributed evenly, but diluted by the net surface area of the sphere, that surface area will in proportion to the square of the radius.
That said there are some fascinating alternative theories of gravity such as mond or the unruh effect which don't invoke an inverse swuare rule.
But still note, that even if some solution appears to be "pretty" or "obvious" mathematically for some specific scenario (like drawing the areas for inverse square rule, or looking at the galactic curve for a "fit" with MOND), it never means that it automatically follows that all of the measurements in all the experiments will match that.
The "better" theory must cover more, not less of the measurements. The one that at the moment certainly covers the most is dark matter, and the alternatives simply cover much less (or as I've posted even somewhere result in the wrong shapes). Once some alternative manages to cover the observations and bring even more predictive power, that one will eventually be accepted (although sometimes "one funeral at a time" was needed), even if it's less "pretty" and less "obvious."
The problem with the dark matter model is that it has more free variables so epistemologically it's just "able to fit more". At some level it feels like that time in high school physics (gravity lab with ball bearings) when one of my classmates proudly showed off a sixth order Excel polynomial fit of five data points with an R^2 of 0. (That doesn't mean dark matter is wrong)
More than what exactly? As far as I know, the https://en.wikipedia.org/wiki/Lambda-CDM_model is the
best we have, i.e. with the minimal number of parameters that still covers most of the observations. If there were a better one, it would already be accepted.
But unless you try to inform yourself, you wouldn't even believe how good it is, compared to the alternatives. Really, really good, as in, it actually predicted the future measurement results of the oldest signals reachable to us, and then later the measurements did match perfectly.
> epistemologically
Ah... I'd guess then that you don't write about physics but some non-scientific belief system.
the lambda-CDM model assumes the existence of a distribution of dark matter which is unique to each galaxy and empirically derived. That's potentially a (countably) infinite-dimensional free variable vector for each galaxy, although typically models assume dark matter is usually symmetrically distributed (which can still be a countably infinite-dimensional free variable vector, as a function of rho and phi at least). In any case it's got enough free variables to account for two drastically different cases - 'normal' galaxies, and diffuse galaxies, which apparently have no dark matter. How do you parameterize that without at least one free variable per galaxy?
> If there were a better one, it would already be accepted.
Because we're looking at the surface of a sphere, and that scales like r².
Why do we look at the surface? Because every point at the surface has the same distance from the center, and so we expect the field to be the same on each point.
You might be interested in quantized inertia http://physicsfromtheedge.blogspot.com/ which is a model that does away with the need for dark matter and that has been gaining some interest. For instance, author claims it replicates the success of MOND but also predicts where MOND and GR both fail (see wide binaries posts on the blog).
I 100% agree that when the model doesn’t fit the data, making up some hitherto undetected kind of matter, distributed in just the right way to explain the discrepancies, is kinda odd... the QI author is constantly complaining about this.
"making up some hitherto undetected kind of matter, distributed in just the right way to explain the discrepancies, is kinda odd"
That seems totally backwards to me. Matter, by definition, is what is distributed spatially in just the right way to explain our observations. Physical laws, by definition, are not; they are universal. So why keep trying to explain what appears to be matter with adjustments to physical laws?
It's like you kept seeing elephant footprints appear out of nowhere, but insisted that rather than believe in invisible elephants, the most parsimonious explanation is that they are a natural consequence of a corrected law of gravity. That is, to me, turning Occam's Razor inside out.
I think your response makes sense (neutrinos are another example where the theory predicted an as yet undetected particle and it was eventually found).
But the other reply to your comment is good too - at what point, after failing to detect any OTHER signs of these invisible elephants, do you wonder if there’s other explanations? What it would take to falsify dark matter, given that you can invent it in whatever distribution you like to fit the data?
There’s nothing that logically compels us to assume the existing model of gravity is correct, and its discrepancy with observation is explained by assuming the inputs aren’t what we can observe. It is also logical to consider that the existing model of gravity isn’t quite right. Of course any tweaks to gravity have to be compatible with a lot of other observations but it would be wrong to rule out the possibility of an alternate model.
"What it would take to falsify dark matter, given that you can invent it in whatever distribution you like to fit the data?"
What does it take to falsify any observation of any kind of matter? Isn't the chair you are sitting on just as unfalsifiable? In an effort to be consistent, you might concede that the belief your chair (or anything!) exists is not scientific, so then I ask if that's not a problem, why is dark matter a problem?
Ok, so epistemologically we can’t know what reality “really” is, only what we observe, and we can come up with models that fit our observations. Your (I’d say vacuous) point is that my model of reality that includes a chair is epistemologically on the same level as a model of the cosmos that includes dark matter... though it’s also on the same level as a model that gravity is caused by invisible purple dinosaurs. :) So this perspective doesn’t really get us anywhere in judging models of reality.
I’d say models are useful when they are simple and predictive. Dark matter isn’t really predictive, it’s what is hypothesized to compensate for the prediction error of our current model of gravity. (Doesn’t mean it’s wrong but am just pointing this out)
So I don’t think dark matter is inherently problematic, just that it’s not predictive and it’s worth considering whether the model of gravity we have needs tweaking instead (much like GR refined Newtonian gravity).
I think, philosophically, dark matter has been "seen" as much as anything. Like, radio waves are invisible, but you accept that we've "seen" them, right? So-called visible light isn't what activates your visual cortex, it's a cascade of rube goldberg machinery triggered by photons; the inside of one's head is dark. Nothing is seen directly, but all the things which we think we've seen have various intermediate steps before we can "see" them. We've detected gravity waves - is that "seeing", and do people who doubt dark matter exists doubt the gravity wave detectors work?
But that's what the physicists have tried (and not been successful at so far).
The issue is that modifying how we think gravity and matter works is not just a change in a math equation.
Changing those equations would also mean a whole chunk of modern physics, including general relativity is wrong - yet all those theories can be observed and confirmed in many very different ways to be consistent and valid.
So if you change those equations to adjust for dark matter, they no longer give the result for many other phenomena that we observe.
"Modified Newtonian dynamics (MOND) is a theory that proposes a modification of Newton's laws to account for observed properties of galaxies. It is an alternative to the theory of dark matter in terms of explaining why galaxies do not appear to obey the currently understood laws of physics." [1]
Seems like this is (one approach to) exactly the question I asked! Even if proven not to lead to any satisfactory cosmological model, these kind of studies will at least tell you why the alternatives (to r^2) don't hold.
Modern science has available modern technology to measure so much and with such a precision that any theory that attempts to be scientifically accepted has to match with the existing measurements.
The only approach that is not "dark mater" which is not completely disqualifying itself on all scales (by not matching anything) is MOND, which is still more "let's try to find some formulas that can match at least some observations" than something that even now can be consistent with all observations and measurements. Still, where MOND doesn't match the observations, it fails not in the orders of the magnitudes, it even results in the wrong shape of the curve:
"The biggest challenge facing MOND today is the shape of the matter power spectrum. The shape depicted in Fig. 1 is related to the acoustic oscillations observed in the CMB. If the Universe is dominated by dark matter, these matter oscillations, dubbed Baryon Acoustic Oscillations (BAO), are highly suppressed as the baryons fall into the potential wells created by dark matter, leaving only percent level traces of the primordial oscillations. In a no-dark matter model, on the other hand, the oscillations should be just as apparent in matter as they are in the radiation. Indeed, Fig. 1 illustrates that – even if a generalization such as TeVeS fixes the amplitude problem – the shape of the predicted spectrum is in violent disagreement with the observed shape."
"On the scales of groups of galaxies, individual galaxy clusters, colliding galaxy clusters, the cosmic web, and the leftover radiation from the Big Bang, MOND’s predictions fail to match reality, whereas dark matter succeeds spectacularly. It’s possible, and perhaps even likely, that someday we will understand enough about dark matter to understand why and how the MOND phenomenon on the scales of individual galaxies arises. But when you look at the full suite of evidence, dark matter is practically a scientific certainty. It’s only if you ignore all of modern cosmology that the modified gravity alternative looks viable. Selectively ignoring the robust evidence that contradicts you may win you a debate in the eyes of the general public. But in the scientific realm, the evidence has already decided the matter, and 5/6ths of it is dark."
As a mathematician, you might want to start from Gauss's law for gravity (flux theorem for gravity). You need only the Gauss law and zero curl assumption to derive Newton's law of gravity.
It is interesting to see some of Gauss's side quests during his life and the scope of his interests.
I remember reading that Gauss's (and his contemporaries') foray into statistics was greatly motivated by the central limit theorem. Every now and then one sees a question about least squares, asking why we don't usually use least cubes. [1]
Abel said of Gauss: "He is like the fox who erases his tracks in the sand with his tail."
Galaxies don't all have a proportionally equal amount of dark matter. So if you invented maths that would account for some dark matter, they would fail in other cases. You'd still have anomalous events that would require the type of ad hoc explanation that dark matter solves. Plus an extra superfluous theory on top of that.
There are theories. I don't claim to understand this but there are persistent physicists having driven alternate formulas for many many years. Example:
http://riofriospacetime.blogspot.com/
It's more complex: dark matter theory beautifully predicted the results of the CMB measurements, and no alternative theory can match that even given a hindsight.
Suppose many smart people do something that makes no sense to you. Specifically because something seems obvious to you isn't done by them.
Then it's reasonable to assume you are missing something. One easy way to figure our what you are missing is to say "hey, I'm not from this field, but I don't see anyone try X. Can someone from the field explain to me why?".
That is not calling everyone in the field stupid. It is trying to learn more about the field. Generally you can learn a lot from "why don't you just..." questions because they reveal widely presumed knowledge you are missing.
Hubris only comes in if you start chastising others for not doing the obvious thing.
The were kinda rude about it though, ”why on earth” is not the humblest expression to prefix a question, especially when asking outside ones domiain knowledge.
And the answer is simply that they are ill informed about r^2 being the best fit to all the data, popular talks just use rotation curves as a simple to explain justification and don’t bother with structure formation arguments and the like.
OP here, "why on earth" was in context to the talk. At said talk the person assumed the audience generally want r^2 to hold, but did not mention that r^2 should preferably hold (for the reasons posted in other comments).
In fact, the scope of the talk was much different. It mainly was to explain how we are looking for dark matter. Of course, this slightly diffirent topic is still very interesting. It happens to be confusing, though, for a person that may prefer theory extension over empirical extension (in their own research).
They're as much an armchair physicist by questioning the inverse-square rule as you're an armchair psychologist by invoking "passive aggressive condescension". Or do you actually have the necessary qualifications to make that diagnosis?
Seriously though, it's a good question. Inverse-square rule is most likely well-founded, but reexamining such foundations is how you make theoretical progress - at least in mathematics, if I'm to believe Polya. I am not a mathematician itself, but it does seem like a pattern that generalizes well into solving theoretical problems in all domains ;).
Sometimes the best way to understand a problem is to ask lots of questions. And the best way to get a meaningful response is to frame those questions with context.
I don’t read any ego in the GPs tone - just an individual who was curious about a physics problem and wanted that explained in a language he understood (hence the mathematics context).
It seems like the number of telescopes are a limiting factor in astronomy, and while expensive, new telescopes seem to be a straightforward way to get interesting new results. It is unfortunate that researchers have to wait for telescope time like computer scientists had to wait for computer time back in the day.
The request for additional funding for extra telescopes is always met with the question "but what use is that knowledge?" Politicians don't want Moon missions, they want Teflon coated no-stick pans. They don't want Higgs bosons, they want the WorldWideWeb. Problem is you never know what commercially useful things you will accidentally find while you hunt pure knowledge.
Given that they're staring at an entire galaxy for such a long time in quite high resolution, I wonder that they didn't notice anything else that's unusual?
Given the sheer amount of data (870 megapixels every 90 seconds for a night) you will only find or not find things you're very specifically looking for. The volume of data is simply too large for random discovery. As the article covers, looking for a very specific type of interaction, the researchers were still faced with 15,500 possible matches that they then needed to filter.
7 hours of observation, one image every 90 seconds: that's 280 images. My phone takes 8 megapixel photos and those are about 2 MB, based on that the 870 megapixel image would be about 200 MB. 200 MB * 280 = 56 GB. Doesn't sound that much.
Did they publish the images? I'm sure many astronomers would be happy to examine them.
This isn't a smartphone camera. The raw images are 16 bit monochrome, which is about 2 GB. Additionally you'll make dark exposures etc. to compensate for errors, which overall results in about ~13 GB per shot [1].
So 280 images at 870 megapixels each is 243,600,000,000 pixels to scan through? That is a lot unless you can filter for something specific. That's why they didn't simply stumble upon anything else significant.
Yes future analysis of the images could potentially turn up something else but that wasn't what was asked.
When I write a test, I intentionally make the condition fail to validate that the test works like I think it does.
The experiment described here uses new equipment that presumably has never been used for this purpose before. But I don't see any mention of efforts to verify that it can actually be used to detect black holes. Hopefully the remaining 10 days of their experiment can confirm that it works...
The “Hyper Suprime-cam” instrument came online in 2012, so it’s not new, and was presumably calibrated against well-understood targets; so its light-detection capabilities have been well tested. The physics of gravitational lensing is also well understood and verified, so they know what light they’re looking for.
If you can test each part of a system independently, and the parts connect in a reasonably trivial fashion, it’s about as good as testing the whole.
Depends. In functional programming, if all components of a pipeline are tested independently, you can all but guarantee that the pipeline as a whole will work as intended.
That said, software testing isn't really an apt analogy now that I give it some thought. Testing real world equipment has to contend with a lot more edge cases and environmental factors.
To be fair, this is after years of lending surveys already closed the loophole for MACHO theory down to the pinhead closed by this last survey. Still a good read.
Sixty Symbols, which is a great channel for these kinds f topics, did a great video on primordial black holes a few years ago: https://youtu.be/gs3mtZPySeM
We have burnt before. We invent something to fit our idea eg all wave need a medium. Hence light run in ether and ... in fact we even call Ethernet in remembrance of this ether. But it does exist.
We have done it before. We believe there is one rule from smallest to the largest. God does not play dice with us and definitely not in the dark. We were wrong.
Do not know whether this dark material (and the total different case of dark energy). Just to say eliminate one more theory like dark hole may not be the end of non- or better the birth of
Dark Matter. If it is just weak ... but dark. Still need convincing most positively.
* search for why Ethernet is called Ethernet you should find this :
“In 1973, Metcalfe changed the name to “Ethernet.” He did this to make it clear that the system he had created would support any computer, not just Alto’s.
He chose the name based on the word “ether” as a way of describing an essential feature of the system: the physical medium carrying bits to stations.
He thought this was much like the old luminiferous ether was once thought to propagate electromagnetic waves through space.”
One of the best pithy comments on general relativity and dark matter was made by someone on HN about a week ago. I think it summarizes the situation quite well. But, inexplicably, people don't seem too concerned about it:
>"[dark matter] seems to make up about 95% of the galactic mass...crushingly narrow passages between observation and tests of GR"
GR only matches observation because you added in 20x more stuff that is undetectable other than as a deviation from the predictions of GR.
People aren't worried about it because it doesn't seem like a problem. We know that weakly-interacting matter can exist (as demonstrated by neutrinos), and there's nothing odd about the behaviour of dark matter: it seems to behave exactly as we'd expect weakly-interacting matter to behave. That's certainly not case closed * , but it's plenty to make the explanation perfectly reasonable as a working theory.
On a more philosophical level, though, if experts aren't concerned about something, and you as an amateur are, you're almost certainly the one who is wrong. Assume you are, and look for why, rather than questioning their conclusions. It's like the old adage about compiler bugs: they exist, but if you think you've found one, you haven't.
* And indeed scientists don't treat it as though it is - there are several other theories, though it's the strongest contender.
Id take what you say about amateurs worrying what professionals don't to be true for the most part, but it's also not always true. Even in physics.
Iirc about a hundred years ago when we were trying to measure electron diameter, several iterations of scientists bsed there results assuming the last guy was accurate. Dark matter has a lot of similar mysticism and I would not be at all surprised to see a metric shit ton of scientists just being sheep and going along with incorrect models. I'd also not be surprised if they were all spot on too.
For reference in grad school I worked in a neutrino lab right by a dark matter lab underground. So I'm not an expert but do have decent exposure.
I didn't say it was always true, I said it was almost certainly true. I stand by that.
Untrained amateurs are right from time to time, but given that they have almost no basis for their assumptions except gut feelings, it basically comes down to chance.
I'm not arguing that scientists can't be wrong, certainly not. I'm arguing that believing they are, or even believing you have a valid criticism, without understanding on what they base their arguments, because "it sounds reasonable" or "scientists are sheep" is deeply intellectually dishonest.
You are referring to the electron charge, not diameter which cannot be measured.
Certainly there is bias in physics but I don't recall any occasion where amateurs where correct and experts not (in fundamental physics, that is). This may occur in fringe fields but is highly unlikely in fundamental stuff.
Fair enough, but at the same time dark matter has been around a while and we're no closer to understanding it other than a few ideas about what it isn't. I do think it's a candidate for a very interesting paradigm shift, which there has been before in physics and maybe that's more what I'm getting at. I'll be real I've had a long weekend.
> It's like the old adage about compiler bugs: they exist, but if you think you've found one, you haven't.
I'm not sure if that's a good simile. I've found several compiler bugs in my time, and I proved that with assembly output from the compilers and I've received updated compilers from the vendor as a result of my reports.
I suspect that to find "bugs" in physical theories must be orders of magnitudes more unlikely.
I've been reading Ars for many years now and I just wanted to say that the clarity this specific article displays in explaining astronomical concepts to a layperson is really impressive and appreciated. Sometimes I get lost in these types of write-ups, but you gave readers the state of the field and why this study - which simply provides evidence that helps likely nix one potential explanation - truly was important in that context. It was all just so digestible and satisfying.
Totally agree.
I loved the implication that one simple 7 hour viewing yielded 15500 events to be filtered and that they could observe a binary star system's change in that time frame. To me, that's awesome!
The thing is that there's no conceivable mechanism we know of for a non-primordial black hole to form and be as small and common as would be required to explain away dark matter.
Yes, I understand that. So primordial black holes don't explain dark matter, but dark matter as a hypothesis came before it no? To explain something else no? The article wasn't very clear about all the links.
Also, there's zero indication in the article that this invalidates the existence of black holes.
And about dark matter, the last sentence says: "But the study does slim down the already thin chances that they are the source of dark matter's effects."
Insinuating that primordial black hole were a hypothesis for the effects of dark matter. Not that dark matter theory is proven inconsistent by the lack of evidence for primordial black holes.
I'm looking for someone who knows better I guess, to clarify. I'm only interpreting this using my reading skills from the article itself.
I'm not positive, but I feel there might be a misunderstanding here. This article isn't trying to say dark matter and/or black holes don't exist, just that they aren't the same thing.
But the title of the article insinuates so: "One night of telescope time rules out black hole/dark matter".
So I wasn't sure if I had misinterpreted the article, or if the title was just confusing.
Edit: Maybe I wasn't clear. From the title, I got the impression that the observations ruled out both black holes and dark matter. But from the article, I only got the impression it ruled out the hypothesis that black holes were the force behind dark matter.
Even more the article mentions that primordial dark matter is required to explain the signature of the CMB. Therefore, if black holes form the dark matter they must have been formed primordial.
Maybe a physicist can explain why, but as a mathematician my first quest would be to try to form a mathematical system that follows what we are observing, without introducing physical objects that we aren't actually seeing.