The "dark star" hypothesis given a finite speed and momentum of light (particle nature) combining with the escape velocity of stars isn't impressive by itself.
But his scientific reasoning and methods very much so.
He carefully conjectured what could be observed e.g. binary stars in amassing data points. Naturally, in a logical progression, he looked for stars which behaved as if they were in a binary systems but the partner i.e. "dark stars" could not been seen because its mass would be so big that the finite speed of light couldn't "escape". Brilliant.
He, also, inherited Cavendish his "torsion balance" apparatus for measuring the mass of the earth for what later became the famous "Cavendish experiment" yielding an accurate measurement of the gravitational constant. The experimental basis for the advancement into the nature of gravity.
Quite impressive.
And he did not yell about it or seemed to care about what nowadays would be called the "impact factor".
The thing is, academia has always been rather dumb at its main job; analyzing the validity of research. Meaning that, even if the research proves its validity on its own, academia fails to pick it up, preferring to wait for secondary proofs.
It takes a certain kind of a person to go against the flow. Einstein certainly did and it sure wasn't a smooth ride. Even in our days, Geoffrey Hinton wasn't allowed to present on deep learning in AI conferences in the early 2000's, even though the exact same theoretical bases existed then as they do now.
Not related to the discussion, but I love these verbal dualities.
give/take
inherit/bequeath
and so forth!
-------
What fantastic reasoning! I am constantly amazed at the early and current astronomers, able to work out fairly and increasingly accurate cosmological views from pinpoints of light.
Give/take duality is significant in the study of proto-indo/european language. IIRC, some root words took opposite meanings in their descendent languages. Some have gone so far as to speculate that this suggests reciprocality was an important culture expectation around gift-giving in PIE society.
I'm in New Zealand and I've never heard that; I feel like I might have heard it somewhere when I lived in England though. There are definitely oddities in NZ though; the NZ press uses the word "trespass" in a very strange way that seems to be somehow backwards: https://www.stuff.co.nz/timaru-herald/news/126753711/timaru-...
One I've noticed that is US vs UK/NZ/AU/...: "bring him with you" vs "take him with you", which has a directional component in British usage that I can't quite explain (away from the speaker maybe?) that the Americans don't do.
From a nearby reply there is a clue that it's a new Americanism. I have to resist the urge to send a complaint... I'm not ready to become an old geezer who complains about things changing. But I admit that I grit my teeth when people talk about their 'gas bill' and I genuinely don't know whether they're talking about petrol or heating gas, or say "gotten" instead of "got", etc etc; there's no way to say "but we're supposed to use British words here" without sounding like an arse...
French has the directional component specifically for people (amener vs emmener) which could have influenced British English like many other words and phrasings, or it could have just been lost/simplified in American English.
"An Inquiry into the Probable Parallax, and Magnitude of the Fixed Stars, from the Quantity of Light Which They Afford us, and the Particular Circumstances of Their Situation."
I studied extensively historical scientific original documents for several years and I tend to find my self writing very much in that style when I do anything that requires significant exposition.
Even if it's something boring like remediation of a building structural issue.
I don't think paper titles are usually pretentious, they're just dense with jargon to be terse. They're impenetrable if you don't know the jargon, but I don't agree that makes it pretentious. I don't think it's possible for a journal paper in an established field to be simultaneously precise, accessible, and unprecedented enough to be worth publishing.
This was a standard convention in 17th/18th century novels -- it's not just Defoe. A lot of authors played with this convention by describing chapters in a technically accurate but misleading fashion.
A massive proportion of all titles were this way. Take a look at fictional novels or political pamphlets some time. They're just three sentence long summaries and they're all delightful.
I think the exact opposite, its unnecessarily embellished and ornate. Makes the whole thing sound so whimsical as to not be worth reading. I don't need a fairytale title to be interested in the content.
>> There is no surviving portrait of Michell; he is said to have been "a little short Man, of a black Complexion, and fat".
Only if you assume that describing someone as having a "black Complexion" sometime between 1731-1764 has the same meaning as if an American used phrase in 2021 (e.g. having Sub-Saharan African ancestry). I think there's a fairly high likelihood that assumption is wrong.
> 43. Account of the Woodwardian professors of fossils, at Cambridge, viz. Dr. Conyers Middleton, Dr. Charles Mason, Mr. John Michel, and Dr. Ogden, 1731-1764. pp. 156, 157.
Colors have many shades; Beethoven was often said to be 'swarthy' (dict: dark-hued or dark-complexioned) by contemporaries, sometimes called 'Spagnolo' or 'Moorish'. Gossip aside, tells us nothing useful in the context of his achievements.
Don't know that we can really read contemporary understanding of the word black into this quote. I think given the context of England at the time, as well as the description of his family background, it is very unlikely that he was black.
I was wondering about that. Is there a way to verify? It would be really interesting if he were, and it would be equally interesting if there was some way to show conclusively that he wasn’t.
It’s a tricky subject, but it must be possible to approach it with scientific curiosity and clinical detachment...
1. How would an English clergyman in the first half of the 1700s typically describe someone who was of African descent? This appears to be the guy who wrote the description: https://en.wikipedia.org/wiki/William_Cole_(antiquary). My intuition is that the wouldn't just use "black," but would probably use something more specific, like "African" or "negro."
2. At that time, would a "black" person (in the modern sense) have been accepted into the kinds of positions John Michell had (like being a member of the Royal Society), with so little comment on his race?
Black people were very rare in Britain at the time, this term was used very broadly for people like Italians, Iberian descent, jews, etc.
I doubt there is any way to show "conclusively" that he wasn't, but I would say the evidence is pretty strongly against. Someone I quoted in another reply provide more reasoning on what that evidence is.
It wasn't until 1786 that the British government first employed one of the (relatively few) black men in a minor role, yet I'm supposed to believe based on a contemporary reading of this text (before "black" was nearly widely used to mean what it does today) that this (at least 3rd generation) British professor at Cambridge was black?
Highly doubtful. I'm not sure what statistics you are basing off of.
Here's a more reputable source than me:
> An account apparently purports him to be "a little short man, of black complexion, and fat", though we have been unable to locate any specific contemporary source. However, such words even if used do not necessarily indicate he is of black-African descent, as the term can refer also to Spaniards, Italians, Greeks, Arabs, Ethiopians, or Jews. Jonathan Swift uses the expression "a tall, thin, very black man, like a Spaniard or Jew." In the given context it is almost certainly intended as a slight against Michell by painting him as something he was likely not. The Royal Society famously refused election to Jamaican scientist Francis Williams (1702-1770), on account of his complexion, and it almost certainly would not have elected a black man as early as 1760. Moses Da Costa became the first Jew elected to the Society in 1736, and a second was elected in 1747; the first female was not elected until 1945. The earliest black individual we could determine that attended Queens College, Cambridge was an American, Alexander Crummell, who graduated 1853.
The main text of that entry appears to be a slightly-reworked version of the public-domain 1911 Encyclopaedia Britannica, but the Google Books image doesn't have the footnote you quoted. I'm guessing it's original to that NNDB site, which probably started with the encyclopedia like Wikipedia did.
Not about the article's content, but about the interestingengineering.com web site: it sucks. It consumes 100% CPU, tries to play multiple videos, and takes hundreds of megabytes of memory.
This must be the same John Michell who gave Henry Cavendish his pendulum to measure the density of the Earth :
„Many years ago, the late Rev. John Michell, of this Society, contrived a method of determining the density of the earth, by rendering sensible the attraction of small quantities of matter; but, as he was engaged in other pursuits he did not complete the apparatus till a short time before his death, and did not live to make any experiments with it. After his death, the apparatus came to the Rev. Francis John Hyde Wollaston, Jacksonian Professor at Cambridge, who, not having conveniences for making experiments with it, in, the manner he could wish, was so good as to give it to me.„[1]
Speed of light can be calculated using only classical mechanics and experiments. Same for gravity and gravitational constant. Predicting relativistic effects of high speeds and huge masses that is another thing.
Isn't it a bit like getting to correct answer by incorrect reasoning? I was thinking that knowledge of how space curves in presence of high energies/masses and time/length dilation concepts are necessary to come to the conclusion that light will be slowed down beyond its speed in certain circumstances?
Maybe a bit similar to e.g. Gordiano Bruno's claim that stars are similar to our sun, and there are planets around them, not on the basis of observations and data analysis but b/c of some relatively unrelated to astronomy philosophical reasoning.
The corpuscular theory of light was one of those theories that was wrong but useful. The successor wave theory was also wrong. It may turn out that general relativity is founded on incorrect reasoning ("God does not play dice" for one), but that matters much less than it being fantastically useful.
The basic idea- if light interacts with gravity, if you pack enough mass into a region you will get a dark star- is correct. It just turns out that the interaction can't be described by treating light as little particles and calculating by Newton's equations.
There's been lots of wrong-but-useful theories, like phlogiston or arguably luminiferous aether:
It's not founded on incorrect reasoning, which would imply a logical inconsistency or error in the deductions, but it could be founded on false postulates. (A postulate is like an axiom except that it is held to be true in the real world, as opposed to being abstractly true.)
We should bear in mind that, if the history of science is any guide, in 250 years' time relativity will probably seem pretty badly flawed as well leading some future person to criticize our science for "getting to a correct answer by incorrect reasoning." (If we're lucky. It might just seem wrong.) So we'd do well to be charitable in crediting our predecessors' achievements.
Your comment seems to imply that, for example, the current consensus is that Newtonian gravity is 'pretty badly flawed' now that we understand it to be superseded by general relativity. This is not true; Newtonian gravity is considered to be a very good approximate theory for low velocities and field strengths.
If we use relativity to explain certain phenomena and the theory fits but it turns out that in the future, there's a much more complete explanation then people will probably look back and say that relativity was the wrong steps to the right answer even though those "wrong" steps are serving us quite well right now.
Relativity will probably suffer the same fate as classical mechanics: true for small values of v. In relativity's case, the small value will probably be about 0.99999999c.
Given what we know today, it seems that it's much more likely that special relativity will remain more or less as we know it today (a good approximation for relatively flat regions of space-time and non-accelerating bodies), and that GR will be a good approximation for something like large regions of space-time and huge masses (where "large regions" may mean anything larger than a few Planck lengths, and "huge masses" may mean anything greater than a few micrograms).
The Unruh effect can happen in flat spacetime, for example, and is a useful check on Hawking radiation (in a black hole curved spacetime).
It's also interesting in understanding the weak equivalence principle in detail; very loosely, a small-mass object can rest quietly on the surface of a non-compact, non-spinning, spherical mass eternally, while it is at least very difficult to keep the same small-mass object accelerating uniformly in flat spacetime for even fairly short (compared to say the age of the universe, which is hardly eternal) finite times.
Completely agree. It's a textbook example of hindsight bias when people cannot "unsee" the currently accepted theory deemed to be "true" and fail to imagine that people in the past thought just like them before the new theory came along. I think it takes a good amount of marketing (eg. Einstein with Relativity) to change people's mind if a new improved theory is to be accepted and for majority to conform to it.
So let's indeed be very charitable in crediting our predecessors' achievements.
What do you mean by marketing? Einstein didn't need much convincing, as he proposed experiments that would falsify or confirm general relativity. The theory is also not only mathematically sound but very elegant. Acutally, Hilbert arrived at the same equations using a competely different, more mathematical approach using the least action principle, which is another good sign it was correct. But the smoking gun was of course Edison's observation of gravitational lensing during an eclipse.
But of course the topic in this subthread is special relativity. And I think the speed of light being constant and a maximum speed is one of the most fundamental and most assured fact about the universe that we know, because it's more than just some speed of some particle. It's a geometric property of spacetime, a Lorentzian manifold. I doubt it will ever be looked upon as some sort of horrible blunder, just like no one ever thinks less of Newton just because he only found an approximately correct theory. I think it's foolish to doubt it just to be contrarian.
Could it be wrong? Sure, we look for violations of Lorentz invariance all the time. But the boundaries we have on it are extremely good. But if someone claims to have observed superluminal speeds, I'll doubt the experimenters more than I'll doubt special relativity, because of the strong priors we have on it. This happened by the way a few years ago, where some Italian lab claimed to have seen superluminal neutrinos. Most professional physicists didn't believe it. It turned out to be a faulty cable.
> the smoking gun was of course Edison's observation of gravitational lensing during an eclipse
Actually, the fact that Eddington's (not Edison's) eclipse observations were treated as a "smoking gun" was an example of marketing, since we now know that Eddington's actual data was simply not good enough to resolve the effect in question (bending of light by the Sun). The reason the claimed result was accepted without question was Eddington's prestige, i.e., marketing. If it had been someone with less prestige the results might have been examined more critically.
and while maybe you could say it was about gravitational lensing, the observatories (and Earth) was so far from any possible caustic that it's better to leave it at just deflection of light.
I would expect a lot of incorrect past models of science and its precursors to still be based on (mostly) correct data, so bad guesses were still informed bad guesses at least.
I suppose it's impossible to ever really know what the true nature of reality is. We only have models which have an accuracy of x%. It's quite possible, dare I say likely, that e.g. quantum mechanics as we understand it today is just an approximate model of how things really are. I suspect that the fundamental nature of reality is probably at a scale too small to be testable anyway.
Not really. You can predict gravitational light bending with newtonian physics and a corpuscular approach to light. It gives you half the value of the relativistic value because the assumption is that light as a wave is unaffected by gravity due to no gravity-EM coupling.
If you haven't already, check out the Demarcation Problem and work by Feyeraband. Physics is such a philosophically tenuous science that its theories are perpetually incorrect reasoning.
Quantum Mechanics most likely is not the true explanation of what is going on either, but it sure as hell accounts for a lot of it and is why we have modern computers.
I can't say for sure on this particular case, but the usual reasoning is this:
Using Newtonian Mechanics (I'll skip the derivation.) you can calculate escape velocity as follows: v = sqrt(2GM / r)
r is the distance to the center of mass, G is the usual gravitational constant and M is the mass of the body.
So, now we can reason about an object whose escape velocity is c.
c^2 = 2GM / r and rearranging a bit:
r* = 2GM / c^2.
Thus, IF we packed a mass M into the radius r*, we would have a Newtonian blackhole.
It turns out, quite incredibly, that this r* is in perfect agreement with the swarzchild radius from a non-rotating blackhole calculated using general relativity. Exactly why these two are in such agreement has never been explained to me other than as a coincidence. But it's quite incredible since usually r^3 terms pop up prior to the event horizon (at the same radius) which make GR make different predictions about orbits. (Such as the precession of mercury)
I can see an interesting and controversial entertainment series based on an alternative history retelling of Einstein, in his role as patent clerk, reading John Michell's work and that is the seed which inspires Relativistic Theory. Tie the work of Mitchell and his world to Einstein's. The juxtapositions of time and culture, British 1770's versus Germany's early 20th century would be quite entertaining on their own.
From a newtonian perspective you can imagine that gravity emits a force between all pairs of massive objects. That's a good way to look at it for small values of mass. It works pretty well on the scale of our whole solar system. This is what you're imagining when you say "not by gravity but by spacetime curvature".
But gravity is spacetime curvature. It is manifested by changes to spacetime itself made by that mass. For very large values of mass (or density in this case) that newtonian approximation doesn't hold according to experiment. We need a different model and the model that we've found that works is that mass changes the shape of spacetime. Its presence changes the nature of straight lines and the paths that you can follow (geodesics).
A black hole causes all possible paths of spacetime, paths that light must follow because it travels through spacetime, to point into the black hole https://en.wikipedia.org/wiki/Black_hole#Event_horizon Light must follow straight lines but because the space containing those lines is curved, there don't exist any of those lines that point outside of the black hole. Once you've crossed the event horizon, space itself doesn't have any straight lines that point outward, there are no paths that you can follow.
It's even weirder than this because we're mostly familiar with motion through space so that's probably what you (and I) are imagining, but this is motion through spacetime. So really the correct thing to say is that there exist no 4-d paths that point outside of the black hole that are also in the future. And that's even weirder than it sounds because you asked about light specifically and light doesn't experience time the same way you and I do. So our intuitions and our ability to map reality to the nice charts that we're used to looking at are pretty limited here.
Seems like he wasn't a "lowly country rector" since he had a masters degree from Cambridge, was a member of the Royal Society and had many other scientific accomplishments. The rectorship was apparently just a perk that came with being a successful scientist.
I wish bloggers wouldn't try to create a false impression of things like that. It's disinformation - intended to mislead the reader.
But his scientific reasoning and methods very much so. He carefully conjectured what could be observed e.g. binary stars in amassing data points. Naturally, in a logical progression, he looked for stars which behaved as if they were in a binary systems but the partner i.e. "dark stars" could not been seen because its mass would be so big that the finite speed of light couldn't "escape". Brilliant.
He, also, inherited Cavendish his "torsion balance" apparatus for measuring the mass of the earth for what later became the famous "Cavendish experiment" yielding an accurate measurement of the gravitational constant. The experimental basis for the advancement into the nature of gravity.
Quite impressive.
And he did not yell about it or seemed to care about what nowadays would be called the "impact factor".
His impact was quite of another nature.