Hacker News new | past | comments | ask | show | jobs | submit login
Pioneer Anomaly Solved By 1970s Computer Graphics Technique (technologyreview.com)
218 points by _b8r0 on March 31, 2011 | hide | past | favorite | 80 comments



This is a classic example of how the physics community has been failing for the last 30 years.

First of all, there's an extreme focus on papers that have come out in the last 1.5 years, so that a lot of very interesting older work is invisible.

Secondly, physicists don't look outside the discipline, despite the fact that we often use inferior techniques. Back in the 1990's, Mark Newman and I were both working at Cornell and both of us were aware that the techniques physicists were using to evaluate power law distributions were bogus. Well, I was a timid grad student and, despite being one of the best physicists of his generation who already had written half of an excellent textbook and had a stellar research record, Mark was a postdoc who spent most of his two years in absolute anguish about how he was going to find his next job.

Mark wrote a paper about this ten years later, after physicists had published thousands of bogus papers using bogus statistics. It's a tragedy that neither Mark, myself or some other young turk didn't write it earlier -- and it wouldn't have been hard to do it all because it would mainly be a review paper of what was already in the statistics literature.


> Secondly, physicists don't look outside the discipline, (...)

Pardon? Phong shading is a simplified model of results of a physical process -- light reflections off certain kinds of surfaces (metallic ones, IIRC). It's imprecise, it has its limitation etc. Contrast that with the very tiny, precisely measured acceleration of Pioneer spacecrafts. It is not very common to get good results from applying a coarse tool to a fine problem.


A big part of physics is the art of approximation. It's almost never possible to have a perfectly 'exact' description of a situation.

For instance, in introductory physics we have students work a number of problems involving objects falling under the influence of gravity. Air resistance is rarely considered, and if were to be considered, approximations of some sort would be required, since there's no complete theory of turbulence.

Everything that involves a complex computer simulation, say molecular dynamics or fluid flow, involves approximations.

Even when you look at the simplest calculations involving elementary particles, all of these are intellectually justified by the renormalization concept that assumes that, at some very small length scale, the laws of physics that we know will break down -- but we know that QED, QCD and such are very good approximations.

The first virtue of an approximation is that it captures the qualitative character of a problem, after that, it's a matter of adding an increasing number of decimal places of quantitative accuracy.

More sophisticated models of radiation transport exist (these are critical to the development of H-bombs, etc.) and an obvious follow-up to this paper would be to use a better radiation transport code to validate the result.


Agreed, mostly.

> The first virtue of an approximation is that it captures the qualitative character of a problem, after that, it's a matter of adding an increasing number of decimal places of quantitative accuracy.

Unless the approximate model diverges from the actual process in extreme cases, such as very high, very low values.

One thing comes to my mind, if only slightly related: `Reciprocity failure' [1], an effect in photography, where the usual model of relationship between shutter speed (exposure time), photo material sensitivity and lightness of the scene photographed diverges from reality for extreme values.

In such cases, increasing precision of the model isn't just a numerical task (variable precision, iteration count, etc.).

----

[1] http://en.wikipedia.org/wiki/Reciprocity_(photography)#Recip...


> It is not very common to get good results from applying a coarse tool to a fine problem.

From the blog post, that isn't really what happened here -- the problem was accounting for reflections, and a technique was needed to estimate such effects. It doesn't matter that the anomalous acceleration was small compared to the total acceleration. It was of the same order as these reflection effects, and so only a "coarse" accounting of them was needed.


To nitpick, the discpline of physics covers everything - or it damn well ought to :) While it seems obvious after reading the article that solving the rendering equation is indeed a promising way to solve the Pioneer anomaly, I'm sure that I wouldn't have come up with the idea.


> To nitpick, the discpline of physics covers everything - or it damn well ought to

Everything in the universe. But there's still math.


Math consists of rules discovered by experiencing physics. There's no particular reason to think that math transcends the universe.


Math is just a meaningless game with abstract made-up rules. No relation to the universe necessary.


> Mark wrote a paper about this

I'm interested. Is it:

Power laws, Pareto distributions and Zipf's law, M. E. J. Newman, Contemporary Physics 46, 323–351 (2005). [link: http://arxiv.org/abs/cond-mat/0412004]


Actually, the paper you should be reading is this:

"Power-law distributions in empirical data", Aaron Clauset, Cosma Rohilla Shalizi, M. E. J. Newman http://arxiv.org/abs/0706.1062


Agreed that physics would hugely benefit from exposure to modern techniques in computer science, "software carpentry," and statistics. How can that be encouraged, though? It's not in the curricula, and those are ossified and hard to shift. Individual researchers forging out on their own can't move the field much themselves. Summer schools, maybe, which could create ad-hoc communities carrying skills back to their home institutions?

As to the temporal bias in literature awareness: it's sort of unavoidable in any research area of decent size. The scope of the literature is simply too enormous. Certain key papers get codified as canon and others are lost in the fog. The situation isn't helped by the various journal publishers, whose various paywalls prevent an effective biobliometric universe which could be used as a discovery engine for old but relevant papers.


"Mark wrote a paper about this ten years later, after physicists had published thousands of bogus papers using bogus statistics."

Agreed, but to be fair a lot of non-physicists were also doing this. For example in their excellent review article http://arxiv.org/abs/0706.1062 Mark Newman, Aaron Clauset, and Cosma Shalizi show that a claimed power law in bytes received per http request was wrong. I'm not going to defend the sloppiness of physicists, but I would note that other fields don't fare as well in this either.

And I wish you or Mark had written that paper earlier too!


There's also the example is about how one medical researcher apparently rediscovered the trapezoidal rule, and managed to get a lot of citations to his discovery.


Do you have a cite on this? Would love to read more.


Here you go: http://www.stat.columbia.edu/~cook/movabletype/archives/2010...

One of the comments on the original paper (the pubmed link that teraflop cites) did point out that it was "just" the trapezoidal rule, FWIW, so the only people one may want to find fault with were the who stopped at his paper and decided to cite it, rather than dig further.


Wow, that's just sad.


I remember seeing this when it showed up on Reddit a couple months ago.

http://www.ncbi.nlm.nih.gov/pubmed/8137688?dopt=Abstract


Your reply is premised on this being the right solution to the Pioneer anomaly. That remains to be seen. For instance the comment in http://www.technologyreview.com/blog/arxiv/26589/#comment-23... makes a good point about the method used.

I think your view of 'the physics community' is skewed by your own experiences and does not describe the actual situation in 'the physics community' very well. My experiences, perhaps because I was at an 'university of technology', are entirely different.


the techniques physicists were using to evaluate power law distributions were bogus.

This sounds interesting, care to be more precise?


What do Turks have to do with it?



Agreed - they're still teaching undergrads Fortran '77 in 2011, as there's this weird conception in the physics community that no language since F77 could possibly even be arithmetically correct.


Somebody better update the people using it as an anti-science argument.

Edit: I'm serious. Conservapedia used to have an article about it and how it discredits the scientific model and whatnot. There's a huge knowledge gap between this views and we should do our best to eliminate it.



Interesting - Conservapedia throws me a 403 Forbidden when I try to access that link, and any other of their pages.

Anyone else having trouble? (Win 7, FF4, UK based ISP)


Hah, hilarious. I'm outside of the US and it 403's me too. When I proxy via the US, it's fine. Guess they don't want evildoers from those atheist countries getting this information.


We just don't want all you freeloading communists stealing all of our hard-earned capitalist bandwidth...


They don't even make an exception for the Brits.


As the man said…


I'm getting a tcp error now. It doesn't really matter. It would be much like rushing off to correct the entries in Uncyclopedia and Encyclopedia Dramatica, anyway.


But Uncyclopedia and Encyclopedia Dramatica don't claim to hold the universal truth or try to convince people to follow a specific lifestyle based on that.

On a site note, my GGP post earned me my 666th karma point. Very suitable.


You're probably right about that. Not everybody thinks "reluctant to incorporate new information" when they see the name "Conservapedia" and has a little giggle.


Works for me (Chrome, Ubuntu, in USA)


The following appears to have been recently added to the page: "Most recently, a new model for reflections of heat produced on the spacecraft[5] has been proposed. While this greatly reduces the amount of unexplained acceleration, it does not appear to completely eliminate it.[6]"

[5] http://www.technologyreview.com/blog/arxiv/26589/

[6] http://www.planetary.org/programs/projects/pioneer_anomaly/u...


The key conclusion, with error estimates:

We performed 10^4 Monte Carlo iterations, which easily ensures the convergence of the result. The thermal acceleration estimate yielded by the simulation for an instant 26 years after launch, with a 95% probability, is

a(t=26) = (5.8 ± 1.3) × 10^−10 ms^-2.

... These results account for between 44% and 96% of the reported value

a = (8.74 ± 1.33) × 10^−10 ms^-2

(which, we recall, was obtained under the hypothesis of a constant acceleration) — thus giving a strong indication of the preponderant contribution of thermal effects to the Pioneer anomaly.

http://arxiv.org/abs/1103.5222


It's a bit disappointing, actually. I was hoping for another revolution in Newton's laws.


"The most exciting phrase to hear in science, the one that heralds new discoveries, is not Eureka! (I found it!) but rather, 'hmm... that's funny...'"

- Isaac Asimov


Agreed. Against all reason, I was kinda hoping that the Pioneer Anomaly would somehow lead us to FTL travel... Something like: "Hey, I think I figured out the Pioneer Anom... [pause...] Ooooohhhh!"


Well, the results have yet to be verified. Maybe it still will lead to FTL travel, and someone will solve this problem last year.


I was hopping that this would lead to us sending out plenty of extra-solar-system probes so we could perform more wacky measurements like the pioneer ones.

http://www.wired.com/wiredscience/2011/03/vanishing-dimensio...

I think the only qualitative way to guarantee research progress is through the expansion of the things we can sense/ perceive. For example, if we want to understand space better, I don't think another near earth telescope is the answer, but rather several that float at the edge of our solar system.

I am also excited about IPN- the Interplanetary Internet by Tim Burners Lee. So that there is a standard way for these crafts to communicate and relay messages as they get really far out.

http://www.slideshare.net/vinodquilon/interplanetary-interne...


Occam's Razor strikes again!


So was I. But there is still hope - these results have not been verified yet. :)


Doesn't it seem like a slowdown caused by infrared light emitting from one part of Pioneer and reflecting off of another part of Pioneer is kind of like powering a sailboat with a giant fan attached to the back of the sailboat?

I'm not a physicist by any means, but doesn't conservation of momentum apply to photon emission and absorption/reflection as well?


The original direction of the (infrared) light is irrelevant, only the end result matters -- the `net force', with reflections and losses factored in. And factoring in the reflections is exactly what the article is about.

So it's not like a sailboat with a fan, more like a jet airplane with thrust reversers [1] engaged. While the original thrust is ordinarily pointed rearwards, it's redirected when the reversers are engaged and the `net force' (original thrust sans losses) is pointing frontwards.

----

[1] http://en.wikipedia.org/wiki/Thrust_reverser


Ah, that totally makes sense. Apologies if that was a dumb question, I am a high school dropout after all. :)


The reason it's like a jet engine is that the engine "creates" wind - by burning fuel, the hot exhaust is the new wind. If a jet engine only moved wind (like a turbo prop), then a thrust reverser would not work.


EDIT:

wait, what? I'm convinced that thrust reverser would (in principle at least, construction details be damned) work with a propeller engine.

In my understanding, even if the air stream was accelerated rearwards at first, and only subsequently redirected frontward, it'd still create net force pointing frontward (decelerating one), without need for increased temperature nor extra mass from burned fuel. But I don't have solid physic background; somebody please correct me.

It may be easier to visualize it with a ducted fan. As used on some RC aircraft models, modeled after jet-engined crafts. I believe a thrust reverser would work in such a setup, minus friction and turbulent losses.


With a propeller engine, the "wind" is a side effect caused by to the primary effect of the props getting "lift" in the forward direction, similar to how the wings themselves work. The prop itself is a wing [1], which twists to create even thrust across the prop despite the speed differences between the root and tips.

Turbofans (and ducted fans) are different, as they really do produce thrust due to the particles they are throwing backwards (direct thrust).[2]

Turboprops can get reverse thrust by inverting the propellers, but adding a thrust reverser behind them would not be enough to negate the forward lift (though they could cause problems with the airflow over the props that would negate the forward lift - but it wouldn't be due to the redirected airflow).

[1] https://secure.wikimedia.org/wikipedia/en/wiki/Propeller_%28... [2] https://secure.wikimedia.org/wikipedia/en/wiki/Turbofan


Imagine a bent tube in a 'u' shape with a fan in the middle. Both openings of the tube point toward the front of the plane.

That's basically what a propeller engine with a thrust reverser is. And I think you can visualize how it will do nothing if you turn on the fan. The force from the air exiting and entering the tube will cancel out.


That doesn't sound correct.

Consider it as a mass flow problem. You're ingesting a certain mass of air m at a velocity v and releasing it in the opposite direction. Assuming no losses in the U tube, the only thing that changes is the direction of v. What matters is the total momentum, mv.

If the tube were straight, mv would point out the back and the airplane would go forward. With a U shaped tube, you have instead -mv which is the same momentum, but a different direction, and the airplane moves backwards.

Now that I think of it, it might be easier to visualize as a conservation of energy problem.


Do you think so? The outward stream would probably be more directed than the inward stream, which comes in more diffuse. (Though I don't know if that matters.)

An actual experiment would be nice.


Maybe.

After writing the previous post that I realized it sounds a lot like a Feynman sprinkler. So now I'm less sure.


A difference in kinetic energy -- E(outflow)-E(inflow) -- of air would result in net force pointing forward (a decelerating force, if we consider a plane). Basically, we'd need the air to get accelerated forward more than rearward, producing net force.

Kinetic energy is E = (mV^2)/2

Obviously the mass of inflow is equal to the mas of the outflow. We can achieve a difference in speeds if the effective cross-section of the (forward-pointing) outlet was smaller than cross-section of the inlet.

That's very off-topic, anyway :^)


Don't you think that a momentum based approach would be more suitable than energy based ones?


You can power a sailboat with a giant fan. What's the problem? Don't point it at the sail though - just point it away from the boat. :)

But anyway the difference here is that the "wind" is not being "pulled" from external sources (like real wind), it's created by Pioneer. And that makes a huge difference.

With wind, the fan pulls the wind (which creates force), and also pushes the wind (which also creates force).

Next you impact the wind into the sail, which subtracts from our force (since the wind is pushing in the wrong direction).

And finally the wind "bounces" off the sail, also subtracting, and leaving you with no net force.

i.e. +2 then -2.

With pioneer one step is missing - the force from pulling the wind (since the photons are created on the fly). So you have +1 then -2, leaving you with a net force of -1.

PS. If you don't understand why pulling and pushing the wind counts as two forces, think of a compressor which sucks in air and stores it. It still causes motion by pulling the air, but it doesn't push it.


Yes. But the equipment on the back is radiating heat (roughly speaking) in every direction, as I understand it, so absent a reflector, the equipment would have a net zero effect on thrust. But because some of the heat is being reflected, it's contributing a non-zero amount of thrust.


What I appreciate about this is the straight-forward manner in which the story presents the case for always challenging your/the assumptions.

Speculation about a whole new aspect for theoretical models of the universe. Someone's grounded enough to go over the work and realize, 'Hey, you're doing the math wrong!'

I wish science education did a better job of teaching this.


> Of course, other groups will want to confirm these results and a team at the Jet Propulsion Laboratory in Pasadena, which has gathered the data on the probes, is currently studying its own computer model of the thermal budgets.

Here's to hoping that the numbers match. Phong shading to calculate numerically the effects of a what is almost a solar sail to decelerate Pioneer - now that's cool science!


It also goes to show no scientific research (if performed rigorously and honestly) should ever be derided as being of little use. One wouldn't expect a graphics (!) algorithm to solve a (seemingly) physical problem.

Same as one wouldn't expect some obscure branch of algebra to give birth to modern asymmetric cryptography ;-)


Yes, it's quite awesome. But to be fair, it's a graphics technique designed to calculate a physics problem.


I am not persuaded. You are essentially saying that any attempt to steer (rigorous, honest) research or research funding in the direction of maximum usefulness is futile. Pointing out instances in which predicting the effects of research is tricky does not get you all the way to convincing me of that.


> (...) is futile.

A negation of `should not be derided / may turn out to be important' doesn't read `is always unimportant'. I agree with you wholeheartedly pointed research is important.

Anyway!

Maximum usefulness on what timescale? [1]

To judge the impact of future use, you'd pretty much have to both 1) invent all the important uses on the spot, 2) judge the market impact of each -- including any ripple effect and any combination with other, possibly totally unrelated at first, technologies [2]. Good luck with doing that in a reliable way.

The honesty of the research itself will be at risk, if there is any incentive in that [3].

----

[1] E. W. Dijkstra makes some pretty sound comments on timescale of research: http://www.cs.utexas.edu/users/EWD/transcriptions/EWD11xx/EW...

[2] a semiconductor, by itself, is a high impact thing. Computer Science (the theory, without the machines) by itself has little impact on anything. The both combined gives the -- unbelievably transformative -- computer :-)

[3] http://en.wikipedia.org/wiki/Oil-drop_experiment#Millikan.27... -- subsequent scientists, instead of reporting objective results of their experiments, tended to skew the data (more or less unintentionally) towards the original reference value (which was eventually found to be off by a good bit). The only incentive here was consistency -- a pretty subtle one!


>To judge the impact of future use, you'd pretty much have to both 1) invent all the important uses, 2) judge the market impact of each

We are in disagreement here about how prediction works even after your use of the qualifiers "pretty much" and "important" is taken into account. (I should add that I am not saying that there is nothing to your assertion in grandparent, just that it is not the whole story and is too pessimistic about the human ability to predict.)

Parenthetically, I once got into a similar disagreement on Less Wrong -- with someone who claimed that the only way to predict any aspect of the outcome of a computer program was to run the program. So let me get your opinion on that, and let me use Dijkstra to choose a unambiguous question to ask you:

Dijkstra claimed that a person could arrive at a high confidence that a program has particular useful properties (e.g., that a program that keeps tracks of balances in bank accounts obeys the "law of the conservation of money") without ever running the program but rather by developing a proof of the "correctness" of a program (using techniques whose development form a large part of Dijkstra's reputation) at the same time one develops the program. Do you disagree with Dijkstra? I'm interested in replies from others too.


I agree with him in a limited scope :D

Certain class of programs is written with the explicit (or at least implicit) goal of having provable properties. Like (hopefully) the program the bank uses to track account balance.

However, a seemingly much easier problem (the Halting Problem [1]) is undecidable in the general case. One can find out at least some properties of some programs some of the time with certainty (modulo mundane mistakes). But one cannot find all the properties of all the programs all of the time.

Now perhaps programming isn't the best model of all human activities, but! If we were to narrow down the discussion to research on Computer Science alone -- it is proved (via the Halting Problem), that you can't know all the outcomes of every possible programs ahead of the time. Which means, research on some (possibly valuable) programs can't be graded with 100% certainty ahead of the time, because there is no way of knowing all the properties of a program in advance.

Back to the original topic, my point was -- if somebody invested effort into (honest, rigorous) research, it should not be derided, ever, as it may find unexpected, valuable uses. I don't claim anything about research pointed towards commercial (or otherwise) goals, except the general ``it's important, too -- but it's is not the only way we should follow''.

Back to the matter of predicting: I am convinced, in general, you can't predict some of the outcomes of research -- and some applications of the results. Moreover, I believe that among what can't be predicted, are many of the innovations and discoveries.

Now economic predictions (this will sell/this won't sell) are wrong some of the time. To prevent a wrong prediction from blocking research, there is usually pool of money for `blue-sky research', generally realized as countries sponsoring academiae, and individuals financing research out of own pocket.

It may be very hard to estimate effects of mis-predictions on research, due to this continuous financing -- financing that's independent of whether there is a clear, short-term goal for the research.

EDIT: the Rice's theorem, posted by sid0 [2], is a much better example.

EDIT2: as a funny corollary to the Halting Problem, in some cases even running the program won't give you a definite answer -- the program may go on endlessly. Running the program is not a fool-proof solution, thus not a general solution.

It follows your discussant was proven wrong (not completely right, to be exact) preemptively -- by Turing's proof of undecidability of the Halting Problem ;-)

----

[1] http://en.wikipedia.org/wiki/Halting_problem

[2] http://news.ycombinator.com/item?id=2392186


Thanks for the reply.


The person was talking about Rice's theorem [1], I believe. (It follows rather quickly from the unsolvability of the halting problem.) Proofs of program correctness might be exact for some programs, but Rice's theorem implies that you will never be able to come up with exact proofs for every program, and must rely on approximations. The entire field of program analysis and verification is dedicated to finding better approximations.

[1] https://secure.wikimedia.org/wikipedia/en/wiki/Rice%27s_theo...


>It also goes to show no scientific research (if performed rigorously and honestly) should ever be derided as being of little use.

You changed this sentence after my reply (in a sibling to this comment).


With no ill intentions. I'm struggling with expressing myself in English; often correcting the posts after submitting. Hopefully in minor ways -- perhaps here I've overstep the bounds.


I did not and do not perceive you to have done anything terribly wrong or dishonorable.


Group Theory is no more an "obscure branch of algebra" than "algorithms" is an obscure branch of Computer Science.


I think he was referring to number theory, which would be a lot more obscure if not for public-key cryptography.


Number Theory is not obscure, and was not obscure before cryptography. There's a reason why they're called "Gaussian Integers", and "Fermat Primes". It's been around for a long time, and studied by most mathematicians. It's also part of the California State Standards.

I can only assume that the definition of 'obscure' here refers to the vast majority of an mathematics undergraduate curriculum. Which is absurd.

An aside, one of the key points of public key cryptography, Fermat's Theorem, a^p=a mod p, is a Group Theoretic concept.


> I can only assume that the definition of 'obscure' here refers to the vast majority of an mathematics undergraduate curriculum. Which is absurd.

It's not absurd at all. "Obscure" is relative to a given population. The relevant population here is not "mathematicians". The majority is indeed obscure to those outside the field. If you use "inside the field" you get absurdities such as homology being counted as not obscure.

Calculus, and geometry are not obscure. Many non-mathematicians have heard of these, and some even use them. Both group theory and number theory are obscure. I say this as someone who uses the representation theory of groups regularly.


Number theory was obscure in the sense, that nobody found any use for its more advanced theorems.


Wouldn't ambient occlusion have been a better technique?


Ambient occlusion (as the term is typically used) is a way to approximate local reflection of uniform inbound (aka ambient) light. It does this by measuring how much of the exterior environment is visible at each point, as a way to modulate an ambient lighting term.

It doesn't really simulate how light bounces locally around an object, nor light emitted by an object. Rather than a light transport simulation it's more of a clever CG approximation hack that looks great and is pretty cheap.

But to the real point I think you were trying to make: yes, there have been a lot of strides in simulating light since the 70's, and more advanced techniques exist that should give more accurate results.


Yeah, I'm not sure that Phong Shading is really the takeaway here. It seems that it's more like they went from a local "illumination" technique to a full global "illumination" technique to take into account secondary bounces.


This was my impression, too.


There is a detailed, concise article on how the model was prepared: http://www.planetary.org/programs/projects/pioneer_anomaly/u...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: