Hacker News new | past | comments | ask | show | jobs | submit login
Black Hole Tech? (stephenwolfram.com)
140 points by superfx on Feb 23, 2016 | hide | past | favorite | 36 comments



Normally I skip articles by Wolfram because all he does is spout about how good Mathematica is. But filter out the free advertising for Mathematica and you have a pretty decent article that starts out discussing gravitational waves and continues on to some interesting coverage on some neat results about gravity. I'm not sure we'll be creating a stable lattice of bodies all orbiting each other any time soon, but it was fun to read about. If you are moderately talented at math/physics you can probably understand most of this article. No PhD required, and the animations are fairly informative.


My father was a scientist by profession. I remember how much he talked about SMP when his branch of NRL got it. Later he learned a bit of Mathematica. Anecdote is not data, but when Wolfram claims that scientists use Wolfram Research's products to investigate relativity, I tend to believe him. Not just because it conforms to my experience but because he is in a position to know.

I have found that pretty much all of what Wolfram writes is closer to this article than the popular internet opinion. I suspect that that's because his tools scratch his intellectual curiosity...they exist so that he can start with the sort of math that starts this article and go from there...or write a book like A New Kind of Science.


Some universities pay for a Mathematica site license. I have used the software since version 6. It's really good for prototyping (my whole MSc was done in Mathematica), but the biggest problem for me is that code re-use is a total mess.

It's easier to teach someone to write some Python and use a Git GUI than it is to teach a new student (who is not a programmer) how to properly use Mathematica with version control. Notebooks are, by design, made to be interactive and promote an incremental, playful discovery process. That is intuitive and students pick it up very quickly.

This means that code is passed around as (in my experience) bloated, poorly written, poorly documented and inefficient (no FP, all imperative) notebooks containing multiple "orphan" sections which were just quick hacks to try and see if something worked. Of course you can learn to write packages and good documentation which integrates flawlessly in the documentation center, but that takes time and effort most scientists would not be willing to give.


I agree. I actually love Mathematica and the Wolfram Language but its main problem is the lack of any convenient ways to package it, track it in git, edit it in vim, etc. I spent some time investigating how to do that but it all just seemed really opaque and complicated. That's one of the risks of using proprietary software, I suppose.


You can do that with packages but not (at least not easily and conveniently) with notebooks. So basically your functions can reside in a package, but your computation has to reside inside notebooks.

It is possible to write notebooks directly in a text editor and then have them run. That just gave me an idea for a weekend project...


I'm sure it's possible, but is it straight-forward how you're supposed to do this properly? And is there any way to access the Wolfram kernel from the command-line (like a python executable)?

Update: I guess I owe the Wolfram Language another look.



Ahh, well alright then :). Thx for the info.


Have you observed similar usage patterns with Jupyter/IPython notebooks?

I like the playful-discovery process concepts but the difficulty in versioning and sharing seems like a fascinating UX challenge that comes along with it... so I'd be deeply interested in any ideas you might have on that what influences that balance (positively or negatively) :)


I have only used an IPython notebook once and have no opinion on it. I still use Mathematica. I'm thinking about a small weekend project... I'll post back if it works out.


I'd love to hear about it! It's a subject near and dear to my heart at the moment. My email's discoverable through my profile.


Indeed. But the book was nothing to write home about.


To a first approximation, nobody writes home about science books. These days everything is a juried paper.


Oh, the problem is that Wolfram portraits his book as the ultimate breakthrough, even though at bests its an entertaining pop-sci tome.

See http://arxiv.org/abs/quant-ph/0206089

Also funny, from the same author:

> “The impact of NKS on all the areas of computer science and physics I’m familiar with has been basically zero,” he says. “As far as I can tell, the main impact is that people now sometimes use the adjective ‘Wolframian’ to describe breathtaking claims for the trivial or well-known.” [Martin] Davis offers a sunnier take: “The book has a lot of beautiful pictures.”


When the book was finally published, demand was low, with an initial print run of 400 failing to sell out.

https://en.wikipedia.org/wiki/De_revolutionibus_orbium_coele...


Sorry, I don't understand. What's your point?


He has another interesting article in which he speculates how spacetime might emerge from connected nodes (graphs).

http://blog.stephenwolfram.com/2015/12/what-is-spacetime-rea...


Same feeling. Apart from the occasional Wolframism ("my Rule 30"), this is a really informative read.


He also harps on how Cellular Automata will replace (or complement) a lot differential equation based physics, but I havent seen many significant exampled. Hew wrote a whole book on that and refers to it a couple of times in this blog.


I think that a cellular automata can replace any differential equation. That doesn't necessarily make it a more correct (or more useful!) way to look at things.


The real challenge is that you can't easily predict what phenomena will be caused by what rules, and you can't easily derrive what rules will cause whatever phenomena you are interested in.


Mathematica is an insidious pox on the research community. Research needs to be open and verifiable, not obfuscated and closed. If the system you are using to perform calculations is just a giant black box, it's often extremely difficult to understand what is actually going on under the hood, which makes it very difficult to check for mistakes.

Here's a good reddit thread about some really terrible arithmetic errors in mathematica: https://m.reddit.com/r/math/comments/2kjyrc/known_error_in_m...

As a computer scientist doing physics research, I don't understand how my colleagues put up with the terrible trifecta of mathematica, matlab, and labview. These are some of the lowest-quality and most frustrating pieces of software I have ever had to put up with, yet they are ubiquitous in many research communities. There are vastly superior solutions that are free, open-source, and, as far as I can tell, much easier to use.

Every day I see students and researchers struggling to circumvent the idiosyncratic and senseless designs of these softwares. I think it might just be a vicious cycle of professors only knowing shitty software, so the students only use shitty software, don't learn good software, become professors, and the cycle repeats.


> These are some of the lowest-quality and most frustrating pieces of software I have ever had to put up with

I'm a strong advocate for migrating research code to FOSS wherever possible, but even I don't agree with this statement. Mathematica and Matlab are pretty good at what they do (labview is a different story).

What "vastly superior" FOSS solutions do you have in mind? sympy? sage? Do you think these aren't going to have errors in them?


>What "vastly superior" FOSS solutions do you have in mind?

Depends on the problem domain. A good example is 1D numerical perturbation theory; my colleagues struggle to beat mathematica over the head until it gives them roughly what they want. On the other hand, it's much easier using jupyter, numpy, and matplotlib.

>Do you think these aren't going to have errors in them?

Of course they do, but they won't go unfixed for years on end. You can also figure out what's wrong without having to reverse engineer a big binary blob.


> I don't understand how my colleagues put up with the terrible trifecta of mathematica, matlab, and labview

In my experience, it seems to happen because those are the tools taught to science and engineering students, who then take them to the workforce (post-doc, industry, etc.). Those entering academia then use them for instruction, rinse & repeat, just like you describe.

For what its worth, I've tried very hard to move folks I work with from MATLAB to Python.


Anyone who finds this interesting would also probably find this interesting: http://accelerating.org/articles/transcensionhypothesis.html

The author thinks "black hole technology" ends up being very important in the development of advanced civilizations and offers a solution to the Fermi paradox (essentially, some black holes and other dense objects are better Dyson spheres used by post brain upload civilizations).


I'm not a big fan of Stephen's cellular automata approaches with respect to fundamental research, as I consider them overly reductionist and largely unable to be validated.

Regardless, a very interesting article. Stephen is a pretty crazy guy (especially in person) but he is undoubtedly smart and thinking deeply about big problems.


> But as of a little more than a week ago I’m finally convinced that black holes exist, just as General Relativity suggests.

Didn't we know black holes 100% exist 20 yrs ago because of the orbits of the stars around the centre of the galaxy and before that because of quasars?


Well, we had good reason to believe that there were things with approximately their mass in approximately their area that emitted approximately 0 light to us. Which, yeah, isn't quite the same thing, cause you could imagine that if GR was slightly wrong, you could have something slightly different at the center than a True Singularity.


Interesting summary of some aspects of Black Holes.

I recently read a centennial volume on history of General Relativity of which Black Holes are extreme cases. Except for some initial GR solutions and astronomical confirmations in its first decade, GR became a backwater branch of physics for the next 50 years until Thorne and Hawking came along. I was in college in 1973 when the term Black Hole came into public use, though the book mentions some claims of earlier usage. Just giving some exotic phenomenon a snappy name can focus attention. I recently re-saw the 1967 Star Trek episode about time travel back as an UFO incident. They use the black hole concept, except called it a dark star then. Gravitational singularity was a competing name, but not as snappy. (To be precise it is possible to have an event horizon without a singularity, so they are not exactly the same thing.)


It's a long article with lot of information and at some parts it got boring. But continue (skip :)) to the last part that is really interesting!!!


It's really odd for him to keep harping on Mathematica when the LIGO computations were clearly done in a Python ecosystem as shown in:

http://journals.aps.org/prl/pdf/10.1103/PhysRevLett.116.0611...

https://dcc.ligo.org/public/0122/P1500217/014/LIGO-P1500217_...

https://software.intel.com/en-us/blogs/2016/02/14/python-bri...


I found the "gravitational crystal" concept and the following part interesting:

"But what about the three-body problem? The pictures above suggest a very different story. And indeed my guess is that the evolution of a three-body system can correspond to an arbitrarily sophisticated computation—and that with suitable initial conditions it should in fact be able, for example, to emulate any Turing machine, and thus act as a universal computer."

Building a complicated physical computer to simulate real-world similar to the one in The Hitchhiker's Guide to the Galaxy is an interesting concept. But wouldn't designing that computer require even greater computational power?


Does anyone have any more insight/interesting reads about the electron being a black hole thing?

Somehow the idea that electricity may come from when something is being "sucked out" of our universe is appealing.

I'm just a layman thinking of a sci-fiesque scenario, of course :)


Not exactly about the electron, but if you want good fiction about singularities and the like, find books by Greg Egan. /Incandescence/ is about a civilisation living near a black hole. When you read it, keep a paper and pencil handy. Egan's web is at http://gregegan.customer.netspace.net.au/.


"When I was 15 or so, I remember asking a distinguished physicist whether electrons could actually be black holes."

Honestly, what's the point of that comment? Just as much information could have been conveyed sans ego by saying "I remember one time when I asked a distinguished physicist whether electrons could actually be black holes."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: