Hacker News new | past | comments | ask | show | jobs | submit login
GNU Scientific Library 2.0 (lists.gnu.org)
105 points by lelf on Nov 1, 2015 | hide | past | favorite | 18 comments



    It has been reported to compile on the following other platforms,

    SunOS 4.1.3 & Solaris 2.x (Sparc)
    Alpha GNU/Linux, gcc
    HP-UX 9/10/11, PA-RISC, gcc/cc
    IRIX 6.5, gcc
    m68k NeXTSTEP, gcc
    Compaq Alpha Tru64 Unix, gcc
    FreeBSD, OpenBSD & NetBSD, gcc
    Cygwin
    Apple Darwin 5.4
    Hitachi SR8000 Super Technical Server, cc
I love that they are still doing this. Nothing shakes out numerical bugs like testing on different hardware designs. Would have been good to run the test suite with ICC and Clang too tho', to be sure.

PS I also love that Hitachi have a product called the "Super Technical Server" :-)


Ordinarily, I'd insert a comment here on how much I miss my old NeXTCube, SparcStation, Apollo, AlphaStation, or Indy...

But the fact is I really, really don't. We are so spoiled now. Spinning up an AWS gpu cluster in 60 seconds from the beach beats spending a day and a half just to upgrade a video card ;)


Living in a monoculture requires an understanding of local maxima that I think hasn't been well thought through. So much innovation came from all those competing hardware designs and OSs.


This optimization problem you're conceptualizing is a very strange one indeed. It's like the only outcome we care about is achieving some perfect architecture.

In a dynamic optimization problem, where we have to take into account the current state of the world, and our ignorance of the future, sometimes the most efficient use decision is going with what we have, since it's good enough.

I'll put this another way: I don't think there's some terrible shortage of fundamental computer science research out there, nor is there a shortage of competing architectures.

One more spin on this. The fact that we have fewer competing architectures than we used to is itself a sign that existing architectures are pretty good. First, they are the survivors, and have out-competed the rest. Second, more importantly, if less research is being done to develop competing platforms, that means researchers are less optimistic about their ability to improve on what we currently have.


Yeah on the other hand supported archaic systems that nobody really uses makes the code harder to maintain resulting in more security vulnerabilities (e.g. OpenSSL).


It wasn't operating systems or hardware that was the problem with OpenSSL, it was supportimg bits of the protocol that noone actually used.

Keeping the code clean by compiling on different platforms, can only make it better and more future proof.


It's nice to see that someone took the torch from Brian Gough. The GSL is one of the best general-purpose free scientific computing packages out there.


Any HPC people would care to enlighten us ignoramuses on how the different frameworks for scientific computing stack up: matlab, octave, mathematica, gsl, nag, etc... Or is everyone just defaulting to Python and R now ;)


GSL is closer to something like BLAS or LAPACK[1] -- it's a library that provides tested/reliable/optimized implementations of commonly used functions in scientific computing. The other software you listed is more user facing--and commonly provides wrappers around libraries like GSL. For instance, MATLAB relies heavily on BLAS & LAPACK, as does numpy; I'm less familiar with whether GSL is called from any well known packages other than Octave, but there are definitely bindings for Python (and maybe R? I don't write any R though)

[1] Respectively, "Basic Linear Algebra Subsystem" and "Linear Algebra PACKage"


> GSL is called from any well known packages other than Octave

I see GSL used a lot actually. Mostly as a component of big scientific projects, but it's not as high-profile because it tends to be in a similar position to stuff like BLAS - really common but not something many people deal with directly.


Mathematica is an outlier in that its API revolves around expressions rather than n-grids of numbers. In theory this makes almost no difference at all, but in practice it makes for a completely different API "feel" and experience. In Mathematica you invoke "grownup" numerics in the exact same way that you invoke "kiddie" numerics and it makes you much more likely to go with the "good" method rather than the "good enough" method. Contrast to matlab (and everything else) where I see people using constant-size grids and Euler's method all the time simply because the language/framework has arbitrarily decided to make them easier to implement rather than because they're actually appropriate for the underlying domain or boundary conditions. I'd really like to see an open source competitor, but SAGE just isn't there yet (or wasn't there last time I looked, which reminds me that it's been a few years and that I should look again).


One thing that has ALWAYS impressed me about Gnu libraries is their extensive documentation.

gsl is no exception, it's manual can be found here:

https://www.gnu.org/software/gsl/manual/html_node/

An aside: is gsl why Gnu Calc is so darned accurate?


The Gnu Public License forbids redistribution unless what you distribute is also GPL. It means I can't use this in anything I share on behalf of scientific groups that want to use Open Source. I've run into this a few times, and it's rough when GSL is so useful.


You don't have to distribute your own stuff under the terms of the GPL. Any other GPL-compatible license will work. Most open source licenses are GPL-compatible, so you should have no problem.

Of course, like you say, this only matters if you want to redistribute. For in-house use, everything's allowed.


Furthermore, unless you're distributing binaries with the GSL compiled in, you can use any license you want for your own code. The things you wrote, you wrote. You can distribute those things however you like.

Six months ago, I , like OP, was very confused and concerned about this same issue. One of several discussions that helped me sort it out: https://news.ycombinator.com/item?id=9477840


Just dynamically link to GSL, don't distribute GSL with your app, and you'll be fine. Most of your colleagues would have it installed on their computers anyway...


I'm not a lawyer but my understanding is that dynamic linking doesn't get you around having to use a GPL compatible license if you link to the library [1][2] and distribute your code to others. That is why GNU releases the standard c libraries as LGPL rather than GPL so you can link to them without having to be GPL compatible yourself.

I personally would appreciate it if the GSL went to LGPL. I have certainly been in situations where an employer wasn't ready to release their source code under GPL but would have sponsored me to add new functionality to GSL if we could link to it under LGPL. That said I can understand the reasons the authors went straight GPL and really appreciate all the amazing work they've done as it is a really nice library with a lot of functionality and pretty solid documentation the times I've used it.

[1] http://www.gnu.org/licenses/why-not-lgpl.en.html [2] http://stackoverflow.com/questions/1114045/gpl-and-lgpl-open...


Didn't even know this existed. Thanks for highlighting it!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: