Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Matrices in Julia (alexhwoods.com)
51 points by alexhwoods on Aug 23, 2015 | hide | past | favorite | 29 comments


Having only been around since 2012, Julia’s greatest disadvantage is a lack of community support.

That is true, but if you post a question to their forum, you will get swift answers, often from the core language team:

https://groups.google.com/forum/#!forum/julia-users

Unfortunately these posts don't always show up at the top of a google search, so it's sometimes better to search the user group directly.


often from the core language team

Very friendly core language team, by the way :)


My favourite trick in Julia using matrices is the following one liner:

fib(n) = ((BigInt [1 1; 1 0])^n)[2,1]

Which defines a fast, exact, Fibonacci function. It's so far the easiest and only way I've managed to calculate the 10^9th Fibonacci number.


The following defines the Fibonacci recurrence relation (in matrix form):

[F(n+1); F(n)] = [1 1; 1 0]^n * [F(n); F(n-1)]

or

[F(n+1); F(n)] = [1 1; 1 0]^n * [1; 0] if we define F(0) = 0 and F(1) = 1.

However, we can diagonalize [1 1; 1 0] using its eigenvalues and eigenvectors (see here: http://www.wolframalpha.com/input/?i=eigenvectors+of+%7B%7B1...).

For any diagonalizable matrix A = PDP^-1 where P is the matrix of eigenvectors and D is the matrix of eigenvalues on the diagonal, A^n = PD^nP^-1. Try to expand the A^n if you don't believe it. Using this trick, you have simplified the power of a matrix to a power of two floats, namely the eigenvalues.


Floats have rather limited precision for this type of computation for large n (gp's post had n=1e9). If you were to go with big-floats instead, then the resulting arithmetic will be more expensive that the straightforward big-integer arithmetic.


From this you'll end up with the closed form in terms of the golden ratio. Might as well use that.


Yup, you will. It's nice to show where that comes from though. :)


I was playing with the equation on a sheet of paper. From what I can tell, the [2, 1] figure at the end is unnecessary. The Fibonacci sequence is encoded directly in the [1 1; 1 0]^n figure. The only thing the [2, 1] figure does is fast-forward to the 2nd iteration.

Fibonacci sequence = 0, 1, 1, 2, 3, 5, 8...

if M = [1 1; 1 0], then

  M^1 = [1 1;
         1 0]
  M^2 = [2 1;
         1 1]
  M^3 = [3 2;
         2 1]
  M^4 = [5 3;
         3 2]
  M^5 = [8 5;
         5 3]


The [2,1] is just there to pick the correct entry. You can pick another one but you'll have to change the power of the matrix.


What does this offer me that Numpy doesn't already offer, with the advantage of not having to learn a new language?


1. Speed in iterative algorithms.

2. (Almost) painless concurrency.

3. Symbolic programming right in the language (think of SymPy built in the language).

4. Closeness to Matlab (which makes it easy to translate existing code).

In general, people tend to port to Julia algorithms that were originally written in Matlab or C++ (with the conciseness of first and speed of the second). For example, I made several attempts to translate Matlab/C++ code of active appearance models to NumPy, but succeeded only with Julia [1]. Another example - Mocha.jl [2] - is a port of Caffe deep learning framework (written in C++ with a wrapper in Python) into a pure Julia, which it easy both to use and modify.

So if you just use existing libraries, there's probably no big difference what to use. But if you actively write new libraries, at least give Julia a try.

[1]: https://github.com/dfdx/ActiveAppearanceModels.jl

[2]: https://github.com/pluskid/Mocha.jl


Any noteworthy probabilistic programming efforts in Julia? Perhaps Mamba?

I don't like Python that much, but some libraries like Theano or PyMC are really really well done.


What don't you like about python? Just curious.

There is this for HMC, NUTS gibbs etc : https://github.com/JuliaStats/Lora.jl

But its not ready to use yet it think.


There's also a Stan wrapper https://github.com/goedman/Stan.jl


Another one: https://github.com/zenna/Sigma.jl

There was a talk about Sigma at JuliaCon this year, but I don't think the video is posted yet.


I used to (almost) exclusively use Python and didn't like that I had to go bath and forth with Cython to make my models fast. Earlier this year I ported some of my code to Julia and feel more productive because of it. I feel like it's easier for me to implement new non-trivial improvements to my models than it was before.

For what it's worth I still stick with Python for computer vision applications. I mostly use Julia for timeseries models (e.g. Semi Markov CRFs) where it's harder to vectorize everything.

If I were restart again I'm not sure if I would suggest going the Julia route. I know there are alternatives like Numba that enable you write efficient code with minimal effort. Overall I like Julia as a language but there are a lot of little issues due to it being a new language.


The big reason, performance. On a good day I've seen 5-10x performance improvements over python/numpy for real life code. Of course on bad days I've seen 2x slowdowns, but I'm reasonably confident that within another couple of releases they'll consistently beat python/numpy. Also the Julia syntax is a bit nicer then numpy in my opinion, but that is secondary.

Of course there is both pypy and numba trying to improve the performance on the python side of things.


Also numpy code looks terribly ugly for numerical computing, when compared with julia (looks like matlab, but the language doesn't suck like matlab.)


Yeah that was my impression of julia too. Its awesome to use matlab style syntax for matrices and arrays and stuff.


Example? I keep hearing this, but mostly I got such arguments from people who do not know NumPy (and e.g. use np.array where np.matrix would make the code readable).

(I mean, not everything is perfect, but are there side-by-side vector computations, which are nice in Julia, but not ugly in Python?)


It's a bit of a matter of taste, but many people coming from languages like Matlab prefer the more terse coding style of for example

   A\X'
over the more verbose

   np.linalg.solve(A,np.transpose(X))


julia seems like they're chasing the holy grail: a language that's pleasant to do math in (python is distinctly not this), nice to prototype with, and robust enough and fast enough for production work. I would be delighted if they succeed.


In instance, a lot of new research on statistics is done in R, mainly because is the standard in the industry/academy. If you want to use some new algorith you just readed about on a Journal, it will be implemented almost surely in R.

Julia have been trending for a couple years, and now a lot of researchers publish both R and Julia packages, it seems that in the future most research will be done in Julia.

TL;DR: If you just need basic statistics, interval estimation and alike, then you can use whatever you want, but if you want to use state-of-the-art algorithms. R/Julia are the tools.


Where do you see that researchers are posting new code in julia?


Yeah, I think I've seen maybe one researcher post their code in Julia, and I only found that because of reddit.

The statistical community is still 100% on R and likely will be for some time.


Right now I'm using the armadillo c++ library for linear algebra. It's quite fast (can use openBLAS too + it does some optimizations with template metaprogramming) and quite convenient to use. Julia seems really interesting though.


Is it possible to make one of the items in the matrix a variable? (And then e.g. put the matrix in an equation, expand it, and solve for the variable).


You can do this using the SymPY.jl package. As a good chunk of matrix functionality is parametric on element type, you can do matrix operations on SymPY's symbolic variables.

See example 6 in: https://github.com/andreasnoack/andreasnoack.github.io/blob/...


Not out of the box. There are however a few projects out there trying to add symbolic manipulation to Julia.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: