Hacker News new | past | comments | ask | show | jobs | submit login

Nah, I'm with you. Trying to do the same in Haxe. There's not a first class language for typed tensor ops. Type information could include dimensionality, and remove a whole bunch of stupid, time consuming errors.



What about Julia? It’s what we’ve been teaching our Stanford ML class with and it’s fast! Also typed and JIT with multiple dispatch. Arrays here don’t have the Numpy/Scipy weirdness of matrices vs. ndarrays and linalg is truly first-class.

This comes from someone who used to despise the language, but it’s truly come a long way.


I'm still amazed that Julia hasn't taken off in six years. It's this great language that solves most of the problems that other (scientific computing) languages have, and hardly anyone uses it. I use it all the time for personal projects, but I use Python at work. Looking forward to the day I can use Julia for everything.


My primary issue with Julia is that it has a relatively high latency in a REPL environment. I and many people I know primarily use REPL environments (e.g. Jupyter Lab) for scientific computing, so this is a pretty relevant use-case. For example, if I start Julia and type

   [1 2 3; 4 5 6; 7 8 9] ^ 2
I have to wait about 5 seconds for a response (on a first generation Core i7, SSD). On the other hand, running the following in a fresh Python interpreter is almost instantaneous:

   import numpy as np
   np.linalg.matrix_power([[1, 2, 3], [4, 5, 6], [7, 8, 9]], 2)
Unfortunately, in most scenarios the actual execution speed (where Julia is far superior) is secondary. People just tend to run larger experiments over night; and as long as you can express your code in terms of numpy matrix operations, Python is fast enough.


Just tried this on my (much slower) Intel m3-6Y30 (microsoft surface) processor and it worked in just over a second. What Julia version are you running? Speed has been improving steadily with new releases.

> Unfortunately, in most scenarios the actual execution speed (where Julia is far superior) is secondary. People just tend to run larger experiments over night;

I don't think there are only two scenarios, one requiring instant feedback and dependent on fast startup time, and the other where programs can be run overnight. There are infinite cases in-between. And crucially what about programs that take several days if not weeks (such as the biomechanical data analysis I do in my work)? Execution speed for me (and many others) is essential and doing this work in Python is a pain (I was using it for the same kind of problems before I switched to Julia).


> For example, if I start Julia and type [1 2 3; 4 5 6; 7 8 9] ^ 2'

Put that in a function and run it twice. The second time will be blazing fast since it's jitted. That's the workaround for REPL/Notebook usage. In my experience with Notebooks, I end up having to rerun the code all the time, so it will only be slow the first time around. And I've had my share of Python code that took 5 seconds or more to complete, every single time.


That taking 5 seconds is very strange. I have an early Core M (mobile laptop chip, much slower than Core i7, which is a desktop chip) and that expression takes 0.7 seconds at a fresh prompt. That's still much worse JIT compilation delay than we'd like it to be, but 5 seconds is either a very bad configuration or perhaps a bit of hyperbole? There are other situations like time-to-first-plot where compile times do cause a really serious delay that is a very real problem—and a top priority to fix.


Tried again this morning after rebooting the computer -- turns out I was low on RAM yesterday evening. After starting Julia a few times to make sure it is cached I get the following results:

  time julia -e '[1 2 3; 4 5 6; 7 8 9] ^ 2'
  real	0m1.629s
And for Python/Numpy

  time python -c 'import numpy as np; print(np.linalg.matrix_power([[1, 2, 3], [4, 5, 6], [7, 8, 9]], 2))'
  real	0m0.103s
Edit: Julia Version is 0.6.3, installed directly from the Fedora 28 repositories.


And people think Perl 6 is slow

  time perl6 -e 'say [1, 2, 3;  4, 5, 6;  7, 8, 9] >>**>> 2'
  [(1 4 9) (16 25 36) (49 64 81)]

  real	0m0.170s
Note that the majority of that time is just loading Perl 6.

  time perl6 -e 'Nil'
  real	0m0.156s
Perhaps someone could create a slang module for Julia in Perl 6, as that would be a fairly easy way to improve its speed. (Assuming Julia is easy to parse and doesn't have many features that aren't already in Perl 6)


It’s not a great general purpose language the way Python is. Neither is Matlab, so that’s okay if your competition is matlab but not if the competition is Python, C++, etc..


Meaning the Python standard library and common libraries are geared toward general purpose more than Julia.

I'm not sure that the Julia language itself is lacking in any general sense. It's just more geared toward scientific computing, but there's nothing about the language making it that hard to write general code. It's not R.


Right, if you stick strictly to use cases covered by NumPy, Pandas, Matplotlib etc, there are better options than Python. But many real programs need other things too, and that’s why they start in and stick to Python regardless.


Glad you're enjoying it! "Hardly anyone uses it" isn't really accurate: Julia's in the top 50 languages on the Tiobe index (between Rust and VBScript this month) [1] and in the top 30 of the IEEE Spectrum language rankings (IIRC, paywall) [2]. I'd say that's quite the opposite of "not taking off". Anecdotally, there are a lot of people on StackOverflow giving excellent, accurate answers these days and I have no idea who they are, which feels like a significant place for a language to get. Getting all the way to the top will take a bit of time :)

[1] https://www.tiobe.com/tiobe-index/

[2] https://spectrum.ieee.org/computing/software/the-2017-top-pr...


Honestly, just wanted to give a huge thanks for such an awesome language! I’ve truly been converted as a huge Python person into Julia and it’s been slowly taking over my research workflow since it’s just so fast and actually fun!


> "Hardly anyone uses it" isn't really accurate

My apologies for the wording. I should have just said I wish it was used more in industry nowadays. And thanks for creating Julia btw! It's been a very enjoyable and productive language to program in.


Maybe if they had released 1.0 after all this time it might have had a better chance of being adopted.


> Arrays here don’t have the Numpy/Scipy weirdness of matrices vs. ndarrays

Oh my god. This has bit me in the ass every time i did something with matrices in python. It seems like such a weird split.


After the mess that was writing a numerical library that interfaced with scipy but used numpy arrays, I’m actually slowly porting most of my daily workflow into Julia. It’s just gotten faster and the code is much easier to read.


Also as no-one has mentioned it yet, TensorFlow wrapper for Julia:

https://github.com/malmaud/TensorFlow.jl


As Haskell begins to get more and more dependently typed features this could definitely be an exciting possibility (I think there are folks already working on this).



c++ does a good job


I have been looking to get into C++ with CNTK actually... I think this might be the next big thing. People don’t realise how close to a high-level language C++ is nowadays with C++14 compliance in every major compiler

In fact I am sure that advocates of Go, Rust et al are really comparing them to C++98 and would be very pleasantly surprised by C++14


I love C++ but it is hard to deny the vast workflow improvements that come with languages that utilize a REPL, like Python or Julia or Matlab/Octave. Being able to poke at your code or data and experiment without a compile/run/debug cycle is a huge advantage to productivity.


it is hard to deny the vast workflow improvements that come with languages that utilize a REPL

I’m using cling, part of ROOT, for that https://root.cern.ch/cling


That's cool, but it's hardly the idiomatic workflow for C++. Maybe that will change; it should!


A few advocates of safer systems programming languages, regardless of which ones, happen to use C++, are up to date with C++17 and follow up on ISO C++'s work.

Thing is, no matter how much better C++ gets, preventing some groups to use it as "C compiled with C++ compiler" is fighting windmills.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: