Julia is a language created by academicians. Unlike Python, it is less headachy. Python has weird behaviours scattered all over the place. Still everyone uses Python for Deep Learning (including researchers).
Julia is fast, with good syntax and lots of libraries. Infact it has a library called Flux.jl for ML/DL. Yet nobody uses it. Why? No major industry partner has picked it up to create bigger deep learning frameworks.
Why?
The ecosystem also feels weird. For example for CLI parsing in Python I do argparse/click, in Rust I do clap. Both work well (Rust especially so). Julia seems to have ArgParse.jl which I found unpleasant. My guess is that this sentiment spills to other areas as well in the ecosystem. I am sure that Julia has very well designed libraries, but most of the rest are not so honed for practical use, and most importantly, probably lacks some features that Python-equivalent ones have.
Also personally (I am far from a DL researcher but I do research), I find Julia's design headachy as well quite some times. I like Python yield statements for iterators but the equivalent in Julia is always more complicated. Making Julia code performant requires another layer of intuition of how the JIT works. Julia has type annotations but "type annotations as lightweight formal methods" often cannot be fulfilled due to design (for example one cannot annotate the equivalent of `map :: (a -> b) -> [a] -> [b]`). Of course I stand corrected as only a casual Julia user.
I also want to say that I only stated my criticism towards the language. There are many many good things about Julia that I will just omit due to time.