I think it was pretty clear immediately that running python code was a far away goal. There was a lot more talk about lifetimes and ownership semantics than details about Python interop. Mojo is more like: Can we take the learnings of Swift and Rust and solve the usability and compile time issues, while building on MLIR to target arbitrary architectures efficiently (and call it a Python superset to raise VC money).
That said, the upside is huge. If they can get to a point where Python programmers that need to add speed learn Mojo, because it feels more familiar and interops more easily, rather than C/CPP that would be huge. And it's a much lower bar than superset of python.
It marketed itself explicitly as a "Python superset", which could allow Python programmers to avoid learning a second language and write performant code.
I'd argue that I am not sure what kind of Python programmer is capable of learning things like comptime, borrow checking, generics but would struggle with different looking syntax. So to me this seemed like a deliberate misrepresentation of the actual challenges to generate hype and marketing.
Which fair enough, I suppose this is how things work. But it should be _fair_ to point out the obvious too.
Absolutely. The public sales pitch did not match the reality. This is what I meant with the "Claim to be Ṕython to get VC money" point.
To first order, today every programmer starts out as a Python programmer. Python is _the_ teaching language now. The jump from Python to C/Cpp is pretty drastic, I don't think that it's absurd that learning Mojo concepts step by step coming from Python is simpler than learning C. Not syntactically but conceptually.
Maybe young generations have some issue learning polyglot programming, I guess.
While I agree using Mojo is much preferable to writing C or C++ native extensions, back on my day people learned to program in K&R C or C++ ARM in high school, kids around 12 years old, hardly something pretty drastic.
Many famous Speccy and C64 titles, written in Assembly, were written by bedroom coders between the ages of 14 and 16 years old, getting some pocket money writing them on the UK scene.
Get hold of Retro Gamer magazine for some of their stories.
I've tried learning C a couple times and given up because the curve is too steep to be worth the climb. It's not even the language itself, it's the inherited weight of half a century's worth of cruft. I can't spend weeks fighting with compiler nonsense, header files and #include. Screw it, I'll just use Go instead.
I'm learning Rust and Zig in the hope that I'll never have to write a line of C in my career.
Geez, what a comment. C is much much more simpler than Rust. You’re not supposed to be spending weeks fighting includes or compiler errors, that means you’re have some very basic misconceptions about the language.
Just read K&R “The C programming language” book.
It’s fairly small and it’s a very good introduction to C.
C syntactically is straight forward, but conceptually may be harder than Rust. You’re exposed to the bare computer (memory management, etc) far more than with a GC language or even Rust arguably, at least for simple programs.
Towards deployment is even harder. You can very easily end up writing exploitable, unsafe code in C.
If I were a Python programmer with little knowledge about how a computer works, I’d much prefer Go or Rust (in that order) to C.
This is true, but when you get something wrong related to the memory model in C, it just says "segfault". Whereas in Rust it will give you a whole explanation for what went wrong and helpful suggestions on how to fix it. Or at the very least it will tell you where the problem is. This is the difference between "simple" and "easy".
That applies only if you take "memory model" to mean modeling the effects of concurrent accesses in multithreaded programs.
But the term could also be used more generally to include stuff like pointer provenance, Rust's "stacked borrows" etc.
In that case, Rust is more complicated than C-as-specified. But C-in-reality is much more complicated, e.g. see https://www.open-std.org/jtc1/sc22/wg14/www/docs/n2263.htm
The model you're referring to, a Memory Ordering Model, is literally the same model as Rust's. The "exception" is an ordering nobody knows how to implement which Rust just doesn't pretend to offer - a distinction which makes no difference.
I do sympathize with the parent: The language itself might not be that difficult but you also have to factor in the entire ecosystem. What's the modern way to a build a GUI application in C? What's the recommended way to build a CLI, short of writing your own arg parser? How do you handle Unicode? How do you manage dependencies, short of vendoring them? Etc.
Errors too. When, inevitably, you make mistakes the C might just compile despite being nonsense, or you might get incomprehensible diagnostics. Rust went out of its way to deliver great results here.
I am not arguing about how good or easy it is to use C in production, I’m merely stating that parent complaints about weeks of insolvable errors and issues with includes screams that he needs to read some good resource like book, because he is definitely misunderstanding something important.
THe thing is, if one is an expert it is incredibly difficult to understand the beginner perspective. Here is one attempt:
C is simpler than Rust, but C is also _much_ simpler than Python. If I solve a problem in Python I have a good standard library of data types, and I use concepts like classes, iterators, generators, closures, etc... constantly. So if I move to Rust, I have access to the similar high-level tools, I just have to learn a few additional concepts for ressource management.
In comaprison, C looks a lot more alien from that perspective. Even starting with including library code from elsewhere.
I think one of the "Python superset" promises was that any particular dev wouldn't need to learn all of that at once. There could exist a ramp between Python and "fast python" that is more gradual than the old ways of dropping into C, and more seamless than importing and learning the various numpy/numba/polars libraries.
FWIW generics are already a thing in pure Python as soon as you add type annotations, which is fast becoming the default (perhaps not the least because LLMs also seem to prefer it).
I suppose if you accept the innocent-looking "#"+"#"=="##" then your example kind of algebraically follows. Next it's time to define what exp("#") is :)
* does different things depending on the types of the operands, which is Python's strong typing at work, not Perlesque weak typing. Repeating a string is a useful thing to be able to do, and this is a natural choice of syntax for it. The same thing works for lists: [1]*3 == [1, 1, 1].
It does unfortunately mean that sometimes `*` will work (and produce an incorrect result) rather than immediately failing loudly with a clear error message in the context in which it's actually intended to be numerical.
More broadly this is the same argument as whether overloading `+` for strings is a bad idea or not, and the associated points, e.g. the fact that this makes it non-commutative - the same all applies to `*` as well, and to lists as much as strings. At least Python is consistent here.
Although there is one particular aspect that is IMO just bad design: the way `x += y` and `x = y` work. To remind, for lists these are not equivalent to `x = x + y` and `x = x y` - instead of creating a new list, they mutate the existing one in place, so all the references observe the change. This is very surprising and inconsistent with the same operators for numbers, or indeed for strings and tuples.
That said, the upside is huge. If they can get to a point where Python programmers that need to add speed learn Mojo, because it feels more familiar and interops more easily, rather than C/CPP that would be huge. And it's a much lower bar than superset of python.