Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As a Python programmer with limited experience with compiled languages, Rust code was more intimidating to read or look at than C++, Java or Go. After only an hour, I am overwhelmed by the sheer beauty and mature design of this language - it almost reads like Python or as well as any compiled language can. I cannot believe that I am smitten by Rust within an hour. Its features seem, obvious. My experience with Go was frustrating since I felt like I had to give up elegance for practicality. I did not expect to learn 1-2% of Rust when I woke up today. My thanks to the author.

I'm curious what other what other developers who primarily use Python think of this article and Rust in general?



I've shipped relatively lots of code in Python and a little bit in Rust.

For any project where performance or correctness are significant concerns I'd much rather use Rust than Python. It provides a lot more tools to help you write correct code, and express invariants in a machine-checkable way. This means that they are maintained when the code changes, even when multiple contributors are involved. I'm must more confident in my ability to ship correct code in Rust and maintain that correctness over time.

Rust also doesn't have the performance ceiling (floor?) of Python, either for single threaded programs, or — especially — for those that benefit from concurrency or parallelism. Of course for some progames there's a clear hot loop that you could implement as a native extension in Python, but other workloads have relatively flat profiles where the performance bottlenecks are making allocations or similar.

As well as the actual language features, the commitment to backward compatibility from the Rust authors means that you don't get regular breakage simply from updating to a newer version of the language. I think Python 2 being 2.7 for such a long time gave people a false impression of how stable a foundation Python provides and the fact that point releases of 3.x often cause problems feels like it causes a lot of unnecessary makework for Pyton users and is rather offputting.

That doesn't mean I'd always choose Rust of course. You do require more upfront design to get something that works at all. Some projects also benefit greatly from not requiring a compile step and making use of the ubiquity of a Python interpreter. And I think I'd find it difficult to justify using Rust for a typical web backend that's mostly composing together various well-tested libraries to provide an API on top of a database.


My only bone to pick with this commentary is that it is 2020 and people are still complaining about Python as being unstable because of 2 -> 3.

It was a rough transition, to be sure. Python2 was released 20 years ago, and Python3 12 years ago. It is a pretty stable language.


I don't think the parent is talking about Python 2 -> 3. I think they are pointing out that for many developers, moving to Python 3 happened very late. That means they ran 2.7 since 2010 (roughly 10 years now.) Now that they are on 3.x, they get issues with each new 3.x release.

The argument is that being on 2.7 for so long gave a false impression of the language being more stable than it is - developers just weren't using new versions.


Well to be fair the 2020 end of life date was announced almost a decade ago. That said, I know that fortune 500 companies that used 2.7 to develop new code in 2017 -- which was a mistake -- mostly because they were tied to old version of RHEL and they were too lazy to try to get Python 3 to work on an old version of RHEL.

So 2020 rolled around and the pypi libraries they were using were no longer supported even though RHEL would support python 2.7. So yeah, it was a really bad experience for everyone involved.


Some people are still using 2.7 in production.


Maybe there will be a company like Red Hat that announces to support python 2.7 indefinitely, although I doubt it.


My housemate was just rolling back from python 3.8 to 3.7 yesterday due to a backwards incompatible change breaking a library... it's not just 2 -> 3 that makes python relatively unstable compared to rust.


Yup, similar is 3.5->3.7. The community also doesn't seem to value backwards-compatiblity, judging from the libraries.


And I recently had to roll back from Python 3.9 to 3.7 to work around a couple different problems with the cffi and zstandard libraries on macOS 11.


How does library breakage reflect on Rust? Python's deprecation policy is 2 years, only a year less than that of Rust's Epoch system.


Epoch's don't break libraries - that's the whole point.

They manage this by allowing using a different epoch in a library than the application that uses it, and defaulting to the original epoch if you don't ask to use a new one.

The vast majority of rust 1.0 libraries should still work with modern rust programs, I vaguely recall that there was one minor backwards incompatible change as of a result of memory safety bug (an exception to the guarantee), but I can't remember what it was and I can't find it with google so it might actually be all rust 1.0 libraries still work.


What the issue?


It's specifically complaining about point releases of 3.x, it only mentions 2 in an unrelated way.

The fix for https://bugs.python.org/issue40870 caused me some grief when upgrading from 3.8.3 to 3.8.4 (though to be fair, the ast module has different stability guarantees).


Some industries are very very slow when it comes to upgrades. E.g. the VFX reference platform[0] only adopted python 3 for the 2020 baseline, which means many recently released products are still embedding python 2.x [1].

[0] https://vfxplatform.com/ [1] https://vfxpy.com/


> For any project where performance or correctness are significant concerns I'd much rather use Rust than Python. It provides a lot more tools to help you write correct code, and express invariants in a machine-checkable way. This means that they are maintained when the code changes, even when multiple contributors are involved. I'm must more confident in my ability to ship correct code in Rust and maintain that correctness over time.

Any reason for why Rust over Ada/SPARK?


Ada/Spark is more about mathematically proving that your program works. Rust is more about that your program won't trample memory and won't crash.

That said, Rust seems to have reasonably good performance compared to C/C++. Also the concept of lifetimes is interesting too.

Just because your program compiles in C++/Ada(no-spark)/C/Java/Rust doesn't mean it's correct. If it compiles in Rust, the compiler is telling you it's memory safe and thread safe.


If you have a lot of string handling the strictness and ease of safe zero copy in Rust would make it unbeatable for performance and safety.

Due to lifetimes it's easy to keep string slices as references into the original memory and if you do need to modify some strings, you can use CoW.

The way Rust deals with encoding also makes it much safer to use Rust. Since you need to be very explicit about which encoding you convert to what and if you want to allow lossy conversions.

I don't think there is much of an equivalent in ADA for this. ADA has other benefits, like the delta (fixed point) types which help a lot in embedded projects.


I was mostly a Python user, and now I primarily write Rust when I can. It's an absolutely gorgeous language, especially for being so practical.

There's a bit of a progression with Rust, where at first you see the examples and you go "this language is so beautiful!" Then you try to write something nontrivial in Rust and you go "wow the compiler will NOT stop yelling at me, how does anyone write anything in this language?" Rust has a fairly austere learning curve in general, and it takes some getting used to.

That passes, though, and then you really do get to experience the elegance and power which sold you on Rust in the first place. There's a period where writing Rust feels substantially slower that writing in anything else, but that passes too.

It really is as good as it sounds, but it might take a while before you're accustomed enough to Rust's paradigm that it feels that way.


am overwhelmed by the sheer beauty and mature design of this language - it almost reads like Python or as well as any compiled language can.

All those lifetime annotations are not sheer beauty or pretty, sure they are necessary but not nice to look at if you are comparing it to a higher level language.


I think that being able to write down the lifetimes in the language is beautiful.

You don't have to do it, but if you want to do it, being able to do so in a way that's verified and enforced by the toolchain beats doing so in a documentation comment.

Now every time I read a doc string saying that I need to "deepcopy" something in Python for some API usage pattern to work properly I cringe.


That's one of the things that get me about Rust discourse: it seems that "Rust is pain in exchange for performance" is a common misconception. Rust is discipline in exchange for performance and correctness. A GC lets you be relatively worry-free as far as memory leaks go (and even then..) but it doesn't prevent a lot of the correctness problems the borrow checker would.

With a checker you're forced to think: do I really want to pass a copy/clone of this? Or do I want to let that function borrow it? Or borrow it mutably?


While I get the point about data-races, a “GC” doesn’t make/help you leak memory - you have simply postponed the free to a later point in time.

Assume you had some code that takes a file name and calls open on it. One day you decide you want to print that filename before you open it. Naive code will cause the name to “move” to print and unusable to the open in next line. Even though it is perfectly understood by all parties that there is no threading involved and print would finish before the next use of that string. Yes, I can create a borrow or clone, but having to think of it every single line of code even when there is only one thread of execution is really painful

Edit: I get print is a macro, but imagine a detailed logger for this case.


I'd argue the exact opposite. Languages that don't have a concept of ownership, and a borrow checker, and don't explicitly say if they want ownership, a reference, or a mutable reference, force you to keep all of these details in your head.

Here, if I have a `&T` and I try to call a function that has a `&mut T`, the compiler will tell me that's not gonna work - and then I can pick whether I want my function to take a `&mut T`, or if I want to make a clone and modify that, etc.

There's a learning curve, it's a set of habits to adopt, but once you embrace it it's really hard to go back to languages that don't have it! (See the rest of the comments for testimonials)


I think both points are right. There's times when it's useful and desirable to be specific about lifetimes, and there's also times where it's annoying noise.

Bignum arithmetic is an example of the latter. You want to just work with numbers, and in Python you can, but in Rust you must clutter your code with lifetimes and borrows and clones.

Swift's plan to allow gradual, opt-in lifetime annotations seems really interesting, if it works.


Bignum arithmetic actually is pretty simple in Rust if you use the right library. Rug[0] makes using bignums look almost just like using native numbers through operator overloading.

[0] https://crates.io/crates/rug


The operator overloading is nice but you still get smacked in the face right away by the borrow checker. Simple stuff like this won't compile ("use of moved value"):

    let a = Integer::from(10);
    let b = a + a;


This would work if `Integer` would implement the `Copy` trait. But I guess that type is heavy enough for it to be too expensive, therefore forcing you to explicitly call `clone()`.


That `Integer` type can't implement `Copy` because it manages a resource, meaning you can't just do a simple memcpy of the type.


  let b = &a + &a;
will work, but I agree it's unfortunate that this is necessary.


I don't see how a GC would do anything with memory leaks. When you have a data structure containing some elements, and you keep a reference to the structure, because you need some data from it, but you let it grow, then you have a leak, GC or no GC. If anything, GC encourages memory leaks by making freeing memory implicit, something a programmer normally doesn't think about. And yet a resize/push operation on a vector (sic!) is just another malloc. One that will get ignored by a GC.

GC protects you against double-free and use-after-free, but memory leaks? Nope.


GC also protects you against forget-to-free, which in most non-GC languages is a common form of memory leak.


That's just exchanging forget-to-free for forget-to-drop-reference.


But it's much easier to forget-to-free:

  Foo* foo = new Foo();
  ...code that uses foo...
This is a memory leak in a non-GC language, but not in a GC language.

In a practical sense... is it your personal experience that memory leaks are equally prevalent in GC and non-GC languages? I've spent decades working in each (primarily C++ and Java, but also Pascal, C, C#, Smalltalk...) and my experience is that memory leaks were a _much_ bigger issue, in practice, in the non-GC languages.


This is a memory leak in a GC language as well, depending on what happens to foo in ...code that uses foo... Maybe it will become a field of some class, maybe it will get pushed to some queue, maybe it will get captured in a lambda, etc. You won't even be able to tell just from looking at this code in this one place alone, if you passed this pointer as argument to a function.

It is my experience that when people work with GC languages, they treat the GC as a blackbox (which it is) and simply won't bother investigating: do they have memory leaks? Of course not, they are using a GC after all, all memory-related problems solved, right? Right... With code that relies on free(), I can use a debugger and check which free() calls are hit for which pointers. Even better, I may use an arena allocator where appropriate and don't bother with free() at all. With a GC I'm just looking at some vague heap graph. Am I leaking memory? Who knows... "Do those numbers look right to you?"

Memory management issues are usually symptoms of architectural issues. A GC won't fix your architecture, but it will make your memory management issues less visible.

It is my experience that most memory problems in C come from out-of-bounds writes (which includes a lot more than just array access), not from anything related to free(). A GC doesn't help here.


Per Wikipedia:

> In computer science, a memory leak is a type of resource leak that occurs when a computer program incorrectly manages memory allocations in a way that memory which is no longer needed is not released.

In most of your scenarios, e.g. "pushed to some queue", the object in question is still needed. Hence, this is not a leak. Presumably, the entry will eventually be removed from the queue, and GC will then reclaim the object.

At any rate, I think we're moving past the point of productive discussion. My experience in practice is that memory leaks are more common / harder to avoid in non-GC languages. Of course a true memory leak (per the definition above) is possible in a GC language, but I just don't see it much in practice. Perhaps your experience is different.


Are there any languages that provides memory leak safety, so that the compiled binary is guaranteed to be memory leak free?


What does memory leak free mean? It has a semantic meaning.

If I have a queue with elements and I won’t be accessing some of it in the next part of the program, is it a leak? Also, remember that certain GCd languages intern strings, is it a memory leak since it will not necessarily use it anymore?


Well, you would need a way to "tell" the compiler more about your intentions and maybe even explicitly declare rules for what you consider a memory leak in certain scenarios.

To provide an example for a different case: programs written in Gallina have the weak normalization property, implying that they always terminate.


I have at times in the past made the (somewhat joking) observation that "memory leak" and "unintentional liveness" look very similar from afar, but are really quite different beasts.

With a "proper" GC engine, anything that is no longer able to be referenced can be safely collected. Barring bugs in the GC, none of those can leak. But, you can unintentionally keep references to things for a lot longer (possibly unlimited longer) than you need to. Which looks like a memory leak, but is actually unintentional liveness.

And to prove the difference between "cannot be referenced" (a property that can in principle be checked by consulting a snapshot of RAM at an instant in time) and "will not be referenced" (a larger set, we will never reference things that cannot be referenced, but we may not reference things that are reachable, depending on code) feels like it is requiring solving the halting problem.

And for free()-related problems, I've definitely seen code crash with use-after-free (and double-free).


> I have at times in the past made the (somewhat joking) observation that "memory leak" and "unintentional liveness" look very similar from afar, but are really quite different beasts.

Both situations prevent you from reusing memory previously used by other objects, which isn't being utilized for anything useful at that point. The distinction is valid formally, but from a practical point of view sounds rather academic.

> And to prove the difference between "cannot be referenced" (a property that can in principle be checked by consulting a snapshot of RAM at an instant in time) and "will not be referenced" (a larger set, we will never reference things that cannot be referenced, but we may not reference things that are reachable, depending on code) feels like it is requiring solving the halting problem.

As usual, looking for a general solution to such a problem is probably a fool's errand. It's much easier to write code simple enough that it's obvious where things are referenced. Rust's lifetime semantics help with that (if your code isn't all that simple, it will be apparent in the overload of punctuation). If not Rust, then at least it would be good if you could check liveness of an object in a debugger. In C you can check whether a particular allocation was undone by a free() or equivalent. I'm not aware of any debugger for e.g. Java which would let me point at a variable and ask it to notify me when it's garbage collected, but it sounds like something that shouldn't be too hard to do, if the debugger is integrated with the compiler.


For me it's also faster to read lifetimes, which have a standard concise syntax, versus ad-hoc documentation.


You can write Rust code with almost no lifetine annotations. That's heavily dependent on the domain, of course. But for relatively simple function signatures they are usually inferred correctly, and you can often just clone instead of requiring references.


Pretty sure if you’re deserializing a structure with borrowed references you need lifetime annotations. Copying things around to pacify the compiler hardly seems elegant.


Sure, but that's a performance vs ease of use decision.

Often you don't need to care about the extra allocations and can just deserialize to owned types.

The code for owned deserialization certainly ends up looking more elegant.


It’s not just performance, but you now have to go back and update every instance of that struct to reflect the change in ownership of that member if you can even change the member in the first place (e.g., you can’t change the type of a member of the struct itself is defined in a third-party package). Moreover, some instances of your struct might be in a tight loop and others might not, but now you’re committing to poorer performance in all cases. Maybe you can box it (although IIRC I’ve run into other issues when I tried this) but in any case this isn’t the “elegance” I was promised. Which is fine, because Rust is a living, rapidly-improving language, but let’s not pretend that there is an elegant solution today.


Libraries can be (and some actually are) designed with that in mind, with COW (copy on write( types that can be either borrowed or owned. Speaking of performance, kstring (of liquid) goes one step further with inline variants for short strings.

I want to say that seems like a valid concern but in practice I've seen it come up only rarely.


Yes, you can clone all day long and just reuse variable names as if you were writing JavaScript, but what does that do to your memory allocation?


Most languages just clone all day long, it's not that bad, rust clones (like most languages) are just to the first reference counted pointer after all.


Well, yes and no.

Eg cloning a string leads to an extra allocation and a memcopy.

If you want to get a similar performance profile to GC languages, you have to stick your types behind a `Rc<T>>/Arc<T>` or `Rc<RefCell<T>> / Arc<Mutex<T>>` if you need mutability.

But modern allocators hold up pretty well to a GC, which amortizes the allocations. The extra memcopying can be less detrimental than one might think.


“Yes and no” is too generous; most languages clone only very infrequently (basically only for primitives where there is no distinction between a deep and shallow copy). For complex types (objects and maps and lists) they pass references (sometimes “fat” references, but nevertheless, not a clone).


To be clear, JS doesn’t require cloning because it has a gc. Not sure what you’re getting at with reusing variables names...


What I mean is, if one does:

  const obj1 = { a: 32, b: 42 };
  function foo(ref) { ref.a = 0; }

  foo(obj1);
  console.log(obj1);
in JavaScript, one is just using the variable name as a "holder" of some value. One doesn't have to designate that that variable is being passed by reference. If one wanted to actually copy that object, they'd have to devise a mechanism to do so. In Rust, if someone doesn't specify, using the & symbol, that something is a reference, it'll end up moving the value.

Basically all I was saying is one can not approach writing Rust with a Java/JavaScript mindset. (That a variable is just a bucket holding a value). Care needs to be taken when referencing a variable as it may need to be moved, copied/cloned or referenced. In the case of copying, another memory allocation is done. So if someone approaches Rust from the standpoint of "this word represents a value, and I'm going to use it all over the place", they can find themselves blindly allocating memory.


Oh, by “reuse variable names”, you simply mean that values aren’t moved, i.e., that JS lacks move semantics or an affine type system. That’s a bit different to “reusing variable names”, which you can do in Rust as well provided the variable holds a (certain kind of?) reference.


I agree it's not pretty. It took me several passes to just understand what was going on there. But ownership is something I've never come across before and it doesn't exist in Python (not to an average user anyway), so it seems a little unfair to compare Rust and Python based on that feature.

However when you take a high level feature like overriding operators which can be done elegantly in Python, for a complied language, Rust's way is quite concise, readable and to my eyes quite pretty.

Edit: Typos


You get used to it, and you don't really need to use it most of the time. Only for more complicated things, but by that point, it's second nature.


Except from that though, it really looks like ocaml, which really makes me want to look into rust a little bit more :)


Yup, strong similarities there, since the original Rust compiler was implemented in OCaml :)


For those who are curious, here's the source code of the original compiler written in OCaml (rustboot) https://github.com/rust-lang/rust/tree/ef75860a0a72f79f97216...


All those? Very small amounts of an average Rust code base will have lifetimes.


This podcast might be relevant: the creator of Flask who is now using Rust: https://realpython.com/podcasts/rpp/18/

Edit - actually I was probably thinking of this podcast with the same interviewee: https://rustacean-station.org/episode/004-rust-in-production...


Just started listenting to it. It's exactly what I was looking for, thank you. Armin Ronacher's opinion on such things would hold a lot weight in my head.


Rust makes hard things easy and easy things hard.

Also reading Python code is easier (less syntax noise) on the eyes. The article above is very beginner Rust, and I won't rely on that to look at actual Rust code in the wild.

If you want to take a look at what actual Rust code in the wild, take for example, a web server Actix, and try to figure out what the documentation says.


I find that advanced Python is quite hard to read. Well, I find that advanced dynamically typed languages tend to be hard to read in general, because you can never be sure what type your inputs and outputs are and what values they can take. Usually my

At least idiomatic Python shuns wildcard imports that obfuscate where symbols are coming from (that drove me insane in Ruby).

On the other hand Rust code tends to be very strict about what your types are (even more so than a language like C++ where metaprogramming is duck-typed by default). It can lead to complicated code, but baring some weird operator overloading decision you should know exactly what calls what from whom when you look at any method or function.

Rust code can be tricky to write at times, but I generally find it a pleasure to read. Sure, ultra-complicated generic code can be overwhelming, but this complexity would be here in one form or the other regardless of the language, Rust just forces you to be explicit about it.


Python now supports type annotations. With mypy those types can be checked. Any python shop worth it's salt should be switching or has switched.


> Rust makes hard things easy and easy things hard.

I don't agree with this, and, if anything, this reads very biased.

Insofar, Rust has made my life a lot easier, and I have not run into any major issues aside the borrow checker. And this was early on. Two years now playing with the language and I barely run into it anymore.


Async has ecosystem fracture issues. Creating graph data structures require you to understand more about borrow checker.

Majority of people will probably want to use Rust for web? Which makes async needs to be ergonomic enough if it wants consider wide adoption.


So does Python. Asyncio is mediocre at best and difficult to express certain patterns in. Other libraries are far better, but not in the standard library.


Agreed it could be a little more ergonomic, but it wasn't a blocker. I was able to figure it all out in a weekend, and write plenty of services with it.


"Rust makes hard things easy and easy things hard."

Disagree here. Like article Rust 2018 is very nice and ergonomic. Iterators, Lifetime Elision etc are nice to work.

Regarding noise many people consider those noise but I find noises like return types etc very useful. Because I can be sure the return type. Regarding actual code actix doesn't look that bad from examples also.

I use warp and its pleasant to work. The only thing I hate is compilation time other than that I don't think I have any major criticism against rust. But compilation time is near to cpp etc so ...


I'm a recent Rust fan (via AoC), but for "easy things hard", I would offer input/string processing as an example— regexes, string operations, grammar, etc. I know that Rust is forcing me to be correct and handle (or explicitly acknowledge that I'm not handling) my error cases, but from the point of view of just wanting to get the happy path working, it's a lot more noise to deal with compared with what it would look like in Python, eg:

        let re = Regex::new("(?P<min>[0-9]+)-(?P<max>[0-9]+) (?P<letter>[a-z]): (?P<password>[a-z]*)").unwrap();
        re.captures_iter(&contents).map(|caps|
            Input {
                password: caps.name("password").unwrap().as_str().to_string(),
                letter: caps.name("letter").unwrap().as_str().chars().next().unwrap(),
                min: caps.name("min").unwrap().as_str().parse().unwrap(),
                max: caps.name("max").unwrap().as_str().parse().unwrap()
            }
        ).collect()


If you get rid of unwrap and use `if let` and `match`, this code will clean up nicely.

If the code is supposed to be maintainable, you could also make something like a FromRegex trait for Input. It would be a good idea to try exercism's mentoring thing to get better at writing it the easier way the first time. The mentoring thing really is what helped me to think in ways that made this mess easier to avoid.


Fair fair. To be honest, I think in a lot of these cases there are good helper crates/macros; for this I would now probably use recap: https://docs.rs/recap/0.1.1/recap/

The really horrendous scenario was when I was trying to navigate pest iterators for parsing according to a grammar.


Even without if let or match many of those unwraps are unnecessary. See https://github.com/zookini/aoc-2020/blob/master/src/bin/2.rs


Interesting! Can you help me understand why this doesn't work for me?

    error[E0599]: no method named `parse` found for enum `Option<regex::Match<'_>>` in the current scope
      --> src\main.rs:24:39
       |
    24 |                 min: caps.name("min").parse().unwrap(),
       |                                       ^^^^^ method not found in `Option<regex::Match<'_>>`


I think you need .name("min")?.as_str() to access the underlying text of the match object (after making sure it is a valid match with the ?), which can then be parsed. The regex Match object itself does not have a parse method that I can see.

I don't know what the structure Input looks like but I played around with your code and it seems to work with as_str()

https://play.rust-lang.org/?version=stable&mode=debug&editio...


caps.name("min") returns an Option which you need to handle e.g. by unwrapping. You can use caps["min"] instead if you don't want an option.


See this is what I'm talking about. This is ridiculous.


Actual code examples from Actix isn’t that bad but take a look at the docs.


D is also worth a look as a compiled C- family language that can read like python. https://bitbashing.io/2015/01/26/d-is-like-native-python.htm...


> I cannot believe that I am smitten by Rust within an hour

This has been my, and a lot of my colleagues', experience with Rust. Never have I seen people fall in love with a programming language so strongly. And the love lasts for a long time.


Try to use that in real world and see if your love still stays put. An hour is not enough to validate Rust.


Typical condescending HN response. We are using Rust, we've used it to build and ship a successful native desktop app.


I’ve seen a lot of Golang supporters really hate on rust and make this discussion into a holy war


In which sentence is this condesdending?


The assumption that real-world use had not been tried.


An hour isn’t enough to use it on real world use, am I not right?


One way to read your comment, which I think the way others are reading it, is “you only like rust because you haven’t used it much, try it more and you won’t like it anymore.” The implication being that only people who haven’t used Rust can like it.

That may not be what you meant but it seems like that’s what people are understanding.


An honor to have Steve Klabnik himself responded to me. Thank you for your amazing work for the Rust community and the books as well. I'm a Rust hobbyist, I'm just trying to be objective, especially since majority of my day job doesn't involve Rust, and for sure my team mates refuse for me to introduce small bits of Rust code here and there, because of unknown territory.


Correct, but the comment you responded to ended with "And the love lasts for a long time."

This suggests experience far more than a single hour, does it not?


Sounds like a contradiction to me that he said it he only had it for 1 hr. Anyway, I'm not here to argue semantics. Not gonna reply anymore.


> he said it he only had it for 1 hr

They didn't, though? They indicated that they had a similar experience of falling in love with it quickly. That says nothing about when that experience happened.


I have been using it for a few years now, and the love is still there


Do you know any Object Pascal programmers?


I know it’s pedantic but I despise the semicolons in Rust. They make the language ugly and constantly burden the programmer unnecessarily. It’s a bizarre choice for such a modern language.


I find my own JS code easier to read since I dropped them, and my tired eyeballs no longer need to lex the ";" pattern. Yet some JS masters (Mike Bostock) include them religiously.

Semi-colons do make make possible multiple statements on a single line -- but surely there's a stronger rationale than that.


Python allows semicolons for separating multiple statements on a single line, even though it doesn't require them otherwise. (Those statements can't have blocks, because of the indentation-based syntax, but that's a separate issue.)


> I am overwhelmed by the sheer beauty and mature design of this language - it almost reads like Python

Sarcasm?


"... or as well as any compiled language can."

Some syntax is close to Python so some mental load is minimized, i.e use of snake_case, self, None and type annotation syntax, dict unpacking.

Some lines read like sentences i.e. for loops, single line if statements, impl Foo for Bar etc. Some design choices also read better I think, for example, let reads better than var.

Of course such preferences are personal. I've read other comments that mention that the tutorial covers only basic Rust features and it can get much more complicated when using more advanced features.

Edit: Added sentence about advanced features.


"As a Python programmer with limited experience with compiled languages"

In other words, someone who doesn't have a CS background, never took theory of programming languages, etc. "Python programmer" is like the 2020's equivalent of a "HTML developer" from the 2000s.


Look into C++17, that one makes C++ look like Python but with worse basic libraries (not that Python ones are great either).


Unfortunately one can’t live solely in C++17. The sheer amount of syntax and knowledge required to write C++ productively is insane.


Rust is a complete pain for the types of problems Python is used for. Same for Go/Java.


I used to use Ruby for small (100-500 loc) scripts to parse various CSV/JSON documents. You craft a couple perfect stanzas of immaculate code to do what you need and you feel great. Then you run it and turns out you forgot an `end` somewhere. Fuck. You run it again, and the name of the hash map variable is unknown, because you changed it at the last moment to make it more descriptive. Fuck. There, fixed everything. Run it again. Running for a couple seconds now with no errors -- pop the champagne! Wait, what's that? Row #998 of 1000 had an unexpected type of value? Fuck. Fix it, ship it. Wheel, snipe, and celly.

At some point I just started writing these things in Rust instead. Describe what I want deserialized as a struct. List the fields I want as enum variants. What's that, rustc, the "name" field is optional? Ok, that makes sense, I'll handle this right away. Done. Hey, rustc, how come people talk about fighting you all the time when you're actually the world's greatest pair programmer?


As another Python programmer I also agree. Rust's features are very mature especially compared to an old dinosaur like C. I only wish its syntax was more like Swift's which is more pleasing to the eyes and was also made by the same guy.


Minor note,. Graydon created rust and worked on Swift. He didn't create Swift however.

That said...both languages read like close siblings to me to the point I find it annoying switching between the two because I keep trying to use each ones language semantics in the other subconsciously.


I feel like I'm being bullied and nitpicked but you're speaking politely so I'm confused.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: