This seems like a neat project but I can’t say I ever wanted Python to be an even bigger language... Ideally we would have a Python-like that eschews unnecessary features like inheritance and much of the magic that makes Python hard to optimize (and often hard to understand), and with a type system (even an optional type system) that is better than Mypy.
I’m not 100% sure what that would look like, but I would ditch classes (and thus inheritance) for named tuples, make most things immutable by default, include enums and pattern matching (that the type system would check; I hardly see the point in dynamic pattern matching). If possible, typed code would compile to C extensions or similar. Oh, and exceptions would go away in favor of a Rust-like Result type.
I also recommend looking at Nim. It has Python-like syntax but it feels very different than Python for me when using it. Stellar language though that has been getting a lot of attention!
Whoa - Nim seems really cool! Thanks for mentioning it. I like that it also compiles to JS, and seems to be pretty active (latest release was just on Dec 31).
It is a little large, yes. The creator's goal was always to create a language with a small core but a rich set of features for extending the language (metaprogramming). Nim has bloated a little and we're now trying to remove as many features as we can without annoying too many people :)
Glad to hear it. I've been skimming the documentation and definitely got the impression that it is a feature-rich language to put it politely :). In the words of Bjarne Stroustrup, it seems like there is a small core language that wants to get out. I imagine you could create a linter that catches uses of extraneous features and foster a culture of "Good nim code passes this linter"--you could create badges for repositories that show the percent of the code that adheres to the linter or something. Over time, you could deprecate and then remove these extraneous features.
This is part of what turned me away from Nim, so it's good to hear that. Can you give some examples of some features you've removed recently? I notice you're not at 1.0, so are you comfortable ignoring the 'annoying too many people' part if you think there's enough of a reason to remove something?
I'm aware of Cython and RPython. RPython is super cool but it isn't really a general purpose programming language. Cython is superficially similar to what I want, but it's always put me off; not sure why off the top of my head. I do love Go, however. :)
Cython is basically C in Python syntax with some convenience features.
Unless you know C it's not always worth using. Without doing any "C stuff" (i.e. just writing Python and compiling with Cython) you get about a 50% speedup in tight loops; sometimes that's good enough, other times it's not.
To go any faster than that, you basically need to start writing C code, using the CPython API, using Cython's badly under-documented features for doing so.
Learning C to the point where you get Cython is pretty easy. C is much easier to pick up rather than the HUGE C++. Here is a video you can watch in an afternoon that would give you the general gist and covers the basics quickly. https://www.youtube.com/watch?v=KJgsSFOSQv0
I hear you on Go. I am a Python dev at work and love Python, have dug into PEP8 and understood the tradeoffs for the sake of simplicity. Now after years of not touching Go I go back to it and it feels soooo familiar. It truly feels like a compliment to any serious Python developer to find Go. Now I got two goto languages for prototyping.
superficially. Under the hood, though, Julia has an type-system that interacts with the compiler in a very specific way that makes it possible to write highly optimized dynamic code without the overhead of a VM (is that the right term) like python's.
Lots of small issues, some of which may have changed since I last checked. I haven't looked at Pyre/Pytype yet. These are just a handful of issues with mypy that spring to mind:
* Poor syntax support in the language. E.g., you have to define TypeVars as variables in a scope outside of the unit you want to use them in, potentially conflicting with other generic functions. No clear guidance from the documentation about how the type system interprets these, either.
* No recursive types; e.g., can't define a JSON type
* Some callables can't be typed. I'm thinking they're those with kwargs and/or args, but I forget the particulars.
* Unions aren't true ADTs; the only way to define single-value variants (e.g., "null" in a JSON type) is to do something verbose and hokey, like creating a type that can only have one value.
* Lots of common libraries can't be spec'ed and therefore can't be typechecked. Things like SQLAlchemy which generate their types/classes at runtime, for example.
* Generally, the implementation is buggy even for mundane cases
Rust is great for lots of reasons, but it's not remotely what I think of as Pythonic. Rust has a steep learning curve and requires a much deeper understanding of memory management than Python, and it's also not much smaller.
Haskell is functionally pure with a syntax that I just can't bring myself to call readable. It's also very complex--which string type do I use? Which prelude? Which compiler extensions? Which build tools? Which testing library? I think if someone went and made a Haskell-lite and really focused on polishing the experience (including a familiar syntax, simple/straightforward tooling, relaxing the functional purity requirement, etc) it would be a great advancement.
I think this issue is way overblown. The reasons for choosing a particular string representation in Haskell are analogous to the reasons to choose among e.g. byte arrays, streams, string builders, etc. in mainstream programming languages.
> Which prelude?
There is a standard prelude. Unless you actively choose another one, that's the one you'll get.
> Which compiler extensions?
If you want to use certain advanced language features, enable the appropriate extensions.
> Which testing library?
Many mainstream programming platforms (e.g. .NET) have multiple popular testing libraries to choose from. Is this a bad thing?
If you want to use certain advanced language features, enable the appropriate extensions.
Your answers typify a level of comprehension of Haskell which appears to occupy the same brain space empathy would do. The whole POINT is that there are complex language features which have to be enabled if you want them: how does anyone know a priori in a new situation what to enable and what to disable? What side effects? What consequences?
I loved learning Haskell but to pretend it's syntactic simplicity translates to simple in all things is to misunderstand.
Gcc has a million compiler -W options do you think every C programmer knows them all? Do you think that every cat user knows why we joke about cat -v?
You have mastered Haskell and forgotten what lack of complete understanding means to anyone else. Mastering FORTRAN or pascal or lisp was trivial by comparison.
Mastering the underlying concepts of recursion, and tail recursion, and typing systems, and then integrating optional language features is not trivial.
There's a lot here I would like to respond to, but I'll limit myself to a couple points:
> The whole POINT is that there are complex language features which have to be enabled if you want them
Yes, "if you want them". If you value having a simpler language, don't use them. And Haskell is hardly unique in this regard. You've mentioned GCC, but I think the Babel transpiler is another good example.
> You have mastered Haskell and forgotten what lack of complete understanding means to anyone else.
I've managed to teach my daughter some Haskell, who had no prior exposure to programming languages whatsoever, so I doubt your assertion is entirely true.
The hardest programming language I ever learned was the first one. Each one after that was easier than the one before...until I got to Haskell. It was so different that I was forced to go back to a more fundamental understanding of the nature of code and computation and start from there. It felt pretty challenging, especially at first, but I think that had more to do with my perspective and experience coming into it than the language itself (and the fact that I had a family and career at that point didn't help).
These were examples of the choices one has to make when using Haskell; it is far from exhaustive. The point is that IMO Haskell undervalues consistency and standardization.
I can only directly address the examples you've given. If you disagree with how I've characterized those examples, I'd be interested to know why. If you have other examples, I'd be interested to hear them.
Haskell has its weaknesses, e.g. the lack of quality tooling available as compared to mainstream programming languages.
But I disagree that Haskell is complex, at least as a criticism. When expressing concepts of a similar complexity, I find Haskell to be particularly concise and expressive as compared to most other programming languages I am familiar with.
And if Haskell undervalues consistency and standardization, I would like to know as compared to what? The only programming languages that I know of where there are not many reasonable choices for e.g. a testing framework are those that either (a) haven't been around that long, or (b) haven't seen wide adoption.
I have never seen non-trivial Haskell that didn't enable at least 1 compiler extension. Laziness is difficult to reason about. Purity makes easy things hard in exchange for making hard things easy. You can't even make it through the standard library documentation without bumping into category theory.
Haskell is many things. Simple is not one of them.
> I have never seen non-trivial Haskell that didn't enable at least 1 compiler extension.
Language extensions are idiomatic Haskell. There is only one mainstream compiler used everywhere. Widely used extensions are a natural result when the compiler, as a testbed, can outpace the language standard.
Enabling extensions is as trivial as including a standard library package. (In fact, the extensions are often better documented than the standard library, as you hint to.)
Honestly, I’m content to agree to disagree. I’ve had this conversation too often to repeat it here. The criticisms aren’t novel; I’m sure they’re easily found elsewhere on the Internet.
Ocaml has no ability to do ad-hoc polymorphism at all (but it does have parametric polymorphism), which affects more things than operators, so that's why not only have the weird math operators, but things like string_of_int.
This will change at some point though. Modular implicits are coming to Ocaml. These are similar but much simpler than Scala's implicits, and allow the same kinds of patterns. If Ocaml gets that and eventually gains a bigger community with better library coverage, it's something I would consider using heavily.
That would indeed be good yeah - it sounds like a small thing but it really would help code readability quite a bit, especially when dealing with stuff like bignums.
I knew the thread would head in this direction. :) Unfortunately, those languages' syntaxes are only marginally better and they have more complexity than I care for (especially objects and inheritance).
Given your "especially" comment -- you might be mis-remembering something about OCaml? It's true that it has these features, but in practice they are almost never used. The object system is useful if your program's design really needs open recursion / inheritance. But it's far more common to build an OCaml program using other constructs (modules and simple data types).
That's not to say that there isn't any complexity in OCaml. The type system and module/functor system have a daunting number of features. But most applications stick to a much simpler subset of the language, which is quite straightforward and easy to reason about.
Please don't relax the functional purity requirement. Instead, just use an algebraic effects library in place of monads as the "sin bin", guaranteeing that anything fitting the requirements to be an effect commutes with other effects, but still requiring them to be marked in the type.
That's not meaningful to me (I'm a Haskell novice), but the thing I care about is that the compiler doesn't reject correct code (and no, "correctness" is not defined by the compiler, it's defined by the desired behavior). Whether that property comes from relaxing functional purity or some library, I don't much care.
There's a language community investigating those ideas here: https://www.eff-lang.org/ . As they brag:
>Effects are thus first-class citizens of Eff and can be seamlessly combined. There is no need for the do notation, no need for monad transformers, and no need to reshuffle your whole program just to read a global flag.
I guess it is just a matter of taste. Clojure is the reason I have been writing software and it helped me to achieve things that would be very difficult without. It just has the right set of tools for a guy like me with systems engineering background. I find it much more understandable to use map + reduce than for loops, async instead of fork + threading code. The out of the box performance of JVM is also one reason I keep using Clojure.
Could you elaborate on this? Personally, I like the idea of Haskell about as much as I like the idea of Clojure, but Haskell has never quite clicked for me.
I don't think it's necessarily readability holding me back from Haskell, but I do find Clojure to be highly readable which really helped me get up to speed with it quickly. I'm curious what you find more readable about Haskell.
Haskell has a higher learning curve. It takes effort to pick it up. There is no learn Haskell in 5 minutes. But it's syntax is incredibly consistent and beautiful once learned.
Clojure has an incredibly simple syntax, but that isn't so much because it removed the complexity but rather that it just moved it away from the syntax imho.
If you spend enough time in Clojure, it becomes quite readable. With threading macros and the like, there's no real reason you need many more parens that a language like Python.
This is pretty cool. I'm not a huge Python fan, but I wind up using it quite a bit for ML tasks. Functional programming in Python is essentially impossible, between nearly everything being mutable and the terrible one-line lambdas. I know Python is all about "there should be only one way to do it," but that typically doesn't map (ahaha) at all to how I think.
I totally agree. I also have to use Python for machine learning, which has been my primary work activity for years, but Python does not click with me. I have started using MyPy at home but we don’t use it at work. I am so much happier writing code in Common Lisp, Scheme, Haskell, and Ruby.
Looking at the example code, it seems that this is perl-ifying Python. And I'm not saying that as a good thing. The code is more terse and less readable & understandable.
Some of those are very needed constructions (like algebraic data types with pattern matching), but it's adding an entire sub-language into a language that is built on the goal of being simple. I'm not sold into it either.
This is the reason why all of these "mashup" languages are so pointless.
Everyone feels like X language is so great but if only we could add a feature from Y language to it. They then attempt to trivialize what is needed for that feature not realizing that the benefits of Y are due, in part, to the non-trivial amount of functionality. They almost always turn into a façade of the feature they truly want to support and they never gain traction because their usefulness just really isn't there.
I could be missing something but saying this language has ADTs is stretching the definition of ADT a bit much for me. ADTs are defined by combining simpler types (usually with sum and product types) but that does not seem to be possible in Coconut.
Yes, it's stretching, but there is a section for it.
The sum types are create by multiple data declarations joined by directives. It's ugly, messy, error-prone and as your sibling comment says, misses the point of enabling static verification.
I'm a big fan of ADTs and pattern matching, but I hardly see the point in a dynamic language. I also like Haskell because of its type safety and _in spite of_ its syntax, so this sort of feels like the worst of all worlds to me.
It's actually quite possible to write 'pythonic' haskell, if by pythonic you mean 'kind to human readers'. Many of the obscure < * > style infixes in Haskell have prefix counterparts ( for < * >: liftA2 id). Unfortunately infix is preferred by most haskellers, and the prefix english language forms derive their terminology from advanced branches of mathematics, which make them difficult to understand and use intuitively without sufficient background.
That said, all of these things are symptoms of the preferences of the community around the language, and not restrictions imposed by the language design itself. Core Haskell is actually comprised of some pretty easily understandable functions. You can craft synonyms for more advanced abstractions such as applicative fairly easily once you understand the type system. Haskell more or less gives you everything you need to write functional programs in a way that reads like plain english--this is very uncommon however, given the language's origin and close connection with mathematics.
On the other hand, once I did get used to some of the infix operators, I started to really like some of them, i.e. how:
f data data
becomes
f <$> fancyData <*> fancyData
where what makes the data fancy might be validation, being in container like list, or even some sort of reactive-ui-component, like Flare library for Purescript does: http://try.purescript.org/?backend=flare
Coconut seems anti all of those. In particular all of the weird infix operators ( ` $.> `, ` :&$ `, etc) [1]. Gives me flashbacks to dealing with Haskell.
[1]: I just made those up, but I wouldn’t be surprised if they were real.
When talking about a language it usually means the design principles summarized in "The Zen of Python"[1]. When talking about Python code it can also mean "idiomatic".
I've done my fair share of Elixir. And this looks like you hit it right on the nail. Elixir is a great language, but I think Python is Python and Elixir is Elixir. We shouldn't me complicating any more than it already is (not that python is complicated, just adding another languages functionality another hurdle to learn).
I haven't used Coconut beyond some dabbling last time it made HN's frontpage, so someone with more experience may be able to provide a better answer. However, the syntax is based on Python 3, and they use their own builtins for both Python 2 and 3 to maintain compatibility. Some functionality can't be back-ported to Python 2 (tuple unpacking with *, f strings, etc.), and there is more info about that at the link below.
That's clever and horrifying. Clever because writing a regex-based preprocessor for a complex, dynamic language that adds TCO and actually works in most cases is impressive. Horrifying because . . . it's a regex based preprocessor that rearranges arbitrary function code in a complex, dynamic language.
"Pythonic functional" seems like a contradiction of terms, considering python's infamous hostility (e.g. http://fold.sigusr2.net/2010/03/guido-on-functional.html) to functional programming seems like a key part of what it means to be pythonic.
That said, it looks a lot more pleasant than Python. The addition of pattern matching alone makes this seem like a very worthwhile project, a big improvement on Python.
That quote is hilarious -- Guido von Rossum claims that it's impossible to implement `reduce` in a few lines in a functional language. Just out of curiosity, I checked out the `reduce` implementation in Elixir:
def reduce(enumerable, fun) do
result =
Enumerable.reduce(enumerable, {:cont, :first}, fn
x, :first -> {:cont, {:acc, x}}
x, {:acc, acc} -> {:cont, {:acc, fun.(x, acc)}}
end)
|> elem(1)
case result do
:first -> raise Enum.EmptyError
{:acc, acc} -> acc
end
end
It's clear from that quote that Guido (at least at that time) doesn't have real experience with functional programming and is letting his personal biases cloud his judgement.
Okay maybe '"scheme isn't a functional language because it allows side effects and only Haskell and family are allowed to call themselves functional", or whatever. He seems to perhaps hint at that mentality. But the fact remains that a language like scheme makes this much easier than python.
I think you are conflating Guido's attitude to FP with Python too much here. All the FP primitives in Python were championed by other people. In your linked blog post, Guido is talking about purely functional languages and how Python, like Lisp, is not like that. Which is true but doesn't make Python bad for (or "hostile" to) writing functional code.
In fact Python taught a generation of programmers to love functional code, because of the big speedup available from stringing together C-native builtins with higher order functions.
There is "not purely functional", and then there is python. Python does more to discourage FP than other "not purely functional" languages. Guido may no longer be python's BDFL, but it's clear that during his tenure he had little interest in making python better FP. He accepted contributions (like the addition of lambda) from other people, but never did anything to champion FP in python and it really shows.
Do you feel Python discourages FP more than eg C++, or JavaScript? I'd call Python more functional than either of those, and a little less functional than Common Lisp.
It's capable of functional programming in the sense that functions are first class values that you can pass around, but it is infamously hostile to functional programming. For example, python still doesn't have multi-line lambdas, and the justifications for why always boil down to it being "unpythonic".
Allowing to elide the add42 from "def add42(x): x+42" makes the difference between encouraging FP or not?
I've actually started wishing for the opposite change in my use of Clojure. There, it is currently optional to supply a name for the fn form. Always supplying the name, like in "(fn add-42 [x] (+ x 42))", would make stack traces more readable and the named function object would be easy to identify when it appeared in the REPL / pretty-printed data structures.
> "Allowing to elide the add42 from "def add42(x): x+42" makes the difference between encouraging FP or not?"
Not being able to define a two-line function in the place where it's used because you have to give it a name discourages FP in a way many other modern languages do not.
In many cases this may make code even more readable.
I'm not saying that Python's lambdas are perfect as they are or that multi-line lambdas would be a bad thing. I'm just saying it's not as bad as you paint it.
Python really took the direction of preferring list/generator comprehensions for this kind of thing, encouraging their use over both imperative loops and basic HOFs like map/filter. List comprehensions are declarative and hence functional constructs.
Generators also get you another FP checkbox because you now have laziness in a very accessible form. You get building blocks like infinite sequences and so on. And you have access to all the normal primitives like reduce, take, partial, groupby etc in the itertools/functools std modules.
'More readable' is subjective. It's clear that Guido and much of the Python community believe that FP is less readable, less pythonic, and that viewpoint is exactly the sort of hostility towards FP that I'm talking about. And in no small part this is a self-fulfilling attitude, e.g. "python lambdas are ugly because pythonistas believe FP is ugly." They're way uglier in python than in other languages. The ugliness of python's lambda is a trait of python, not a trait of FP, yet it's ugliness cited to justify itself.
> when a for loop would be faster
Hey, that's fine. But I think much of the time for the sort of things I'm working on, that's deep into "premature optimization" territory. I'd rather a slight linear slowdown with more readable code than a slight linear speedup with less readable code. I'd prefer that python not "subtly discourage" me to nudge me towards less readable code because a list comprehension is marginally more performant.
I like the idea but am not fond of the execution, particularly around the aesthetics.
One of the reasons I love Python is because of it's syntactic brevity and conciseness, once you get to intermediate python there are some weird forms of syntactic sugar (magic functions and unpacking for instance) but visually the language has less clutter than most popular languages. Coconut really just seems to remove that aspect.
Is it just me who finds adjective "Pythonic" usage unnecessary and somewhat elitist? Let's be honest, there are no other language counterparts (Java as Javish?! or JS as JSish?!) for such adjective and "Pythonic" seems to be used by a sub-set of Python community. Basically it is idiomatic.
I think that "pythonic" does mean something, and that when they say "pythonic" they are actually trying to communicate something.
JS and Java... well they're not something I think a lot of programmers want to be. JS has a pretty complicated ecosystem that a lot of developers don't like. Java is good at what it's good at, but it's not very fun.
What I see a lot of is "lisp" or "lisp-like" and "pythonic". Those languages are notable for being considered fun by a lot of programmers.
But yes, it's idiomatic. Python is an idiomatic programming language that has sacrificed a lot for that ideology. Now that guido is gone it's less ideological, and we're getting things like type-hinting.
When people try to apply that same ideology to other fields, sometimes they call it pythonic. When they try to apply the lisp ideology to other fields, they just call it a lisp dialect instead of lisp-ish.
> JS and Java... well they're not something I think a lot of programmers want to be.
I actually much prefer modern JS to modern python. JS is flexible enough to allow a functional-ish style, which greatly improves code clarity in my opinion. And of course you can still write python-style code if you want to. Python is a lot more opinionated!
That's right - Guido has been very much in favour of adding type hints to Python.
Are type hints Pythonic? I personally don't think so - to me, they don't match the philosophy of the rest of the language. On the other hand, one could argue that the definition of Pythonic is simply "whatever Guido likes".
> Let's be honest, there are no other language counterparts (Java as Javish?! or JS as JSish?!) for such adjective
I've seen it for other languages, including those two, and denying the term wont deny the concept.
I've seen perl, python, and javascript written like C, java written like perl, etc. Its always a bad idea - each language has strengths in the ability to express concepts to other devs, part in syntax and part in conventions, and swimming upstream against those never helps anyone. (Evolution is great, chaos is not)
None of which says that these terms CAN'T be used in an elitist way, but I dont see that as the primary purpose or use (at least in the circles I'm exposed to)
Meaning, as an exmaple Java and JS aren't known for their constant references and inside jokes relating to Monty Python. That is part of the culture and community around Python.
Saying something is Pythonic instead of saying it is idiomatic, is itself, Pythonic. It is part of the culture grown up around the language. And that's okay.
To be fair, back in the day, both Java and C# were intended to be C++-ish. Javascript itself was famously remodelled to be Java-ish. When Coffeescript came out, it intentionally modelled itself after the brevity of Python. There are lots of Lisp derivatives and Forth-likes (Forth-like is even a pretty established term). ML derivatives are popular. I don't think it's just Python.
Edit: Of course I should have read TFA before replying. Of course this is not a Python like language, but a language that compiles to Python. I agree the cute name is confusing, but if the point is to compile to a specific language, saying what that language is is pretty important ;-)
I find 'Pythonic' more of a higher level philosophy about what "good python style" looks like more than elitist, but I can see how it can be warped by an elitist mindset.
Whenever I write Python I try to keep it 'pythonic' and that mainly means keeping things simple, very explicit, and concise. These are obviously traits that all developers want their code to have, but sometimes the syntactic sugar of a language can make it difficult, I think the Pythonic philosophy provides guardrails against this.
But Coconut seems to be a superset of Python (all valid Python is valid Coconut). Seems like a very good and succinct adjective to use to attract Python users who otherwise wouldn’t be interested in a new language.
I think it stems from Python's "There's only one right way to do things" philosophy. You can write Python that looks almost identical to Java, but that will likely be seen as overly verbose and complex in Python circles. I don't know if this is elitism, or just a desire for consistency and brevity among tools and libraries
I've seen C-like, Rusty, and ___Script (meaning related to JS). Lisp is a family. I'm not sure why you think "Pythonic" is some new vein of elitism, because to me it just seems like it's telling people that Coconut is easy to learn if you know/like Python already.
That said, it immediately told me that I wouldn't be interested in Coconut, so I also found the label valuable as a non-Python person.
Alternatively, the toolz package ( https://toolz.readthedocs.io/en/latest/ ) is a nice way of getting some additional functional programming capabilities while using the standard CPython interpreter.
I hate to be negative and I appreciate this may be due to the requirement to make Coconut a superset of Python, but the pattern matching and the partial application look anything but elegant (or Pythonic) to me.
Just came back to python after several years in javascript. I've found that the functions you need are often already provided in the standard library (for example the `operator` module[1]).
The Python equivalent of your JavaScript example is simply:
def double(x): return x * 2
Python's `lambda` construct creates anonymous functions - its limitations mean it should only be used when anonymity is required. If a function is to be given a name, there is no reason to use a lambda over standard function syntax.
I don't think so, Guido has a long time history of being against FP, and the core of language is statement oriented rather than expression oriented. Beside that "->" operator in python is already used as return type annotation for functions. For the record Julia use exactly this syntax for lambdas. Like:
I've heard the argument that you ought to consider writing a named function and using it where the lambda would have gone. Self documents and cuts down on lines of symbol soup.
Python wants to be broadly good enough for everyone, not perfect for any one person's specific domain and expertise level. I think this might be the right attitude.
This is basically what R offers: both -> and <- assignment operators. The addition of a piping operator in Coconut is great, too, for people familiar with dplyr.
Which, if you can stand the parens, is a good argument for lisps or a good macro system in general. Clojure has arrow macros which are trivial to implement using procedural macros.
The first question that comes to mind is, what is wrong with using the JVM or the CLR as a compilation target for such a language. The only thing that I can think of is that they want to exist on the Python platform, which makes some sense. Python is taking over the computing industry and programming education and it does have a lot of libraries. Perhaps Python should have a standard Python virtual machine, but that almost certainly is never going to happen, for multiple reasons. So here you are stuck with a transpiler. A transpiler whose purpose is to make a functional language compile to a language which was explicitly designed not to be functional at all.
Yes. It is a bit of a head scratcher why people would be going to enormous effort to transpile to an execution environment that is as gimped as Python's. I use Groovy pretty much as "python for the JVM", and a big reason for that is simply that Python's runtime limitations are unacceptable to me from a basic engineering standpoint (primarily, GIL). I guess Python's ecosystem is big enough now that the selling point of "you can use all your favorite libraries" is enough of a drawcard?
There are two things you can do to ensure that the transpiled Python is as fast as possible. First, disable TCO (while leaving TRE enabled) with the `--no-tco` argument to the compiler. Second, make sure you set the target version of Python correctly. This will allow the compiler to make a number of optimizations that are specific to different Python versions.
Overall, for the majority of use cases I have found Coconut code to be approximately as performant as Python code written by hand. The compiler speed can leave a little to be desired though.
How are the "algebraic data types" algebraic if they don't express any relation to each other?
This seems like a cool syntactic layer for people that are already writing Python and wishing for a more functional style. However to call this a functional language would betray the definition.
I’m not 100% sure what that would look like, but I would ditch classes (and thus inheritance) for named tuples, make most things immutable by default, include enums and pattern matching (that the type system would check; I hardly see the point in dynamic pattern matching). If possible, typed code would compile to C extensions or similar. Oh, and exceptions would go away in favor of a Rust-like Result type.