Hacker News new | past | comments | ask | show | jobs | submit login
Coconut: Pythonic functional programming (coconut-lang.org)
428 points by andybak on Jan 3, 2019 | hide | past | favorite | 169 comments



This seems like a neat project but I can’t say I ever wanted Python to be an even bigger language... Ideally we would have a Python-like that eschews unnecessary features like inheritance and much of the magic that makes Python hard to optimize (and often hard to understand), and with a type system (even an optional type system) that is better than Mypy.

I’m not 100% sure what that would look like, but I would ditch classes (and thus inheritance) for named tuples, make most things immutable by default, include enums and pattern matching (that the type system would check; I hardly see the point in dynamic pattern matching). If possible, typed code would compile to C extensions or similar. Oh, and exceptions would go away in favor of a Rust-like Result type.


You should check nim out, it's a python-like language that compiles to C.


I also recommend looking at Nim. It has Python-like syntax but it feels very different than Python for me when using it. Stellar language though that has been getting a lot of attention!


Thanks for the recommendations, folks. I'll take a look!


Whoa - Nim seems really cool! Thanks for mentioning it. I like that it also compiles to JS, and seems to be pretty active (latest release was just on Dec 31).


Isn't Nim, like Python, a large, complex language? Looking at the documentation, it seems to include a huge number of features.


It is a little large, yes. The creator's goal was always to create a language with a small core but a rich set of features for extending the language (metaprogramming). Nim has bloated a little and we're now trying to remove as many features as we can without annoying too many people :)


Glad to hear it. I've been skimming the documentation and definitely got the impression that it is a feature-rich language to put it politely :). In the words of Bjarne Stroustrup, it seems like there is a small core language that wants to get out. I imagine you could create a linter that catches uses of extraneous features and foster a culture of "Good nim code passes this linter"--you could create badges for repositories that show the percent of the code that adheres to the linter or something. Over time, you could deprecate and then remove these extraneous features.


This is part of what turned me away from Nim, so it's good to hear that. Can you give some examples of some features you've removed recently? I notice you're not at 1.0, so are you comfortable ignoring the 'annoying too many people' part if you think there's enough of a reason to remove something?


One thing I can think of is the likely little known {.this.} feature, that's been removed. We definitely need to do more, and yeah, it's hard :)


Ever tried out Cython?

https://en.wikipedia.org/wiki/Cython

There's also RPython which is restrictive and what PyPy is done in.

https://en.wikipedia.org/wiki/PyPy#RPython

Edit: on an unrelated note, sounds like you'd love Go.


I'm aware of Cython and RPython. RPython is super cool but it isn't really a general purpose programming language. Cython is superficially similar to what I want, but it's always put me off; not sure why off the top of my head. I do love Go, however. :)


Cython is basically C in Python syntax with some convenience features.

Unless you know C it's not always worth using. Without doing any "C stuff" (i.e. just writing Python and compiling with Cython) you get about a 50% speedup in tight loops; sometimes that's good enough, other times it's not.

To go any faster than that, you basically need to start writing C code, using the CPython API, using Cython's badly under-documented features for doing so.


Learning C to the point where you get Cython is pretty easy. C is much easier to pick up rather than the HUGE C++. Here is a video you can watch in an afternoon that would give you the general gist and covers the basics quickly. https://www.youtube.com/watch?v=KJgsSFOSQv0


For me Cython truly shines when interfacing to C++.


I hear you on Go. I am a Python dev at work and love Python, have dug into PEP8 and understood the tradeoffs for the sake of simplicity. Now after years of not touching Go I go back to it and it feels soooo familiar. It truly feels like a compliment to any serious Python developer to find Go. Now I got two goto languages for prototyping.


Julia is very much like this.


superficially. Under the hood, though, Julia has an type-system that interacts with the compiler in a very specific way that makes it possible to write highly optimized dynamic code without the overhead of a VM (is that the right term) like python's.


What's wrong with Mypy? And have you looked at Pyre/Pytype?


Lots of small issues, some of which may have changed since I last checked. I haven't looked at Pyre/Pytype yet. These are just a handful of issues with mypy that spring to mind:

* Poor syntax support in the language. E.g., you have to define TypeVars as variables in a scope outside of the unit you want to use them in, potentially conflicting with other generic functions. No clear guidance from the documentation about how the type system interprets these, either.

* No recursive types; e.g., can't define a JSON type

* Some callables can't be typed. I'm thinking they're those with kwargs and/or args, but I forget the particulars.

* Unions aren't true ADTs; the only way to define single-value variants (e.g., "null" in a JSON type) is to do something verbose and hokey, like creating a type that can only have one value.

* Lots of common libraries can't be spec'ed and therefore can't be typechecked. Things like SQLAlchemy which generate their types/classes at runtime, for example.

* Generally, the implementation is buggy even for mundane cases


This sounds like Rust to me...


Rust is great for lots of reasons, but it's not remotely what I think of as Pythonic. Rust has a steep learning curve and requires a much deeper understanding of memory management than Python, and it's also not much smaller.


I feel like you mostly just described Haskell, no?


Haskell is functionally pure with a syntax that I just can't bring myself to call readable. It's also very complex--which string type do I use? Which prelude? Which compiler extensions? Which build tools? Which testing library? I think if someone went and made a Haskell-lite and really focused on polishing the experience (including a familiar syntax, simple/straightforward tooling, relaxing the functional purity requirement, etc) it would be a great advancement.


> which string type do I use?

I think this issue is way overblown. The reasons for choosing a particular string representation in Haskell are analogous to the reasons to choose among e.g. byte arrays, streams, string builders, etc. in mainstream programming languages.

> Which prelude?

There is a standard prelude. Unless you actively choose another one, that's the one you'll get.

> Which compiler extensions?

If you want to use certain advanced language features, enable the appropriate extensions.

> Which testing library?

Many mainstream programming platforms (e.g. .NET) have multiple popular testing libraries to choose from. Is this a bad thing?


> Which compiler extensions?

If you want to use certain advanced language features, enable the appropriate extensions.

Your answers typify a level of comprehension of Haskell which appears to occupy the same brain space empathy would do. The whole POINT is that there are complex language features which have to be enabled if you want them: how does anyone know a priori in a new situation what to enable and what to disable? What side effects? What consequences?

I loved learning Haskell but to pretend it's syntactic simplicity translates to simple in all things is to misunderstand.

Gcc has a million compiler -W options do you think every C programmer knows them all? Do you think that every cat user knows why we joke about cat -v?

You have mastered Haskell and forgotten what lack of complete understanding means to anyone else. Mastering FORTRAN or pascal or lisp was trivial by comparison.

Mastering the underlying concepts of recursion, and tail recursion, and typing systems, and then integrating optional language features is not trivial.


There's a lot here I would like to respond to, but I'll limit myself to a couple points:

> The whole POINT is that there are complex language features which have to be enabled if you want them

Yes, "if you want them". If you value having a simpler language, don't use them. And Haskell is hardly unique in this regard. You've mentioned GCC, but I think the Babel transpiler is another good example.

> You have mastered Haskell and forgotten what lack of complete understanding means to anyone else.

I've managed to teach my daughter some Haskell, who had no prior exposure to programming languages whatsoever, so I doubt your assertion is entirely true.

The hardest programming language I ever learned was the first one. Each one after that was easier than the one before...until I got to Haskell. It was so different that I was forced to go back to a more fundamental understanding of the nature of code and computation and start from there. It felt pretty challenging, especially at first, but I think that had more to do with my perspective and experience coming into it than the language itself (and the fact that I had a family and career at that point didn't help).


These were examples of the choices one has to make when using Haskell; it is far from exhaustive. The point is that IMO Haskell undervalues consistency and standardization.


I can only directly address the examples you've given. If you disagree with how I've characterized those examples, I'd be interested to know why. If you have other examples, I'd be interested to hear them.

Haskell has its weaknesses, e.g. the lack of quality tooling available as compared to mainstream programming languages.

But I disagree that Haskell is complex, at least as a criticism. When expressing concepts of a similar complexity, I find Haskell to be particularly concise and expressive as compared to most other programming languages I am familiar with.

And if Haskell undervalues consistency and standardization, I would like to know as compared to what? The only programming languages that I know of where there are not many reasonable choices for e.g. a testing framework are those that either (a) haven't been around that long, or (b) haven't seen wide adoption.


I have never seen non-trivial Haskell that didn't enable at least 1 compiler extension. Laziness is difficult to reason about. Purity makes easy things hard in exchange for making hard things easy. You can't even make it through the standard library documentation without bumping into category theory.

Haskell is many things. Simple is not one of them.


> I have never seen non-trivial Haskell that didn't enable at least 1 compiler extension.

Language extensions are idiomatic Haskell. There is only one mainstream compiler used everywhere. Widely used extensions are a natural result when the compiler, as a testbed, can outpace the language standard.

Enabling extensions is as trivial as including a standard library package. (In fact, the extensions are often better documented than the standard library, as you hint to.)


Honestly, I’m content to agree to disagree. I’ve had this conversation too often to repeat it here. The criticisms aren’t novel; I’m sure they’re easily found elsewhere on the Internet.


I agree and the closest thing I've found to a Haskell lite is F#.


Ocaml is, imo, a pretty and not utterly impractical language as well.


Does it still do the thing where it doesn't have operator overloading?

Made math code oh so ugly.


Ocaml has no ability to do ad-hoc polymorphism at all (but it does have parametric polymorphism), which affects more things than operators, so that's why not only have the weird math operators, but things like string_of_int.

This will change at some point though. Modular implicits are coming to Ocaml. These are similar but much simpler than Scala's implicits, and allow the same kinds of patterns. If Ocaml gets that and eventually gains a bigger community with better library coverage, it's something I would consider using heavily.


That would indeed be good yeah - it sounds like a small thing but it really would help code readability quite a bit, especially when dealing with stuff like bignums.


I knew the thread would head in this direction. :) Unfortunately, those languages' syntaxes are only marginally better and they have more complexity than I care for (especially objects and inheritance).


Given your "especially" comment -- you might be mis-remembering something about OCaml? It's true that it has these features, but in practice they are almost never used. The object system is useful if your program's design really needs open recursion / inheritance. But it's far more common to build an OCaml program using other constructs (modules and simple data types).

That's not to say that there isn't any complexity in OCaml. The type system and module/functor system have a daunting number of features. But most applications stick to a much simpler subset of the language, which is quite straightforward and easy to reason about.


Granted objects are rare and this isn't one of the things I consider to be significant grievance about OCaml. :)


Fair enough. :)


Also has a syntax reskin in the form of ReasonML


Please don't relax the functional purity requirement. Instead, just use an algebraic effects library in place of monads as the "sin bin", guaranteeing that anything fitting the requirements to be an effect commutes with other effects, but still requiring them to be marked in the type.


That's not meaningful to me (I'm a Haskell novice), but the thing I care about is that the compiler doesn't reject correct code (and no, "correctness" is not defined by the compiler, it's defined by the desired behavior). Whether that property comes from relaxing functional purity or some library, I don't much care.


There's a language community investigating those ideas here: https://www.eff-lang.org/ . As they brag:

>Effects are thus first-class citizens of Eff and can be seamlessly combined. There is no need for the do notation, no need for monad transformers, and no need to reshuffle your whole program just to read a global flag.


On that topic Haskell is soooooooooo much more readable than clojure to me


I guess it is just a matter of taste. Clojure is the reason I have been writing software and it helped me to achieve things that would be very difficult without. It just has the right set of tools for a guy like me with systems engineering background. I find it much more understandable to use map + reduce than for loops, async instead of fork + threading code. The out of the box performance of JVM is also one reason I keep using Clojure.


Could you elaborate on this? Personally, I like the idea of Haskell about as much as I like the idea of Clojure, but Haskell has never quite clicked for me.

I don't think it's necessarily readability holding me back from Haskell, but I do find Clojure to be highly readable which really helped me get up to speed with it quickly. I'm curious what you find more readable about Haskell.


Haskell has a higher learning curve. It takes effort to pick it up. There is no learn Haskell in 5 minutes. But it's syntax is incredibly consistent and beautiful once learned.

Clojure has an incredibly simple syntax, but that isn't so much because it removed the complexity but rather that it just moved it away from the syntax imho.


> There is no learn Haskell in 5 minutes.

True, but there is a learn Haskell in 10 minutes...

https://wiki.haskell.org/Learn_Haskell_in_10_minutes


If you spend enough time in Clojure, it becomes quite readable. With threading macros and the like, there's no real reason you need many more parens that a language like Python.


It is kind of funny when I make my programmer friends count parenthesis in their code vs my Clojure.


I guess no one of your friends uses F#


Not really, but I like OcaML.


I can confirm, I have been writing Clojure for seven years. Its pretty nice to look at now.


mypy is great


This is pretty cool. I'm not a huge Python fan, but I wind up using it quite a bit for ML tasks. Functional programming in Python is essentially impossible, between nearly everything being mutable and the terrible one-line lambdas. I know Python is all about "there should be only one way to do it," but that typically doesn't map (ahaha) at all to how I think.


I totally agree. I also have to use Python for machine learning, which has been my primary work activity for years, but Python does not click with me. I have started using MyPy at home but we don’t use it at work. I am so much happier writing code in Common Lisp, Scheme, Haskell, and Ruby.


You say "ML" tasks. Do you mean StandardML or Ocaml?


My reading was "machine learning".


Looking at the example code, it seems that this is perl-ifying Python. And I'm not saying that as a good thing. The code is more terse and less readable & understandable.


It's haskellfying it.

Some of those are very needed constructions (like algebraic data types with pattern matching), but it's adding an entire sub-language into a language that is built on the goal of being simple. I'm not sold into it either.


This is the reason why all of these "mashup" languages are so pointless.

Everyone feels like X language is so great but if only we could add a feature from Y language to it. They then attempt to trivialize what is needed for that feature not realizing that the benefits of Y are due, in part, to the non-trivial amount of functionality. They almost always turn into a façade of the feature they truly want to support and they never gain traction because their usefulness just really isn't there.


I agree. Likely-unpopular corollary: the same is true of adding threads to JavaScript (server or client).


I could be missing something but saying this language has ADTs is stretching the definition of ADT a bit much for me. ADTs are defined by combining simpler types (usually with sum and product types) but that does not seem to be possible in Coconut.


Very disappointing, like why bother doing ADTs if you aren't going to do them right and have exhaustive pattern matching.


Yes, it's stretching, but there is a section for it.

The sum types are create by multiple data declarations joined by directives. It's ugly, messy, error-prone and as your sibling comment says, misses the point of enabling static verification.


I'm a big fan of ADTs and pattern matching, but I hardly see the point in a dynamic language. I also like Haskell because of its type safety and _in spite of_ its syntax, so this sort of feels like the worst of all worlds to me.


I'd like to see an actually pythonic Haskell. Needs the right designer.


It's actually quite possible to write 'pythonic' haskell, if by pythonic you mean 'kind to human readers'. Many of the obscure < * > style infixes in Haskell have prefix counterparts ( for < * >: liftA2 id). Unfortunately infix is preferred by most haskellers, and the prefix english language forms derive their terminology from advanced branches of mathematics, which make them difficult to understand and use intuitively without sufficient background.

That said, all of these things are symptoms of the preferences of the community around the language, and not restrictions imposed by the language design itself. Core Haskell is actually comprised of some pretty easily understandable functions. You can craft synonyms for more advanced abstractions such as applicative fairly easily once you understand the type system. Haskell more or less gives you everything you need to write functional programs in a way that reads like plain english--this is very uncommon however, given the language's origin and close connection with mathematics.


I remember reading an old post by Gabriel Gonzales on exactly this topic: http://www.haskellforall.com/2015/09/how-to-make-your-haskel...

On the other hand, once I did get used to some of the infix operators, I started to really like some of them, i.e. how:

f data data

becomes

f <$> fancyData <*> fancyData

where what makes the data fancy might be validation, being in container like list, or even some sort of reactive-ui-component, like Flare library for Purescript does: http://try.purescript.org/?backend=flare


this statement makes we wonder: what does pythonic exactly mean?


The important part to me is good ergonomics from the point of view of the programmer and reader.

Beyond that, some particulars:

* Being terse is a lower priority than usual

* Cool features are a lower priority than usual

* Be careful how you use and then re-use symbols (Haskell can be confusing this way).

* Use English (or whatever spoken language) in your symbols


Coconut seems anti all of those. In particular all of the weird infix operators ( ` $.> `, ` :&$ `, etc) [1]. Gives me flashbacks to dealing with Haskell.

[1]: I just made those up, but I wouldn’t be surprised if they were real.


I've started to hate the phrase pythonic at work. its ambiguous and leads to bikesheding.


When talking about a language it usually means the design principles summarized in "The Zen of Python"[1]. When talking about Python code it can also mean "idiomatic".

[1] https://www.python.org/dev/peps/pep-0020/


Elixir-ifying Python, maybe? The only Perlism I recognize are the dollar signs, but that could be PHP.


I've done my fair share of Elixir. And this looks like you hit it right on the nail. Elixir is a great language, but I think Python is Python and Elixir is Elixir. We shouldn't me complicating any more than it already is (not that python is complicated, just adding another languages functionality another hurdle to learn).


It's got the pipe operator, but that's about it.


I always refer back to PEP 20 when thinking about these issues (and many things in life actually):

https://www.python.org/dev/peps/pep-0020/

"Explicit is better than implicit."


I recently gave a talk on Coconut in PyCon Bangkok: https://youtu.be/24DWw6Ozkvo


I really enjoyed your talk. Thank you for giving the talk and sharing the link!


Thank you for the talk and for sharing the link to this!


> And Coconut code runs the same on any Python version, making the Python 2/3 split a thing of the past.

My understand was that the big issue was the fact that [third party] libraries assume that strings were made of 8bits characters.

How does coconut solve that issue?

Seems a little misleading....

if you import enough things from `six` or `__future__` then you code will run the same on both python 2/3.

some things like fancy tuple expansion aren't available if you work with that subset, but yeah, most things are OK.


I haven't used Coconut beyond some dabbling last time it made HN's frontpage, so someone with more experience may be able to provide a better answer. However, the syntax is based on Python 3, and they use their own builtins for both Python 2 and 3 to maintain compatibility. Some functionality can't be back-ported to Python 2 (tuple unpacking with *, f strings, etc.), and there is more info about that at the link below.

https://coconut.readthedocs.io/en/master/DOCS.html#compatibl...


Thanks for the link.

I wonder how they are doing tail call optimization in python


Here is where TCO is implemented in the Coconut compiler code:

https://github.com/evhub/coconut/blob/07e311fb8f69861d30e58f...


That's clever and horrifying. Clever because writing a regex-based preprocessor for a complex, dynamic language that adds TCO and actually works in most cases is impressive. Horrifying because . . . it's a regex based preprocessor that rearranges arbitrary function code in a complex, dynamic language.


"Pythonic functional" seems like a contradiction of terms, considering python's infamous hostility (e.g. http://fold.sigusr2.net/2010/03/guido-on-functional.html) to functional programming seems like a key part of what it means to be pythonic.

That said, it looks a lot more pleasant than Python. The addition of pattern matching alone makes this seem like a very worthwhile project, a big improvement on Python.


That quote is hilarious -- Guido von Rossum claims that it's impossible to implement `reduce` in a few lines in a functional language. Just out of curiosity, I checked out the `reduce` implementation in Elixir:

  def reduce(enumerable, fun) do
    result =
      Enumerable.reduce(enumerable, {:cont, :first}, fn
        x, :first -> {:cont, {:acc, x}}
        x, {:acc, acc} -> {:cont, {:acc, fun.(x, acc)}}
      end)
      |> elem(1)

    case result do
      :first -> raise Enum.EmptyError
      {:acc, acc} -> acc
    end
  end

Seems pretty straightforward to me.

https://github.com/elixir-lang/elixir/blob/v1.7.4/lib/elixir...


Exactly. In scheme reduce-left/fold-left can be as simple as:

    (define (reduce-left fn init lst)
     (if (null? lst)
      init
      (reduce-left fn (fn init (car lst)) (cdr lst))))
It's clear from that quote that Guido (at least at that time) doesn't have real experience with functional programming and is letting his personal biases cloud his judgement.

Okay maybe '"scheme isn't a functional language because it allows side effects and only Haskell and family are allowed to call themselves functional", or whatever. He seems to perhaps hint at that mentality. But the fact remains that a language like scheme makes this much easier than python.


What do you mean by the last paragraph? foldl in Haskell is just

    foldl _ acc []     = acc
    foldl f acc (x:xs) = foldl f (f acc x) xs
And in Python something like

    def foldl(f, acc, l):
        for x in l:
            acc = f(acc, x)
        return acc
Which doesn't seem much harder to read or write than any other version, to me (it's all just syntax around the same algorithm).


I think you are conflating Guido's attitude to FP with Python too much here. All the FP primitives in Python were championed by other people. In your linked blog post, Guido is talking about purely functional languages and how Python, like Lisp, is not like that. Which is true but doesn't make Python bad for (or "hostile" to) writing functional code.

In fact Python taught a generation of programmers to love functional code, because of the big speedup available from stringing together C-native builtins with higher order functions.


There is "not purely functional", and then there is python. Python does more to discourage FP than other "not purely functional" languages. Guido may no longer be python's BDFL, but it's clear that during his tenure he had little interest in making python better FP. He accepted contributions (like the addition of lambda) from other people, but never did anything to champion FP in python and it really shows.


Do you feel Python discourages FP more than eg C++, or JavaScript? I'd call Python more functional than either of those, and a little less functional than Common Lisp.


More than javascript, yes. C++ is not a language I'd say nice things about unless under duress.


Yes


I’m not sure what you mean, what about python is notoriously unfunctional?


The built-in data structures often force in-place mutation on code which wants to be in any way efficient.


It's capable of functional programming in the sense that functions are first class values that you can pass around, but it is infamously hostile to functional programming. For example, python still doesn't have multi-line lambdas, and the justifications for why always boil down to it being "unpythonic".

Furthermore over the years Guido has made it pretty clear that he doesn't particularly like or care about functional programming: https://python-history.blogspot.com/2009/04/origins-of-pytho...


Requiring complex functions to have names isn't hostile to functional programming.


It really is though. It doesn't prevent it, but it sure as shit doesn't encourage it.


Allowing to elide the add42 from "def add42(x): x+42" makes the difference between encouraging FP or not?

I've actually started wishing for the opposite change in my use of Clojure. There, it is currently optional to supply a name for the fn form. Always supplying the name, like in "(fn add-42 [x] (+ x 42))", would make stack traces more readable and the named function object would be easy to identify when it appeared in the REPL / pretty-printed data structures.


> "Allowing to elide the add42 from "def add42(x): x+42" makes the difference between encouraging FP or not?"

Not being able to define a two-line function in the place where it's used because you have to give it a name discourages FP in a way many other modern languages do not.


Not really, since Python can nest functions:

  def something(x):
    def myfunc(y, z):
      ...
      return result
    reduce(myfunc, x)
So the difference is like having to write

  x = some_very_complex_expression_that_can_get_quite_long
  some_function(x)
instead of

  some_function(some_very_complex_expression_that_can_get_quite_long)
In many cases this may make code even more readable.

I'm not saying that Python's lambdas are perfect as they are or that multi-line lambdas would be a bad thing. I'm just saying it's not as bad as you paint it.


It subtly discourages functional programming when a for loop would be faster and more readable.


Python really took the direction of preferring list/generator comprehensions for this kind of thing, encouraging their use over both imperative loops and basic HOFs like map/filter. List comprehensions are declarative and hence functional constructs.

Generators also get you another FP checkbox because you now have laziness in a very accessible form. You get building blocks like infinite sequences and so on. And you have access to all the normal primitives like reduce, take, partial, groupby etc in the itertools/functools std modules.


> more readable

'More readable' is subjective. It's clear that Guido and much of the Python community believe that FP is less readable, less pythonic, and that viewpoint is exactly the sort of hostility towards FP that I'm talking about. And in no small part this is a self-fulfilling attitude, e.g. "python lambdas are ugly because pythonistas believe FP is ugly." They're way uglier in python than in other languages. The ugliness of python's lambda is a trait of python, not a trait of FP, yet it's ugliness cited to justify itself.

> when a for loop would be faster

Hey, that's fine. But I think much of the time for the sort of things I'm working on, that's deep into "premature optimization" territory. I'd rather a slight linear slowdown with more readable code than a slight linear speedup with less readable code. I'd prefer that python not "subtly discourage" me to nudge me towards less readable code because a list comprehension is marginally more performant.


I like the idea but am not fond of the execution, particularly around the aesthetics.

One of the reasons I love Python is because of it's syntactic brevity and conciseness, once you get to intermediate python there are some weird forms of syntactic sugar (magic functions and unpacking for instance) but visually the language has less clutter than most popular languages. Coconut really just seems to remove that aspect.


Is it just me who finds adjective "Pythonic" usage unnecessary and somewhat elitist? Let's be honest, there are no other language counterparts (Java as Javish?! or JS as JSish?!) for such adjective and "Pythonic" seems to be used by a sub-set of Python community. Basically it is idiomatic.


I think that "pythonic" does mean something, and that when they say "pythonic" they are actually trying to communicate something.

JS and Java... well they're not something I think a lot of programmers want to be. JS has a pretty complicated ecosystem that a lot of developers don't like. Java is good at what it's good at, but it's not very fun.

What I see a lot of is "lisp" or "lisp-like" and "pythonic". Those languages are notable for being considered fun by a lot of programmers.

But yes, it's idiomatic. Python is an idiomatic programming language that has sacrificed a lot for that ideology. Now that guido is gone it's less ideological, and we're getting things like type-hinting.

When people try to apply that same ideology to other fields, sometimes they call it pythonic. When they try to apply the lisp ideology to other fields, they just call it a lisp dialect instead of lisp-ish.


> JS and Java... well they're not something I think a lot of programmers want to be.

I actually much prefer modern JS to modern python. JS is flexible enough to allow a functional-ish style, which greatly improves code clarity in my opinion. And of course you can still write python-style code if you want to. Python is a lot more opinionated!


Guido was working on Dropbox with Python. One of areas was type hinting (of large Dropbox Python codebase), so it is partially coming from him.

Learnt this in Pycon 2014 if I recall correctly.


That's right - Guido has been very much in favour of adding type hints to Python.

Are type hints Pythonic? I personally don't think so - to me, they don't match the philosophy of the rest of the language. On the other hand, one could argue that the definition of Pythonic is simply "whatever Guido likes".


> Let's be honest, there are no other language counterparts (Java as Javish?! or JS as JSish?!) for such adjective

I've seen it for other languages, including those two, and denying the term wont deny the concept.

I've seen perl, python, and javascript written like C, java written like perl, etc. Its always a bad idea - each language has strengths in the ability to express concepts to other devs, part in syntax and part in conventions, and swimming upstream against those never helps anyone. (Evolution is great, chaos is not)

None of which says that these terms CAN'T be used in an elitist way, but I dont see that as the primary purpose or use (at least in the circles I'm exposed to)


I don't find it elitist, I find it Pythonic.

Meaning, as an exmaple Java and JS aren't known for their constant references and inside jokes relating to Monty Python. That is part of the culture and community around Python.

Saying something is Pythonic instead of saying it is idiomatic, is itself, Pythonic. It is part of the culture grown up around the language. And that's okay.


To be fair, back in the day, both Java and C# were intended to be C++-ish. Javascript itself was famously remodelled to be Java-ish. When Coffeescript came out, it intentionally modelled itself after the brevity of Python. There are lots of Lisp derivatives and Forth-likes (Forth-like is even a pretty established term). ML derivatives are popular. I don't think it's just Python.

Edit: Of course I should have read TFA before replying. Of course this is not a Python like language, but a language that compiles to Python. I agree the cute name is confusing, but if the point is to compile to a specific language, saying what that language is is pretty important ;-)


I find 'Pythonic' more of a higher level philosophy about what "good python style" looks like more than elitist, but I can see how it can be warped by an elitist mindset.

Whenever I write Python I try to keep it 'pythonic' and that mainly means keeping things simple, very explicit, and concise. These are obviously traits that all developers want their code to have, but sometimes the syntactic sugar of a language can make it difficult, I think the Pythonic philosophy provides guardrails against this.


There are plenty of counterparts. I've heard source code praised for being "Swifty", "Lispy", or "Forthy".


But Coconut seems to be a superset of Python (all valid Python is valid Coconut). Seems like a very good and succinct adjective to use to attract Python users who otherwise wouldn’t be interested in a new language.


I think it stems from Python's "There's only one right way to do things" philosophy. You can write Python that looks almost identical to Java, but that will likely be seen as overly verbose and complex in Python circles. I don't know if this is elitism, or just a desire for consistency and brevity among tools and libraries


As far as the python community goes, the term 'pythonic' barely registers in terms of elitism. The phrase "for humans" comes to mind...


I've seen C-like, Rusty, and ___Script (meaning related to JS). Lisp is a family. I'm not sure why you think "Pythonic" is some new vein of elitism, because to me it just seems like it's telling people that Coconut is easy to learn if you know/like Python already.

That said, it immediately told me that I wouldn't be interested in Coconut, so I also found the label valuable as a non-Python person.


It really does just mean idiomatic python. The word itself is almost a mash up of those two.

And I don't see how that's a problem? Every language has a subculture around it, language specific jargon, even inside jokes.

Pythonic stems from the idea of simple ways to do "stuff", and that's pretty much it.


Coconut is a superset of Python.

"Pythonic" doesn't just mean "idiomatic". Python has design principles[1] that influence new idioms.

[1] https://www.python.org/dev/peps/pep-0020/


There are counterparts in other languages. Idiomatic usage of Swift code is said to be "Swifty".


Its a cult, man.


Alternatively, the toolz package ( https://toolz.readthedocs.io/en/latest/ ) is a nice way of getting some additional functional programming capabilities while using the standard CPython interpreter.


Nice effort and I really like the premise.

I hate to be negative and I appreciate this may be due to the requirement to make Coconut a superset of Python, but the pattern matching and the partial application look anything but elegant (or Pythonic) to me.


I'm a huge fan of the lambda syntax :

x -> x * 2

It looks much cleaner than the traditional :

lambda x : x * 2

Would a PEP adding this syntax have any chance to pass ?


Just came back to python after several years in javascript. I've found that the functions you need are often already provided in the standard library (for example the `operator` module[1]).

So where in javascript you would write:

    const double = x => x * 2
In python you can simply write:

    from operator import mul
    double = mul(2)
[1]: https://docs.python.org/2/library/operator.html


The Python equivalent of your JavaScript example is simply:

    def double(x): return x * 2
Python's `lambda` construct creates anonymous functions - its limitations mean it should only be used when anonymity is required. If a function is to be given a name, there is no reason to use a lambda over standard function syntax.


That won't work, you have to use something like partial instead if you want to use the operator module:

    from functools import partial
    import operator
    double = partial(operator.mul, 2)
And at that point I think it would way simpler to just define a normal function.


That don't work since you need to use `partial` on `mul`.

I'm a big fan of the way Haskell manages operators and partial application.

    double :: Num a => a -> a
    double = (2 *)


I don't think so, Guido has a long time history of being against FP, and the core of language is statement oriented rather than expression oriented. Beside that "->" operator in python is already used as return type annotation for functions. For the record Julia use exactly this syntax for lambdas. Like:

foo = a -> b -> a+2*b

It's pretty lispy language.


I've heard the argument that you ought to consider writing a named function and using it where the lambda would have gone. Self documents and cuts down on lines of symbol soup.

Python wants to be broadly good enough for everyone, not perfect for any one person's specific domain and expertise level. I think this might be the right attitude.


I think Guido wanted to get rid of lambdas altogether one time. If anything I would guess they want to keep it ugly to discourage its use.


Then you should take a look a Julia, if you haven't already.


This is basically what R offers: both -> and <- assignment operators. The addition of a piping operator in Coconut is great, too, for people familiar with dplyr.


Would it be possible for Python's parser to handle the infix `->` syntax, without making it more complex than the current LL(1)?



I instinctively distrust every project that self-describes as "elegant". Doubly so if it also says it is "simple".

You docs should show it, not tell it.


I get the nausea shivers when something is described as "curated"


The pipe operator alone makes this worth using.


Which, if you can stand the parens, is a good argument for lisps or a good macro system in general. Clojure has arrow macros which are trivial to implement using procedural macros.


Are you familiar with F#?


Or Elixir.


I think I could replace bash with xonsh[0] as my default shell if coconut's pipe syntax were available!

[0] https://xon.sh/ [1] https://github.com/xonsh/xonsh/issues/1336


The first question that comes to mind is, what is wrong with using the JVM or the CLR as a compilation target for such a language. The only thing that I can think of is that they want to exist on the Python platform, which makes some sense. Python is taking over the computing industry and programming education and it does have a lot of libraries. Perhaps Python should have a standard Python virtual machine, but that almost certainly is never going to happen, for multiple reasons. So here you are stuck with a transpiler. A transpiler whose purpose is to make a functional language compile to a language which was explicitly designed not to be functional at all.


Yes. It is a bit of a head scratcher why people would be going to enormous effort to transpile to an execution environment that is as gimped as Python's. I use Groovy pretty much as "python for the JVM", and a big reason for that is simply that Python's runtime limitations are unacceptable to me from a basic engineering standpoint (primarily, GIL). I guess Python's ecosystem is big enough now that the selling point of "you can use all your favorite libraries" is enough of a drawcard?


> Perhaps Python should have a standard Python virtual machine, but that almost certainly is never going to happen, for multiple reasons

Really really curious, what are those reasons?. Not being sarcastic, just want to know.


I still don't understand why Common Lisp isn't more popular


I'm seeing a lot of the things I like about Elixir here. Very slick.

Are there performance implications?


There are two things you can do to ensure that the transpiled Python is as fast as possible. First, disable TCO (while leaving TRE enabled) with the `--no-tco` argument to the compiler. Second, make sure you set the target version of Python correctly. This will allow the compiler to make a number of optimizations that are specific to different Python versions.

Overall, for the majority of use cases I have found Coconut code to be approximately as performant as Python code written by hand. The compiler speed can leave a little to be desired though.


Are there supervision trees, share-nothing concurrency, and out of band multithreaded garbage collection, and IO.inspect?


So it's like CoffeeScript (https://coffeescript.org/) to JavaScript?


Except not all of JavaScript is a valid CoffeeScript.


Yes, similarly to how CoffeScript is transpiled into JS, Coconut is transpiled into Python.


How are the "algebraic data types" algebraic if they don't express any relation to each other?

This seems like a cool syntactic layer for people that are already writing Python and wishing for a more functional style. However to call this a functional language would betray the definition.


I really like the look of this. Most programming in my spare time is done within a more functional paradigm.

But I can't help but wonder if companies ever go for these kind if extensions / frameworks / languages.

I will play around with this one though!


Heh... "compiles to Python code" - so still no efficient multithreading :(


related, hy-lang (clojury python transpiler)


> The difference between |> and |> is exactly analogous to the difference between f(args) and f(args).

Should the order of the second list be reversed?

i.e. |> corresponds to f(args) and |> corresponds to f(args)


I wish people would stop using |> as an operator. It's exceedingly awkward to type, and I shouldn't have to rely on editor tricks to input code.


There's also Hy:

http://docs.hylang.org/en/stable/

"hy - A dialect of Lisp that's embedded in Python"


How is this implemented ?


The code is available on Github: https://github.com/evhub/coconut


Coconut, aka take a beautiful language and fuck it up with ugly syntax :D Well done!


I really enjoy coconuts and food products derived from coconuts, so the name is absolutely terrific as far as I'm concerned.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: