Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Functional Programming Is on the Rise (medium.com/elizarov)
48 points by pplonski86 on May 4, 2019 | hide | past | favorite | 32 comments


> Mainstream languages are Java, JS, C++, Python, etc — languages one would hardly call “functional”.

Even so, "functionalizing" mainstream languages is a thing. Gary Bernhardt was long advocating functional ruby. I deployed a functional Python module (to keep my sanity). React is becoming a progressively more functional dialect of JS, and it's relatively easy to deploy an "all functions" JS project.

> Most developers who live today were raised on this x.one().two().three() syntactic tradition in which the code is fluent —it reads and works left-to-right

a trend in functional programming is the pipe operator, at least F#, Julia, and Elixir have it to varying levels of flexibility, where you can apply the result of one function to the first parameter of the next

    start_with_value
    |> do_this()
    |> and_this(too)
    |> then_this()
being equivalent to `then_this(and_this(do_this(start_with_value), too))`

at least in elixir the power of the construct is really that you pipe into other things, like:

    value
    |> transform1()
    |> action_with_error_tuple
    |> case do
      <matches on error_tuples>
    end
For developer ease, Elixir has a builtin IO.inspect(value, label: "label") which pretty prints as a side effect and returns the original value, which is invaluable for debugging, especially with modern IDEs like VS code you can add IO.inspect pipes (with line number annotation) to multiple lines in a pipeline, watch the dataflow, and then when you're done delete it with ease.


To this point, JS has been very careful to not get in the way of functional programming, as it doesn't really try to stick to a specific paradigm. There is even a proposal to add pipielines: https://github.com/tc39/proposal-pipeline-operator.

Personally, I think forcing yourself to a specific paradigm is a waste of energy. I've tend to just write code that best expresses my intention. Sometimes the code turns out more oop, sometimes it turns out more fp, and often it's more of a mix.


> Sometimes the code turns out more oop

I think it's misleading to think of FP as being necessarily opposed to OOP. While FP is very much in opposition to the stroustropian style of OOP in syntax and execution (which is what people mean these days), if you're going back to smalltalk-type message passing objects, arguably some FPs have the hands down best implementations of OOP.


For sure. If there are two flavors of OOP: the Imperative OOP we're familiar with (Ruby, Java, Python, etc.) and the Message oriented variety that Alan Kay described, then Elixir/Erlang are more the Message Oriented flavor.

While you can write either flavor in most languages (even Elixir if you abuse ETS), concurrency's requirements for immutability and run-time support start becoming distinguishing factors.


There's three main flavors I see:

Synchronous method calling as limited message passing; synchronous method calling as general message passing, and asynchronous message passing.

Java and C++ and most static class-oriented OO languages are in th first group.

Ruby and Smalltalk (and JS, which isn't, barring more recent developments, class oriented OO, bit is still OO) and some other dynamic OO languages are in the second group.

Elixir/Erlang are in the third group.

Message passing is inherently an imperative idiom (whether synchronous or not), even though it may show up in languages that are largely declarative rather than imperative outside of message handling.


Do C++ and java actually pass messages? Java I have no clue, but I thought calling a member function in c++ is looking up a function pointer in a table or a vtable if inherited virtual.


> Do C++ and java actually pass messages?

Not really, “a limited echo of message passing” would have probably been more accurate than limited message passing: C++ and Java Ivor a form of OO inspired by Smalltalk-style message passing but are implemented in a way which provides something much more limited.


I think actually Ruby is Alan Kay style but with an imperative surface implementation, itonically


Kinda. I think what Ruby is missing is how minimalist Smalltalk is and how many of the primitives in Smalltalk are implemented in itself. To give you an example of what I'm talking about, in Smalltalk an if statement is implemented by sending a message to a boolean, like:

    someBool ifTrue: [ "some block to execute" ].
In that context, either the "True" object receives the message and executes the block, or the "False" object receives the message and doesn't execute it. (Or even some user defined object that just happens to have an ifTrue: message handler). (Sorry I might be a little off on the syntax and exact details, my Smalltalk is rusty)

Loops and such are also similar. Granted most of these things probably end up being implemented in native code on a practical level for performance, but conceptually it's a very small language. Most of the standard library and IDE is written in itself. Ruby definitely has a lot of flexibility, but it also has way more built-ins and much more separation between "interpreter code" and "user code"

I think this is what people are missing when they bring up the point that messages and method calls seem very similar. In a practical sense yes, but the smalltalk style message system is more about messages being a fundamental building block of the entire system.


> Loops and such are also similar.

While “if” doesn't work that way in Ruby, loops do (there is a for..in statement, but it's syntax sugar for a call to #each, and not idiomatic to use it anyway.) Python does a lot of that; putting loosely C-style syntax as sugar around method calls (which themselves support more general message passing patterns than Java-style method calls.)


I should give an example.

Python:

    class MyClass:
      def __init__(self):
         self.name = ""
      def setname(self, new_name):
         self.name = new_name
      def getname(self):
         return self.name
Usage:

     obj = MyClass()
     obj.setname("joe")
     obj.getname()        #==> "joe"
Elixir:

    defmodule MyAgent do
      def init(), do: 
        Agent.start_link(fn -> "" end)
      def setname(obj, new_name), do: 
        Agent.update(obj, fn _ -> new_name end)
      def getname(obj), do: 
        Agent.get(obj, &(&1))
    end
usage:

    {:ok, obj} = MyAgent.init()
    MyAgent.setname(obj, "joe")
    MyAgent.getname(obj)         #==> "joe"
Honestly, superficially, the difference is minor (obviously, under the hood they're way different; ironically I think the Elixir form is far more "explicit" than the python form)


Oh yes, i meant that sometimes a paradigm is unused, not that one prevents the other.


Another nice thing about the pipe is that you can pipe into any function, not just the methods associated with the object that I’m working with.

If I write my own function four() I end up with

Four(x.one().two().three()

But with pipes you get to keep the left-right reading order:

X | one() | two() | three() | four()

(Sorry for poor formatting. Can’t find the backtick on my mobile keyboard.)


Languages that have UFCS or extension methods also support this.


I wish it were possible to do this in "native" python (I know about coconut). it would make numerical code (where you're changing data shapes a lot) so much nicer


It's interesting that functional programming is becoming more popular while functional languages are straggling behind a bit, but I think there's a contributing factor besides just imperative languages borrowing features: IO & error handling. What I find with functional languages is that once I start adding in a lot of IO and error handling things get gross really quick, ie things like print statement debugging are a "side effect", along with things like logging, and so on. If I was more of an expert in functional languages maybe I'd find a more elegant way around this, but I guess I find using something like python I get most of the benefits of a functional language without the aforementioned things becoming painful.

From a practical standpoint, debugging by setting things like breakpoints and watching a value also becomes a bit more weird, because you have so many return values being pipelined through multiple functions without ever having an alias assigned to them, so how do you add a watch? Of course there's ways around that, but I wonder if dealing with practical problems like that turns a lot of people off. Especially since one of the things that draws people to write functional code is it's elegance, but it stops looking so elegant once you have a lot of these concerns in place.


I think both models have their strengths. I use C# quite a bit (which itself borrows some functional paradigms for LINQ, Func, etc), but it definitely has some limitations, especially when you need immutability. I have started using F# to fill these gaps, and thanks to the CLR interop is pretty easy. Debugging is pleasant as well.

Someone else mentioned Python/Coconut, which is another great combo that enables you to switch between functional and imperative styles as needed. I think this will probably become more common, especially in interpreted languages.


I have found great joy with Clojure (runs on JVM) which doesn't constrain you from side-effects, it just encourages you to do them on an appropriate functions. And all values are immutable by default.

Debugging pure functions is quite a lot different than keeping watches on specific memory locations (or variables).


I'm concerned about the use of functional languages. I like them but they seem to be treated as an implementation, not a model, and as such can be very inefficient. I've used scala and like it, but basic stuff with flatmap etc can unfold to be a gross mess of code which, had I written it non-functionally, would be vastly more efficient.

I'd expect the compiler to do more work, perhaps with some additional hints from me, go get efficiency. SQL does this because it has to (not with total success) but with scala I have had to abandon 'nice' functional stuff, which I like, to get speed. It's disappointing. I shouldn't be doing the compiler's job. With a literal implementation, functional programming just seems inevitably expensive and that's morally wrong (just cos computers are fast doesn't mean cycles are free, and don't quote the programmer time vs computer time stuff at me, wasting hardware can become more expensive as the number of machines increases).

Optimising FP to the extent I want is something I'm sure the compiler can't do alone; it would require hints and pointers from the human. The necessary optimisations would need to be from high-level functional transformations (eg. map f (map g x) -> map f.g x), down to eg. escape analysis so as to not produce so much garbage, down to data layout to stop the thrashing the cache.

Also someone I worked with just went quite mad on it - FP should be used where it fits best, he forced us to use it exclusively, even when a procedural code would simply have been simpler and clearer. And faster too - the end result just crawled (it was scala btw). Not a fault of the paradigm but of overselling, and someone in dire need of common sense.

Multiple disclaimers: I'm not a scala expert and haven't used it for a couple of years. I'm not an FP expert. What I've said above is a brief overview of a much larger and more nuanced criticism, which I don't have time to give; I hope you see what I'm getting at. I have little experience with other FP languages so what I say is based on scala and maybe eg. haskell does the clever stuff.

Edit: tidied up.


Haskell (or, rather GHC) does do clever stuff, and gets idiomatic code competitive performance with other languages [0]. The problem is that a lot of what GHC does require going full on functional in order to be practical. For instance, GHC can easily produce garbage of 1GB/s [1], which it can handle far easier than most languages because everything is immutable.

GHC also does not have a stack [2]. You can get something resembling a traditional call stack, but it has a fair amount of overhead.

> map f (map g x) -> map f.g x)

This transformation only works with pure functions.

I am currently using Scala, and it is (IMO) a very weird mix between FP and OOP. I is (IMO) much closer to Java style OOP than it is to Haskell style FP (not suprising considering its relationship with Java). It seems to inherit a lot of restrictions from Java that limit its ability to do the type of optimizations that are necessary for performant FP.

If you actually look at the JVM call stack of a scala program, you will find an insane amount of indirection. More than even a naive implementation of FP in C (undoubtadly because of how it interacts with its object system and the JVM)

[0] Although reasoning about performance is still pretty hard.

[1] https://wiki.haskell.org/GHC/Memory_Management

[2] Actually, it does have a stack, but the stack is no where near what you would expect.


> GHC [...] idiomatic code competitive performance with other languages

I thought this only was on micro-benchmarks?

> GHC can easily produce garbage of 1GB/s [1], which it can handle far easier than most languages because everything is immutable.

How can you possibly not totally trash caches and destroy performance with that kind of churn? From your link "New data are allocated in 512kb 'nursery'. Once it's exhausted, 'minor GC' occurs". I don't know but that sounds like kissing your entire L1 data cache goodbye, and then some.

(stuff about GHC stack) - Most curious! Got a link? (cactus stack is it?)

> This transformation only works with pure functions.

for sure, FP transformations tend to require that but it reminds me that scala doesn't have a this-function-is-pure declaration, which I'd really like. Along with deep-constantness.

> It seems to inherit a lot of restrictions from Java that limit its ability to do the type of optimizations that are necessary for performant FP.

Not so sure. Scala seems to do a very literal translation - there was a library that showed this, I can't find it, was on github, where straightforward but messy desugaring was replaced with simple stuff. OK, let me try this instead... invented syntax, can't remember how to do it in scala, sum of squares:

  SumSq lst = reduce (\x, y: x + y) map (\z: z * z) lst
I'm pretty sure scala would compile that literally a few years ago and I wonder if it does any better now. I'd expect that, over a thousand item list, to call the square lambda a thousand times to construct an intermediate, then call the add function a thousand times to reduce it. The above should be compilable to:

  SumSq lst = var rslt = 0; for(i in lst) { rslt += i * i }; rslt;
They're not quite equivalent as the first will fail on an empty list but you see what I mean. Point is, there's nothing in the JVM that prevents this level of optimisation. The second does mem reads only and can operate in a couple of registers, and can be unrolled and seems maximally efficient in every way I can think.

If FP doesn't use the very optimisations that it enables, it's going to pay for that in lack of takeup. Inefficiency is not just a technical issue, I'd say it's even a moral one - wasting resources is immoral.


Depending on how you define "functional", it's been on the rise for decades. It's just that the goal is a horizon, not a point. For example: Does the language support recursion? It's functional by the old standard. Does it have first-class functions? Ditto, but by a newer one. And so on. These days, it seems to be about making functional composition a major theme, often with side-effect-free styles being favored. Maybe later it'll shift further, to languages derived from Coq or Agda being the things people mean when they talk about functional programming languages.


Recursion with tail-call elimination, so that you don't run out of stack, still isn't common in mainstream languages.


C++ and the dotNet CLR (so C# etc) support tail recursion elimination. That's a big percentage of widely used languages.


I didn't know about C++ (and not familiar with dotNet). I suppose it's not in the C++ standard, but apparently some compilers do it, perhaps as a side-effect of optimisation. So it may work in some cases, when using certain compilers, and not others.

https://stackoverflow.com/questions/34125/which-if-any-c-com...


> Does the language support recursion? It's functional by the old standard

That's a strange claim, that recursion alone would suffice.

I'd say first class functions are more central. Give me that, recursion and closures and I can do the rest. IMO anyway.


Which functional language is the closest to being "mainstream"?


Scala


Depending on how loosely you want to define "functional", I'd argue Javascript is more popular.


There is a lot of merit in just using Scala as a better Java, in fact a lot of its original popularity was that kind of adoption. These days, however, Kotlin is eating that cake, so maybe it's true now?


I'd prefer something that compiled to machine code anyway. Scala also has type erasure unpleasantness due to its dependence on JVM.


Type erasure isn't just a JVM implementation detail. Haskell, for example, has no JVM dependency and relies solely on type erasure (a running program can't introspect type information). Not reifying types at run-time (in this case type parameters) is a legitimate implementation decision as reification has non-trivial costs.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: