Hacker News new | past | comments | ask | show | jobs | submit login

Prior to Clojure I feel that I didn't really know how to do things well.

In the context of the domain I have the most experience with, web applications, I'd summarize the bulk of programming as defining data, data transformations and shoving data through APIs. Once you get the hang of it, Clojure makes it trivial to transform datastructures from A to B. This has enormously expanded the kinds of problems I can solve.

When Rich says "It was impossible to avoid the sinking feeling that I had been “doing it wrong” by using C++/Java/C# my whole career."

I feel somehow that this is the case for the majority of people but they don't realize it yet because their experience is the most popular trendy language or framework. I've seen many examples of libraries in different languages having enormous amounts of commits and hundreds issues for problems that are trivial to solve in Clojure.

I was in the same boat, constantly grabbing for frameworks and if one wasn't available that made my task simple, would struggle trying to bend the frameworks or language I was using to come up with a solution.

I'm not a language geek and I don't write code outside my work for fun. I want to spend the least amount of time possible in front of the computer and sleep knowing my web applications won't topple over. Clojure has fit me so well that I don't think I would have accomplished what I have in other languages.




Could you elaborate on a problem that illustrates this property of Clojure? This sounds awesome, but I have a hard time understanding what you're getting at without knowledge of Clojure.


Three examples of this:

1. core.async: So the guys who built golang did it because they had this cool idea for creating coroutines using channels. However, since it required very low-level functionality to be part of the core of the programming language, they thought they'd have to design a brand new language to implement their idea. However, shortly after golang was released, Rich and some other clojure folks implemented the same feature in clojure as a totally ordinary external library, proving that the core of clojure was general enough to support this. And it wasn't just a gimmick: I use core.async every day and think it is better than golang's implementation.

2. The Expression Problem: One of the core challenges in language design is designing it so that (simplifying a bit) you can transparently add support for new function methods to objects designed by a third party. Clojure makes this easy https://www.infoq.com/presentations/Clojure-Expression-Probl...

3. Lots of languages have attempted to allow you to write a single library that you can use both on the client and the server, having your code transpiled to different languages in each case. However, this type of feature is rarely used in production, because there are usually lots of headaches involved, with many limitations. However, for clojure developers it is perfectly normal (and usually expected) that all code is written in cljc files & so that it can be used on both the client (transpiled to javascript) and the server (transpiled to java). It is easy to do this, even for cooperatively multithreaded code.


I use core.async a lot too and really like it, however, I do feel that it suffers from being a macro instead of part of the language, for example, you cannot nest core.async functions inside other functions as they must be directly visible by the go macro, which cannot see jnside function calls. I’ve also had errors where the stack trace did not reference my code files at all, because it happened inside some core.async setup (IIRC it was a variable that was meant to be a channel but was nil, inside a sub or mix or something, ie I connected two core.async things together, it created go blocks internally, the exception happened inside that and therefore never referenced any if my source files, since the exceotion happened in an asynchronous context after my code ran).This was extremely painful to figure out.

Neither of these issues can be solved as long as core.async is implemented as a macro. However, it is extremely cool how far it was able to be taken without changing the language!


> I use core.async a lot too and really like it, however, I do feel that it suffers from being a macro instead of part of the language,

Agreed. The core abstraction of a go block is essentially a user-space thread. Because macros are limited to locally analyzing/rewriting code (and limitations of the JVM), go blocks are super limited in scope. As you point out, one call to a function in a go block, and you can't switch out of that go block during the function's execution. And functions calls are very common in Clojure code and can easily be hidden behind macros that obscure the details of the control flow.

There are 'core.async'y ways around all this, but the net effect is that 'core.async' imposes a very distinct style of writing (and one that tends to be contagious). Python's 'async/await' is contagious also, but because it's more tightly integrated into the runtime/compiler, it doesn't feel nearly as restrictive. (at least to me).


I don't think it makes logical sense for there to be a "core.async function". Instead, you have each function return a channel with it's own go macro, then have the parent function consume data from the channels.


Sure, that’s a solution, but its necessary only because of a limitation that only exists because its a macro that needs to look into the code to rewrite it. It also adds extra (cognitive) overhead that such functions are always themselves asynchronous and the return value are channels - they can never just return a raw value - which limits what you can do with them or how you can call them or pass them elsewhere.

I mean, yes, usually it isn’t a problem at all, but it IS a limitation of core.async that wouldn’t need to exist if it had tighter integration into the language.


> and think it is better than golang's implementation.

How is core.async better than the golang implementation. In golang, all network I/O you do is automatically asynchronous, so your goroutines won't block a thread on network I/O. That is simply something core.async can't do.


I don't know if it does, but why couldn't it? The Java has non blocking IO.


It's not just a matter of non-blocking IO. core.async uses heavy-weight threads. Go-lang uses light-weight go-routines.

The JVM also would need to incorporate a lightweight process scheduler that can wait on I/O and schedule j-routines when I/O is blocked.


`core.async` does M:N scheduling where many go-blocks are scheduled across a smaller number of heavy-weight threads. `core.async` also has channels and select. It pretty much works the same as in go.

However, since the default in JVM is heavy threads and blocking behaviour, extra care needs to be taken to avoid blocking a thread that a go-block is executing on (as that would also block many other go-blocks).

So it's definitely easier in Go, since you don't need to juggle these two contexts.

JVM has NIO, which provides non-blocking IO. You can use this with core.aysnc's channels to resume a go-block. You have all the pieces you need to write a go-scale clojure service.


Sorry, I completely disagree. I don't consider mere M:N thread work distribution as anywhere close to what the go scheduler offers.

This article mentions some of the fundamental problems with core.async: http://danboykis.com/posts/things-i-wish-i-knew-about-core-a...

Having to pay ridiculously careful attention to what code blocks across libraries and function boundaries is something that golang completely takes away the need for - because it has a true cooperating scheduler built into the runtime at the boundary of every syscall.

Decision making is in hands of the Go runtime. You can write in standard dumb sync fashion at 3am in the night within a go-routine making all the Go library calls you wish to make without needing to scan code with a microscope to see what blocks and what doesn't.

https://www.ardanlabs.com/blog/2018/08/scheduling-in-go-part...

core.async is a half-baked error prone solution. People will make mistakes with it. Until the JVM has native support for fibers, continuations and a compile mechanism to classify all legacy blocking calls as unsafe, core.async will continue to be hobbled.


M:N thread distribution is exactly what the go _scheduler_ offers. It's a work stealing scheduler like Java's ForkJoinPool.

The benefit lies entirely in that there's no other way to do concurrency. Which is great, as it's less error prone. But if you were to introduce native threads in Go you'd have the exact same problems as you'd have with core.async. So it's not like core.async (or async/await in other languages for that matter) is half-baked or badly implemented, but it won't automatically remove a part of the Java runtime that has existed since the beginning.

In a similar vain, Rust's actix and Java's Akka are not half-baked or bad implementations of the actor system, but it is more error-prone to use compared to languages built entirely around the actor paradigm, like Erlang and Pony.

So no, core.async doesn't magically give you non-blocking concurrency, but it does enable it. And that is a much better option than re-writing your entire codebase in Go.

> core.async is a half-baked error prone solution. People will make mistakes with it. Until the JVM has native support for fibers, continuations and a compile mechanism to classify all legacy blocking calls as unsafe, core.async will continue to be hobbled.

With Loom it won't be necessary to add a compile mechanism to classify legacy blocking calls, as there won't be any. This is why Loom is taking so long, the entire runtime is being re-written to not block if IO is performed in a virtual thread, but work as before if a full thread is being used.

And when you can simply create virtual threads instead of native ones, there's really not much value in core.async anymore (well, ClojureScript will still benefit) as Clojure already has great primitives for working with threads.


I don't know anything about the JVM and my experience with JVM-based languages is incredibly limited, but isn't that the essence of the Loom Project?

https://developers.redhat.com/blog/2019/06/19/project-loom-l...

https://cr.openjdk.java.net/~rpressler/loom/Loom-Proposal.ht...


Yes, however, the Loom project is still some years away and the Java standard library will need to be extensively revamped to support this. I see this easily a decade in the future.


They have been rewamping the Java standard library slowly and silently over the last couple of years in anticipation for Loom. With Java 14 they re-wrote the tcp socket class. In Java 15 they're re-implementing the udp socket class. These re-implementations are being done to support loom seamlessly in a future update.

They also collaborated with IntelliJ to find bugs related to debugging Java apps, to make sure Loom doesn't break anything there.

Loom is well underway. I wouldn't be surprised if we saw Loom in Java 19 or 20. So in 2-3 years.


Hell, I just used loom last night on some pet projects. You can use it today if you dare.


The next LTS version of Java is planneed to to be Java 21 (fall 2021, numbers a coincidence) and my presumption is they are trying to get it into that.


> So the guys who built golang did it because they had this cool idea for creating coroutines using channels.

Communicating sequential processes is an "old" idea. See: https://en.wikipedia.org/wiki/Communicating_sequential_proce...


That is true, but my impression is that the original syntax was cumbersome and that golang made major advancements on that front (though I'm no expert and haven't studied the original papers)


I'm no expert myself and wasn't trying to call you out or anything. (I've only read that paper once or twice before a Papers We Love event I attended.) I think it's a common misconception, though, and thought people might find the history interesting.


This stack overflow answer has a very fascinating explanation of the history of the evolution of CSP from Hoare's original paper, to the work done on it with Occam up onto Go, I recommend reading it if you are interested in this stuff: https://stackoverflow.com/a/32696464/172272


2. requires static typing in my personal opinion

cf. http://lambda-the-ultimate.org/node/4136#comment-62959


Spec is pretty close to being a library for static typing; Clojure core being dynamically typed isn't really a blocker.

If you really want static typing Clojure can probably do it. The community support isn't going to be as solid as for something like Haskell. But I'm pretty sure spec offers stronger control of the useful parts of a type system than something like, eg, C. Except the control over RAM; that isn't a Clojure competence.


> So, the expression problem is the problem of solving another, unnamed problem, while satisfying the constraints of static typing? It seems a not very useful term then, and implies that every problem will need two names - the name for the actual problem and the name for the problem of solving that problem while satisfying type system constraints. Bleh. -- Rich Hickey

Solving the problem doesn't require static typing.


tl;dr on the above: The Expression Problem was originally expressly formulated with maintaining a Haskellian level of static type checking as one of its core requirements.

Saying you've solved it, when your solution involves on dynamic typing, or even casts in an otherwise static language, is arguably akin to kicking the ball across the center field line and then claiming you've scored a goal.

(See: http://homepages.inf.ed.ac.uk/wadler/papers/expression/expre...)


Actually, clojure compiles down to java classes, which are statically typed. Existing java types can be extended with clojure protocols. The link below talks about this, specifically about the expression problem.

See https://www.ibm.com/developerworks/library/j-clojure-protoco...

However, I learnt something. I didn't know that the definition of the expression problem required static typing. I prefer to think of it as adding static types to multiple dispatch.


You missed a detail: or even casts in an otherwise static language.

Clojure does compile down to Java classes, but Java is only a partially statically typed language. It also permits run-time casting, which, in a language that even tries to be strongly typed, means run-time (to wit: dynamic) type checking. And Clojure relies heavily on that.

That's why I invoked Haskell. It's an example of a language that is truly statically typed, in that it doesn't (generally) permit any run-time type conversions. ALL type checks must be done statically.


Then, the conclusion can be that dynamic languages don't suffer from the expression problem.


Pretty much. Not entirely unlike how one can't implement the Y combinator in a statically typed language, but it's trivial to do so in a dynamic language.


“I know of no widely-used language that solves The Expression Problem while satisfying the constraints of independent compilation and static typing” - kind of implies that the problem is different from whether or not the language is statically typed, doesn’t it?

I don’t think you should be writing off Clojure’s multimethods too quickly - they are quite nice, and I haven’t seen anything like them in other dynamic languages like Python.


You've cherry-picked that sentence, though. Given the entire context of the article, I would argue that one should interpret that as a case of unclear writing, and not something that is intended to directly contradict the definition given in the second sentence: "The goal is to define a datatype by cases, where one can add new cases to the datatype and new functions over the datatype, without recompiling existing code, and while retaining static type safety (e.g., no casts)." (emphasis mine)


I’m not much of a clojure fan myself, but I think the observation is that by relaxing that last constraint of static typing (and given the other appropriate tools), that while you can’t solve The Expression Problem itself, it’s a bit easier to solve the real world problem that happened to manifest itself as the expression problem in your code.

By the way, I was also enticed by multi-methods in this area, but I ultimately don’t think multiple dispatch is necessary.


Yes, absolutely, but doing so would be moving the goalpost right out of the stadium. The expression problem isn't supposed to describe a challenge for people implementing line-of-business applications. It's a problem for programming language researchers.


>core.async: So the guys who built golang did it because they had this cool idea for creating coroutines using channels. However, since it required very low-level functionality to be part of the core of the programming language, they thought they'd have to design a brand new language to implement their idea. However, shortly after golang was released, Rich and some other clojure folks implemented the same feature in clojure as a totally ordinary external library, proving that the core of clojure was general enough to support this. And it wasn't just a gimmick: I use core.async every day and think it is better than golang's implementation.

It is very limited -- it was done by a single university student as his MS thesis -- but luaproc had added this functionality to Lua before Go even existed:

http://www.inf.puc-rio.br/~roberto/docs/ry08-05.pdf

Other M.S. theses from PUC have extended the library (in particular, adding support for sending tables) but it is otherwise unmaintained:

https://www.maxwell.vrac.puc-rio.br/30267/30267.PDF https://github.com/lmillanfdez/luaproc-master https://www.maxwell.vrac.puc-rio.br/35424/35424.PDF https://github.com/fernando-ala/luaproc---messaging-tables-a...


Go was inspired by several systems that have come before it. You can write coroutine code with channels in C.

http://man.cat-v.org/plan_9/2/thread

It’s available for unix systems as well in the Plan9Port, and that predates Go by a bit.

I’m not sure what any of this has to do with Clojure though other than to say “look we have Go-like concurrency without starting over”. It’s been shown that you don’t have to start over to add features to other languages as well.

See also libmill http://libmill.org/

Go was not solely created to have goroutines and channels, but that was definitely one of the features desired.


Yes! I must point out a little factual error: JVM Clojure is not transpiled to Java, it is compiled on-demand to JVM bytecode.


core.async in Clojure is not equivalent to Go's asynchronous support. Even worse, core.async is the simplest & least-performant way to implement asynchronicity: It's just using an internal thread pool, meaning that go blocks in Clojure are not at all like goroutines in Go since they are state-machines produced by a macro that can block the entire thread when doing blocking calls. So, one has to manually "park" them by using core.async channels.

Summarizing: core.async is macro sugar on top of java.util.concurrent.

Go has actual lightweight processes that are scheduled on top of OS threads. Big difference in both design and runtime performance. Erlang too has m:n threading and offers a simpler and safer programming model (each Erlang process has its own heap).


Re "core.async is macro sugar on top of java.util.concurrent" - yes, but also no. Java does not (yet) have a way to spin light-weight processes that can park without consuming a thread OR a way to select/alt over multiple parked ops, so being able to do that in Clojure is actually novel (and enabled by having macros that can rewrite code as a state machine).

There have been a few projects that did this in the past on Java using runtime bytecode modification but none of them are widely used afaik.

This will change once Project Loom lands in Java and provides m:n fibers.


Yes, yes in the future at some undefined point java will be better.

Meanwhile go works very well today. And you don't need to be saddled with the JVM and the very bloated java ecosystem.


>very bloated java ecosystem

Why waste time reinventing what java has already solved?


Life is too short to waste on Java? Yes, some of us really find Java that bad.

Plenty of other languages have solved the same problems in much better ways (as demonstrated by this very discussion re: asynchronicity). I'll pick Go over Java, Erlang over Go and wouldn't pick Java (incl languages running on top of) for anything today.


I usually find immutability to be more important than this particular detail. Because of this my preference is usually Elixir/Erlang, followed closely by Clojure (for the potential interop with a huge ecosystem), with languages like Go and Java fairly close to eachother but pretty far down list.

I would love it if Clojure on BEAM were more mainstream.



Why not?

The java ecosystem has solved so many problems already.

Don’t you want to just focus on the business problems?


The business problem still can include things like "concurrency", which is just -painful- in Java, especially if I/O bound.

I much prefer languages that make easy stuff harder, but hard stuff easier, than languages like Java that make easy stuff easier, and hard stuff harder.


Java doesnt solve business problems.

It solves enterprise ones.

Or at least claims to.


The Java ecosystem has solved many business problems, for me and plenty of other people I personally know, both in small startups and in large companies. Just because you are personally prejudiced against Java for some reason, doesn't mean that its not a good ecosystem, because it is. I don't like Java the language (and certainly not the crazy frameworks), although it has improved, but the ecosystem, especially from Clojure, is fantastic.


Enterprises do business last time I checked.


I don't solve business problems, I solve my own problems and try to have fun along the way.


You are confusing Java with the JVM.


No I'm not, there is no escaping Java when you work with Clojure, since you'll be using Java libraries and working with Java APIs all the time.


Depends on what you're doing.


Regarding Java, sure. But in the context of Clojure, it's here right now as well, no need to wait for anything.

I also don't really understand the "bloat" comment. What do you refer too? Are you referring to the fact the JVM supports too many features?


Sure, sometimes a simple, crude runtime will do, but if you need really good performance and low-overhead deep observability, you might want to be saddled with something that's more state-of-the-art.


Go's concurrency primitives are a less expressive version of concurrentML, which imo has been the nicest way of writing concurrent (but for a long time not parallel) applications the last 30 years.

Guile scheme has a parallel version of it that is implemented as a library: https://github.com/wingo/fibers/wiki/Manual

Comparing concurrent systems is hard. I would suggest implementing selecting with negative acknowledgement (nack). If I can do that nicely I know I will probably like the system.


I am confused. If I remember correctly, go's runtime uses a thread pool as well. It's not like go somehow invented a runtime that does not rely on OS threads. So in the end of the day, what difference does it make?

If you ask me, java.util.concurrent is a much better and battle tested concurrent infrastructure than the simplistic implementation in go, because java.util.concurrent is actually written by the world's foremost concurrency expert, Doug Lee. I will take that over someone else's hand rolling concurrent runtime any day. God knows how many bugs there are in your home grown concurrency code.


It makes a huge difference. You can't use blocking calls in core.async, since they'll block the entire thread. There is no switching when you do a blocking call, the thread your code runs on will be out of commission until the blocking call returns. So this means that blocking APIs are out of the question and this translates to you not being able to write linear, straightforward code. You have to use non-blocking APIs and callbacks, and rely on core.async channels to force switching between blocks (core.async doesn't have lightweight processes). This is a total nightmare to debug and forces you to bend your programming logic to support this non-optimal programming model.

Constrast with Go goroutines or Erlang processes where you write linear, synchronous code that executes as part of lightweight processes. The built-in scheduler will automatically schedule these processes on real OS threads, and switch between them when they block (either through channels or blocking calls). OS thread utilization is far more effective, debugging is a piece of cake and the code that you write is easy to read and understand.

I'm bypassing your "world's foremost concurrency expert" comments. I suggest you use the concurrency offered by Go and Erlang by spending a few hours to do tasks you've done with core.async. You will learn something and you will see how Clojure falls far short of the optimal.


Hum, what you're saying seems to contradict a lot of what I've read. We're talking about IO here. So something has to block unless we're using non-blocking IO. I think what you mean is that Go will detect the block, and will yield your execution, and manage the IO in a non-blocking way automatically for you?

Under the hood, go still relies on non-blocking IO, epoll and all. And for disk and other serial IO I think it actually allocates real blocking OS threads as well.

I agree, the automatic management is convenient, but I fail to see the difference from core.async and the direct use of async IO or blocking OS threads when that's not available like for disk IO.


Go scheduler is sophisticated and it indeed intercepts all blocking system calls.

Essentially Go runtime will run a different goroutine, if current goroutine is blocked on: blocking syscall (for example opening a file), network input, channel operations, primitives in the sync package.

http://www.cs.columbia.edu/~aho/cs6998/reports/12-12-11_Desh...

I would always prefer the runtime to do this job instead of having to scan code under a microscope across functions, libraries and module boundaries to see what blocks and what doesn't.

http://danboykis.com/posts/things-i-wish-i-knew-about-core-a...

If I wanted complete control, I would use C++


I don't disagree about the convenience of having the Go runtime handle this automatically for you. I do wish core.async could as well.

That said, all your prior comments seem misleading, at least they confused me because it sounded like you were implying something that didn't feel accurate. Core.async has lightweight processes. The core.async Go macro can yield to other Go blocks and it will be really cheap to do so. You can have hundreds of thousands of them concurrently parked and cooperatively switching between each other, etc.

In effect, it means that you can achieve the same scale of concurrency in Clojure as you can in Go.

Yes, it will be trickier in Clojure, because you have to manage the async IO on your own, and you have to delegate an OS thread on your own when blocking IO is needed. In ClojureScript it won't be as much an issue since all IO is async, but still not as nice as auto-yielding by the runtime under the hood as Go does.


core.async doesn't have lightweight processes, for any sane definition of these terms. A code transformation spat out by a macro, subject to all the serious drawbacks I've already mentioned, is not a lightweight process.

But looking at the number of downvotes I've received, stating the obvious no less, I think I'll stop here. I will however take the completely nonsensical, misleading and downright erroneous responses illustrated in this thread as signaling me to steer well clear of Clojure and its community. Let's not forget that this thread started with someone claiming core.async is better than Go's goroutines.


> But given the number of downvotes I've received here

If I were to guess, I'd say it's because your comments feel like they come from a place of bad faith.

I'll give you the benefit of the doubt here.

> core.async doesn't have lightweight processes, for any sane definition of these terms

We can argue semantics if you want, but that leads to nothing constructive.

I understand very well that the underlying implementation is different. Core.async is a very clever code rewriting machinery. Go actually has a user level thread implementation, with each Go block its own stack, and the Go runtime schedules those over real OS threads. I know all that.

The implementation doesn't matter for what I'm saying though. At the end of the day, it means you can pause and resume multiple logical code blocks and cooperatively alternate between which one is given another chunk of CPU time. All while keeping memory footprint low, and with very fast context switches.

Thus you can achieve a similar scale of concurrency, and even though they achieve this with a different implementation, Go and Clojure both achieve the same end result.

While I totally acknowledge as well that the implementation Go uses allows for a more ergonomic programmer experience.


Here is a very simple example that I think does a great job of illustrating how Clojure was designed to deal with "data" in a straight-forward and concise way:

  (defn csv-data->maps [csv-data]
    (map zipmap
      (repeat (first csv-data))
      (rest csv-data)))
This is a function that, when given a contents of a CSV file (list of sequences), converts it to a list of maps, where the keys are the column headers.

I'm sure there are clever ways of accomplishing this in other languages, but the default way a Java/C# dev would probably approach this is to create a class that represents the CSV columns, and imperatively iterate over the contents.

With Clojure, the above function is idiomatic, and 4 lines long.

When it comes to "information processing" systems, which is what a lot of us work on, having a language with a primary purpose to provide powerful and concise tools to process that information is.. I don't know, liberating? (maybe not the best choice in words).

(this example was a bit of an eye-opener for me, coming from a .Net background, as I was writing some simple Clojure code that needed to read in some data and was struck at how simple this solution is)


If you really wanted to (and you understand Clojure's destructuring and lambda syntax) you could even do it in two lines:

  (defn csv-data->maps [[head & lines]]
    (map #(zipmap head %) lines))


i prefer perl


It may look terse, but its not code golf.

Any Clojure developer would understand what this does (its actually pretty straight forward), but to an outsider, it definitely looks strange.

Part of using Clojure is becoming familiar with the language, its syntax, and its idioms.

Rich even has a section in "Simple made easy" where he talks about this very thing; its on us to become familiar with the tools we are using.


Do you have any recommendations of online resources or books that can get a web programmer (experience in PHP / C# ) up to speed with clojure ecosystem ?


Hi fellow PHP'er and in a long time gone C#er here

I would highly recommend learning the spirit of Clojure first:

https://changelog.com/posts/rich-hickeys-greatest-hits

https://www.youtube.com/watch?v=vK1DazRK_a0

There are a few classes of tech that are uniquely Clojure:

Data driven DSLs:

- https://github.com/noprompt/garden

- https://github.com/weavejester/hiccup

- https://github.com/seancorfield/honeysql

Hyper normalised relational databases:

- https://www.datomic.com/

- https://opencrux.com/

- https://github.com/replikativ/datahike

Advanced SPA tech (hyper normalised data driven):

  - http://fulcro.fulcrologic.com/

  - https://wilkerlucio.github.io/pathom/v2
Once you understand the spirit and rationale for Clojure it becomes apparent why other communities don't have this kind of tech yet

Once it gets down to practical things I recommend using clj-Kondo with type hints, Cursive for Intellji, make sure you learn how to hot code inject new code into your running program using your editor shortcuts, and TDD in Clojure is also excellent and immediate: https://cursive-ide.com/userguide/testing.html

Also look out for GraalVM and Babashka we're using it to compile fast native binaries out of Clojure


While this is a pretty complete list of some of the most interesting things out there, it might be a little overwhelming to a newcomer.

I'd suggest starting with what everyone seems to start with:

- https://www.braveclojure.com/clojure-for-the-brave-and-true/

Its a pretty fun read, and does a good job of covering the language (plus its free, which is probably why most people start there).

You could also take a look at Programming Clojure (written by some of the people behind the language, Alex Miller and Stuart Halloway), which I think is a better resource, but it does cost money.

It can be tempting to want to start in the deep end and try to create something like a web app from scratch as your first attempt at using the language, but there isn't really a Django or Rails for Clojure and you can easily get lost in the weeds (as I did).

My approach to learning Clojure was to start with a simple setup (for me its VS Code with the Calva extension, and deps.edn for managing the project files) and forcing myself to use Clojure for any small utility scripts that I might have otherwise done in bash or JavaScript.

This allowed me to get a feel for the language and how to work with the REPL without having to also digest a lot of information on how a specific library or framework works.


thanks, i'll take a deeper look at Clojure. I need to have a way to get my java fix and scala isn't doing it.



You can look at Kotlin as well. Or ABCL.


If you actually supplied a Perl version maybe you could provide a convincing argument.


Here's a more or less equivalent implementation in Common Lisp, using alist-hash-table from alexandria:

    (defun csv-to-maps (csv)
      (loop for row in (rest csv) collecting
           (alist-hash-table
            (mapcar #'cons (first csv) row) :test #'equal)))
For CSVs with only a few columns (maybe less than 10?), I'd probably go with alists instead of a hash table to save memory, and it'd look like this, using curry from Alexandria:

    (defun csv-to-alists (csv)
           (map 'list (curry #'mapcar #'cons (first csv)) (rest csv)))


If you read Rich's paper, there's a section where it says:

> maps were built behind several abstractions. Thus, maps can start with a representation optimized for small maps and switch to a representation optimized for large collections if and when they ‘grow’ beyond a certain size.

Which is pretty cool, because basically the Clojure version of your code abstracts over alist-hash-table and alist, and the most optimal type of concrete map will automatically be created based on the size of the map you are asking to create.


There's actually an implementation of the same HAMT data structure that Clojure uses available in QuickLisp and on GitHub: https://github.com/danshapero/cl-hamt/

Technically, since the hash-table implementation isn't specified in the CL standard an implementation could use a HAMT for the built-in hash tables, but I don't think any do.


The construction of a Map is abstract in Clojure. The concrete type of Map you get is not something you specify. This isn't related to the HAMT per say.

So what I mean is Clojure will use an array backed map when you ask it to construct a small map, and it will give you a HAMT backed map when the map is large.

This is true as well as you add elements to a map, Clojure might automatically promote the map from an array map to a HAMT as its size grows.


"The default way a Java/C# dev would probably approach this is to create a class"

No, not for a long time. The idiomatic C# way would be to use LINQ which actually predates Clojure a tiny bit. Java (since version 8 in 2014) could use Streams to similar effects. Either way the solution would be as short as your example.


Assuming that you are mapping the results to a Dictionary, sure it would be similar.

My point was more that C#/Java developers (myself included at one point in time) would typically reach for classes to store the results, rather than just a Dictionary<String, String>.

It’s not that other languages can’t to this, it just that this approach doesn’t always jive with their conventions like it does with Clojure.


Right, the more idiomatic approach in java would be

- define the class

- let the framework fully handle serializing/deserializing it from your file to the neat in-memory object


That’s great if your dealing exclusively with CSV files and a fixed set of data types, but the point is that this approach in Clojure is universal regardless of where the data is coming from, or going to.

Everything comes in as a map, gets manipulated with the powerful set of core Clojure core functions, and spit back out as a different shaped map.

It’s a very flexible approach to dealing with data, and that is one of the main appeals of Clojure.


Yup, Java in its current incarnation is definitely a better fit for larger companies that can present more standardized workloads. You want to give each team clean and explicit interfaces that they have to work with, and not have to worry about different shaped maps. Flexibility and power just lets junior developers shoot themselves in the foot.


I agree, I wouldn’t suggest Clojure for a large company with lots of junior developers.

I wouldn’t suggest Java either, though. Go would be a better fit for that situation these days.

I look at Clojure and Go as different sides of a coin.

One is a powerful language the let’s a small team of experienced developers get a lot done with a small amount of code.

The other is a simple language that lets a lot of less experienced developers get a lot done with a lot of code.

I want to stress in this comparison though, that I don’t view one as better than the other.

Go was designed for development at scale, where you have lots of developers, all at different levels, all needing to contribute large amounts of code. In this respect, it does a great job at removing most of the rope that people can use to hang themselves. If I needed to pick one language to use at a large enterprise, it would probably be Go.

Clojure gives you a (different) set of limited functionality, but what it gives you is very powerful, and very foreign to most developers, so you can definitely hang yourself trying to force everyone into using it. If I got to choose one language for myself to use, it would be Clojure.


Go reminds me a lot of the old days of Java, that lead it to its current dominance. It used to be explicit and wordy, but there was usually just one way to do things, so a code base would stay very consistent, leading to much better portability even as code constantly changed hands between developers.

For people coming from a java background, I've had good results with other java+syntactic sugar languages as well, like Groovy and Kotlin, that have accelerated the growth of java to try to pick up some of the best parts of other more modern languages.

I would tend to disagree with the idea of "only choose one language for myself". You want to pick the right language for the job, you want to consider the business needs at hand, you want to consider the team that you have, you want to consider the experience that your team has. Even a company that would prefer to use as few languages as possible could easily end up with javascript for the frontend, ruby for the backend, go for infrastructure, python for ML. And at the end of the day, it's just code, it's just a tool to express your own skills as an engineer.

And then as an engineer personally, you need to get exposed to a range of languages, because each language teaches you a different mindset, and brings with it a different community with different priorities. Not referring to you specifically, but I've seen a lot of junior developers get into Clojure and all of a sudden it's the most amazing language because their life as a Java engineer before was horrible. But it's not Clojure specifically, it's the idea of learning new languages that is itself powerful.


A counter point is that most large companies these days operate with lots of small teams, not one big team. And most systems are composed through a service oriented architecture of some sort, not a single giant monolith worked on by all teams.

So in practice, even at a large company, you find yourself with lots of small teams.

And no such small team at a large company would be composed exclusively of junior developers. There's at minimum going to be a team lead of some sort, and most often you'll have a spectrum from senior to junior.

So in reality, Clojure can be quite a good fit even at large companies. In fact, it might be a better fit at bigger companies, because I think small startup might actually be where you could find a small team of only juniors, and a "deliver at all cost no matter the quality" business need (due to not being profitable yet).


I can't help but feel that all arguments defending how this Clojure solution supposedly can't be done as elegantly in other languages, use "idiomatic Clojure" as the judging criteria. Of course the conclusion writes itself.


Well, you can write code a lot like this in JavaScript using a library like Ramda, and get a lot of the benefits.

But then everyone that you on-board to that project has to be up to speed with how the code is being written and why.

With Clojure, that understanding is table-stakes.

So in that respect, the conclusion does write itself.


I've actually tried that: https://github.com/enumatech/sprites/blob/master/lib/__tests...

reads quite like Clojure, but looks foreign to JavaScript programmers...

I even made extensions to Ramda, so you can thread a mixture of async and non-async code with it: https://github.com/enumatech/sprites/blob/master/lib/fp.js#L...

At the end the friction was just annoying and no one else really understood the greatness of this approach, since they didn't know Clojure...


In my experience, it's not that you can't do these things in some other languages, it's that you won't, and if you try too, there will be more ceremony and friction.

That said, this is where a lot of people who took a detour through Clojure, come out of it saying when they went back to their prior language or some other language, they felt like they had suddenly become better programmers, and it helped them even in other languages to write better programs. Clojure showed them the way, and now they can recognize it and think of doing it as such in other languages as well.

Maybe a different way to think of it is that you can do OOP in Clojure if you really want, but you won't. Similarly, in Java, you could try and do FP with immutable collections and value semantics, but most likely you're not going too.

And I'm using Java here because in theory, Java and Clojure have the exact same set of available features, they run on the same JVM, same runtime, compile to the same bytecode, can share libraries, etc. So it goes to show that the language itself does influence the style that is most convenient for writing your programs in.


I am familiar with C#, and in good faith spent a bit of time trying to find examples to support this, but in the end using a class is both common and typical (omitting the many StackOverflow results with the same pattern): https://dotnetcoretutorials.com/2018/08/04/csv-parsing-in-ne...


His example is just something like .ToDictionary(column => column.First(), column => column.Skip(1).ToList())* .

The motivation to eventually use a Class would be to make use of static typing, e.g. when using an ORM like Entity Framework (a common usecase: reading a CSV file into a relational database). Static typing is usually considered a strength these days, but it does have its downsides (that debate has been done to death). Showing the parallel Clojure code for that could be very interesting.

However, reading CSVs doesn't require enforcing types, even if one wishes to use a library to deal with the complex quoting cases. CsvHelper is tilted in that direction of mapping to a static type, but it's not the only Csv reading library (I'm a pretty happy user of ExcelDataReader[0] for other reasons, and it doesn't enforce mapping. I'm sure there are more functional-style solutions out there). The original post picked the one place C# uses a powerful functional-like syntax, its generalized query language....

* From my limited understanding, not being a user of Clojure yet (I've been considering learning it for some time, just need to find time to do it seriously), each 'sequence' in the original example is a column, not a row. Were each sequence a row, in C# we could have used the positional lambda argument to get the columns.

[0] https://github.com/ExcelDataReader/ExcelDataReader


I'm a little late to the party, but I don't think I would hire an experienced C# developer that didn't immediately know this could be a one liner:

  var csv = ImmutableList<ImmutableList<string>>.Empty;
  var map = csv.ToImmutableDictionary(y => y.First(), y => y.Skip(1).ToImmutableList());
I would hope they would then tell me that this will throw an exception when there is a repeat header in the csv data. I'm not sure the functional and safe version is that much cleaner than a for loop:

  var map = csv
    .Where(x => !string.IsNullOrWhiteSpace(x.FirstOrDefault()))
    .GroupBy(x => x.First())
    .ToImmutableDictionary(x => x.Key, x => x.First().Skip(1).ToImmutableList());
Edit: But I would hire a developer that knew how to do the equivalent in Clojure without hesitation.


It's less lines in JS/TS, depending on how you format the parens and braces.

    let csvData = [
      ["Name", "Age", "Email"],
      ["John", 25, "john@email.com"],
      ["Mary", 30, "mary@email.com"],
      ["Anne", 40, "anne@email.com"],
    ]

    let parseCSV = ([cols, ...rows]) => 
      rows.map(row => 
        row.reduce((res, val, i) => Object.assign(res, { [cols[i]]: val }), {}))


    parseCSV(csvData)


Yes, as I said in my original post (and others in this thread), you can absolutely come up with elegant solutions to this in other languages.

In my experience though, for every one elegant solution like yours, there will be ten developers writing nested for loops.

My example was never meant to be “look at what Clojure can do! You should all be jealous! “. Only an example of how Clojure encourages a different approach to solving problems that we are all familiar with.


Its because the default abstractions provided as part of the language shape the way developers using that language approach problems. In Clojure, the sequence abstraction has been given a lot of thought and effort, which means that’s what Clojure developers will try to use. It also Clojure comes with a rich and powerful set if sequence functions which makes using sequences a breeze and lets you do super powerful things out if the box.

You could simply implement the exact same functions in other languages too and end up with similar code, butsince its not built in, most people won’t think in those abstractions.


> there will be ten developers writing nested for loops.

Ain't that the forever truth =/

nested for loops go brrrrrr


Aaaand this works on arrays, but would you know how to make it work for files? You would need to do something like how the essay wrapped the `rdr` into a sequence, but it would still require (probably) the whole data to be present in the memory at once...

I think you would need something like this: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Refe...

but I found it quite mind-bending when I tried it once. Clojure's way of building something like this out of a `lazy-seq` call feels simpler to me...


I find that much less readable than Clojure.


Is that due to familiarity or another reason?

I, too, find the Clojure version to be easier to read, but I spend more time with Clojure than JS.


I'm not really familiar with either, I think the Clojure version just makes the intent clearer.


Here it is in Erlang, which is another language which forces you not to think in loops:

  csv_to_maps([Head | Lines]) ->
      lists:map(fun(L) -> maps:from_list(lists:zip(Head), L)) end, Lines).
Another language which works differently is Rebol:

  csv-to-maps: func [csv-data] [
      map-each line next csv-data [map zipmap first csv-data line]
  ]
Whats interesting about above is that `first csv-data` will be re-evaluating on each `map-each` sequence. Of course you can assign to variable beforehand. Alternatively because code-is-data and vice-versa you can use COMPOSE it inline:

  csv-to-maps: func [csv-data] [
      map-each line next csv-data compose [map zipmap (first csv-data) line]
  ]
:)

NB. Rebol doesn't come with ZIPMAP so here's a definition for above to work:

    zipmap: func [keys vals] [
        collect [
            forall keys [
                keep reduce [keys/1 vals/(index? keys)]
            ]
        ]
    ]


Opps... a superfluous parenthesis sneaked into Erlang code there!!

  csv_data_to_maps([Head | Lines]) ->
    lists:map(fun(L) -> maps:from_list(lists:zip(Head, L)) end, Lines).


Wouldn't this be this just as feasible and elegant in JS, Python or Scala? Maybe not as short but short enough.

Then again with Python I'd just fire up NumPy or ugh Pandas(which is basically NumPy with benefits) for this kind of transformation.


You could definitely do something like this in other languages, probably even in Java/C# using a HashMap or Dictionary, but the difference is that this is “the way” you would do this in Clojure. It’s not, generally speaking, they way you’d do it in the other languages you mentioned.

I think it’s one of the reasons why people always say that learning Clojure makes them better developers even when they’re not using Clojure. The ideas that that take away from idiomatic Clojure and use in other languages.


I think Reagent would be the easiest and most visible example.

Reagent was my intro to React, I had no experience prior nor did I care to learn it at the time. The reagent wrapper around react made it so intuitive to use that I was writing productive (albeit warty code because I had 0 experience with functional programming/lisp/clojure) my first day. I had loads of experience writing Javascript, but at that time React looked cryptic to me without reading any documentation and Reagent did not.

Another commenter mentioned lodash. If I was forced to use JS directly, lodash would be the first thing I reach out for to solve problems.

But in Clojure you also get "cljc" which is a file extension that lets you use code both in Clojure and Clojurescript land simultaenously. With this you can write things like form validation code that's exactly the same front and backend.


The are aspects of solution shapes that Clojure enables that are unique and mostly completely different than other languages, but it is hard to appreciate them if you don't know it. One's intellectual appreciation is literally limited to the languages one speaks (both human- verbal and eg visual, like art, and computer programming).

I look at the benefits more concretely. It is usually possible to express a solution in Clojure with a fraction of the keystrokes another language requires, and Clojure also gives you knobs to control how many keystrokes you devote to the solution vs the stuff under the solution.

You can be all solution domain and be really lean, or you can make some reusable infrastructure to make the solution-side shorter, at the cost of something bigger overall.

The secret of good writing is usually distilled as- tell less, say more. Take away, cut, remove, until you get to the essence.

So many languages REQUIRE you to tell more, with ceremony and patterns and boilerplate. Clojure has some too, but for any given semantic intention, it generally requires the least ceremony.

It is breathtaking to wield a tool that lets you say, in a couple hundred lines, what it takes many thousands in other languages.

It is also hard, and such tools are sharp and can be humbling. But when you arrive at a place where there is no more to take away- I have not felt that "wow" in another language in quite the same way.


(not OP)

I've got a completely contrived example for you, but those are things that are typically easier to do in Clojure than in the languages I know otherwise:

Let's say you're writing a web-application that consumes JSON form an external source. You have to send out responses that accept and reject requests and you add some meta-data.

You have to meet a bunch consistency guarantees in your business logic like: a person cannot buy more than X products at discount Y in a time-period Z. Or some other arbitrary rules.

The external source decides to add an additional field in their JSON tomorrow. With Clojure this isn't considered a breaking change, you don't change a single line of code since you don't care about their new field at all.

Next week, the external source has a bug, they send out messages that are well-formed, but inconsistent with your business logic. You already wrote function specs for those and your messages automatically "explain" why your spec deems those inconsistent. Again, you don't change a single line.

A month afterwards the maintainers of the external source went through a major "refactoring". Now they changed the structure of their messages towards you and you are forced to be compatible. In Clojure this is a matter of adding a new spec and moving around some expressions, or better: composing a new (v2) top level function to meet the new spec.

Again, this is hyperbole, but there is some truth in it as well.


None of this sounds like a problem in a statically-typed system with proper data modelling. In fact for the complete refactor of the external source we would do something very similar to Clojure, i.e. update our decoders for the external data. In fact I'd argue it would be even easier because the compiler would help check all the data is accounted for.


I agree with you in the sense that these things “done proper” would be robust. It is meant as a pointer to what is specifically one strength of Clojure in comparison.

From my experience marshaling/deserialization and bubbling up errors as an introspectable useful message are things that are more involved in statically typed languages than in Clojure.

Typically statically typed languages are very good at internal consistency. But require more ceremony and maintenance regarding outside sources. This is a viable tradeoff in some cases but not so in others.


It all comes down to proper data modelling, whether in a statically-typed or a dynamically-typed codebase. If you were consuming a JSON API that gave you this data:

    {"id": 1, "name": "Bob", "age": 55}
But your Clojure app only needed the 'id' and 'name', why would you write a parser function that also grabbed 'age'? E.g. (I don't know specific Clojure libs, so this is for illustration only):

    (defn decode-user [json-obj]
      {:id (json/field "id" json-obj)
       :name (json/field "name" json-obj)
       :age (json/field "age" json-obj)}
This doesn't make sense, rignt? Of course. And you wouldn't do it in a statically-typed language either. See https://lexi-lambda.github.io/blog/2020/01/19/no-dynamic-typ... for more on this.


Right, this makes a good point and is worth studying more closely. We have to distinguish these concepts. But from my experience this isn’t done in practice. And by that I mean libraries, frameworks, idiomatic use, culture and so on, which the post at least partially acknowledges (or hints).

This is also a reminder that type systems are not all equal at all. The range of expression matters especially in these kind of discussions.


Rich hickey explains at this timestamp in the video,

https://youtu.be/2V1FtfBDsLU?t=2426

"The information programming problem"

Whole video is gold, is my fav programming talk of all time.


I feel the other answers to your comment failed to really illustrate the property.

The history paper Rich Hickey wrote alludes to this throughout it all.

Basically, data is central, everything about the language makes modeling data simple, first class, and non-ceremonious. You can take the data your domain relies on, and use it directly, in the same structure the domain structures it, no abstraction needed, no mappings, no adapters, just straight up.

It's really hard to communicate what a data centric approach feels like. But I will try. Keep in mind, there are a multitude of details and features across the entire language which all focus on this data centric approach, and they all serve a role and come together to create this property. It is not any one feature, but really the sum of all of them which enable the property to surface. I will mention only a few to give an idea.

#1 Flexible data representation

Imagine we have a business, and they have product listings, and their products are uniquely identified by vendor and name. In pseudo-model we would have:

  vendor+name -> product
Now in Clojure we can model this as:

  {[vendor, name] product}
That's it. This will now be the data-structure we will use. We're now going to write a bunch of operations over it which map to the operations the business does in its day to day with regards to its product listing.

Think about how you'd model that in other languages. In JavaScript, which also has a pretty flexible data model, this won't work, because keys to JS Objects cannot be a composite. One would need to have a string encoding and some mapping from the input to the string encoded variant and back, such as:

  {"vendor_name": product}
In Java, one would refrain from using a Map<List, Product>, and instead would model this as classes. Maybe you'd have:

  public class Listing {
    String vendor;
    String name;
    Product product;
  }
  List<Listing> listings;
But now you can't easily lookup for a product by vendor+name. So maybe one would instead do:

  public class ProductId {
    String vendor;
    String name;
  }
  Map<ProductId, Product> listings;
Except it turns out that for certain products, but not all, the business also distinguishes them by manufactured year. And some other are distinguished in addition not by manufactured year, but color. In Clojure you'd just have:

  {[vendor, name] product}
  {[vendor, name, color] product}
  {[vendor, name, manufactured-year] product}
In Java you'd now have:

  public class ProductId {
    String vendor;
    String name;
    String color;
    String manufactured-year;
  }
  Map<ProductId, Product> listings;
Where color or manufactured-year can be null.

And maybe you don't find the Java one that bad quite yet. So let's now talk about a second data centric feature:

#2 Value semantics

It turns out, in Java, the above code does not work. If the business says, find me the product Kraft MacNCheese, you might be tempted to do:

  ProductId productId = new ProductId("Kraft", "MacNCheese");
  listings.get(productId);
But the productId you created is not equal to the productId of equal value which is currently stored in your Map. Two objects in Java are equal if they are the same object, not if they have the same value. In Clojure:

  (get listings ["Kraft", "MacNCheese"])
That's it. In Clojure, things of the same logical value are equal by default. As such, Clojure has value semantics. Or in other words, equal data is equal, as it should be in my opinion.

#3 Data is same in memory and out

So we've come to the point where we want to export our listings. In Clojure we'd do:

  (spit "/home/user/listings/my-listings" listings)
That's it. And when the user wants to import listings:

  (-> (slurp "/home/user/listings/my-listings")
      read-string)
Because all data in Clojure can be serialized and deserialized as is by default. This gives us back our exact listings data-structure that we had.

#n

And like I said, there's many more such little features all thought of from a data centric perspective. When you add them all up, you start to realize everything is just data needed to be manipulated from one shape to another, and moved from one place to another.


In Java you can (and should) implement equality semantics however you want for Objects by implementing equals/hashCode. If you just want value semantics you can use a record type, which works just like the Clojure example.


I think the point is that in Clojure you get value semantics out of the box, but in Java you have to put in extra effort to get it.


Nothing really different from writing functional JS with lodash and being conscious about immutability (or use immerjs). For macros you can use Babel.


That's kind of like saying you can write immutable data structures and then bolt on STM and try to force yourself to stay in functional territory as much as possible in any language, which the paper addresses. You can do this, and Rich did it in C#, but in my experience it's not long before you start to have clashes of philosophy with other libraries you depend on and with the language itself, not to mention people you work with.

Believe me, I tried on one project to write as Clojure-y in Ruby as I could, and as soon as you reach for a library you need to interact with that doesn't play by the rules, things get weird if you want to keep things functional and immutable all the way down. You need to flagrantly violate established idioms in the core of the language, and often you wind up reimplementing things in the core of the language. People who aren't familiar with why you wrote it a particular way will come in at some point later and imperative or OOP it up.


Are you speaking from experience of Clojure?


Yes, heavily. Clojure influenced(mostly because for Rich Hickey talks) many parts of the JS functional programming ecosystem. Many functions in lodash and other libraries(immutablejs etc...) are somewhat directly or indirectly influenced from ideas Clojure help make more mainstream. Even Java the language itself has been heavily influenced by Clojure.

So this ecosystems implement the best ideas from niche influential languages, you can use Clojure's "programming model" without leaving the productivity and practical benefits of a big ecosystem. This has been more prominent in JS since the language lends itself better to FP.


This reads to me as a rather superficial take on Clojure.

It's true that JavaScript - or any language, for that matter - can implement libraries that may have been inspired by a Clojure library. But specific libraries aren't what's being talked about here.

The characteristic of Clojure in question - which, IMO, JavaScript does not answer well - is the ability to easily create libraries and DSLs that feel like an extension of the language itself.

To poorly paraphrase a quote that I can't quite remember: Lisp isn't the best language for any problem in particular. What Lisp is the best language for is being a platform on top of which you can implement your own language, that is itself the best one for the problem you're trying to solve right now.

Or, to take a concrete example: In JavaScript, you get optional static typing from TypeScript, which is actually a whole new language that is transpiled to JavaScript. In Clojure, you get optional static typing from Typed Clojure, which is just a library.


By "experience of Clojure", I meant experience of Clojure.


"By 'X', I meant X" doesn't help to explain what you mean. Can you clarify?


The first X was in quotes, so it was a purely lexical reference to the words themselves, without referring to any conventional meaning that those words might have. The second X was unquoted and so by that I meant to convey the usual meaning in English.

Another way of saying it would have been: erm right but experience of Javascript is not experience of Clojure, regardless of any influence Clojure may have had on modern Javascript.


OK. But you didn't ask him to describe his Clojure experience, you just asked whether he was speaking from Clojure experience -- and he said that yes, he was. Asked and answered?


No, on the contrary, I understood his/her reply to be saying that he/she had not used Clojure, but that its influence on the Javascript ecosystem was such that a user of modern Javascript could consider themselves to have "experience of Clojure" in a sense.

We don't have to carry on debating it, but have another read of the reply in question -- I think you'll agree that it was in fact saying that they did not have any experience of Clojure; only of Clojure-influenced Javascript.


What stack do you use for web application development for clojure?


Most notable things I use are Re-frame frontend and Datomic for back.


I’m in the opposite boat. I went from Clojure to Java.

I got really tired of “finishing” my work, only to discover that it was riddled with bugs from all kinds of weird edge cases. My favorite is having to use (map vec <thing>) all over the place, because the seq abstraction leaks like a sieve.

To be fair, I also worked in a domain that was extremely poorly suited to Clojure. The “data first” approach of Clojure falls to pieces the moment you have a bunch of data that’s syntactically similar, but semantically different. This requires a lot of cross-method coordination, creating a huge mess and a risk of missed cases.


This sounds fishy to me too... Have you actually consulted anyone, with more Clojure experience, how to avoid the need for `(map vec <thing>)` all over the place? I would guess you were working with deeply nested data structures and you had to do deep updates, while strictly maintaining the vector data-type. That's indeed kinda sucky to do just using vanilla `clojure.core`, indeed, but then there is https://github.com/redplanetlabs/specter for that. Alternatively, instead of dealing with deep nesting, you should probably work with namespaced keys, which allow flattening your data and suddenly a lot of transformations become radically simpler and such a system is more extensible too...


Your attitude is extremely patronizing.


Everyone's problems, context, and experience is different, but from where I stand it sounds like you may have had an a better experience using a data spec library like Metosin's malli.

One of the pain points I had before was different kinds of data flowing through my system. Clojure spec never made complete sense to me and was a bit difficult to write specs that matched my data. It was easier to write data that matched the specs.

Malli does this a lot more easily with human errors that have made my life enormously easier. It's been a game changer for me.


I don’t know Malli; and I haven’t used Clojure since 2017, so things might have changed. So if I’m misunderstanding the purpose of that library, my mistake.

The issue wasn’t that we would use the wrong data in the wrong place. The issue is that we had a bunch of data that was syntactically similar (huge overlap in keys), but semantically different (key numerical results must be calculated differently based on the minor differences between maps).

So the issue is less “the right data got to the wrong place” and more “in this data pipeline there are multiple places where the same data must be treated differently based on minor differences”. This is actually pretty tricky to do in a maintainable way in Clojure; technically possible but really hard to avoid breaking when you need to extend it.

In the end it actually turned out to be an issue that classes and objects solved brilliantly. The solution was to have the pipeline accept an interface (syntactic similarity) and have different classes for the behavior (semantic differences).


This reminds me of a discussion between Alan Kay and Rich Hickey on role of "data" that occurred right here on HN[1]. The TLDR (though the thread is very worth reading, IMO):

Alan Kay argues that bare "data" isn't very meaningful on its own because it leaves the interpretation too open. His view is nicely summarized by the following quote:

> For important negotiations we don't send telegrams, we send ambassadors.

Rich Hickey argues that bare "data" is a fundamental primitive that, while leaving open the possibility of misinterpretation, leaves open the possibility of re-interpretation in multiple contexts. This is the strongest quote I took away from his POV:

> None of the inputs to our sensory systems are accompanied by explanations of their meaning.

Based on your above post, it looks like you're strongly in Alan Kay's camp, which may also explain (in part) your experience struggling with Clojure's approach.

[1] https://news.ycombinator.com/item?id=11945722


That is a very succinct summary, yes. I’ll add that certain applications are more amenable to one kind of use or another.

I find the possibility of re-interpretation utterly bewildering. I have never re-interpreted my data in Clojure application, at least not in any way that I understand that word. Rewrite algorithms, yes, but you can do that in any language.

The only time I think of re-interpreting data in a different context is in a data lake or similar, which is a wildly different thing, IMHO.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: