Hacker News new | past | comments | ask | show | jobs | submit login

Who still has nightmares of infinitely nested parenthesis?



The more nightmarish thing about Clojure is realizing that, in truth, you have no idea what all these dicts you are passing around the terse, nil-punning functions of your codebase hold at any given time.


That was the case for me. I went all in drinking the Clojure koolaid and wrote some small internal CLI tools with it. If I came back to that code a month or two later I could only properly understand it if I opened up a REPL to debug it. I ported those tools to Java and they were dead simple to comprehend.


After a while, lesser the magic, the better.

One of the reasons Golang has had a use case in today's age is there is a need for a programming language with just functions, loops and if/else statements.


Yup. I learned Clojure just so I can use a Lisp and get paid for it, but there is some weird cult against all forms of typing. Even coming from a Common Lisp background, this was strange to me. In Common Lisp, there are implementations (like SBCL and ECL) that can make use of type declarations to produce efficient machine code and allow the compiler to catch errors that would otherwise be run-time errors. There's also other benefits like contextual autocomplete. The autocomplete in Clojure tooling is very basic, and many Clojure libraries try to make up for this by using qualified keywords everywhere. That way, rather than seeing all keywords ever interned, you can type ":some.namespace/" and your editor shows a dozen keys instead of hundreds of unrelated keys.

Many in the Clojure community believe that occasionally validating maps against a schema "at the boundaries" is good enough. In practice, I have found this to be insufficient. Nearly every Clojure programmer I know has had to "chase nils" as a result of a map no longer including a key and several functions passing down a nil value until some function throws an exception. (Note: I don't specify which exception, because it depends on how that nil value gets used!)

Refactoring Clojure code in general is a nightmare, and I suspect it is why many in the community are reluctant to change code in existing libraries and build entirely new things in parallel instead. Backwards compatibility is one often-cited reason, but I do think another reason is that refactoring Clojure code creates an endless game of bug fixing unless you have full test coverage of your codebase and use generative testing everywhere. (I've never seen a Clojure codebase with both of these things. I can count on one hand the number of Clojure codebases where generative testing is used at all).

Function spec instrumentation provides something that feels like runtime type checks in Common Lisp, but now you have to manually run certain functions at the REPL just to ensure some change in your codebase did not introduce a type error.

On the flip side, Java has things like DTOs which always felt too boilerplate-ish for me (though at least it provides useful names for endpoint data when generating Swagger/OpenAPI documentation). Even then, records in Java provide what are essentially maps with type safety and similar characteristics as DTOs.

I think the structural typing offered by languages like OCaml and TypeScript provide exactly what I'd want in Clojure. But when faced with feature requests in Clojure, people will state something like "I have never had a use-case for X, therefore you don't need X". In the case of criticisms, the response is often "I may have ran into X before, but it's so rare that I don't consider it a problem".


I still don't get how Java records can be used for anything like a DTO. Since you're a Clojure dev you may remember the pattern Rich Hickey described as "place oriented programming" :) Nearly every endpoint will have more than 2-3 fields and you really don't want a Java record with more than that many fields for the same reason you don't want a Java method with that many fields e.g. doIt(Long, String,String,Long,String,int,int,String) <-- code smell.

And the problem I always see is something may start off as a Java record and then need to be refactored into a class as soon as 1-2 more fields are added.


> I still don't get how Java records can be used for anything like a DTO

In Clojure, we often deserialize a result set from a database to a vector of maps. These maps have different keys depending on what exactly your query was selecting. In Java, one often "projects" results to some DTO. This is one scenario where records offer identical functionality while avoiding boilerplate.

Regarding "place-oriented programming", records are immutable, so that is one technical advantage they have over handwriting a DTO. And from my short experience using web frameworks like Quarkus, it seems that a lot of the "design patterns" I see in documentation exist to help design easy-to-test programs rather than unfettered mutation.

Additionally, I have found records useful for describing the payload of endpoints that accept map-like data. Without records, I would be writing POJOs with public fields anyway.

Overall, Java records behave like TypeScript interfaces with awkward syntax. I have found them ideal for expressing type-safe, map-like data with minimal boilerplate.


I'm also using it for projections. But to be honest we have a fairly large Quarkus app and only use projections in a few places. For immutable classes with a large number of fields where instances have to be created manually I usually use the builder pattern. But the analogy to typescript interfaces is interesting.



Typed Clojure is interesting. However, until two months ago, it was practically unusable for most Clojure projects because of the lack of type inference in higher-order functions. This has changed, but there's another huge problem: nobody maintains type declarations for widely used libraries. If you look at alternatives to TypeScript, such as ReScript, you will find similar issues.

I still use Clojure, but I am fully aware of the kind of bugs to expect down the road. Typed Clojure would work if I could maintain types for each library I use, but that is simply too much effort on my part.

As for clojure.spec, I already addressed this in my post, but I will state the following.

Schema, Malli, and Spec are not substitutes for a type system. Each of these libraries explicitly state so. You still need to enable instrumentation and run erroneous code violating some contract. Most Clojure programmers have the habit of enabling instrumentation in dev and disabling it in production because validation is an expensive operation. I personally use Malli for data validation and coercion, but it does not make refactoring any easier, nor does it help autocomplete and other development-related tooling.

(Someone will probably link a malli document demonstrating clj-kondo linter generation, but even that is not a substitute. At best it detects arity errors and primitive type mismatches, not the shape of data in a map).


They seem to disappear with parinfer.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: