Well, it would just require the person building the type-system to write a parser himself and it would need to be a extra step in your compilation pipeline. It would be like a linter.
I have jet to see compiler hooks that are as simple as to write as they are in lisp. Also, most of them are far less powerful then they are in clojure.
I see tons and tons of example were people use clojure macros in intressting ways but not that many people who write scala compiler hooks.
So in theroy you are right, in practice I have not jet seen tons of examples.
Many (probably most?) of them don't, but I don't know that it has much to do with the syntax. And many do. GCC has a plugin infrastructure, and Clang is well known for having a better one.
Clojure has a lot of great things going for it. I don't have any objection to Lisp syntax - I like it although I don't know that I prefer it - but I don't think we should be crediting it with things more properly deserved by other pieces. Proper support for macros is a biggie.
> Many (probably most?) of them don't, but I don't know that it has much to do with the syntax.
It has to do with the syntax. Working Lisp "plugins" is very simple because all the code is represented as plain datastructures. Rewriting program is like writing simple data structures manipulation code.
The same is true for any AST. The datastructures get marginally more complicated, but that's rarely the limiting factor.
Strictly speaking, AST data structures are often full of junk, but that's not mostly a matter of the complexity of the syntax itself but of the variety of things being done with that AST - you can't report error locations if all you have is the nested lists representing your Lisp program.
The focus on making macros work well is absolutely a strength of Lisps. The syntax is a much smaller part of that than the lore would suggest.
I agree with you generally it is possible with other langauges but to really do it is quite hard. As of right now I will stick with Clojure compared to the alternatives. I also really like working in rust and I have found use for macros there too.
More syntax makes things harder, compiler that have not been designed wit this goal make things harder. You can look at the papers trying to get this stuff in to other langauges and there are some problems, dylan [1] was doing it befor most others, now rust is coping that stuff.
Actually some of the coolest stuff was done by David Moon (Common Lisp, Dylan) he greated a langauge spec that he calls PLOT [2] and it has a very inovate macro system. For anybody that is intrested in that sort of stuff I highly recommand it.
More syntax makes things harder, but the amount of hardness introduced by the craziest syntax is dwarfed by the rest of hardness involved in introducing a type system. It's like saying "Steve was able to walk 20 miles, because his driveway slopes down a little". It may be true that his driveway slopes down, and that may make that stretch easier, but other things are more important.
Its no different in any other language with macros, the data structures are just bigger. It's pretty easy to write a Rust macro or Scala macro that consists of simple data structure manipulation. I have a simple example here: https://github.com/jroesch/pg-typeprovider/blob/master/src/t....
No not really. Optional type annotations are not the same thing as optional typing. F# is strongly and statically typed and if you expect it to be something else you will be frustrated.
You are correct, but I would claim that the type hinting in F# combined with pattern matching over algebraic datatypes makes dealing with a static type zoo much more enjoyable than in some other languages.
Both languages are very similar in way that they support lists, arrays and maps as first class citizens as fundamental types and provide expressive algorithms to operate over them 'out of the box' with more or less uniform syntax.
To me, as a novice, the main apparent practical difference between the two languages are the domains for which there is strong support and which have discoverable solutions. I.e. the platform and the community.
Hy looks really incredible, because it's like Clojure without the JVM. I really hope it becomes more popular. It might take a ClojureScript variant to compete, though.
If anybody on Arch Linux wants to give a Hy a try, I put together a PKGBUILD: python-hy.
I'm experimenting on this with Hype. It's barebones and does some whacky-don't-look-at-it macro "magic", but it works! Also trying to mimic core.typed's way of annotation.
Note: Just using python functions annotations with a typeannotation library to provide some runtime type checking. It's rather rad if i can say so my self.
From sources... Amazon is a heavy Java -> Scala shop. They are probably looking at Clojure to diversify as Scala is hitting some growing pains, but they can still leverage their large JVM expertise.
From sources... There's not much overlap between the Scala and Clojure folk, so nothing to do with any "growing pains". Scala's seeing much more uptake than Clojure.
Yeah, I can't imagine someone who's proficient with Scala seeing anything special in Clojure. I'm just a sight-seer in those languages, but I can see there are some neat things in both.
Parenthesis are always offputting to me, even though I like a fair amount of clojure concepts. I often feel like functional programming has made a cottage industry around putting obfuscated names around relatively simple ideas (see transducers for example)
Amazon actually has a pretty interesting history with Lisp. I believe Steve Yegge has written about it before, but I'm not sure. It was pretty much completely gone by the time I got there though. Amazon went from almost completely C++/Lisp to almost completely Java/Ruby over the course of about 5 years.
I'm not directly in the dev space, but it appears as though the choice of languages that people use has far more to do with what Builder Tools will support than what people actually want to use. Teams only use Scala or Clojure if they are willing to build it using their convoluted Ant wrapper that doesn't make any sense.
I was really into Scala and studying it and everything. I still remember all the scala bloggers I follow, Daniel, Tim MOrris, Paul Snively, etc...
Eventually I landed into a scala job role. Oh boy was it different. The other programmers were like type casting to Any or whatever that generalize highest type was (unified type theory).
The complexity is pretty high, it's great if you know it all but in all seriousness it was impeding...
That's just bad usage of a language feature don't you think. I'm still trying to figure out whether to blame the language because of the way programmers use it.
Personally, I like the object + static typing + first class functions as it eliminates 90% of errors at compile time. If any error occurs at run time, am pretty sure it's logical. This is a huge plus. The only drawback - the ridiculous amount of type specific code you have to write for mundane changes/updates.
Scala with mediocre programmers (CommodityScrumDrones writing FactoryVisitor classes) is an outright disaster.
The Java community has its flaws (oh boy, does it) but it has adapted to the hordes of mediocrities. Give Scala to mediocre engineers (who typecast to Any, use null instead of Option[T]) and you have the Java hell with worse IDE support and 5x compile times.
Amazon is literally thousands of shops with a libertarian CEO. The fastest way to end a career there is to try to force a company-wide standard language although many have tried and failed.
"CORTEX is our next generation platform that handles real-time financial data flows and notifications. Our stateless event-driven compute engine for dynamic data transforms is built entirely in Clojure and is crucial to our ability to provide a highly agile response to financial events."
I'm thinking financial flow as in the billions of credit card transactions every year and financial events as possible chargebacks, refunds, purchases, that amazon runs into every second.
The first thing I thought when I read the title is that they're going to run a Datomic-as-a-Service cluster. I'm still not sure that they're not, though a "stateless event-driven compute engine for dynamic data transforms" sounds more like something to do with AWS Lambda.
I wonder what the sable referred to in the job description is referring to? Could it be SableVM.SableVM is a clean room implementation of the JVM but is no longer being maintained.
Ha, sadly no. Sable is the name of an internal Amazon product that the person posting the job must have forgotten was internal. Nothing secret afaik, just not something offered on AWS.
So you did a little lisp in your undergraduate days and now you are a expert, compared to all the people who write clojure on a daily bases and are happy with it?
Also LISP (I assume you are talking about Scheme or Common Lisp and not the 60s langauge) is not the same as Clojure. It does share some syntax and some ideas but its also very diffrent in many ways.
You might prefer your legacy java in the simple case, buts lets talk again when that java code has to do concurrency or other complex problems.
> Most of the modern functional languages read closer to the spoken word.
Its actually a absolut non-goal of clojure to 'read like the spoken word'. Its a goal of clojure to write code in a simple way, simple meaning a speration of concerns were each part is easy to reason about.
> lets talk again when that java code has to do concurrency
For single-instance applications, sure. But for scalable systems, 90% of your concurrency is handled by whatever message broker you're using anyway. Tasks, Async processing, Blokcing Queues etc etc - they're all handled by your JMS provider.
Sure, if you have a one of solution you can take out and it works for you then it does not matter if you use clojure, php or java.
However often you find that its not enougth for you and then you will have to write some costume code yourself. You might think one message passing system is enougth but then its more and now you have to manage message from three subsystems in your application.
Ill use a langauge that makes things easy when things are easy and not very hard if they are hard.
Doing concurrency right is very, very easy in just plain Java. ExecutorService is all you need, it's even 100% FP as long as you remap "anonymous class with 1 method" to "lambda" in your head.
Let's consider an example. One of the classic concurrency problems is managing a bank transfer. In this problem it's important that the debit and credit operations happen atomically. If one operation fails, then both operations should fail.
In Clojure, we'd write:
(defn transfer [from to amount]
(dosync
(alter from - amount)
(alter to + amount)))
How would you write the same thing in Java?
A lot of the problems with dealing with concurrency are in communication between threads. BlockingQueues and Executors only deal with a small subset of concurrency problems, and blocked threads are a fairly inefficient use of resources compared to async solutions.
I have probebly not done much Java Concurrency. Sure you can simulate CSP with Blocking Queues but since you bind real threads you will run into tons of problems.
Why dont we ask Brian Goetz a Java Language Architect how "easy" it is, read his book [1] and then lets talk again how easy it is.
"Implemented a scheme in my undergraduate days (in C), had a small love affair with LISP in general... It's not readable. It really isn't."
Hang on. Your opinion is that Clojure isn't readable because you didn't find other Lisps readable when you were an undergraduate? But you haven't actually tried Clojure yourself?
I have tried Clojure, and my opinion is that it's easily the most readable programming language I've found.
Clojure has very strong opinions about complexity, which in Clojure parlance is a measurement of interconnectedness between components. The language is built around the idea of reducing complexity, about making things isolated and independent. Idiomatic Clojure code therefore tends to have a very flat structure consisting of isolated functions and data structures. It has a very high degree of code reuse, because it explicitly rejects encapsulation.
Clojure's syntax may be unusual, and individual forms are more information dense than many languages, however the structure of Clojure code is often much easier to understand, and for any non-trivial piece of code, that's the hardest part of comprehension.
When I come across a new Clojure library, I'll often find myself reading a bit of the docs, then heading into the source code to get an idea of how it operates. I rarely do this with an object orientated language, as OOP code tends to be deeper, more interconnected, and because of encapsulation, have more methods to understand. Java in particular is terrible for this.
I've written applications using Java, Groovy, Clojure and Scala (in that order) and after Clojure, coding Scala felt specially horrible (yet another syntax to learn! ++: \ ~> and so on...).
Maybe Clojure feels odd for a few hours when coming from C-like syntax, but it becomes so obvious later on. I remember I felt terrified by the parens because I had not written anything serious in it.
> I've written applications using Java, Groovy, Clojure and Scala (in that order)
I'd say the most natural pathways for Java programmers upgrading their language skills are:
* from Java, to Groovy, to Clojure
* from Java, to Groovy, to Scala
* from Java, direct to Clojure
* from Java, direct to Scala
Once someone reaches Clojure and Scala, they seem unwilling to give up what they've gained (the simplicity and macros of Clojure, or the higher-order typing of Scala) to switch to something else, even when that something offers more.
I ran into LISP during my university days as well and was pretty much force fed it. I didn't want to touch it with a ten foot pole after that. However, for various reasons, I ended up using Clojure to prototype a rather large project, and was surprised at how different it felt to work with it compared to my university days, in a positive way. Might be that I had matured, I dunno.
I don't feel that there's anything particularly magical with the paren being on the left hand side of the verb that automatically makes it less readable than when it is on the right hand side of the verb.
I think its also the indentation along with everything is an expression, formatted as a list kind of thing.
Especially that alternating pairs
business.
Having to know what is a macro and what is not affects understanding how arguments are evaluated, and basically everything is an argument differentiated only by position.
I've head one should read a lisp function not from the top town or outside in, but rather from the most indented part first, and then work your way up and out to discover the definitions.
>Having to know what is a macro and what is not affects understanding how arguments are evaluated, and basically everything is an argument differentiated only by position.
The whole point of macros is that they are written in the same syntax as everything else (because of homoiconicity you can operate on your code as native data), you don't need to know what is a macro and what is not for 90% of the code you write. There are very little cases in Lisp languages in general where the difference between a macro and a function are evident, and in Clojure even less (compared to another dialect like, let's say, Common Lisp).
There are some times where that might be useful, but that's why we have a repl and macroexpand.
Yes, they have the same syntax, but the semantics are different:
"that macro function is called and is passed the unevaluated operand forms. The return value of the macro is then evaluated in its place.
If the operator is not a special form or macro, the call is considered a function call. Both the operator and the operands (if any) are evaluated, from left to right." - http://clojure.org/evaluation
There are many macros and special forms in clojure where you have to understand that the arguments are not evaluated first - and if you try to reason about what happens as if they were eagerly evaluated you get the wrong answer or an error
> I've head one should read a lisp function not from the top town or outside in, but rather from the most indented part first, and then work your way up and out to discover the definitions.
That's why the pipeline operator (->>) is common. Then you just read it from up to down.
Grabbed at random. I think the larger indentation and fewer parens do make a difference when you squint at things.
Of course, I also think as a professional programmer, if something does the job, get over it and get on with it. I use Erlang myself, which also gets lots of syntax complaints.
Completely agree. Wrote some non-trivial amount of code in Clojure. But when I go back, it would take a long time to understand each line that I myself wrote. Java, for all its verbosity was readable. Going back to Python now.
It really is readable, to me at least. I just wanted to say that because some people will perhaps read your comment and take it as true, because they too struggle with it at the moment, and then discard Clojure. But I'm just in love with it---yes, the syntax.
I've spent some time writing Clojure recently and I like it, though I think there can be readability difficulties quite easily, things can at times get twisted into a functional way of doing things (the capacity for abuse of this is maybe less than OO, but can be difficult to untangle). Obviously this varies with the coder, I've kept it to small projects and it has worked well.
I wouldn't be surprised if Clojure has more "industry" adoption than Scala at this point.
Give me Clojure semantics (anti-OO, maps, vectors, and seqs) with some optional typing and a Python syntax then I'm there *
* that's just my preference, no need to convince me of anything other.