Hacker News new | past | comments | ask | show | jobs | submit login
Rich Hickey: "Simple Made Easy" from Strange Loop 2011 [video] (infoq.com)
356 points by puredanger on Oct 20, 2011 | hide | past | favorite | 99 comments



For me this was the best talk I've seen in a long while. It reminded me of how I felt when I first read PG's essays, someone articulating your own suspicions whilst going further and deeper and bringing you to a place of enlightenment and clarity.

BTW for those of you who haven't watched it, this talk is not Clojure specific.


However it's a great advertisement for Clojure, since the set of things he highlights as enabling simplicity is pretty much a subset of Clojure (or included libraries, like core.logic).

I'd be interested in seeing someone present a different and convincing set of concepts. At this point, I think Rich has put together a very good toolset.


Well, yes, IIRC he described one of his major goals with Clojure as enabling simplicity, or something like that. That Clojure's design follows his views on simplicity seems natural.

Non-Clojure examples would help make the point. He does bring up examples from Haskell (such as type classes) in the talk in places, but doesn't dive deeply into them.


SQL and Prolog were also cited, but yes, nothing very deep (which I think is fine for this talk)

I would love to see the programs that are generated from this philosophy.


Your comment made me take heart if I understand it correctly. Sometimes I feel I'm the only person out there who thinks SQL has a certain beauty and elegance.

I find it a little funny sometimes to hear all these "web scale" folks put down SQL, and then praise something like MongoDB because you can do map/reduce.

I saw a presentation by the author of the Lift framework in Scala and he made a bit of a joke saying "gosh...all this FP stuff...the folks who created SQL heard about that a long time ago".


I think the relational model has beauty and elegance, but not the SQL language itself. They are not one and the same.

On the subject of SQL and Clojure, ClojureQL provides an alternate relational data manipulation language that compiles to SQL. It's not just a different syntax; it allows some composability not found in SQL. The syntax does help though, especially in conjunction with the thrush operator.


    the only person out there who thinks 
    SQL has a certain beauty and elegance
You're not alone. :-)

http://news.ycombinator.com/item?id=1730320


If I'm not mistaken he also mentions LINQ wich is a wonderful extensions of C# and the .NET Framework. It feels like LINQ and Clojure could be a interesting match.


Another way to look at it is Clojure is a reification of his thoughts on simplicity.


All of Rich's talks are great: http://www.infoq.com/author/Rich-Hickey


I'd like to point out in particular "Are We There Yet?" (http://www.infoq.com/presentations/Are-We-There-Yet-Rich-Hic...)

If you liked this talk, then you will definitely like "Are We There Yet?" In it, Hickey argues that most popular object-oriented languages make similar mistakes by bundling identity, time and state into objects. He discusses how we might simplify programming by separating these components and rethinking each.

It has a similar theme but focuses on one concrete issue in depth. It has a similar philosophical style while remaining clear-headed and practical. And, in my opinion, it is similarly enlightening. If you couldn't tell by now, I recommend it :)


Great presentation, but it got me thinking. Am I wasting my time trying to get software just right? Is it worth my time to learn Clojure or Haskell, when I don't even know what I'll use it for?

How many programmers do you know that are learning all kinds of languages and technologies and methodologies and other things to improve the quality of the software they write and yet will probably sit at a desk writing code for the next 30 years? As opposed to starting a business, getting financial free, etc.

Take the guy from Duck Duck Go. He wrote all of that in Perl; talk about easy, but not always so simple (to maintain). What if he spent his time learning Lisp and Monads instead of writing an app that lots of people use?


To me, Clojure is about as productive as banging out a script in Ruby. It's way more productive than writing the same thing in any language where I can't (easily, or at all) just fire up a REPL and start hacking.

I can actually just crank out code that I use. It can be quick and messy. Clojure is not concerned with type theory and provable correctness.

But the end result of a quick hack, in Clojure, is often something (or is made of of parts) that can be applied to other problems by pulling it out into a library, or by abstracting one more argument to a function, etc..

I think someone like Gabriel Weinberg could get along just fine in Clojure, with much the same spirit as hacking Perl, but maybe with better results.


I kinda regret having chosen Clojure for my project. Its not that I can't express ideas in very little code, its just that my workflow is different from languages like Python.

In Clojure, it seems that I can't apply the method of spike solution (http://c2.com/cgi/wiki?SpikeSolution) and then refactor it into working code like I do with languages like Python or C.


I'm wondering what about spikes doesn't work in Clojure? I usually understand a spike as a "proof of concept" that doesn't handle all of the edge cases.


I'm curious why you feel you can't create spike solutions in Clojure?


Probably because pseudocode in my head is imperative.


I think that's normal. Part of learning Clojure is learning to think in another kind of pseudocode.

What does this imply about the languages you already know?


> Is it worth my time to learn Clojure or Haskell, when I don't even know what I'll use it for?

I don't know if it's a binary decision between investing a lot of of time, or investing none. You could skim the reference manual, and if anything jumps out at you, you have a topic for future investigation, or a non-zero quantity of information to base your decision to "is it worth more time?" on.

If you mean on a very abstract level, the balance between learning what others are doing and producing your own stuff, you might enjoy Richard's Hamming essay "You and your Research", pg hosts a copy on his site: http://www.paulgraham.com/hamming.html


I find I can't learn a language very well without having a real project to use it on anyway.


That's true.

I'm just thinking my time might better be spent thinking about business ideas, and then trying those ideas out on Heroku. As opposed to learning about some technology that may make me a better programmer but may not really help me get to where I want to be, which is having a successful product that I own equity in.

Is it really worth my time to learn the intricacies of functional programming? Am I still going to be writing code for some corporate overlord 5 or 10 years from now?


I fear you've missed the thrust of Rich's talk. Functional programming may seem "intricate" because it is unfamiliar, but if you're willing do do a little bit of up-front unlearning of the massively complex (but familiar) tools you're used to, you may find yourself able to work on projects of a much bigger scale and functionality.

See the slide at 17:20 -- easy stuff (perl,ruby, blurb) gives you 100% speed at the beginning of a project. If you don't expend effort into making things simple, you will "invariably slow down over the long haul."


    Am I still going to be writing code for some 
    corporate overlord 5 or 10 years from now?
Isn't that up to you?


Did you watch the video? He wasn't talking about tools and he wasn't talking about clojure. It was programming in general and if you have to ask if it's worth your time the obvious answer is no.


Yea I did, but I don't take what Rich says as gospel. I think for myself, not just about code but how I allocate my time.


> I think for myself

Then you should think that there may be others who aren't really that concerned about being their own boss and financial freedom(strange as it might sound); others who love learning new perspectives. Other than saying that clojure may not be worth your time, you offer no insight. Thanks for making your preference clear, now can we get back to discussing the talk?


"Am I wasting my time trying to get software just right?"

Don't we all want to "get software just right"?

"Is it worth my time to learn Clojure or Haskell, when I don't even know what I'll use it for?"

Reasonable question. Pick a project, choose a language. I introduced Scala where I work and it solved a problem but wasn't optimal for our team. Then I introduced Clojure and that's working better for us. Real world problem solutions will help you validate your choices (there was a great talk at The Strange Loop on real world Haskell usage at a startup, BTW).

"What if he spent his time learning Lisp and Monads instead of writing an app that lots of people use?"

Like Paul Graham? (Viaweb)


A good talk. Leave it to a lisper though to call testing and type-checking "guardrail programming". Hickey says instead you should reason about your programming without acknowledging that testing and type-checking are ways to have executable documentation of your reasoning about your program. Testing does in fact feedback into complexity - if something is hard to test it may be due to complexity that you realize you should get rid of.


Difficulty of writing a test can certainly be a complexity indicator, but in my experience the evidence is against testing having served this purpose very well to date, at least for the kinds of complexity addressed in this talk.

If you look at around 31:27 in the talk, you will see ten complex things, with proposed simple alternatives. If testing helped people feel the pain of complexity and discover simple things, we would see projects where people had problems writing tests, and switched strategies as a result.

Do you see people moving from the complexity column to the simplicity column in any of these ten areas as a result of writing good tests? I don't. What I see is people cheerfully enhancing testing tools to wrestle with the complexity. Consider the cottage industry around testing tools: fixtures, factories, testing lifecycle methods, mocks, stubs, matchers, code generators, monkey patching, special test environments, natural language DSLs, etc. These testing tools are valuable, but they would be a lot more valuable if they were being applied against the essential complexity of problems, rather than the incidental complexity of familiar tools.


> fixtures, factories, testing lifecycle methods, mocks, stubs, matchers, code generators, monkey patching, special test environments, natural language DSLs

STOP STOP STOP! MAKE IT STOP!

This is the clearest indication of how (dogmatic) testing has become a vehicle that introduces complexity, rather than something that alleviates complexity.


Just look at this. He keeps rolling out what is usually hard earned wisdom gained over years of time of experience while constantly striving to improve yourself and any software you work on.

Do yourself a favor and take the shortcut of listening to this talk..not to say he may not join a cult religion at some future point in time and come out with crazy crackpot ideas then but everything I've seen and read so far are things that all senior+ quality engineers should find some common agreement with.


Yes, I am constantly reducing complexity, but only after first writing tests that cover the reasoning of the program so I know that it won't change. Without some of the "cottage industry" of testing tools, it would take me multiple times longer to write tests, and I would do less reducing of complexity.

And yes, I have seen developers make large changes in their code towards simplicity because it was hard to test.

If someone is going to write complex code, they are going to do it with or without tests. If someone is going to write simple code, tests are a wonderful tool to have in that endeavour.


Why do you consider code generators, monkey patching & DSLs to be "testing tools"?


I don't. I refer here only to their use in that context.


I see this mostly in Java. Outside, you can deal with simple dsls (it "should something") and simple functions (equal x, y).

But yeah, on Java and C# land, the complexities of the type system and the class based OO complect the testing, yielding this big complex testing system (which are large enough to be called testing frameworks).


I'm sorry; I think I'm being dense - what is their use in that context?

(I don't disagree with your main point, but I don't quite see where those techniques fit in).


Some examples off the top of my head that use these techniques:

* code generators: Visual Studio, Eclipse

* monkey patching: RSpec

* natural language DSL: Cucumber


Thanks! Very good examples.

I can see how monkey patching could be useful in mocking or something similar. I've never really used a language that supports it though.

I'm not entirely sure what Eclipse's code generation has to do with testing, but given the other examples I'll assume I'm being stupid again ;) I'm actually working with a lot of EMF generation stuff at the minute which can be quite painful.


This is a purely philosophical debate, it's not to say the testing ecosystem in clojure isn't well done:

http://clojure-libraries.appspot.com/category/137002

my 2 cents


Yeah. I like tests because they let me export my mental state about a codebase.. and reimport it later. I can get the code back into my head faster.

I use lisp -- and half my code is tests.

http://github.com/akkartik/wart


He says relying on tests and type-checking to verify a program still does the right thing after making changes is "guardrail programming".


The slide where this comes up (~15:45) is about debugging. I think the point Rich is trying to make on that slide is that a bug in production passed the type checker and all the tests. Therefore, tests are not solving the problem of building high quality systems.

Rather, tests (and type safety) are "guardrails" that warn when you when you are doing something wrong (they are reactive). As Rich said on Twitter (https://twitter.com/#!/richhickey/status/116490495500357633), "I have nothing against guard rails, but let's recognize their limited utility in trip planning and navigation."

I believe that linking back to the greater context, Rich is saying that simplicity and doing more to think about your problem at the beginning (proactive steps) provide value in building systems that are high quality and easy to maintain. I think he is at least implicitly challenging whether the value you get from simplicity is greater than the value you get from an extensive test suite.

I do not hear his comments as anti-testing, but rather more as pro-thinking and pro-simplicity. Personally, I find tests useful. I write them when writing Clojure code. However, I also get tremendous value from focusing on abstraction and composability and simplicity in design.


Okay, follow up, I watched half of the talk. Wow. What an insightful guy. I really enjoyed what I heard so far.

The section about testing and guardrails seems to have been blown way out of proportion. I fervently believe in Agile/XP practices, TDD and all such good things. But I'm not naive enough to say that "because I have tests, nothing can go wrong". And that seems to be his main point here.

It makes me think...it seems like all languages and methodologies have a "Way" of the language (call it the Tao of the language). The closer you get to "The Right Way of Doing Things" within a language, the more you reach the same endpoint. And I feel that's what Rich is talking about here.

What I like about this talk is that it could be useful for programmers of any caliber or toolset to hear. If I could have heard some of these principles when I was first learning BASIC, it would have been useful.


> And __I feel__ that's what Rich is talking about here. (emphasis mine)

I guess it's just that isn't it. There's a lot of talk here about what Rich might have/probably implied. I suppose it would have been infinitely more helpful if he would have just been explicit about it as opposed to projecting a slightly philosophical [sic] point of view.


Thanks for distilling it. Your comment makes me want to watch the presentation and also to perhaps take a closer look at Clojure. Good thoughtful reply.


None of us do "guardrail driving", but we still put guardrails on roads.


However, on most highways the guardrails are only on the dangerous sections.

So sticking with this analogy, we should only need to use testing in the more intricate / complex parts of our code. However, current testing best practice seems to be to test everything possible, thus potentially wasting a lot of time and effort into aspects with a low ROI.

There could be some lesson in this...


I think to begin with it's futile to try to cover all relevant behaviour in tests as you introduce new code. Some basic functionality tests will do fine to prevent anyone from completely breaking the code, as well as providing fair documentation as to what the developer expects the code to do.

However, I think regression tests are useful. Once you find a bug and fix it, the things learned from fixing the bug can be expressed in a test, to prevent similar bugs from happening again. In such a case, the test documents an unexpected corner case that the developer was unable to predict.


Your regression tests sound very similar to what I call perl tests. The perl community was ahead of it's time by distributing a test suite with packages on CPAN. Tests that come out of bug fixes tend (at least for me) to be complexity tests. Essentially they are a 2x2 test of the interaction of pairs of conditional paths with some interaction between them. This dovetails nicely with Rich's point -- keep things simple but in those few inevitable areas where complexity will arise, make sure you can reason about them. I just write regression tests around them to ensure that my reasoning about them is correct. Rich skips the tests because he's better at remembering or re-reasoning through them again :)


That's an interesting point. I've noticed that as I grow as an engineer, I still place a high importance on tests, but the type of tests I write and how I write them has changed a lot.

When I first started testing it seemed like the world suddenly got really scary and now I had to test everything. I ended up testing ridiculous stuff, things that the language would provide by default. (I did this in many languages which is why I don't mention a specific one).

What I've found valuable as I do testing (I do TDD) is that it has made me change how I think about design and composability.

I agree that there should be a greater focus on "what is appropriate to test" but even knowing how to write tests and what to test is a skill in itself.


I enjoyed being a passenger for some "guardrail driving"

http://www.eurail.com/planning/trains-and-ferries/high-speed...

These too run on rails.


You may also want to look at this other great video:

Stuart Halloway: "Simplicity Ain't Easy"

http://blip.tv/clojure/stuart-halloway-simplicity-ain-t-easy...


This kind of arrogance has no future.


I can't add anything more to what @jsmcgd said but Rich's strange loop talk really brightened my day and more importantly gave me the tools to express what I sometimes try to share with other developers in the clearest way possible. Thanks man, really awesome talk. Invaluable if more people could start thinking this way. (which it sounds like oracle/java 7,8,.. will also help to do whether they like it or not and that's also awesome for that general clump of dev brethren)


I feel smarter having watched this talk: it gives you tools to think about thinking as you come up with new designs. Also love how Rich manages to find the right metaphors to illustrate abstractions, he's a great communicator.


The interesting thing about this talk is how Hickey's (very valid) distinction between "easy/familiar" and "simple/unentangled" can be applied to argue for powerful type systems like Haskell's or OCaml's. Likewise with the "benefits vs tradeoffs" argument.


Likewise, some points in the talk brought to mind the OO SOLID principles. It's unfortunate that much of the discussion I saw on HN and twitter after this talk was argument about the value and purpose of testing. More useful would be examining the point of the talk and considering any common ground between Rich's "Simplicity Toolkit" and where things stand now with languages that aren't Clojure.

Take your language of choice. If you're not in a Lisp already, you can't do much about syntax vs. data, but what about the rest? Is it easy to work generically with data and values instead of making custom classes and mutable objects for every piece of data? Is it possible? Can we make it better? Are there persistent collections or managed refs available? Can we write them as libraries? Within the language's polymorphism construct, can we extend a type we don't control to our abstractions without creating adapters and more complexity around value and identity? What about transactions?


Ugh. Why don't they release the presos as opposed to having to deal with synchronous video?


I empathize, but at the same time, 5 min. of slide browsing would not drive the point home. People are trying to skimp when they should invest the time. Relax, reserve yourself an hour and enjoy the talk. This is a good one, and getting signals through sound, visual and text will leave a better imprint in our brains.


If you mean just the slides, most of them are in this github repo: https://github.com/strangeloop/2011-slides


For some reason, these slides are only available in a flash widget that's synchronized with the video. Here's a little script to grab the flash and build a PDF for yourself.

ImageMagick and swftools required.

  #!/bin/bash
  for s in {1..39}; do wget http://www.infoq.com/resource/presentations/Simple-Made-Easy/en/slides/$s.swf; done
  for swf in *.swf; do swfrender $swf -o $swf.png && rm $swf; done
  convert `ls *.png -x1 | sort -n | xargs echo` slides.pdf
  rm -f *.swf.png
[Edit: required packages]


I agree with "teach a man to fish ...", but you know, some people are far away from the sea (linux) so just providing them with the fish (pdf) is also a good option.


This is in no way Linux specific - most likely OSX can do it, and with enough time cygwin too. Probably BSD's and others...


Copyright.


That was Easy! (pun intended)

Great talk by Rich Hickey, and thanks for the script.


Awesome! Thx!


This was done at Rich's request. I've asked whether we can release them now that the video is available.


Any chance of releasing the videos somewhere that doesn't require Flash?


I watched it fine on my iPad, it should be viewable in an HTML5-mp4 capable browser I guess.


Sure enough, if I fake my User-Agent as an iPad, I get a <video> tag referencing a .mp4, which I can then download and watch. Now if only infoq would provide a "download" link pointing to the same .mp4.


No, sorry.


This is my number one gripe about infoq. I love skimming slides to see if it's worth investing an hour in watching a talk.


I have no interest in seeing slides, I want video.


Great talk.

Simplicity is, of course, key; but a few of his applications of these principles are misguided IMO.

Ex: The "Parens are hard!!" slide. He suggests that parens are "overloaded" in CL/Scheme because they are used to wrap functions, data, and structures all the same. However he completely misses the fact that by representing everything with parens, CL/Scheme remove a lot of complexity in software written in those languages.

AFAIK, the only languages that do macros right are homoiconic. Anything else is too complicated. Just look at Perl 6 and Haskell macros. They require learning special syntax and really crazy semantics. Using them will probably just make your program more difficult to understand.

He also "rails" against testing. He misses the virtues of a proper testing infrastructure in producing reliable software: if you don't test it, how do you know if it works? Because you reasoned about it? How do you know you're reasoning is correct?

True, "guard rail" testing isn't a crutch that will automatically produce reliable software. But I think Rich relies too much on this narrow view of testing to make his point. Testing is good and necessary.

And the jab to the Unix philosophy? ("Let's write parsers!"). Isn't that what we do on the web anyway? AFAIK, HTTP is still a text-based protocol. Any server that speaks the protocol has to parse a string at some point. So what was he getting at there? The Unix philosophy is about simplicity and has a lot to offer in terms of how one can design software to be simple.

Overall though, it's a great talk. I just think that if he wants to get pedantic then he could be a little more thorough and less opinionated. Everything he said about designing simple systems I pretty much agree with, but I think he glosses over some finer points with his own bias of what simplicity means.


Clojure is homoiconic just as much as CL and Scheme are. It just happens to use more than one datastructure to represent code.

The "oneness" in other lisps does not make things simpler, nor, in my opinion, easier. The reason why parens (lists) in traditional lisps are not simple is that they complect several different purposes. In contrast Clojure uses list forms (almost) exclusively for "active" expressions such as function and macro calls. For grouping and binding, vectors (and sometimes maps) are used instead.

In this manner Clojure manages to keep the roles of different data structures in code mostly simple.


The reason why parens (lists) in traditional lisps are not simple is that they complect several different purposes.

You've repeated what Hickey says in the talk, but I'm not sure I buy it. [Deleted boring stuff about boring parens here.]

The trouble with the general argument is that you can add constructs that are individually simpler but nevertheless make a system more complex. It's overall complexity that we should seek to minimize.


How does adding several independent simple things make systems more complex in a way that is unavoidable (ie. incidental complexity, not problem complexity)? The interaction between simple elements will be explicit, whereas complex elements by definition interact in ways that you can't control.

Of course, sometimes the complex tool might be exactly what you need to solve your problem, making things easier. But in cases where you need only a part of the functionality of this tool, the complexity bites you as all the unneeded functionality (along its cost) is forced on you.

What sort of "overall complexity" does having several data structures in code introduce? As Rich Hickey says in his talk, complexity is not about cardinality.

In Clojure's model, the elements are distinct, and as such there is more simplicity in the system than in lisps where you have to understand the difference ("untangle" the complexity) between several different uses of lists to get anywhere.

I also think having several datastructures makes code easier to read due to the visual differences, but that's a separate discussion. :)


Adding things means more things, which means more complexity. Doing so is a net win only if the new things subtract more complexity than they add. To figure out whether they do, you have to consider the system as a whole. That much is clear. Do we know how to do that? Not really. Empirical studies favor the simplest measurement of program complexity: code size. So that part is roughly taken care of. But I don't know of a good definition of "system". What counts as a single system and what counts as separate interacting systems, or subsystems? It's in the eye of the beholder. If I take 90% of a program and call it a library, has my program become 1/10 as complex?

Rich says his definition of simplicity is objective, but it's not. What determines whether a construct has a single purpose? It's whether or not, on inspection, you think it does. S-expressions seem to me to have a single purpose: to group things. How those groups of things are treated by the evaluator is a separate concern. You, following Rich, say no, they have 3 purposes. Ok. Why not 4? A function call and a macro call have different purposes; why conflate those? (I have no trouble reading function calls in s-expressions, but sometimes run into trouble not knowing that they are really macro calls, so this is not a made-up point.)

We have no objective basis for this type of discussion, only emotions and beliefs. Concepts like "readability" are hopelessly relative, but appeals to "ease of future change" are no better; to put it politely, they depend on individual perspective and experience; to put it bluntly, we imagine them.


I don't agree with your assessment that adding things necessarily increases complexity. Assuming these things are simple (in the objective sense as defined by Rich), actually suitable for the problem, and used correctly, then any complexity arising from their use is necessary to solve the problem, and can't be avoided.

The definition of a system in this case is anything with one or more components that accomplishes a particular purpose.

It still seems you're using a different definition of complexity than I am. To me, complexity implies unnecessarily intertwined elements.

In your hypothetical library situation, the answer is likely no. Your program still depends on the "library" code in ways that make the two not treatable as standalone entities, so no complexity has been removed.

An inherently complex tool may solve the specific problem it's built for, but it does not combine well with other tools. A simple tool tries to keep itself standalone so that can be freely combined with other simple tools to provide functionality that the original authors of either tool might not have envisioned. Clojure has many examples of this idea in action, but it's not the only language to exhibit simplicity.

I think Rich's definition of simple is straightforward and objective. For example, Clojure protocols fit the definition. They give you freely extensible polymorphism based on object types. Protocols don't even provide a direct way to do inheritance. That can be done by using a map of functions that you modify as needed, but requires no explicit support.

Your defense of s-expressions is rather puzzling. The evaluator defines what s-expressions (lists) mean in CL, and depending on context, there are multiple meanings that are completely separate. Certainly you can argue that Clojure conflates macro and function calls too (it is justified to me though, since they're all operators), but it has at least reduced complexity by not conflating binding and grouping with those elements.

As an added benefit, with few exceptions, whenever you see a list form in Clojure you can assume it's either a function call or a macro of some sort.


Having written a bit of Scheme and Clojure - Clojure's distinction between data structures make many things simpler - from writing code to writing macros.

As far as his comments on testing - I suggest you read this, http://blog.8thlight.com/uncle-bob/2011/10/20/Simple-Hickey....


Importantly, it should be mentioned that despite the way it distinguishes between different data structures, Clojure manages to retain the advantage of "representing everything with parens" by instead making sure that everything implements a common _interface_, (i.e., seq). In other words, as opposed to CL or Scheme, it separates logical list manipulation from the physical data structure, giving the best of both worlds. That's a big advantage for Clojure.


Precisely. Clojure was my first Lisp, and I got spoiled by the "every collection is a seq" uniformity. In Clojure, functions like 'count' can operate on all collections, whereas in, say, Racket, you have an explosion of methods like:

length

vector-length

string-length

hash-count

set-count

...

Clojure embodies Alan Perlis' idea that "It is better to have 100 functions operate on one data structure than to have 10 functions operate on 10 data structures.", which is one of the many reasons why I enjoy the language so much.


> "It is better to have 100 functions operate on one data structure than to have 10 functions operate on 10 data structures."

Oddly enough though, you yourself are pointing out that it is better to have 1 function that operates on 10 data structures. ("length" operates on vectors, strings, hash tables, etc.) On the surface this seems opposed to the quote you chose.

However, it is clearer if you replace "data structure" with "interface". The is classic separation of concerns. When specifying the "what", we can get away with "100 functions operate on one interface", but the efficiency, the "how", can be specified independently based on the choice of data structure implementing that interface.


Yes, that is a better way of making the point. Thank you.


Big advantage for Clojure over what?

The point I was trying to make about parens is that I think Rich is creating a straw man from them. Parens in CL/Scheme don't have any special meaning other than demarcating lists. A Lisp compiler/interpreter just evaluates lists of symbols at the end of the day.


His point is that it puts together (complects.. ha!) lists, the data structure, and list-like operations into one and the same thing, when logically they are separate ideas. He's trying to point out that separating things brings simplicity, even if you end up with more of them, therefore unifying everything under one syntax for example is not necessarily "simpler". You may not agree, that's okay, but it's consistent with what he was saying, not a straw-man.


That's where I think he is 'discovering' complexity and in fact creating some.

That code and data can both be represented as lists is a major feature of what makes CL so compelling.

And I suppose that is the major contention -- is 'code as data' a complex idea? I say it's simple. There's no difference between the data structure of the code and the data structures the code acts on, so using the same operations on either should be trivial. With a handful of simple evaluation rules and a small number of special forms you can bootstrap an entire language written entirely in itself. Functions, classes, interfaces, namespaces, the whole nine-yards. The list is just an implementation detail and it's a very simple one that enables some very elaborate structures.


There's nothing you've expressed that doesn't apply to Clojure as well. I'm assuming you haven't done much Clojure and thus can't really illustrate what the problem is in practice.


I think the Unix philosophy jab is about data. Instead of using string/text file with no form, I think he's suggesting that a standard format (like JSON) could be much simpler.

HTTP is just that, a protocol : a standard way for two entities to communicate. In Unix, there's no protocol, its just text without any standard form, which makes it difficult to write tools.


Without being confident enough to have voiced it earlier, I thought similar thoughts. Clojure appears to depart from something that characterizes both traditional Lisp and Unix systems – having one universal interface, with a big emphasis on one. Though by the definitions he outline, that would perhaps be easy more than simple.


Rich is a good philosopher. Although it's often hard to strictly follow your own principles in your real-life work - and Clojure shows that often ;)


Slightly OT: What's the name of the template used in his presentation? It's beautiful and has a nice contrast of colours.


That's a stock Keynote theme called Industrial.


At 55:20 he says, if you have A calling B all the time, you should "Stick a queue in there."

What is an example of this?


I suppose he means that instead of A calling B directly, A should put work in a queue and B should consume from it, thus decoupling the two entities.


Can anyone explain how pattern matching causes complexity?



Good talk - reminds me of Yegge's recent rant about the Service Oriented Architecture, but coming from another angle (internal use vs. external).


I wonder what would Rich think of automake, autoconf and cousins? is it simple, is it easy, or simply esoteric?

It confuses the hell out of me :)




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: