Hacker News new | past | comments | ask | show | jobs | submit login
John Carmack: “I just dumped the C++ server I wrote for a new one in Racket” (twitter.com/id_aa_carmack)
402 points by tilt on March 17, 2015 | hide | past | favorite | 275 comments



It's refreshing how humble Carmack always sounds in his Twitter feed. I've found that old school game programmers tend to be hardline and a bit belligerent about their code; Carmack, on the other hand, often tweets about going against his better instincts and exploring new technologies. Not a lot of sarcasm or negativity at all. If only more people had Twitter streams as pleasant and informative as his!


I second this. Its something we can all take a lesson from. Unfortunately I see too FAR many programmers who model their persona's after Kanye West.

I had the great fortune of meeting Steve Bourne, inventor of /bin/sh. If ever there was a guy who could accept any and all accolades it would be him. Instead he was most gracious and humble and spent all his time showing interest in our work as well as answering our questions about the invention of the shell and Unix.

Speaking of; did you know he was the first alpha tester of Unix? When Ken Thompson and Dennis Richie were writing it they would hand the magnetic tapes over to Steve Bourne and have him test it.


Less Kanye West and more Linus. And I'm of the opinion that an attitude and approach like Linus' is fine - if you're Linus.

I think it's sort of made its way around as the way to act if you know what you're doing and that's a shame. I know a lot of great devs who are lost in the shuffle for not being boisterous enough.


Linus is a technical genius with a very abrasive personality. People tolerate the abrasive personality because he is an excellent programmer and steward for the kernel.

What's intolerable is when untalented people cargo-cult Jobs or Linus and pretend that their personality is what caused them to succeed. There is nothing more scary than an untalented manager reading one of the many 'inspirational' leadership books about Steve Jobs - they are likely to pick up every single wrong lesson.

Has anyone ever complained that Terence Tao is apparently a genuinely really nice person? People would still work with Tao if he was a misanthropic asshole because he is probably the smartest person in the world, but it's not like being a kind and generous human being detracts from his genius.


But Linus's personality is only abrasive if you screw up majorly when you should have known better.

We haven't seen Carmack when someone tries to contribute naff code to one of his game engines. He doesn't develop in the open the way Linus does.


I've found that I inherently think that I've screwed up majorly in my code. Evidence shows though that's not the case, but that doesn't change how I feel. As my own strongest critic I write some pretty strong test cases, because I'm sure I did something wrong. Of course I can't test my own test plan, so I'm always worried that I missed something stupid where I should have known better.

I also don't like strong emotional confrontation.

Can you see why I wouldn't want to work with someone abrasive like Torvalds? Part of me will always be on edge expecting to be chastised, even if that were never directed to me. So it isn't only the targets of his abrasion which are affected, but also those who think they might be targets, even if that belief is wrongly held.


I genuinely have no special insight into what Linus is like as a human being.

I just think that his abrasiveness (even if it only manifests itself in a few circumstances) is a negative that people tolerate because he is such an exceptional project manager and programmer. The keyword is 'tolerate'.

Too often, people think that his success comes not from his technical mastery (which is very difficult to imitate) but from aggressive rants and call outs - and try to imitate those and justify it as 'well that's how Linus runs the kernel'. It's even more prevalent with people cargo culting Steve Jobs' personality flaws thinking it will make their company the next apple.


People don't just tolerate him for being negative, because he's really not all that negative. He's helpful if people actually need help. However, if he trusts you to not be stupid, and you break his trust, he's going to get mad.

He needs to be able to trust people because he needs to merge thousands of patches and make sure nothing bad gets in.

This is the only reason you should ever do what Jobs or Torvalds have done: trust.


Exactly: "succeeds in spite of" rather than, "succeeds because of."


>We haven't seen Carmack when someone tries to contribute naff code to one of his game engines.

Romero


> What's intolerable is when untalented people cargo-cult Jobs or Linus and pretend that their personality is what caused them to succeed.

Yes, some other, less talented developers are being jerks and decide that they should try to hide their deficiencies by imitating Linus Torvalds. That does happen. But I'm sure that to a far larger extent, imitating the style of the great leader is just something that happens subconsciously.

That is really the reason why I disagree with nkozyra's grandparent comment. Even somebody who really knows their stuff must be conscious of how they communicate, because of the potential problems that arise when their style is inevitably adopted by those who don't really know their stuff.


More like the way they think Linus acts. Yes, sometimes Linus has outbursts, and I think it's fair to criticize him for that. But 99% of the time he's listening to ideas from all sides and deferring to other people on the things he doesn't know. I think a big reason for Linux's success is Linus' skill as a mediator, and that's something these imitators don't seem to grasp.


Right - it's like all modern news reporting - it's incredibly distorted in favour of the dramatic and attention grabbing parts which are usually exaggerated beyond belief. The other 95-99% just doesn't get any attention because it's not news-worthy.


> And I'm of the opinion that an attitude and approach like Linus' is fine - if you're Linus.

I'm of the opinion that it's not. And, it provides a really poor example to the younger people in the field.

I've known a lot of programmers. Many of them far smarter than Linus. Funnily enough, the better they were; the nicer they were.


I don't think you know anything about Linus beyond his widely publicized outbursts (Which also means you have no idea how smart, or not, he is).


OMG is he a 10x engineer? 11???

Seriously, there's no productivity benchmark beyond which you shouldn't treat people with respect. There are broken cultures where you can get away with it, but you'll still be an asshole.

Linus is an asshole not because it's some deep essential part of his nature, he's an asshole because people give him a free pass.


Parent post was asking whether your knowledge of how much of an "asshole" Linus is comes from only reading the publicized outbursts, and not interacting on a day to day basis.

While it's not preferable, I think that we can be understanding of the fact that people can sometimes have outbursts due to stressors in their life. That said, whenever I've seen people bashing a Linus outburst, I've gone and read said outburst. It's never as bad as the "tech press" make it out to be.


>While it's not preferable, I think that we can be understanding of the fact that people can sometimes have outbursts due to stressors in their life.

Sure. The question is, do the rest of us treat those incidents as mistakes where somebody went over the line and behaved poorly, or do we defend the behaviour and even try to argue that it's a sign of good project management that should be emulated because it 'tells it like it is' and keeps away the 'idiots'? Sadly the latter is a common attitude, as seen in other comments in this thread.


I'm not sure, but given his cultural background, I'd say he's less of an asshole than most people think he is.

The level of rudeness in conversation is very dependent on cultural background and those ignoring it display cultural insensitiveness

Europeans are more straightforward than Canadians and Indians where everything is said in a mild passive-aggressive tone that goes nowhere.


I understand and have experienced what you're talking about, but I don't think it gets to the meat of what people dislike about the way Linus (occasionally) communicates with people. Being more straightforward as in "no that's a dumb idea because X" may put people off, but can be chalked up to "cultural differences in communication". This doesn't really apply to the kinda stuff Linus has said that people complain about:

"Mauro SHUT THE FUCK UP"

"I don't _ever_ want to hear that kind of obvious garbage and idiocy from a kernel maintainer again. "

"fix your approach to kernel programming."

"There aren't enough swear-words in the English language, so now I'll have to call you perkeleen vittupää just to express my disgust and frustration with this crap."

It's ridiculous to claim that that's just avoiding unnecessary civility. Those statements aren't "blunt", they're just him being a dick.


I agree - and in fact, Linus does also - that some of those are crossing the limit.

What I find remarkable is that people always talk about the same half-dozen remarks - from an history of 20 years of kernel development completely in the open.

Linus is a fucking remarkable example of civility. He just doesn't have an office where he can shit all over people in private, like so many managers.


The reason people keep talking about those incidents is that a lot of people defend that behaviour (see other posts in this thread) or even promote it as the best way to manage a project and deal with 'inferior' programmers.

If everybody was just saying that those were unfortunate incidents where he lost his temper and agreed they were inappropriate then this debate wouldn't keep happening. Some people aren't prepared to admit that Linus is ever at fault in any way during these episodes. There is even a group of people who think that those episodes are examples of 'telling it like it is' and 'sticking it to the PC crowd' and thus should be replicated as much as possible to make sure your project isn't 'swamped by idiots'. Unfortunately some of the people with that attitude maintain open source projects (and have downvote powers on HN).

I have found buffer overflow bugs in widely used open source projects that I have not informed the developers about, because when I've made previous bug reports they acted like assholes. I don't feel inclined to open myself up for public humiliation just to contribute to their project.


> and deal with 'inferior' programmers.

Except the people he usually deals with in a harsh manner are not inferior, rather, from the examples, it is usually the top collaborators

> I have found buffer overflow bugs in widely used open source projects that I have not informed the developers about, because when I've made previous bug reports they acted like assholes

I understand, but did you follow the bug reporting procedure?

Also, some projects may act polite towards outsiders then direct criticism (and bugs) to /dev/null


We defend that behaviour because this debate keeps happening and people keep blowing his (few) outbursts way out of proportion.

I don't think it's the best way to manage anything, but I also don't think these outbursts are a big deal.


No, he's not a remarkable example of civility. He's just a realistic example of a person who manages others. The thing is, like you pointed out, he does it in public.


> What I find remarkable is that people always talk about the same half-dozen remarks - from an history of 20 years of kernel development completely in the open.

Your commend would be relevant if my comment was the top of a thread or something. Take a look at the context of the conversation. I wasn't popping in out of context out of nowhere and saying "Linus is an asshole, look at these comments!". I was responding to the specific assertion that these asshole-ish statements can be attributed to cultural differences. I in no way disagree with the idea that "Linus was being a dick in these cases" is very very different from "Linus is a dick".


While they are blunt and I don't disagree about "him being a dick" written communication is very different from spoken ones, I can imagine someone using such words in a certain context and they sound less harsh.

But in the written medium it really come across is the most harsh way.

But while maintaining the kernel sometimes he needs to get the point across and saying "this is unacceptable, I am disappointed" will not work.

For those examples, when reading the context I (usually) agree with Linus (and with his attitude), and this usually happens when the situation has been building up and people keep repeating those mistakes.

Also, there are people much worse than Linus on the LKML (and subsystem lists), also some people are very polite and helpful.


Linus explains his point of view on this here: https://www.youtube.com/watch?v=JZ017D_JOPY


"Europeans"? Grumble. There is no "European" culture.

Also, Linus isn't representative of Finnish or Swedish culture (none of which put a lot of stock in assholishness) as much as he is representative of early 90's hacker culture and he never had to grow out of it. It doesn't work as well for a guy with a lot of power though, it isn't cool when he kicks downwards.


You know what I mean, unless you want to nitpick. Is "Continental Western European" good enough?

I'm not saying he's representative, but it certainly plays a part.

Unless you want "Continental Eastern European", which makes Linus sound polite?


He's right and throwing up "Continental Western European" as an answer really emphasizes that your picture of Europe isn't in line with reality. You are saying the cultures of Spain, France, Germany (and many others) is all the same? That doesn't make sense. And then you lump up the Balkans and others as if somehow that's another group with a single culture?

kryptiskt is right and not nitpicking. You are trying to take widely diverse people groups and lump them all together based on a general ignorance of the places you are talking about.


> You are saying the cultures of Spain, France, Germany (and many others) is all the same? That doesn't make sense

Of course it doesn't make sense, that's why I never said it, but the cultures share some traits, even across language borders.

Having lived in more than one country in Europe I think I know something about the differences (and similarities) between them.


It makes less sense if you've lived here. Then you ought to know "European Culture", "Continental Western European Culture" and so on are nonsense labels. It would be like talking about North American Culture.


I have no clue what you mean, other European OSS contributors don't have that reputation, and they're legion. It's a just-so story made up to excuse Linus behavior. It has nothing to do with Europe. Or your imagined "European" culture that doesn't exist.


Dane here. The stereotype here is that Swedish people, much like the British, never say directly what they mean, are very polite, politically correct and generally much more orderly and conflict averse than us. My own experience with interacting with Swedes confirms this, as much as such things can be confirmed.

Linus is of course from the Swedish-speaking minority in Finland, and I don't know if it translates. But anyways, like others I find it a little weird that you speak of a singular European stereotype. It's not something I can understand at all, really, but I suppose yours is an outsider's perspective.


Linus has a sense of humour that is often misunderstood, especially by the PC crowd. Abrasiveness is an innate Finnish quality. Just think you're in a Monty Python sketch.


What are you proposing? Because if it involves preventing him posting to mailing lists or restricting his contributions to kernel development I'm against, and on pretty fundamental ethical principles: you appear to think there's a deontological injunction against permitting someone to do harm by disrespecting others, whereas I think that harm should be weighed against the benefits that would be lost thereby.

I also think that your opinion on why Linus behaves as he does is wishful thinking, sorry. There's a decent chance he'd take his ball and go home if the above were done. That would be bad.


It's also a mass communication strategy. Concise and harsh. No time wasted.


I think you should treat Linus with more respect.


Let me first point out that pretty much every single programmer I have met in person has been pretty easy to get along with, and that I try myself to be as accomodating and friendly to other people as I can. (My day job as a sysadmin includes what is essentially helpdesk work, so this attitude comes in rather handy.)

But it seems there is a kind of tradition among hackers of people that are exceedingly smart but also jerks.

They get away with it if they are sufficiently smart, because many hackers tend to value a certain kind of cleverness above most other things. And honestly, I would rather deal with a smart jerk, as long as he/she was not trying to feed me bullshit than a friendly but ignorant buzzword-slinger.

Having said that, let me repeat that I would prefer a friendly smart guy over both of these extremes, and so far, I have been lucky.


Linus is nice ... I have not seen him being jerk just for jerkness.

But if you cannot take having a new one teared when you do something stupid - you are in the wrong field. Bullshit, stupidity and thin-skinniness should not be tolerated.


Just my €0.02, but i think it's precisely the "new ones" that should be treated carefully. The kernel (or whatever other FOSS project) won't develop itself, so it's purely economically bad strategy to scare off newcomers who might well turn out to be very talented devs. Aside from the economics, it's also simple civility to not shout at people if (you believe) they're idiots.


I agree with you about treating newcomers nicely, but do note "having a new one teared" is an expression that doesn't refer to new people :)


Right, i totally didn't parse that sentence correctly on account of not being familiar with the (arguably distasteful) idiom. Thanks for the clarification, though.


I feel absolutely idiotic to not have known that the Bourne shell was actually named after a Mr. Bourne. I just assumed it was an acronym or something.


You never saw that matryoshka doll book?

http://www.amazon.com/System-International-Computer-Science-...

It was somewhat outdated by late 80's but it was still "the book" for newcomers to Unix environments.

Next, look up what AWK stands for...


> I second this. Its something we can all take a lesson from. Unfortunately I see too FAR many programmers who model their persona's after Kanye West.

Most indie dev. are like that, they don't have anything to show but act like they know everything and that everything they say is pure gold.


That's because they need to raise money before starting on building their game. Hip startup people have similar traits, and inflate their idea in order to get more funding. That's how marketing works, basically.


And that why we (developpers, scientists, ...) hate marketing lessons :p


Carmack is constantly challenging himself. A trait common among masters. Louis CK is so good at comedy because he throws away all his material each year. Carmack used to practically write a new game engine every year. Now he's doing even more challenging work than that.


Arthur Whitney, author of A+,k and kdb/q always starts from scratch when he develops a new version of his db. From scratch means throwing away even the lowest-level routines like basic string operations and low-level file i/o.


And the source code is bizarre but... works for him.


If what you do over and over again is throw away your work with the goal of writing a better thing, then you can get really good at writing new things. It’s a clever way to think about practice.


Man i'd love to do that. Explore game dev with different languages. Try and code a fresh web game engine purely using webgl. Naturally i get busy with other things :)


He's certainly not afraid to leave his comfort zone and try something completely tangental, like making rockets or working with VR hardware.

He's always been humble, engaging, friendly, and most of all, passionate about what he does without becoming fanatical about it.


I think this has been a theme with him the past couple of years. He's gone into Haskell territory, and has recently been an extremely vocal proponent of immutable structures (coming from the man behind some pretttty fast programs, that's a nice cold shower for a lot of people), and generally higher-level thinking.

I think this is a consequence of a lot of these languages getting a lot nicer tooling recently, as well as other FP ideas being pulled into newer languages.


I like that he always tweets about his discoveries. Most of them you don't really understand, but some are common discoveries we all eventually make.

Either way, his tweets are always entertaining.


Back in the day it was always interesting to watch his finger posts before all this newfangled tweeting.


The only thing I've admired as much as his intelligence has been his child-like enthusiasm.


Exactly, and he takes time to consider your replies and reply himself. No hard feeeling, no shitty internet trolling, just peacefull argument. Refreshing !



It's a shame he ended up at Facebook of all the places he could've worked for - Space X, Tesla, Apple, maybe even Google X. Such a waste.


Technically he's at Oculus at Facebook. I think if the Oculus team was to go away he would probably walk. Not sure how this is any worse than Google X or Apple unless you think Google/Apple is intrinsically better than Facebook somehow.


I don't view it that way. If anything, finally some of Facebook's money is going towards an end I like. Remember that he is not working on Facebook, they just gave his company lots of cash and get to say they own it.


The guy's only 44! With luck, he's only at the midpoint of his career.


I really love that the majority of the responses to that tweet are of the form: Why not X? Use Y!

It's very illustrative to me of just how tribal we've become. As if it matters what language Carmack decides to use. I'm sure it's a boon to the Racket tribe that the others are now jealous of. To have the name recognition of John Carmack tweeting about your language! Imagine!

Racket is a fine enough language and ecosystem. I'm more curious about what he's building. Is this the VR-version of Facebook?


>As if it matters what language Carmack decides to use.

I think if we look at the history of adoption of technology, a lot of it is driven by the top 1% endorsing it. I think its a pretty big deal when high profile people endorse a technology. Social capital is as real as financial capital. Carmack has lots of social capital and it can get results. His celebrity helped launch Oculus from a weirdo company playing with 90s relics to a Serious Threat to The Status Quo and I'm sure drew in big investors and eventually Facebook's purchase of it.

>It's very illustrative to me of just how tribal we've become.

This is how we've always been and will forever continue to be.


> I think if we look at the history of adoption of technology, a lot of it is driven by the top 1% endorsing it.

I suppose this might be true but it's rather hard to quantify. I don't recall ever choosing to invest in and master a language from a celebrity endorsement. Some people might have -- I can't say. But it obviously does have merit because of the responses Carmack's tweet solicited so I don't disagree.

I just found that the majority of responses were of this patronizing sort. If Carmack is amongst the top 1% of programmers, as you say, and is a minor celebrity as we both know then it seems disingenuous to immediately ask him, "Why not Haskell/Erlang/Clojure/Whatever-my-favorite-X-is?" Given his previous essays on functional programming and his move to adopt C++ I think it's safe to say he knows what he's doing and picked Racket for good reasons (even if it's as simple as, "I like it."). I made the tribe observation when it became apparent to me that perhaps they were jealous that Carmack didn't pick their tribe and bring his celebrity power along to them.

I find that kind of sad and funny. I'm more curious as to what he's building than what language he's using to do it with. There are interesting things to talk about wrt the system he's building and the run-time he's building it on but that seems to go over the majority of peoples' heads. Even as a newcomer to the Racket ecosystem I think Carmack will have quite a lot to teach us as he develops this system: about Racket, the language VM, system architecture, his process, etc.

So when I said, "As if it matters what language Carmack decides to use," what I was implying was that he probably has reasons and it's more interesting to know what those are. He could have continued writing it in C++, Haskell, anything... it's what he does choose, as opposed to the multitude of choices he didn't make, that is interesting here in my opinion.


It is not hard to write code in Racket or Clojure; what's hard is maintaining it!

I've written a small Clojure project, maybe 3k loc. It was great fun. But when I go back to add small features I find it quite tricky. Bugs often sneak in.


I have an opposite experience with Common Lisp. Yes, I tend to make bugs in it, but I tend to fix them so quickly I don't really mind. SLIME is godsend.


I think maintaining old software is a known problem, regardless of the language. What about lisps make them more challenging in this regard?


Lack of types.

I don't want to start a discussion here on typed vs untyped so please consider everything I say to be qualified "It is only my personal opinion and experience".

I'm writing Haskell in my day job, Clojure for fun side project, OCaml because it's a nice language that's a bit underused and C++ because it's useful to know and not so evil as most people say (and it's progressing fast!). I should throw Rust into the mix because it might have a bright future.

WARNING: personal opinions and anecdotal evidence ahead!

Maybe there's a level of lisp enlightenment that I haven't reached yet but I can't just get by with writing lisp without writing tests. On the other hand I can get by with writing Haskell and OCaml without writing tests.

This is particularly true when I'm coming back to a project that I haven't touched for a few weeks. I change something and something breaks. In Haskell and OCaml I change something and compiler complains.

Maybe we'll see HaLispML one day.


The types objection may apply to Clojure (I've never used it, I don't know) but one of Racket's major development thrusts has been providing a way to get the benefits of both typed and untyped languages.

* Racket has a sophisticated contract system that allows you to enforce "type-like" properties at runtime very easily (e.g., you can say "This function should behave as an ((int -> int) -> int) function" and it will do all the necessary runtime checks to make sure that contract is honored as your program executes)

* It also has an optional modern type system, Typed Racket, that you can opt into on a per-module basis. Typed Racket modules can interact with untyped modules safely via contracts. The Typed Racket type system was designed specifically so that it's easy to migrate untyped, idiomatic Racket code to the type system, so you can write untyped Racket code idiomatically, and then go back and port your untyped module to Typed Racket with a minimum of fuss and get the benefit of the type system.


Clojure has Core.Typed, which lets you add optional typing.


Core.Typed is directly inspired by typed racket :)


Does Typed Racket give you a way to run your whole program through a typechecker without executing the code? Because catching type errors at compile time is a major benefit you get from a static type checker, that you wouldn't get from just runtime type checking.


Typed Racket operates on a per-module basis -- only some of your program has to be typed. But it's entirely static -- errors are caught at compile time.


As others have noted and in my own limited experience picking up OCaml -- a static type system does not solve the problem you need unit tests and functional tests for. They're good for readability if you explicitly annotate everything (and forgo the type inference) and they're good for flushing out a whole class of run-time type errors. But that's about all.

At this point in my learning it's more about pleasing a SAT solver than getting anything useful done... but I'm sure that will change with time.

Common Lisp does have a type system... it's just dynamic so you can hit things at run-time. This is great in development because it allows you to under-specify things that aren't terribly important (like types) at that time. However when you begin to find your code is ready to be locked in place you can annotate your function to hint to the implementation what the types should be. In particularly well-tested and heavily-typed code you can even turn off dynamic type-checking entirely for your production builds and get the performance boost from that.

Regardless of your approach and requirements, tests have a completely different use beyond ensuring type consistency. They set expectations, inform API design, and catch mistakes in refactoring; they act as a specification for the module under test. I've had plenty of OCaml code compile that still failed tests. Even in the presence of strong static typing you need unit, integration, and regression tests.

Just food for thought.


At some point, I thought OCaml would be the NBL since the writing part was fun and easy due to type inference (mostly, I hate using +.) and the refactoring felt safe.

BUT those guys are much smarter than us... so creating a language that combines the qualities of dynamic and static languages is probably a hard problem.


F# isn't exactly OCaml, but it's a close relative and... I don't know if it's the next big language, but it feels like it's one of them.


Haven't you heard? Javascript is the NBL :(


> On the other hand I can get by with writing Haskell and OCaml without writing tests.

Well, I never want to have to maintain one of your Haskell or OCaml projects. If you think you can get by with any language without tests you are wrong.

That being said, yes it is easier to deal with lack of tests in a staticly typed/compiled language than in a dynamic language like clojure/ruby/python/perl/etc.

The main point I'd like to make to you is you aren't complaining about Clojure per say, you are complaining about dynamic languages in general. It just so happens clojure is the one you are picking on.

Either way, you might like Shen (http://www.shenlanguage.org/) though it is very much academic and has few tools around it, it is very much a HaLisp, though i'm not sure about the ML part.


I'm a Scala developer and I use its type system to approximately its full potential and I prefer it to other languages precisely because of its static-ness. With a powerful type system you effectively eliminate certain problems from happening, like I once had a bug that I couldn't understand, yet I eliminated the possibility of it happening. And I prefer it precisely because I'm not smart enough for the problems I'm working on, so I prefer having a static type-system holding my hand.

That said ...

> On the other hand I can get by with writing Haskell and OCaml without writing tests

Tests and static type-safety serve different purposes. If you end up writing tests for properties that should have been inferred by a static compiler, then you're using the language in a wrong way or you've picked the wrong language for the problem at hand.

When working with a dynamic language, I don't need to write tests just to see that my code works. It's because I work with a REPL and in terms of happy paths, that's just as effective as having a static type system.

And surely having a compiler is very cool when refactoring, however we tend to miss the fact that (a) the kind of refactorings we are doing are very superficial and for architectural / design refactorings the compiler doesn't save you and (b) in a dynamic language there is less need for refactoring, because you don't end up modelling the whole world through types.

> I don't want to start a discussion here on typed vs untyped

I'm also thinking you're making a confusion. Dynamic languages can be strongly typed. A language Clojure is very much typed. The difference is in the moment those types are used, at compile time or at runtime.

This is important, because a dynamic language like Clojure can do optional typing when you want it. Of course, something like core.typed will never be as expressive and potent as Haskell's type-system, however this leads to gradual evolution - at first you don't have a well defined shape for the data you're working with, so you can enjoy the relaxed rules and protocols of Clojure and afterwards you can start introducing type definitions with core.typed or with prismatic/schema.

As I said, I'm a developer that leans on the static side of the argument, however this debate will never be settled simply because which tool is the best depends on the problems you're trying to solve, therefore people will never agree on anything, because people are always thinking from their "personal experience".


Have you tried typed Clojure [1]? It seems promising for bringing in types.

[1]: https://github.com/clojure/core.typed/


I tried core.typed and I think it's pretty cool. However, if you're using external libraries in your typed code you'll need annotations for those libraries too.


For one, static typing is optional in Lisp -- if it's even there at all.

If you want to write maintainable code and catch bugs early, that pretty much rules out dynamically typed languages.


Are you implying that it's impossible to write maintainable code in any dynamic language?


Dynamic languages rely on test suites to catch bugs. The test suite is also software that must be maintained.

Statically typed languages apply typechecking as a "test" to the entire program at load time.

People have written and maintained code in typeless assembler, including for mission-critical systems. It's just more work and requires a different kind of rigor. The larger and more complex your system gets, the more useful typechecking becomes.

"Maintainable" is not a boolean, it's a cost function.


In lisp you have the power to easily invent abstractions. Strip things down to a bare minimum and you can see that your code is good. And write some tests just in case.

OTOH in strongly typed languages with rich type systems the actual code might be uglier and more complicated because you're constructing a proof of something. The property might be week (tests needed!) or strong (tests? what tests). Depends how far the rabbit hole you want to go.

At least that's my view on this.


uglier and more complicated because you're constructing a proof of something

It's rare for the programmer to actually produce proofs that a piece of code meets the spec given by its type[1]. Those proofs are generally produced by the type checker, possibly with occasional hints from the programmer.

1: Even in Coq, detailed specs for some code are typically given (and proven) separate from the code itself rather than in that code's own type.


Not impossible, but in my experience it takes disproportionately more work than if you have a decent type-system to help you out.

In a dynamically typed system, every expression and line of code is a liability, and requires a large weight of tests to have any confidence that it might work.


Type errors in programs don't come up nearly enough to warrant languages completely dwelling on them so much. Now, say, consistent argument order does help a lot if you are writing your own libraries, but I wouldn't place the blame there on the language.


Type errors come in more shapes than the obvious one: e.g. Untyped data structures like dictionaries are a frequent source of error in Python (KeyError e.g.) or non-uniform lists. These are pretty common bugs, I'd argue, which simply cannot happen in statically typed languages like Haskell.


No, but depending on the size of your code base and your available budget, it can get prohibitively expensive really quickly.

Static typing eliminates entire classes of bugs. The cost of the up-front inconvenience to the programmer is tiny compared to the ongoing maintenance costs of possibly-incorrect, we-won't-know-for-sure-until-that-code-path-executes-in-production dynamically-typed code.

This is dependent on many variables but a good rule of thumb is that for any project bigger than a doddle, it's a safe bet to just fucking use a statically typed language.


I tend to agree. I've had great fun banging out programs in Clojure, but the lack of types makes coming back to fix bugs or extend the program quite unpleasant.

I'm doing a lot more Scala these days, which feels a lot easier to work with in the long run.


That has been my very limited experience as well, but with Scala I'm still things will get out of hand. The only "new" language I feel confident would not disappoint or surprise me negatively down the road is Golang.


What makes you think that? I have had the opposite experience with Scala/Haskell


The language is large, supports multiple paradigms and in general allows many different ways to express the same thing. That's great for short term productivity and/or small and tightly knit teams, but I worry about less ideal conditions. In theory a strict and well defined style guide should take care of that (I have used them successfully in C++), but I feel with Scala that will not be enough. Expressed differently: Scala is a great language for smart and careful programmers, but a ticking bomb for the less talented masses.

Golang, in comparison, is very transparent, WSYIWYG.


How many lines were tests?


I'll just assume that the answer to the question is expressed in points it earned.


Carmack has a secret affection for Lisps I think. A while back he wrote about programming in lisp on his iPad (which is nicely suited to editing S-expressions).

Anyway, Carmack has convinced me to take a look at Racket again. When I looked at it a few years back, it seemed nobody was using it.


> on his iPad (which is nicely suited to editing S-expressions)

How so? I'm curious.


There's a fun iPad app called Lisping that let's you play with different flavors of Lisp. Not really a programming environment, but it's sort of neat for trying out ideas.


App Store link, for anyone that's interested: https://itunes.apple.com/gb/app/lisping/id512138518?mt=8


How does that get around the iOS App store restriction on interpreted code?


I assume they got around that restriction by cunningly releasing it after that restriction was abandoned in 2010...


Ah, my inexperience with Apple's platforms is showing.


I belive that restriction was lifted quite a while ago, like in 2010 or something. It didn't last long as game engines required the use of scripting languages for game logic, among other things. Apple simply had to cave in.


Excuse my ignorance, but what about the new situation that all apps MUST use ARC. Does that interfere with lets say embedded Lisps in Apps etc...


Apple no longer allow use of Objective C garbage collection on the MacOS App Store (it was never supported on iOS). This doesn't effect third-party garbage collection and there's no requirement for third-party languages to use ARC (it wouldn't be practical for most of them, anyway, though there is an AOT-compiled Ruby that uses it).


Thank you!


You don't have to use ARC. You can reference count manually if you like. What you can't do is use the old garbage collection (which was only ever available on OS X).


And in fact you _can_ still use the old garbage collection if you really want to; you just won't be able to submit new apps to the OS X app store.


The restriction was reformulated to something like "you can't allow dynamic code loading over the net" or something. So as long as the app ships with the example snippets it runs and otherwise works only with user input, it should be politically correct.

Edit: reference: http://seattleclouds.com/ticketfiles/8665/ios_program_standa... 3.3.2 "An Application may not download or install executable code. Interpreted code may only be used in an Application if all scripts, code and interpreters are packaged in the Application and not downloaded." So in fact the Lisp app is also forbidden (but of course Apple is free to selectively enforce)


This seems to be done on a "spirit of the agreement" basis; they're okay with the various python REPL languages even though you can do things like eval(urlopen("http://bla.com/some_python.py").read()), for instance.


Ah, I didn't even think that far, I stopped at any REPL being a violation since it's running code which was not packaged with the app.


Is that still a thing? I thought they relented?


I think it might make more sense to say that S-expressions are less of a pain to edit on touch devices than most other expressions, as people have made nice tools for them.


Not really secret, there is even an article about how to try to make Functional C++ that he wrote.


Well the only other thing i know of which is written in Racket is HN. So not sure if it's any popular today either. IMO it looks like were getting too many damn programming languages which is something I'm starting to really dislike. Too much shit flying around forcing you to pickup some niche languages for a project, and worse having fresh developers being forced to learn like a billion languages instead of mastering 1 or 2.


While I understand your frustration with 'too many languages', Racket is older than quite a few (started in '95). I also think the danger of having this attitude is that if we all thought like this nothing would become better. If no one wanted to get out of C++ we wouldn't have had many languages.

The premise behind Racket, in broad strokes, is to provide a platform and programming language for making new programming languages. The most important identifier/keyword in Racket is '#lang' (specified at the top of a file to tell Racket which language the file is in). It's a whole ecosystem in which a multitude of languages exist and the only thing they really need to have in common is that on some level they speak Racket.

Personally, I think the above paragraph explains how exciting Racket as a language and platform is, but it doesn't sound that interesting on the surface. It does allow people to create things like this[0] and this[1], though, which displays real practical use; shaping the way you solve the problem to fit the actual problem.

[0] http://www.youtube.com/watch?v=oSmqbnhHp1c - Naughty Dog's scripting language [1] http://pollenpub.com/ - A web book publishing language


Learn scheme and you can handle all lisps, learn C/C++ and you can handle all the imperative languages, learn ocaml/haskell and you can handle the rest. That is 3 languages to master.


Until you come to Prolog, or APL/J/K and realise that none of these languages have fully prepared you either. And then there are other things out there too, that will leave you learning all over again.


APL and friends blew my mind when I read about them. I remember thinking "You can do all that with just those characters? That's the same length as the word function"! It's a good job I read this far down, I made a mental note to learn J but, as usual, forgot it by the time I got home. Thanks!


I have this ritual every year or so, where I try to write J, sigh, and go back to pedestrian F#.


I always do one-liner stuff in J, and I keep trying to get into F#. I recently bought Dyalog's APL. I thought the symbols would get in the way, but it's a whole other door opening. I really like the array-oriented languages for math and science. Even Julia and Numpy are attempts to do what the APL/J/K family have always done. I've always had Racket on my machine, before it was called Racket. Having an IDE and a good standard library right off the bat is great for beginners and dabblers like myself. I do not program for a living. I only program when I have a math or engineering problem to solve.


Racket is amazing, I agree. :)


C/C++ is not like all imperative languages. Basic, Go, Python - those are more bog standard imperative languages. C/C++ are off in a world of their own with implementation specific behavior (compiler, arch, ...), undefined behavior & memory safety.


If you know C/C++, the concepts of Go and python should be easy. Especially since you understand how python itself is implemented.


I just pity the poor person who learns C++. I pity myself for having learned it.


Why? I think following a 3/4rd year university course on C++ was one of my best investments ever.

Nearly every imperative language is a walk in the park after you know C++, and you can still write C++ for performance-intensive code (if necessary with expression templates et al.).


Totally agree. Also, I'm very happy I learned C++ as my first language (after little bit of dabbling in QBasic, Pascal and VB in primary school). When you're learning programming for the first time, you have no reference points. You either learn it or don't. So I learned C++, not once thinking it's difficult or complicated, and it became my reference point, making other imperative programming languages trivial to learn.


You forgot Prolog, and maybe Smalltalk (although CLOS may help or hinder you in understanding message parsing/"proper" OO)... but otherwise, yeah pretty much.

[ed: and perhaps PostScript/Forth...]


Yeah, forgot about logic programming. There are some specialist languages too, and hardware languages like VHDL.

Overall, I think the more languages that exist, the better. Some people think we should be moving towards more domain specific languages.

There is a balance between using the best tool for the job, and getting a large potential developer base.


I don't get the argument. Can't I say then "learn Logo and you can handle all programming?" Because you are defining an arbitrary cutoff of functionality and detail further which there is no effort involved in "handling" new languages, based on your experience, I suppose.


>IMO it looks like were getting too many damn programming languages which is something I'm starting to really dislike.

Are there more now? I remember a zillion niche pet languages from forever ago.

Now a days I see less, probably because it's not in-style to write an in house scripting language for every single new project, they tend to just use a language that is nice off the shelf.


Hacker News is (was?) written in Arc, no?


Correct, HN is written in Arc, a dialect of Lisp.

But, the Arc compiler is written in Racket[1], and internally outputs Racket code[2]. "mzscheme"[3] is an early version of PLT Scheme, aka Racket.

[1] https://github.com/arclanguage/anarki/blob/master/ac.scm

[2] http://arclanguage.org/item?id=14608

[3] http://docs.racket-lang.org/mzscheme/


IIRC the Arc interpreter is itself written in Racket.


Always interesting to see what Carmack is up to in Lisp land. It's particularly interesting to me because he seems to have some sort of Lisp guilt. Every time he talks about it, he has to mention that he's not sure if it's useful for "serious" projects because it's not statically typed or because it might not scale.


I remember him (either in the QuakeCon 2013 talk, or one of his writings) mentioning that all these years he was so busy he never got a chance to have his lisp enlightenment. I think he might be having it now.



I've been vaguely aware of Racket, but I haven't paid much attention to the details.

If I'm reasonably familiar with Common Lisp, what are the main differences/advantages of Racket to pay attention to? What's the best resource (preferably online) to use to learn about Racket?


Two things spring to mind, there are lots more I'm sure.

Racket's macros are unlike any other. They are better thought of as a lightweight compiler API. Check out http://www.greghendershott.com/fear-of-macros/.

Also, the whole #lang framework is a pretty unique tool for experimenting with and extending the language, see http://docs.racket-lang.org/guide/hash-languages.html. For example, used to build http://www.greghendershott.com/rackjure/.

Greg Hendershott does some great writeups, there's more to find on his site.


I think the main difference is that Racket is a Scheme, and Common Lisp is, well, Common Lisp. Racket has some great tooling like their IDE (Dr Racket, already mentioned).

Really it comes down to the old CL vs scheme thing, which seems to boil down to "has almost everything you need already but is huge" vs "lets you build anything you need yourself and is tiny." However, I think this is less so with Racket because, while I haven't used it much, from what I know it's more of a batteries-included scheme used for Getting Shit Done (and does a good job of this).

If I was going to use a scheme and I didn't have the requirement of needing to embed it, I'd probably go with Racket.


Other than what's been mentioned by others, Racket also comes with a nice way of bundeling up different languages/dialects (like typed racket for a strongly typed variant). See eg:

http://docs.racket-lang.org/scribble/

I suppose it isn't technically such a different thing from other lisps, but like most of Racket -- it's pretty well thought out, and actually works really well.

[ed: as per my other comment, see also eg:

http://pkg-build.racket-lang.org/doc/sweet/index.html ]


Racket was PLT Scheme before they renamed it, so it's got a very academic focus, documentation focuses on the DrRacket (formerly DrScheme) IDE, and it's got a kind of old-school Lisp feel.

Their documentation is great, though, so it's a good place to start: http://docs.racket-lang.org/


I've enjoyed the following resources:

SICP - http://mitpress.mit.edu/sicp/

HTDP - http://htdp.org/

Dan Grossman's Programming Languages Coursera course - https://www.coursera.org/course/proglang

The Racket docs - http://docs.racket-lang.org/

Chris Jester-Young's StackOverflow answers - http://stackoverflow.com/search?q=user:13+[racket]

This list won't be as useful to you as it was to me, because you already know Common Lisp, but hopefully other readers will find it interesting.

NB - the ProgLang Coursera course features Racket alongside SML, Ruby and quite a lot of material that will seem very basic to experienced programmers, but it's a really good introduction of some of its key features. I think this is probably my weakest recommendation to someone who already knows Common Lisp (or similar) and my strongest recommendation to someone who does not.

Beyond that, the default DrRacket IDE comes with a whole load of teaching resources bundled by default, including (iirc) resources to help build a game in Racket - this package I think http://docs.racket-lang.org/teachpack/2htdpuniverse.html.

As for differences/advantages of Racket, I would say that there are few that I'm aware of beyond the simplicity of learning the language, the great tools, and the ease with which you can get up and running. Nothing inherent to the language that I'm aware of, and I would suspect that Common Lisp is great if you're already part of that community and 'know where everything is' so-to-speak. I've heard that CLOS is better than Racket's OOP abilities.

I really do love the language, though, it's one of the easiest and most joyful experiences I've ever had with a programming language. Just maybe not a necessity if you already know CL.


As other have said, Racket is a 'batteries included' scheme. The two things I miss from CL are CLOS (and no, tinyCLOS doesn't count) and loop/format.

Their pattern matching is easier to extend than optima though.


> What's the best resource (preferably online) to use to learn about Racket?

You could work through SICP using Racket instead of MIT/GNU Scheme. I'm currently in the midst of this and the differences thus far have been relatively minor (e.g. Racket doesn't have `inc`, `dec` or `nil`).


> e.g. Racket doesn't have `inc`, `dec` or `nil`

I'm assuming those are trivial to add?


Yeah, you could add 'nil' easily.

inc and dec I assume mean increment or decrement (I have no experience with CL) - racket/base includes (add1 .) and (sub1 .) for the same effect. If they didn't exist they would be trivial to add I think - if this isn't what inc and dec mean then I apologise.

In Scheme/Racket truth and falsity is really simple - #f is false and everything else is truthy. There's also already a null value for the empty list (which is 'true'). I don't know if nil would be useful.


And it looks like someone's already done it: http://planet.racket-lang.org/package-source/neil/sicp.plt/1...


I dunno it looks to me like they've just defined nil to be equal to null/empty, which is not exactly the same.

In Common Lisp, unlike Scheme/Racket, nil is the null value AND the false value. In Scheme/Racket, 'nil is true unless you define it to be #f. If you wrote an if or cond expression to evaluate the 'nil in the link, I think that it would return true, whereas in CL nil is falsy (I think it is the only falsy value in CL, but I would still describe it as 'falsy' rather than false, possibly incorrectly).

e: Ah, I see why I'm not understanding you now - you're talking about the differences between Racket and SICP/Scheme whereas I made the assumption that we were talking about the difference between Common Lisp and Racket (from OP's comment). To further clarify, I believe this was my fault in comprehension, not yours in communication.


> I dunno it looks to me like they've just defined nil to be equal to null/empty, which is not exactly the same.

But it is how nil is defined in SICP:

'The value of nil, used to terminate the chain of pairs, can be thought of as a sequence of no elements, the empty list. The word nil is a contraction of the Latin word nihil, which means "nothing."' -- http://mitpress.mit.edu/sicp/full-text/book/book-Z-H-15.html...

And then in a footnote to that paragraph:

"It's remarkable how much energy in the standardization of Lisp dialects has been dissipated in arguments that are literally over nothing: Should nil be an ordinary name? Should the value of nil be a symbol? Should it be a list? Should it be a pair? In Scheme, nil is an ordinary name, which we use in this section as a variable whose value is the end-of-list marker (just as true is an ordinary variable that has a true value). Other dialects of Lisp, including Common Lisp, treat nil as a special symbol. The authors of this book, who have endured too many language standardization brawls, would like to avoid the entire issue. Once we have introduced quotation in section 2.3, we will denote the empty list as '() and dispense with the variable nil entirely." -- http://mitpress.mit.edu/sicp/full-text/book/book-Z-H-15.html...


> e: Ah, I see why I'm not understanding you now - you're talking about the differences between Racket and SICP/Scheme whereas I made the assumption that we were talking about the difference between Common Lisp and Racket (from OP's comment).

Apologies for the confusion.

In that case, I don't think it's possible to define an exact equivalent of CL's nil in Scheme, as nil means both false and the empty list in CL -- the latter being falsy in CL and, as you point out, truthy in Scheme.


you might be interested in http://try-racket.org . Quick online tutorial. Haven't worked through it myself, but spent 30 seconds with it to make sure I wasn't embarrassingly wrong.


> but it is winning for development even as a newbie

Not sure Carmack qualifies as a newbie when it comes to anything programming-related.


In my opinion anyone can be a newbie once taken out of their comfort zone.

John Carmack was mostly a C programmer up until quite recently and then started experimenting with some Functional Programming languages, and I'm sure he had a few things to learn when doing so.

Not to take away from an obviously intelligent individual.


Doom 3 was written in 2004, and was in C++, not to mention Carmack's other scripting langauges and VMs he wrote for his other game engines .. not to take anything away from your comment.


The C++ used in Doom 3 was basically C with classes. Even Carmack says it's not his best work since he wasn't proficient in it and didn't take the time to learn it correctly.


I would be willing to bet he's still eight years behind the times with his C++, and that is the explanation for this post. Writing a solid and flexible server in C++ is just not very hard with modern tooling.


I think perhaps you've forgotten all the ways to shoot yourself in the foot...

C++ may be an effective language for experienced users, but it's got quite the learning curve (beginning with how exactly DO I write a correct constructor in the presence of exceptions...). Every time I have to work on C++ code I dig out my Meyers book to make sure I'm not fucking something up. C is small enough that I don't have to perform the same ritual.


I think there is space for a programming between C and C++. C - as much as I like it - makes it far to easy to shoot yourself in the foot. C++ fixes some of these, but adds a whole new arsenal of appendage-mutilating features that even many experiences C++ programmers admit takes a long time to master.

Is it possible to create a language that fixes what C gets wrong while keeping what C gets right without becoming C++? There have been a number of attempts and none seem to have stuck, except for Go (IMHO). Java tried, and it did not fail altogether, IMHO, but it comes with yet another set of problems, mainly a tendency to lure programmers into over-engineering their solutions (the same, IMHO, goes for C#, which is kind of a sweet language at it core, but that becomes very hard to see among to tangled mess that is the .Net framework).


What would you consider "modern tooling" for writing a network server in C++? Boost Asio, perhaps?


Asio is good but can frustrate people unaccustomed to the heavily templated boost approach. High learning barrier. cpp-netlib aims to wrap and simplify Asio for people doing the typical stuff, but I've no experience. POCO, Qt, and libevent are all good approaches.

Having followed Carmack a bit over the years my guess is he built something from scratch himself in C++.


> Not to take away from an obviously intelligent individual.

It's interesting to observe how programmers and other similar intellectual workers prostrate themselves in their own specific way when referring to someone of high status in their field.


[deleted]


lots of editing on your post .. I reckon it takes more effort to be politically correct and toe the popular line than to just state the obvious truth : All across the globe, software engineers of value do amazing stuff everyday. They're too busy doing the amazing stuff to be praising someone for doing what they do on a daily basis. The ones that aren't doing amazing stuff can be found prostrating themselves to every tweet some high profile person makes because they don't grasp the normalcy of coding at that level on a regular basis. Sadly this idolization is what is probably the biggest impediment to them progressing to the level of the person they praise.

For every Carmack, there's 1000s of unspoken/unheard of people doing the same thing. I try to think of them, myself, and not get too excited...

Steve Jobs was Steve jobs because he thought different. You're not going to turn into him or Carmack by trying to emulate them or by following tweets and prostrating yourself.

P.S - the only reason why I commented is because I have friends who constantly give me updates about what Carmack is doing as if they've just had an encounter with God. I can't help but think they are this way because they themselves have never coded at that level. Sadly, you don't get to that level by going Gaga over another programmer. You get to that level by understanding that there is nothing amazing about what Carmack/etc does, digging into something you care about, and pushing your own personal limits... People don't want to hear these kinds of things though as it necessarily compels them to action. They'd much rather rally around the superstar and put him/her on a chariot. Vicarious living?

P.S.S - Constructive discussions can occur on a topic easily as long as individuals aren't over hyping it and are actually capable of conversing at a similar level. Fanfare is just as harmful as it makes people believe there is some mystical magic to achieving things. There isn't.. Just dedication and passion.


For every Carmack, there's 1000s of unspoken/unheard of people doing the same thing.

Well, okay, then, there must be millions and millions of programmers writing CRUD apps and for loops, because that covers probably 95% of the general value created in programming worldwide.

I don't see anything wrong with being impressed by the accomplishments and interests of others. I'm not Carmack. I'll never achieve anything like his level of productivity. I don't care, but I'm still vaguely interesting in reading about the guy.

I don't plan to climb Mt. Everest either, but I wouldn't mind hearing a story about it. And yeah, there are thousands of people planning Mt. Everest climbs, but that has zero bearing on my feelings about it.


I've deleted my post as I think it was incorrect.

Having worked with Carmack's code, two things I liked about it: it always came across as being the first obvious thing that popped into his mind, and yet it was always somehow the right thing.

Sadly, I myself can only manage one out of two.


I tend to observe that seasoned/good software engineers are too busy tackling problems of similar difficulty to be caught up praising, following, and prostrating themselves to others. It's not a habit of mine and I don't hold anyone who does it in high esteem. I tend to think someone praises another to a large extent because they don't hold themselves in high regards. If they did and were of similar skill level, it wouldn't seem a big deal to them right? i.e : Tweet (high profile software engineer) : Today I cut 500ms out of a 4 second operation on a consumer device Me : OK .. last week I cut a 10 second switch-over operation on a enterprise switch (grosses 8 billion dollars in revenue a year) down to 300ms. I thought that was just a part of my job as a software engineer.


I think that may be the most annoying post I've read on Hacker News this year.


Be nice to people, be a human being. Good lord.


The upvotes and downvotes on this post has been a true roller coaster. EDIT: someone is fast on the vote-trigger; hello, friend. ;)


Task-Relevant Maturity is a management term which recognizes that maturity (the scale that “newbie” is one end of) is fairly isolated to particular tasks/contexts. Somebody can excel in their comfort zone (e.g. C++ for Carmack?) but be a newbie in other areas until they gain that specific TRM.


Compared to his C++ wizardry, he probably counts as a newbie to Racket.


I've seen a team of incredible developers struggle pretty hard when we first used Clojure on a project.

It always takes time to learn something new. Especially the fundamentally different way of working between OO and Functional programming.


Some people find it hard to reason in passing around lamdas and functions as arguments to other methods... I see this with people coming up to speed with JS (node in particular)... Using tools like Ramda, and node streams/pipes feels really natural to me... I think it just depends on how you see things... I always feel that deeply OO patterns usually just add complexity to things that can be much simpler.


Not to mention, the test of a platform doesn't come in the first hour, it comes after shipping, some Saturday morning at 4am. And then over the course of the next ten years. Not saying Racket is bad, but first impressions usually give way to time.


I don't wan to call tha false modesty but damn when JC is using the language how "newbie" can you be?!?


> John Carmack ‏@ID_AA_Carmack 3h3 hours ago > @touristtam If I run into catastrophic perf problems, I may try rewriting in Go.

he apparently has lots of fun with languages, very curious what kind of server he is writing now.

He also spoke highly about Haskell before.


Hi I did ask the following:

> thoroc ‏@touristtam 12h12 hours ago > @ID_AA_Carmack what guided your choice of Racket vs Rust and Go? > Just curious. :)


I've always thought Racket to be a highly under-rated language.


I use it for everything I write. It brings the joy back after having to maintain a Java project in my day job. Well written racket code brings so much satisfaction :)


I wonder what kind of server it is. What is it's purpose?



...* after reading some of the comments here *...

I think we would be better off caring a little bit less about what language to use and a little bit more about what programs to write.


Had to read that twice. Was pretty sure he meant "RakNet" the first time ;)

Oh, and just gonna put this out there: http://benchmarksgame.alioth.debian.org/u64q/compare.php?lan...


Then again: http://benchmarksgame.alioth.debian.org/u64q/compare.php?lan...

and Erlang is certainly a valid platform for servers, so as usual benchmarks are just benchmarks... would be interesting to see if typed racked made a difference in these tests -- although I'm not sure if performance is the main focus of typed racked (as opposed to just type safety).



If you look at the graph, you'll see "possible mismatch - one-core program compared to multi-core program." there.

And indeed, of the ten benchmarks, four of them are singlecore racket vs quadcore Go. Single core-comparison is much less dramatic, though Racket is still slower.


Well done!


>May not scale, but it is winning for development even as a newbie

Sounds like Python.

Works great at the start, and then one day you wake up and realize you are running 50K ppl online at the same time MMO (Eve Online), your code cant take advantage of multi core CPUs = every game region (star system) starts to lag above ~500 people and there is nothing you can do about it.


Honestly, just about anything is going to be faster, productivity-wise, than C++. When you stop having to think about how you're going to structure your inheritance and classes, you--shockingly--get things done. But more metaprogramming is not the answer. And I've used Scheme since 2002 or so. I'm implemented Scheme interpreters and compilers. But I'm no longer a cheerleader for macros and call/cc or Scheme (or Lisp, for that matter).

You know what gets things done and makes things easy to maintain? Boring ass code. IF statements. FOR loops. I mostly use Perl today. It doesn't get in the way. But getting things done is not trendy. That's where we are today.


You know what's the problem with boring code? It's boring. This means its information content is low, and its abstraction level is low. This means that you need more of it to express an algorithm.

When you have a lot of wordy, boring code to maintain, you have to make coordinated changes in more similarly boring places. A human's brain can only keep that many lines of context. So it becomes easier to make a mistake.

I understand that abstraction astronautics can leave you with puzzling, convoluted, hard-to-maintain code full of leaky unintuitive abstractions. This problem is not unique to Lisp macros; languages like C++ and even Java are known to be widely used by perpetrators of the above-mentioned atrocities.

What makes code easier to maintain is clear separation of concerns and low impedance between code's abstractions and the subject area. This is, again, attainable in a number of languages (though expressive power and minimalism help make it even nicer), given the right mindset and skills. I suppose John Carmack possesses both.


I recently watch a colleague write an elaborate system to parse a few different CSV feeds. There are a dozen different interfaces and mixed in with all the lovely Java design patterns.

I'm beginning to use a phrase that I'd rather deal with poorly written code than well planned architecture. Obviously by well planned architecture I'm referring to overly architected solutions.


When it comes to architecture, I've lately turned towards BCNF as my god. My premise is that if my data model - my internal application data, not just "the database" - is as in as normalized a form as I can reasonably get it given typical constraints of procedural/OO/functional styles, my features automatically grow into a flexible and decoupled grain because they're operating on exactly the right slice of data, no more, no less. "Guess and check" and "OO design pattern" strategies don't seem to get me there because they tend to start with whatever is language-easy or looks pretty at first glance, and then take on the problems later. And it seems to work - the thing I have right now is, indeed, incredibly flexible for the amount of code involved. And it isn't really "architected" in the usual sense otherwise - there are no grand plans.

The only problem I'm having with this tack is that it reveals all the technical debt at once, which produces an enormous amount of pain early on. My friends smirked at my woes today of trying to make a clickable button, which has to piece together stuff from the graphics layer, input events, text fields, and internal button state. An enormous variety of data, altogether, with the debt usually hidden from view at some level. It all makes sense, it's all decoupled, the lifetime of the state is automatically managed, any configuration you want will just be a matter of making the data for it. But making that first button is quite a headache.


I had a funny feeling doing a SQL MOOC when I had to re-learn normalization, and how it was a very generic decoupling algorithm. Suddenly all OOP became tiny and ad-hoc.


Would you mind telling me which MOOC you did for SQL? I am quite rusty - (~10 years since I did any serious SQL stuff) but I am finding it is coming up quite a lot now for me.


IIRC it was Stanford's (I have memories of a mainly red interface)

http://www.erictimmons.com/node/18

I don't know if it qualifies for serious, I'd say challenging enough, but it was great to revisit with another university material.


Thanks for that. In case anyone else is interested, they now have it set up as a self-paced course here:

https://class.stanford.edu/courses/DB/2014/SelfPaced/about


A well-planned architecture is often the smallest thing that works correctly. A well-architected car is unlikely to have 37 wheels (though a poorly-built one might).

You know, the ideal device is that which is not even there, but its function gets executed. This ideal is rarely attainable, but it's something to crave for.


Very much this. Refactoring simple code to handle more complicated situation as it develops is so much better than pre-engineering for possibilities.


This! I joined a new company recently to build out the systems. Instead of trying to predict the future and build for it, I just went ahead and built a bare minimum architecture and used TDD for it while doing so. The start was a little slow, but now when I get requests to change things entirely (eg - an entire segment of logic was requested to be shifted into the database for an administrator to manage its behaviour), I get it done fairly quick.

On a side note... Uncle Bob is my hero.


You know what's the problem with boring code? It's boring. This means its information content is low, and its abstraction level is low. This means that you need more of it to express an algorithm.

When you have a lot of wordy, boring code to maintain, you have to make coordinated changes in more similarly boring places. A human's brain can only keep that many lines of context. So it becomes easier to make a mistake.

A problem nicely summarized by Yaron Minsky (of Jane Street): "You can’t pay people enough to carefully debug boring boilerplate code. I’ve tried."


You know what's the problem with boring code? It's boring. This means its information content is low, and its abstraction level is low. This means that you need more of it to express an algorithm.

Code may also be boring simply because it is unsurprising for someone familiar with the subject matter.


"You know what gets things done and makes things easy to maintain? Boring ass code. IF statements. FOR loops." I think you are channeling some of the Go philosophy there :).


Or C philosophy perhaps?


The C philosophy is: an easier alternative to writing assembly.


"Go philosophy" apparently being an absolutist and unwavering belief in One True Way To Actually Get Stuff Done.

It's surprising that a philosophy that tries to promote simplicity also manages to come across as so elitist at the same time.


If there were multiple accepted styles, the code wouldn't be as boring. But it is, and to some of us, that's a good thing.


Who's talking about code style? Some Go users like to talk about Go as if anyone who doesn't like it just dosn't "get it", or that they obviously don't appreciate Getting Things Done.


Racket is quite a lot more than call/cc and macros.

By the way, FOR loops invite off-by-one errors and worse. Use a combinator like map, or filter or foldr etc, to keep things boring.


There are also the extremely "boring" & practical list comprehensions / iterators within racket itself:

http://docs.racket-lang.org/reference/for.html

eg. fizzbuzz using for, match:

    -> (for ([i (range 1 16)])
        (match (list (modulo i 3) (modulo i 5))
          [(list 0 0) (displayln "fizzbuzz")]
          [(list 0 _) (displayln "fizz")]
          [(list _ 0) (displayln "buzz")]
          [_          (displayln i)]))
    1
    2
    fizz
    4
    buzz
    fizz
    7
    8
    fizz
    buzz
    11
    fizz
    13
    14
    fizzbuzz


In for, in-range is faster.

http://docs.racket-lang.org/reference/sequences.html?q=in-ra...

An in-range application can provide better performance for number iteration when it appears directly in a for clause.

    -> (for ([i (in-range 1 16)])
        (match (list (modulo i 3) (modulo i 5))
          [(list 0 0) (displayln "fizzbuzz")]
          [(list 0 _) (displayln "fizz")]
          [(list _ 0) (displayln "buzz")]
          [_          (displayln i)]))


Oh, thanks for that :)


For loops in perl have two styles. The first is the c-style loop, the other is a map. One often uses the latter style a lot more often.


That, and Perl has an actual `map` function too:

  map { $_ + 1 } (@list);


Sometimes Perl can be a beauty:

  sub sum_of_squared_pairs {
    reduce { $a + $b } map { $_ * $_ } grep { $_ % 2 == 0 } @_
  }

  sub schwartzian_transform {
    map  { $_->[0] }
    sort { $a->[1] <=> $b->[1] } # use numeric comparison
    map  { [$_, length $_] }     # calculate the length of the string
         @_
  }


Yes, but _ has dynamic scope (I believe). That's very dangerous in general.


Carmack seems to go through his FP phase. A phase that many programmers go through in their younger years (Carmack must have missed it, because he was occupied with Keen, Wolfenstein, Doom and Quake at that time), when they read SICP, learn Scheme, ML, etc. before the novelty wears off and they come back to plain old imperative, mutable programming.


What you're not seeing is the successes who acheive escape velocity. They don't come back, ever.


Ha. JWZ of XEmacs and Lucid fame nowadays uses Perl for the little tools he writes. I hate to admit it, but I think you're right.


I came back to Perl about 18 months ago, after working as a sysadmin for a couple of months and being fed up with Python's unicode handling (Python 2.x, I haven't given Python 3.x a try, yet).

I do not think Perl is a pretty language, but I have come to appreciate how useful it is. If all you want is a smallish application (roughly, less than 1 KLOC), especially if you're only going to use it once or maybe a handful of times, no other language I have met can keep up.

And for the kind of problem I typically use Perl for - reading, say, a CSV file or an Excel spreadsheet, filtering the data according to some criterion, fetching and adding data from an external source, say, an LDAP directory or a relational database, then inserting the result into a database or emitting another CSV file - it is also surprisingly hard to beat Perl's runtime performance, especially its regex engine. I'm not saying it can't be done, but for a program you're essentially throwing away after a week or so, it's usually not worth the hassle.


Honestly, just about anything is going to be faster, productivity-wise, than C++.

I don't think that's been true for a while. Boost went a long way towards making C++ much more productive, and now that's gone even further with C++11 and 14.


Compile times are still terrible, there's still a terribly high number of causes for undefined behavior that will burn through your time in debugging sessions, and there's terribly far to go still before it's feature list catches up to "just about anything" (albeit a slightly smaller set this time around.)

Things are improving in C++-land, but I'd still place it near last.


> Honestly, just about anything is going to be faster, productivity-wise, than C++.

I can only assume you have not seen 300+ deep stack traces in lasagna java programs. Or GWT.


I guess this is where the anti-FP people forget their staunch scepticism for a second because a respected imperative programmer uses an FP language? Regular functional programmers haven't been able to change their view, but I wouldn't be surprised if all it took was one tweet from the right person.

Other than that, I don't see what else there is to say about this. 'Dropped some C++ for Racket server: may not scale but is more productive'. That's the most standard high-level vs. low-level dichotomy.


> I wouldn't be surprised if all it took was one tweet from the right person.

Well, the "Argument from Authority" is called out because of when it's misused (kinda how "experts are always wrong". No: those are only the times you remember), but it is a fundamental way of how social humans form opinions.


> , but it is a fundamental way of how social humans form opinions.

Wow, that's amazing: people form opinions in part based on how much they respect/trust someone. Consider my cynical views totally and irreversibly changed.

Then there are those times when it is taken too far: like 115 points on HN for a pithy message like "rewrote to another language".


A random Carmack tweet is news now?


Carmack is an interesting fellow, especially interesting to the types that frequent this forum.

What he is doing is thus interesting to this forum though maybe as you pointed, not news worthy.

But since when does everything has to be news worthy?


So many brackets. Resolving mismatched brackets seems to be just about the most pointless developer activity possible.


If you edit s-expressions as text then yes, it's horrible.

If you edit s-expressions as data structures using something like paredit you'll actually code very quickly, and it'll also be impossible to have unbalanced parens.

Say you have this (| = cursor): (a b |c d)

If you type ( you get balanced parens: (a b ()| c d)

If you press Ctrl+Right twice you slurp in c and d: (a b (c| d))

If you then press Alt+Up you get back to: (a b c| d)

As you can see, you manipulate code on the level of data structures instead of manually placing parentheses, and you are actually prevented from making unbalanced parens in paredit. You can likewise move through code in ways similar to moving by word or paragraph in vim vs moving by character, but I only showed basic editing above.

I won't downvote you because I understand your complaint. But the problem is that you are unaware that you are using the wrong mode of editing for s-expressions. :) It's like editing photo using a hex editor: possible, but very much suboptimal.


I know right, I totally hate having to match Map<String, Map<String, Object>> foo = someMethod(withMethodArg1(), andTwo());

Let's all just use Forth, and implement washing machines as simply as:

    : WASHER  WASH SPIN RINSE SPIN ;


Lisp uses the same number of parens as most languages per call – the only thing that's different is whether the paren goes before or after the first part of the call.

    (print "hello") 
vs

    print("hello")
So to the extent Lisp is paren-heavy, it's more a stylistic thing. Lisp programmers tend to chain up calls more.


> Lisp uses the same number of parens as most languages per call [...]

Indeed. If you want less parens, use Haskell or Forth.


    (-b + sqrt(b*b - 4*a*c)) / (2*a)
    (/ (+ (- b) (sqrt (- (* b b) (* 4 a c)))) (* 2 a))
It does depend on your use case.



Lisp only has infix macros so that we can say "we have that".

Nobody in their right mind uses this stuff in production code.

It just overcomes objections. "Oh, if I start using Lisp, there is be a way to use infix, should I really need it". Ten years and six Lisp project later, you still haven't used the infix stuff; the situation never comes.


> Nobody in their right mind uses this stuff in production code.

You sure about that? I thought the lispy approach was generally pragmatic - you use what you deem handy for your application. It this weren't the case, there would be little need for macros in the first place. I can very well imagine, say, a scientific or engineering application that would share a common infix parser for both user-provided expressions (in the UI, to be more friendly to non-lispers) and heavy math lifting in the source code.


Clearly you didn't even look at the links provided. :)


But! Perfect splitting across lines according to a formatting algorithm which is simple, consistent, and incrementally applicable:

1.

   (/ (+ (- b) (sqrt (- (* b b) (* 4 a c))))
      (* 2 a))
2.

   (/ (+ (- b)
         (sqrt (- (* b b) (* 4 a c))))
      (* 2 a))
3.

   (/ (+ (- b)
         (sqrt (- (* b b)
                  (* 4 a c))))
      (* 2 a))
4.

   (/ (+ (- b)
         (sqrt (- (* b
                     b)
                  (* 4 
                     a 
                     c))))
      (* 2
         a))
Now you have a sideways tree, revealing the structure of the expression, where it is immediately apparent what the operands are of the / and the + and so on.

Infix turns into a mess breakfast when it's too long for one line.

For this particular expression, I'd probably go with variant (3) in production code. Compared to the beauty of (3), the original one-liner is basically a strawman. In terms of clarity of structure, it trumps the infix also.

This is actually a very important point that is overlooked by Lisp noobs. In real Lisp code, expressions are not written all out in one line, whereby the human reader must mentally match the parentheses. Even numeric expressions that might be one-liners in Fortran or C, are split across several lines to make at least the major constituents clear in relation to the major operator.


http://srfi.schemers.org/srfi-105/srfi-105.html

    {-(b) + {sqrt((b * b) - (4 * a * c)) / (2 * a)}}


>It does depend on your use case.

Yes, and the first line's use case depends on the language built-in operator precedence rules to reduce the number of parens.

If you are using math formulas as an example of minimal paren usage, go with APL and have even fewer since all operators are equal precedence and associate to the right.

If you really wanted to write the quadratic formula (or math formulas in general) you could use APL and use even fewer.


    b fnegate b b f* 4 a c f* f* f- fsqrt f+ 2 a f* f/


Executable line noise.


Your second line hurts me inside.


> Lisp uses the same number of parens as most languages per call

Furthermore, Lisp uses zero parentheses for grouping in order to override precedence. These parentheses don't exist in Lisp.

In print(2/(2+4)), we actually have two kinds of parentheses, because two different grammar rules use the same token.

C has even more parentheses. The parentheses in for (;;) are not the same as those in 2/(2+4) which are not the same as those in (double) p, which are not the same as those in p(42).

Lisp has parentheses that do one darn thing in the read syntax---at least when they are not literal as in "(" or #\(.


What about a paren-less calling mechanism?

main = putStrLn "hello"


Possible, and nice. But you lose the ability to easily specify a variable number of arguments.


Shell languages seem to do just fine using spaces to easily specify a variable number of arguments.


Yes. I forgot to mention that Haskell also supports partial application. Shell doesn't, so that's easier.

And that's just talking about syntax. The type system complicates matters further.


Well, there's always the Smalltalk approach.


> So many brackets. Resolving mismatched brackets seems to be just about the most pointless developer activity possible.

It's precisely the other way around: finicky syntax issues consume far less time in Lisp. This isn't just a surface thing—when syntax occupies a block of resident memory in your head, you have that much less capacity to spend on the problem at hand.

It takes a while to adjust to a more regular notation, but that's true of anything unfamiliar. And what you get in return is astonishing.


Unsurprisingly, working with balanced brackets is something computers are well suited for. Even in Vim, an editor not exactly built with Lisp development in mind, handing mismatched brackets is pretty trivial.


Agreed, I do all my lisping in vim and have for years. Mismatched parens has pretty much been the least of my worries from day one.


While that's a bit of an acerbic way to put it, I agree. The human brain has a very short stack depth, so we're no good at nested pattern matching.

Indentation or delimiter based languages are much easier for me to parse than lisps.


As they say, “If you think paredit is not for you then you need to become the kind of person that paredit is for.”


One of the reasons I'm falling in love with Rebol and Red. Very Lispy but with out the hassle of the parens.

Brackets, yes, but not nearly as bad.


The usual solution is to indent your lisp code.


Why repeat myself? If the indentation contains the information, why can't I just use that?


The parens are the single point of truth, the indentation is just for display and is (almost entirely) mechanically generated.


Because then you end up with thinks like Python's one expression lambda.


What I hate is when my delimiters are unmatched and I can't tell where. I haven't used Lisps, but it bothers me a lot even in C-like languages.


Generally things like paredit force brackets to never go unmatched. There is also rainbow colored parens.


There are good text editors and IDEs for LISP-like languages that will help you matching braces. It stops being a problems very quickly. Also, like in other languages, you can make the structure of your code clear by indentation (good editors/IDEs will help you with that, too, of course).

Also, in C/C++/Java/C#/whatever code, you can run into the same kind of problem on a smaller scale when editing deeply nested blocks and expressions, especially when dealing with complex arithmetic/logic expressions.

(I basically only use Lisp when messing with Emacs, so I would not call myself a Lisp hacker, but when learning Lisp, the braces cease to be a problem after a month at most.)


M-x paredit-mode



As opposed to checking the indent level? (Python)


I'm guessing you mean with Python? I have to say I've never once had a bug related to number of spaces.


I billed 8 hours work to a company to fix a python bug that was exactly this problem.

After a lot of refactoring by someone inexperienced in python, something was indented inside of a loop that should have been outside it.

8 hours work for de-denting one line of code.


Nothing raises the ire of HN devotees like the most mild criticism of Lisp aesthetics... Such a sensitive lot.


I'm sure if lispers started going into python threads and whining about "argg all the whitespace makes me want to gauge my eyes out!!!11" the response would be similar. Don't like parens? That's fine, really. Want to write off an entire class of languages because the syntax isn't plastered with curly braces? Go ahead!

Just don't expect anyone to all that interested when you start blubbering about it.


If a post on Python has more than a handful of comments, the whitespace discussion almost inevitably comes up, in my experience.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: