Honestly, just about anything is going to be faster, productivity-wise, than C++. When you stop having to think about how you're going to structure your inheritance and classes, you--shockingly--get things done. But more metaprogramming is not the answer. And I've used Scheme since 2002 or so. I'm implemented Scheme interpreters and compilers. But I'm no longer a cheerleader for macros and call/cc or Scheme (or Lisp, for that matter).
You know what gets things done and makes things easy to maintain? Boring ass code. IF statements. FOR loops. I mostly use Perl today. It doesn't get in the way. But getting things done is not trendy. That's where we are today.
You know what's the problem with boring code? It's boring. This means its information content is low, and its abstraction level is low. This means that you need more of it to express an algorithm.
When you have a lot of wordy, boring code to maintain, you have to make coordinated changes in more similarly boring places. A human's brain can only keep that many lines of context. So it becomes easier to make a mistake.
I understand that abstraction astronautics can leave you with puzzling, convoluted, hard-to-maintain code full of leaky unintuitive abstractions. This problem is not unique to Lisp macros; languages like C++ and even Java are known to be widely used by perpetrators of the above-mentioned atrocities.
What makes code easier to maintain is clear separation of concerns and low impedance between code's abstractions and the subject area. This is, again, attainable in a number of languages (though expressive power and minimalism help make it even nicer), given the right mindset and skills. I suppose John Carmack possesses both.
I recently watch a colleague write an elaborate system to parse a few different CSV feeds. There are a dozen different interfaces and mixed in with all the lovely Java design patterns.
I'm beginning to use a phrase that I'd rather deal with poorly written code than well planned architecture. Obviously by well planned architecture I'm referring to overly architected solutions.
When it comes to architecture, I've lately turned towards BCNF as my god. My premise is that if my data model - my internal application data, not just "the database" - is as in as normalized a form as I can reasonably get it given typical constraints of procedural/OO/functional styles, my features automatically grow into a flexible and decoupled grain because they're operating on exactly the right slice of data, no more, no less. "Guess and check" and "OO design pattern" strategies don't seem to get me there because they tend to start with whatever is language-easy or looks pretty at first glance, and then take on the problems later. And it seems to work - the thing I have right now is, indeed, incredibly flexible for the amount of code involved. And it isn't really "architected" in the usual sense otherwise - there are no grand plans.
The only problem I'm having with this tack is that it reveals all the technical debt at once, which produces an enormous amount of pain early on. My friends smirked at my woes today of trying to make a clickable button, which has to piece together stuff from the graphics layer, input events, text fields, and internal button state. An enormous variety of data, altogether, with the debt usually hidden from view at some level. It all makes sense, it's all decoupled, the lifetime of the state is automatically managed, any configuration you want will just be a matter of making the data for it. But making that first button is quite a headache.
I had a funny feeling doing a SQL MOOC when I had to re-learn normalization, and how it was a very generic decoupling algorithm. Suddenly all OOP became tiny and ad-hoc.
Would you mind telling me which MOOC you did for SQL? I am quite rusty - (~10 years since I did any serious SQL stuff) but I am finding it is coming up quite a lot now for me.
A well-planned architecture is often the smallest thing that works correctly. A well-architected car is unlikely to have 37 wheels (though a poorly-built one might).
You know, the ideal device is that which is not even there, but its function gets executed. This ideal is rarely attainable, but it's something to crave for.
This! I joined a new company recently to build out the systems. Instead of trying to predict the future and build for it, I just went ahead and built a bare minimum architecture and used TDD for it while doing so. The start was a little slow, but now when I get requests to change things entirely (eg - an entire segment of logic was requested to be shifted into the database for an administrator to manage its behaviour), I get it done fairly quick.
You know what's the problem with boring code? It's boring. This means its information content is low, and its abstraction level is low. This means that you need more of it to express an algorithm.
When you have a lot of wordy, boring code to maintain, you have to make coordinated changes in more similarly boring places. A human's brain can only keep that many lines of context. So it becomes easier to make a mistake.
A problem nicely summarized by Yaron Minsky (of Jane Street): "You can’t pay people enough to carefully debug boring boilerplate code. I’ve tried."
You know what's the problem with boring code? It's boring. This means its information content is low, and its abstraction level is low. This means that you need more of it to express an algorithm.
Code may also be boring simply because it is unsurprising for someone familiar with the subject matter.
"You know what gets things done and makes things easy to maintain? Boring ass code. IF statements. FOR loops." I think you are channeling some of the Go philosophy there :).
Who's talking about code style? Some Go users like to talk about Go as if anyone who doesn't like it just dosn't "get it", or that they obviously don't appreciate Getting Things Done.
Carmack seems to go through his FP phase. A phase that many programmers go through in their younger years (Carmack must have missed it, because he was occupied with Keen, Wolfenstein, Doom and Quake at that time), when they read SICP, learn Scheme, ML, etc. before the novelty wears off and they come back to plain old imperative, mutable programming.
I came back to Perl about 18 months ago, after working as a sysadmin for a couple of months and being fed up with Python's unicode handling (Python 2.x, I haven't given Python 3.x a try, yet).
I do not think Perl is a pretty language, but I have come to appreciate how useful it is. If all you want is a smallish application (roughly, less than 1 KLOC), especially if you're only going to use it once or maybe a handful of times, no other language I have met can keep up.
And for the kind of problem I typically use Perl for - reading, say, a CSV file or an Excel spreadsheet, filtering the data according to some criterion, fetching and adding data from an external source, say, an LDAP directory or a relational database, then inserting the result into a database or emitting another CSV file - it is also surprisingly hard to beat Perl's runtime performance, especially its regex engine.
I'm not saying it can't be done, but for a program you're essentially throwing away after a week or so, it's usually not worth the hassle.
Honestly, just about anything is going to be faster, productivity-wise, than C++.
I don't think that's been true for a while. Boost went a long way towards making C++ much more productive, and now that's gone even further with C++11 and 14.
Compile times are still terrible, there's still a terribly high number of causes for undefined behavior that will burn through your time in debugging sessions, and there's terribly far to go still before it's feature list catches up to "just about anything" (albeit a slightly smaller set this time around.)
Things are improving in C++-land, but I'd still place it near last.
You know what gets things done and makes things easy to maintain? Boring ass code. IF statements. FOR loops. I mostly use Perl today. It doesn't get in the way. But getting things done is not trendy. That's where we are today.