> New languages have helped a lot in creating new models of social organization
Glad you agree!
> To be concrete, the significant 90s languages were forgiving enough, powerful enough, and easy enough to learn that they allowed for a kind of triumph of the programmer proletariat over the artisans.
In terms of practice, yes the languages that would go on to be popular and didn't drag one down with manually memory management were created in the 90s. The technology was older.
> Java as a prime exemplar of the value that the research community brings to the world
Agreed. Still not as un-innovative as Go though!
> And when I think of more recent language innovation (Go, Rust), they've only been incremental improvements that were anticipated decades ago.
Rust fully admits it is recycling old research. It's productivity gains are diminished by the fact that it is targeting systems programming which is less productive. If somebody made a a Gc<T>, and made an implicitly garbage collected version of Rust to compete with Go and Java (of course with great FFI to regular Rust for free), it would already be a better candidate than any other popular language, and that's no research required.
> I'd love to hear some more details of the untapped engineering potential!
So I am Haskeller. I would say there are 3 tracks that interest me:
1. The "classic track" of fancier type systems. The Dependent Haskell work is interesting because it should allow the hodge-podge of language extensions we have today to be streamlined into fewer features, reducing complexity. Writing proofs means less productivity, however, combining libraries with lemmas might be extremely productive. Think being able to blindly mash stack overflow answers together with extreme fidelity. That should be a goal.
2. The "categorical strack". Lambda the ultimate....mistake :D. Our programs are higher order, but there is often a first order steady state, like the things we put on whiteboards for our managers. Our current practices utterly fail to capture that first order steady state anywhere in the code, and as languages become "more functional" (java 8+, ES whatever, etc.) we're basically throwing more lambdas at problems without principle.
Using some categorical primitives---and I mean the "realer ones" in https://hackage.haskell.org/package/categories not the ones in https://hackage.haskell.org/package/base that need a lot of squinting to make out the original math---gives us a chance to perhaps fix this. Categories putting the output and input on equal footing helps, and the simple wire diagram makes the plumbing / dataflow way of thinking that every programmer should employ much more apparent. "point free" programming sucks, but I ultimately think we can get something like Scratch that makes sense for programmer-adjacent types, and yet is useful for "actual work".
3. The "incremental track". There is a decent amount of literature around incremental / reactive programming and models of computation, but it needs to be properly synthesized. I think this is will be huge because it will improve techniques that actually match what real world programs do (toy "everything terminates" stuff from school is actively harmful when students extrapolate that theory has nothing to do with practice). This will defeat an entire class of concurrency woahs that currently is a huge source of productivity loss---I don't mean just race conditions, but the more general "how can I analysis (with the ooriginal "break apart" connotations) my program into simply peaces and then synthesize a holistic understanding". In terms of how this will actually happen, this strongly ties into the above.
Let me say 2 more things:
1. we should have "one language to rule them all", in that I can write my firmware, kernel, end applications, and everything in between without compromise in it. People think this is silly, "engineering is tradeoffs, amirite?" but it isn't. The language will begin a "dialect continuum" that supports vastly different clashing idioms for those different domains (GC everywhere? Manual memory management? No call stack even?), but do support flawlessly type-safe FII.
2. Non-programmers think "applied math" means plugin in numbers, and writing regular natural language prose to with it. Math isn't numbers, and people's notion of it must be fixed accordingly. Proof theory means the whole argument, not just the numerical proessing, can be formalized. This is how many jobs should work.
Thanks for the meaty comment! I was expecting a bunch of stuff in the "fancier type system" category and was pleasantly surprised.
Do you have any references to share to learn more about potential practical uses of categories? Assume the knowledge of someone who isn't intimidated by the wiki page on categories but who learns by reading it.
> I was expecting a bunch of stuff in the "fancier type system" category and was pleasantly surprised.
Yeah after enough functional programming, one's sense of the new lowest-hanging fruit switches.
> Do you have any references to share to learn more about potential practical uses of categories?
I'm afraid not. I suppose there is plenty of blogs on Haskell and category theory, and N lab for the actual math, but I don't of a resource that zooms out from the neat tricks and tries to discuss broad needs for better programming and architecture.
But I'm relieved to say that where I work, we are working on some things in the vein of things for tracks 2 and 3. It will be open sourced when it's ready, so... stay tuned, I guess?
Glad you agree!
> To be concrete, the significant 90s languages were forgiving enough, powerful enough, and easy enough to learn that they allowed for a kind of triumph of the programmer proletariat over the artisans.
In terms of practice, yes the languages that would go on to be popular and didn't drag one down with manually memory management were created in the 90s. The technology was older.
> Java as a prime exemplar of the value that the research community brings to the world
Agreed. Still not as un-innovative as Go though!
> And when I think of more recent language innovation (Go, Rust), they've only been incremental improvements that were anticipated decades ago.
Rust fully admits it is recycling old research. It's productivity gains are diminished by the fact that it is targeting systems programming which is less productive. If somebody made a a Gc<T>, and made an implicitly garbage collected version of Rust to compete with Go and Java (of course with great FFI to regular Rust for free), it would already be a better candidate than any other popular language, and that's no research required.
> I'd love to hear some more details of the untapped engineering potential!
So I am Haskeller. I would say there are 3 tracks that interest me:
1. The "classic track" of fancier type systems. The Dependent Haskell work is interesting because it should allow the hodge-podge of language extensions we have today to be streamlined into fewer features, reducing complexity. Writing proofs means less productivity, however, combining libraries with lemmas might be extremely productive. Think being able to blindly mash stack overflow answers together with extreme fidelity. That should be a goal.
2. The "categorical strack". Lambda the ultimate....mistake :D. Our programs are higher order, but there is often a first order steady state, like the things we put on whiteboards for our managers. Our current practices utterly fail to capture that first order steady state anywhere in the code, and as languages become "more functional" (java 8+, ES whatever, etc.) we're basically throwing more lambdas at problems without principle.
Using some categorical primitives---and I mean the "realer ones" in https://hackage.haskell.org/package/categories not the ones in https://hackage.haskell.org/package/base that need a lot of squinting to make out the original math---gives us a chance to perhaps fix this. Categories putting the output and input on equal footing helps, and the simple wire diagram makes the plumbing / dataflow way of thinking that every programmer should employ much more apparent. "point free" programming sucks, but I ultimately think we can get something like Scratch that makes sense for programmer-adjacent types, and yet is useful for "actual work".
3. The "incremental track". There is a decent amount of literature around incremental / reactive programming and models of computation, but it needs to be properly synthesized. I think this is will be huge because it will improve techniques that actually match what real world programs do (toy "everything terminates" stuff from school is actively harmful when students extrapolate that theory has nothing to do with practice). This will defeat an entire class of concurrency woahs that currently is a huge source of productivity loss---I don't mean just race conditions, but the more general "how can I analysis (with the ooriginal "break apart" connotations) my program into simply peaces and then synthesize a holistic understanding". In terms of how this will actually happen, this strongly ties into the above.
Let me say 2 more things:
1. we should have "one language to rule them all", in that I can write my firmware, kernel, end applications, and everything in between without compromise in it. People think this is silly, "engineering is tradeoffs, amirite?" but it isn't. The language will begin a "dialect continuum" that supports vastly different clashing idioms for those different domains (GC everywhere? Manual memory management? No call stack even?), but do support flawlessly type-safe FII.
2. Non-programmers think "applied math" means plugin in numbers, and writing regular natural language prose to with it. Math isn't numbers, and people's notion of it must be fixed accordingly. Proof theory means the whole argument, not just the numerical proessing, can be formalized. This is how many jobs should work.