Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Sure, I get that. I just think a language needs to be built from the ground up to allow multi-programmer collaboration, and that there are few if any valuable lessons to be taken from what works in the single-programmer case.


You're asserting that this isn't multi-programmer friendly. I'll agree that it's not "code monkey" friendly, but I disagree that it is not oriented towards multiple programmers. And the APL language has almost all the features you would expect from a modern multi-paradigm language, including branching, control structures, recursion, exceptions, objects, frameworks, interfaces to other languages, and so on and so forth.

But APL was designed from the beginning to enable human communication. I would argue that almost all programming languages fail to be a good human medium of communication. The evidence I give in support of this assertion is that if you look at how people write when they think the computer won't need to see the code, such as in academic publications on computer science, see what they use in the paper. Almost all of the people who implement their ideas in one language or another fail to include the entire code in their papers, and they usually include some mathematical notation and diagrams to explain their ideas instead. They may include some small snippets of code, but they rarely if every include the full code. Dan Friedman being an exception that proves the rule, if you will.

If you then take a look at how APLers communicate when they have ideas, you see code all the time, all day long. The APL community is the only one I've seen that regularly can write complete code and talk about it fluently on a whiteboard between humans without hand waving. Even my beloved Scheme programming language cannot boast this. When working with humans on a programming task, almost no one uses their programming languages that primary communication method between themselves and other humans outside of the presence of a computer. That signals to me that they are not, in fact, natural, expedient tools for communicating ideas to other humans. The best practices utilized in most programming languages are, instead, attempts to ameliorate the situation to make the code as tractable and as manageable as possible, but they do not, primarily, represent a demonstration of the naturalness of those languages to human communication.


Academia is its own thing with its own incentives. I wouldn't generalise from what happens in academic papers.

When I see people communicating in (my part of) the industry they use pseudocode, which is often described as looking like python. They use if anything fewer symbols (and more space) than a real programming language. They do indeed elide parts of the code - often things like error handling.

To my mind that says: we should use languages in which code looks like pseudocode/python (this idea was suggested in http://paulgraham.com/hundred.html , though he takes it in a different direction). And we should look for ways to elide in real code the parts that people like to elide when talking about programs: to e.g. have "ambient" error handling that's more-or-less invisible most of the time, without sacrificing the safety advantages of checking error cases (this is why I'm interested in e.g. effect systems).


I'd be very surprised if your industry really did use complete pseudocode and only elided error handling. On the other hand, you're sort of assuming in your conclusion that pseudocode is the "better way" for languages because that's what people use, but you're leaving out the initial bias. I would argue that if you made current industrial languages more like pseudocode, you'd probably do better, yes, but it's a local maximum derived from an assumption of what the end result will be.

In other words, people use pseudocode because it's close to the code they intend to write and represents their current notational expectations. It's an enforcement of legacy methods of thinking.

But many people have admitted that there is a problem with writing pseudocode style programming for modern hardware performance, where taking advantage of parallelism is important.

Furthermore, I would argue that academia is relevant because it's one of the few places where the ideas are more important than the executable. If the ideas are communicated clearly, then you've succeeded. If we really want to program for the human, then we want our programs to be focused on the communication of ideas, and not machine-focused. And the reality is that if you take the machine away, and focus on human-to-human communication, without any "industrial" bias (expectation of machine execution), then rigorious idea communication is almost always pictorial, visual, and ideographic. Fruthermore, the notations that people develop and have developed over time to communicate ideas never end up looking like mainstream programming languages. As people work with ideas, math notation is the quintessential notation for communicating human ideas rigorously. It is highly evolved for human consumption, and manipulation, rather than machine-focused.

I believe there have also been some studies on how people describe processes without any computing background, and it's inevitable that many of the core "serial" programming concepts are not "natural" in human though, but a very acquired taste.

Again, I would be surprised if you put a bunch of industry or non-industry professionals up to a white board and had them illustrate their ideas rigorously to one another on just that whiteboard, that they would naturally gravitate to any real programming language. And I doubt strongly that they would actually continue to use pseudocode at scale on the whiteboard.


> I'd be very surprised if your industry really did use complete pseudocode and only elided error handling. On the other hand, you're sort of assuming in your conclusion that pseudocode is the "better way" for languages because that's what people use, but you're leaving out the initial bias. I would argue that if you made current industrial languages more like pseudocode, you'd probably do better, yes, but it's a local maximum derived from an assumption of what the end result will be.

Error handling was one example - I see concerns like serialization, permissions, transactionality commonly elided, and I look for better ways to handle them in programming languages as well.

> I would argue that academia is relevant because it's one of the few places where the ideas are more important than the executable. If the ideas are communicated clearly, then you've succeeded.

Maybe. That assumes that the successful papers (and successful academics) are those that communicate ideas clearly. I'm not convinced.

> the reality is that if you take the machine away, and focus on human-to-human communication, without any "industrial" bias (expectation of machine execution), then rigorious idea communication is almost always pictorial, visual, and ideographic.

Not my experience at all - if anything I'd say visual aspects tend to be a marker of less rigorous communcation.

> Fruthermore, the notations that people develop and have developed over time to communicate ideas never end up looking like mainstream programming languages. As people work with ideas, math notation is the quintessential notation for communicating human ideas rigorously.

Mathematics is one such notation; "legalese" is another, and philosophical terminology a third. I'm wary of generalising too much from mathematical notation alone.


> Not my experience at all - if anything I'd say visual aspects tend to be a marker of less rigorous communcation.

I would point to the field of combinatorics, the traditional proofs of both the ancient Chinese mathematicians as well as those of the West, both of which took on various elements of geometry and spatial reasoning for a significant number of their proofs when other tools were not yet available. The development of algebra I see as a chiefly visual and ideographic one, even tangible or malleable one. The development of UML diagrams another. Flow charts another. We have the abacus and Chinese counting sticks, as well. And finally, while poetry is not specifically rigorous, it is efficient in a way that few other communication methods are. And we find a great deal of "visual cue" elements in that field. In physical sciences and statistics, visualization is a very important tool. Mathematical notation itself is largely spatial and visual at scale.

As for legalese, I would argue that legalese is perhaps well designed for experts to be complete, but not for clarity. Comprehensiveness is different that clarity of rigor. And as for philosophy, vocabulary is not enough. And you'll note that some of the best notational systems to arise came from the philosophy departments in working on logical systems. Those are all usually notationally represented using ideographic, rather than natural language forms. And even some Eastern philosophers who wrote very verbosely tended to make their arguments from visualizations in the mind to make their point.

Musical notation, again, has evolved into a spatial, visual notation. A large number of traditional writing systems were ideographic, including ones we now consider alphabetic/phonetic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: