Hacker News new | past | comments | ask | show | jobs | submit login

Academia is its own thing with its own incentives. I wouldn't generalise from what happens in academic papers.

When I see people communicating in (my part of) the industry they use pseudocode, which is often described as looking like python. They use if anything fewer symbols (and more space) than a real programming language. They do indeed elide parts of the code - often things like error handling.

To my mind that says: we should use languages in which code looks like pseudocode/python (this idea was suggested in http://paulgraham.com/hundred.html , though he takes it in a different direction). And we should look for ways to elide in real code the parts that people like to elide when talking about programs: to e.g. have "ambient" error handling that's more-or-less invisible most of the time, without sacrificing the safety advantages of checking error cases (this is why I'm interested in e.g. effect systems).




I'd be very surprised if your industry really did use complete pseudocode and only elided error handling. On the other hand, you're sort of assuming in your conclusion that pseudocode is the "better way" for languages because that's what people use, but you're leaving out the initial bias. I would argue that if you made current industrial languages more like pseudocode, you'd probably do better, yes, but it's a local maximum derived from an assumption of what the end result will be.

In other words, people use pseudocode because it's close to the code they intend to write and represents their current notational expectations. It's an enforcement of legacy methods of thinking.

But many people have admitted that there is a problem with writing pseudocode style programming for modern hardware performance, where taking advantage of parallelism is important.

Furthermore, I would argue that academia is relevant because it's one of the few places where the ideas are more important than the executable. If the ideas are communicated clearly, then you've succeeded. If we really want to program for the human, then we want our programs to be focused on the communication of ideas, and not machine-focused. And the reality is that if you take the machine away, and focus on human-to-human communication, without any "industrial" bias (expectation of machine execution), then rigorious idea communication is almost always pictorial, visual, and ideographic. Fruthermore, the notations that people develop and have developed over time to communicate ideas never end up looking like mainstream programming languages. As people work with ideas, math notation is the quintessential notation for communicating human ideas rigorously. It is highly evolved for human consumption, and manipulation, rather than machine-focused.

I believe there have also been some studies on how people describe processes without any computing background, and it's inevitable that many of the core "serial" programming concepts are not "natural" in human though, but a very acquired taste.

Again, I would be surprised if you put a bunch of industry or non-industry professionals up to a white board and had them illustrate their ideas rigorously to one another on just that whiteboard, that they would naturally gravitate to any real programming language. And I doubt strongly that they would actually continue to use pseudocode at scale on the whiteboard.


> I'd be very surprised if your industry really did use complete pseudocode and only elided error handling. On the other hand, you're sort of assuming in your conclusion that pseudocode is the "better way" for languages because that's what people use, but you're leaving out the initial bias. I would argue that if you made current industrial languages more like pseudocode, you'd probably do better, yes, but it's a local maximum derived from an assumption of what the end result will be.

Error handling was one example - I see concerns like serialization, permissions, transactionality commonly elided, and I look for better ways to handle them in programming languages as well.

> I would argue that academia is relevant because it's one of the few places where the ideas are more important than the executable. If the ideas are communicated clearly, then you've succeeded.

Maybe. That assumes that the successful papers (and successful academics) are those that communicate ideas clearly. I'm not convinced.

> the reality is that if you take the machine away, and focus on human-to-human communication, without any "industrial" bias (expectation of machine execution), then rigorious idea communication is almost always pictorial, visual, and ideographic.

Not my experience at all - if anything I'd say visual aspects tend to be a marker of less rigorous communcation.

> Fruthermore, the notations that people develop and have developed over time to communicate ideas never end up looking like mainstream programming languages. As people work with ideas, math notation is the quintessential notation for communicating human ideas rigorously.

Mathematics is one such notation; "legalese" is another, and philosophical terminology a third. I'm wary of generalising too much from mathematical notation alone.


> Not my experience at all - if anything I'd say visual aspects tend to be a marker of less rigorous communcation.

I would point to the field of combinatorics, the traditional proofs of both the ancient Chinese mathematicians as well as those of the West, both of which took on various elements of geometry and spatial reasoning for a significant number of their proofs when other tools were not yet available. The development of algebra I see as a chiefly visual and ideographic one, even tangible or malleable one. The development of UML diagrams another. Flow charts another. We have the abacus and Chinese counting sticks, as well. And finally, while poetry is not specifically rigorous, it is efficient in a way that few other communication methods are. And we find a great deal of "visual cue" elements in that field. In physical sciences and statistics, visualization is a very important tool. Mathematical notation itself is largely spatial and visual at scale.

As for legalese, I would argue that legalese is perhaps well designed for experts to be complete, but not for clarity. Comprehensiveness is different that clarity of rigor. And as for philosophy, vocabulary is not enough. And you'll note that some of the best notational systems to arise came from the philosophy departments in working on logical systems. Those are all usually notationally represented using ideographic, rather than natural language forms. And even some Eastern philosophers who wrote very verbosely tended to make their arguments from visualizations in the mind to make their point.

Musical notation, again, has evolved into a spatial, visual notation. A large number of traditional writing systems were ideographic, including ones we now consider alphabetic/phonetic.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: