Hacker News new | past | comments | ask | show | jobs | submit | more vertebrate's comments login

Why do you think some of the so-called hacker languages were pretty popular at a time (perl, lisp), and not the others (forth, apl?)


There was no Internet and knowledge was shared via magazines and computer clubs.

Forth was quite popular in Europe thanks to Jupiter Ace and ZX Spectrum extensions.

https://en.wikipedia.org/wiki/Jupiter_Ace

http://www.worldofspectrum.org/infoseekid.cgi?id=0008717


That's true. And APL and its children is still quite popular in finance.

By the way, I found a rather neat page about building a Jupiter ACE: http://searle.hostei.com/grant/JupiterAce/JupiterAce.html


Mentioning APL in this context makes me picture an 8 bit home computer with an APL interpreter. I mean, the C64 had all kinds of weird symbols on its keyboard, too. And the magazines would save quite some pages for their source code listing compared to BASIC...


According to Wikipedia, the first implementation of APL on a micro was on the 8008, in '73. It wasn't very good, but it did run...

Later ('77), TIS APL ran on Z80 systems, and was aparently quite a bit better, although it wasn't a full implementation.

But I'm just quoting Wikipedia at this point, so look it up yourself.


For Forth, that's an easy answer: when you think "hacker" you're thinking of people working on the Big Iron. Forth's niche was small machines, so while it had a good bit of popularity on the micros, its influence on the mainframe hackers, with their "we hate micros" ethos, was minimal.


I suspect younger programmers can't even begin to imagine what it was like. I learned to program on the Commodore 64. The default language was (Microsoft) Basic. Line numbers. Only flow control IF, GOTO, and GOSUB/RETURN. No loops, no ability to write functions, nothing like a define or a macro.

And then I got my hands on a Forth. Functions. Control structures. Maybe they weren't fancy by modern standards, but they were easily an order of magnitude better than Basic's, and you had the ability to write new ones fairly easily. Quick compiles, code which was both small and fast. Easy inline assembly code for when you needed even more speed. It was a dream come true.


We younger programmers started with Scratch. Sure, it makes pretty pictures, and you don't have to type, but it has only global and object-local variables, barely has function calls, and is generally awkward.

Or at least, it was last I used it.


I read through the recent thread on teaching kids to code (https://news.ycombinator.com/item?id=13499626) and was mildly perplexed at how seemingly successful Scratch has been.

It wasn't until the 2nd or 3rd time I'd used it that I actually figured out how to make sense of it and run something (for Scratch's definition of "run").

To be honest I've progressed extremely slowly with CompSci/programming over the past 18 years I've been using them (got my first computer around 7-8) - I started with QBasic, been shouting at PHP for way too long, I have a basic understanding of C I badly need to develop, and I'm moving toward playing with Lua next - and I hardly consider myself a dyed-in-the-algorithms academic type with a brain that's unable to understand Scratch. (In fact, I'd argue that the best programming teachers would be precisely those types of people, and if they were unable to understand Scratch that would be a major problem.)

Rather, I firmly believe Scatch's UI is a disaster, and horribly unintuitive to use. Other languages are beset with grammatical idiosyncrasies; with Scratch you have to learn the UI before you can learn the... few parts of the language that are actually there.

I'm concerned that systems like Scratch are so widely used; I fear that it's an even worse mind-scrambler than the bad sides of BASIC. Of course, like BASIC, there are good sides, and it teaches the basics without presenting a Mt. Everest-sized learning curve. Perhaps https://en.wikipedia.org/wiki/Dartmouth_BASIC was the Scratch of 1964, and I'm just griping about the dilutory effects of educationally-targeted software in this day and age and "modern" GUI design.

Scratch is also really slow/laggy on my old laptop (Thinkpad T43), I can't imagine how bad it is for schools with limited hardware.


When I used Scratch 1.4, I don't recall the UI being too bad, and it ran pretty fast: Smalltalk is pretty good at that. But I'm generally pretty good at picking up these sorts of things, and my computer wasn't particularly slow.

If you want a Real Language presented the same way, Snap! (descended from BYOB) is essentially a Scheme in Scratch's clothing.

But the two real draws of Scratch were its hackability and its community. Back before Scratch 2.0 ruined everything, Scratch was written in Smalltalk, and using a widely-known hidden feature, you could examine the source code and make whatever changes you wanted with relative ease, resulting in a healthy community of mods and derivatives which explored new features and ideas, or those that the official team had dropped by the wayside (like Mesh, a fully-featured networking system).

Scratch's community was likewise excellent: I spent a lot of time lurking in the Scratch Advanced Topics forum - a sort of off-topic general programming section, where people far smarter than I discussed modifying Scratch, improving the website, and whatever programming projects they happened to be working on (usually web programming in PHP - it was the mid 2000s, after all).

But that's enough nostalgia for one day...


Ah, I encountered Scatch 2.0. I'll definitely check 1.4 out, it looks a lot more accessible and reasonable. I would have loved to have encountered something like this at 14 or 15.

I suspect the reason Scratch felt slow to me is that 2.0 is some kind of HTML5 and/or Flash mess now - you're right, Smalltalk is really fast. I spun up Squeak to check something on this T43 yesterday, and everything was really snappy. I have no reason to expect Scratch will be any slower.

Also, I wouldn't be surprised if a reasonable bit of the exploration everyone did was motivated by the fact that they were "hacking" the platform :P

There seems to be a sad lack of excellent online communities nowadays; I've long looked for sites to complement HN, but without success.

I just had a look at Snap! which is interesting. It definitely flatlines this laptop though, I had to try it on a faster machine. But I'm running the tree animation demo right now, and it looks awesome....


>Also, I wouldn't be surprised if a reasonable bit of the exploration everyone did was motivated by the fact that they were "hacking" the platform :P

Nor would I. Even at the age of 8, before I was really able to understand the code, there was a thrill to it, in a cracking-open-the-toy kind of way.

And it helps that Smalltalk does exploration better than just about any other language/environment. You can just open up any Smalltalk app and extend/take it apart using the same tools the developers did to build it.

>There seems to be a sad lack of excellent online communities nowadays; I've long looked for sites to complement HN, but without success.

Lainchan (a sort-of cyberpunk/whatever chan) is quite popular with some of the people who are here on HN. It's a very different atmosphere, but it does emphasize actual good discussion. And it's got a containment board for politics, which always helps.

At the very least, their magazine (https://lainzine.neocities.org) is worth looking at, if not for the generally interesting articles, than for the outright strangeness of a lot of it.


Wow, nice! At 8 (1999) I was given a probably-6-or-7-year-old 286 running DOS 3.3 with nothing on it. That got swapped for an XT a couple years later, which I discovered Qbasic on and got tangled up in for way too many years :S. Smalltalk would have been awesome to discover at that age, moreso a toy with sekret doors and passages in it for me to discover :D

I was recommended Lainchain a couple months ago, actually, but nobody mentioned the magazine, which is really cool. I am not impressed that the ASCII art generation paper in Vol.1 has any associated source code!! The rest of the magazine content and design is really interesting too.


For what it's worth, after a bit of research [1], I discovered Commodore Basic 2.0 had simple FOR loops (which I think I used and forgot), and the ability to create functions which encode a single one-variable mathematical expression (which I don't think I ever knew about).

[1] https://www.c64-wiki.com/wiki/DEF


I've heard the BBC BASIC was quite a lot better, but I've never worked in it myself...


I forget what the name was, but I used a Microsoft Basic on the IBM PC platform a few years later which was light years better than the C64 Basic. Still gave it up ASAP after learning C, mind you.



Is anyone here using ortho-k lenses? I've replaced my usual soft contacts and been wearing them for half a year now and it's so much easier on the eyes. Although I still see slight halos and by the evening my vision is noticeably worse, I can't complain. And I have an option to undo all this, unlike some poor LASIK patients.


This is MacOS font rendering for you, in Linux normal weight looks normal


Or rather, in macOS it looks normal and in Linux it looks too thin.


Have you tried Luxi Mono or Linux Libertine Mono?


Will try again, but it didn't look as clean and sharp as Terminus, Fixed or even DejaVu Sans Mono.


What is this FRP thing JS developers talking about? What I should read to learn more about it?


It's been around in various forms, and implemented on top of various languages, for quite some time.

The earliest full-fledged FRP implementation for JavaScript was Flapjax – work on it dates all the way back to 2006!

http://www.flapjax-lang.org/

The paper Flapjax: A Programming Language for Ajax Applications is worth reading.

http://cs.brown.edu/~sk/Publications/Papers/Published/mgbcgb...

The principal designer and maintainer of the Elm Language (a functional reactive lang that compiles to JS) did his Master's Thesis on FRP, and that's a great resource if you want the big picture:

https://www.seas.harvard.edu/sites/default/files/files/archi...

See also Controlling Time and Space: understanding the many formulations of FRP, by the same author:

https://www.youtube.com/watch?v=Agu6jipKfYw


If you ignore the whole thing about how frp is not FRP, and how javascript developers really only mean frp, then a good place to start would be learning about RxJS.

FRP is deterministic and referentially transparent, frp is not. Key concepts in FRP are behaviors, events, signals; key concepts in frp is streams, observables, subscriptions. Although I'm not expert with either, there's a lot of overlap and term overloading, and I might be explaining this all badly anyway.


AFAICT, the simplest way of explaining the difference between FRP and "popular frp" is that in FRP things are described as functions of time, whereas in popular frp they are streams of events.

The "real" FRP is great for describing non-interactive things. They can be used to describe anumations: e.g. a FRP behavior can describe the position of a ball as a function of time:

  ballPosition :: t -> Position
whereas in "quasi" frp, the ballPosition is a stream of Position values:

  ballPosition :: (Stream Position)
Basically, quasi FRP is what you get when you sample real FRP at certain times :)


A small part of it is that your events are basically constant streams of data (think arrays) you can map, reduce, and filter (among other functions) and then present. Here is a short article on it:

https://medium.com/@andrestaltz/2-minute-introduction-to-rx-...

This is a discussion over the controversy of calling stuff FRP in academia vs. real world:

https://medium.com/@andrestaltz/why-i-cannot-say-frp-but-i-j...


It's a way to write applications as declarative code that defines the UI in terms of sequences of events and streams.

Basically: something incredibly powerful and clever that will not be adopted by the wider developer community until someone figures out how to present it in a way that makes it accessible to someone who isn't deeply interested in academia.


I don't think it's so much that it requires an interest in academia, but that it requires a shift in thinking towards abstractions that still aren't mainstream, relative to OOP (in its various forms) and unadorned events/listeners. Principally, that shift is toward higher-order functions. Yes, those same HOF abstractions are popular subjects for academic research and teaching, but they're not inherently academic subjects.


True, but this paradigm shift has been pending for nearly half a century and there doesn't seem to be any progress, just periodic rediscoveries.


A way to handle sequences of events - which is really powerful abstraction. Look at this (shameless self-promotion) presentation https://slides.com/bahmutov/javascript-journey-boston-code-c... and the companion blog post http://glebbahmutov.com/blog/journey-from-procedural-to-reac... plus repo of the code that shows every style of JS, including Redux and Reactive https://github.com/bahmutov/javascript-journey



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: