Hacker News new | past | comments | ask | show | jobs | submit login
Shell Has a Forth-Like Quality (oilshell.org)
136 points by chubot on Jan 25, 2017 | hide | past | favorite | 114 comments



I'm really excited about new shells. Fish has been mentioned here (which I've used for 2+ years). It's amazing and adds a lot of good functionality.

I also like that the author mentioned runit. I wish there was more documentation for runit, but I've been using it on Void linux and it's an amazing, damn simply init system. If an init script is missing for any package in Void, they are trivial to write. Most are less than three lines.

I also really like how the post author demonstrated how shell commands are composed in ways that traditional functions aren't composed; that is commands that change the environment for the command they run (environment variables, processor scheduling, security, input/output redirection). This is the very nature of an _implicit_.


> Fish has been mentioned here (which I've used for 2+ years). It's amazing and adds a lot of good functionality.

What does it add exactly? As far as I can tell, it's needlessly incompatible with POSIX shells, and doesn't do as much as e.g. zsh (or even bash?).

I'm not saying that zsh & bash are the end-all, be-all of shells, but I just don't see that fish improves on them.

And honestly, for config files I think we could do a lot worse than an s-expression language.


fish dev here. We love zsh and anyone pushing on what shells can do!

Here's where I think fish fits in. The main goals of fish are:

1. Thoughtful UI that just works

2. Sane scripting

zsh does more than fish, and we're happy to let it. The tradeoff is that zsh requires substantial configuration to realize that power - and once you get into the weeds, you will learn which options are incompatible with each other. zsh has 21 options controlling history, and fish has none, but you'll probably be happier with fish's defaults.

fish has a smaller feature set, but that feature set just works out of the box: tab completions, syntax highlighting, autosuggestions, etc. Customizations (prompt, colors, etc) can be done with a web UI or a nice command line interface, and not arcane syntax (PS1=...)

The hope is you can be more productive in your shell without needing to invest tons of time configuring and learning it.


I've been using fish since 2013. It's amazing how well it anticipates what I'm trying to do, and just makes life better.

The biggest amazing default is to be able to start typing and have it be searching the history. When I'm ready to start work on a project I can start typing part of the project's name, say "bar", and fish will come up with a history item for "cd /customers/foocorp/barproject".

I still do my "scripting" shell work in bash though, since it runs everywhere.


> adds a lot of good functionality.

Parent's comment is misleading, then. As someone who already has a pretty solid grasp of Bash, when I peeked at Fish it looked like there was nothing that it would gain me -- though I understand that the preconfigured shell can be useful for people who don't have the time to learn and tweak it.


If fish has a hook (har har), it would be autosuggestions. fish suggests the rest of the command as you type, like your web browser does for URLs. This feels faster than tab completions, because you can see what command you'll get ahead of time. It also lets you summon commands from history quickly, without having to think about it - just start typing.

If you're a die-hard bash user, autosuggestions are the feature most likely to pique your interest and make the command line more pleasant.


Will take a look, but 'history-search-backward' fills most of my needs in that regard.


You really should. Once I got use to the fish auto-complete history, the Bash reverse search felt terribly primitive. There is no Ctrl+R. Just type part of an old command and hit the up arrow to scroll.


> you'll probably be happier with fish's defaults.

Probably (although I use a zsh distribution to get a sane setup). I definitely appreciate the idea of sane defaults, rather than forcing the user to set everything up properly from scratch. And your autosuggestions do look pretty swell.

> Customizations (prompt, colors, etc) can be done with a web UI or a nice command line interface, and not arcane syntax (PS1=...)

You ship a web server in the shell? Is it always running, or does the user have to do something to start it up?

I kinda like setting variables, to be honest. But I'm old-fashioned.

One thing I really wish y'all'd done was use more standard shell syntax. It's really painful working in an office where someone uses fish and lets fish-isms creep into shared command libraries.


  >  Is it always running, or does the user have to do something to start it up?
Looks like the latter: https://fishshell.com/docs/current/commands.html#fish_config


Off topic but are you the same ridiculous_fish that used to hang around programming:1 on yahoo circa 1999/2000?


And that's exactly why I love fish, I just install it and go. Thanks!


Fish is similar to zsh in a lot of ways, but without configuration you get the best of zsh:

* Syntax highlighting

* Implicit cd to directory

* 100,000 line history

* History de-duplication

* Inline auto-suggestions

* Tab completion using man page data

* Paginated completion

* Copy/paste using X clipboard

The only things I feel the need to configure on a new fish install, is PATH and my prompt. Because I'm pedantic and like things exactly so.

You can get these with zsh, but it's more than apt-get install.


Why the line history limit, though?

I do not want that my lines get deleted silently. ever. I would rather prefer that random files on my disk are deleted, before that. Is there any shell without line history limit?


History is limited because very large history files could make the shell slow! Shells are often used to recover from exceptional conditions (out of disk space, `rm -rf /`, etc) so it's important they be responsive no matter what.

bash's history limit is by default the last 500 commands, without considering duplicates. In practice this limit is easy to exceed, which is why bash config guides recommend upping it.

fish's history limit of 2^18 (~256k) commands is really more of a sanity check. Duplicate commands are counted only once, and commands are discarded by LRU (not oldest). This means history only starts being dropped after hundreds of thousands of unique commands. If you in fact run that many distinct commands, I'd love to hear more about your use case!


Bash is still very responsive with 150k+ history entries. The only noticeable performance impact is increased startup time when the history file is not in disk cache.


I made a hack that gives me an SQLite database of command line history for bash https://github.com/trengrj/recent


You may want to read through this [0], for a better understanding of why these limits exist, and how the high limit in fish might be enough for you.

[0] https://github.com/fish-shell/fish-shell/issues/2674


Um — I'd rather not that random files on my disk be deleted. :P Rather have a history line limit in my shell.


In bash, you can remove the limit by setting:

    export HISTSIZE=-1
    export HISTFILESIZE=-1
When the history file is not in the disk cache, startup time will increase noticeably after a while though (think a second, maybe two).


What documentation would be more documentation?


If you are interested in a shell with first-class function, list and map values and pipelines that pass such complex data (instead of structureless bytes), you might be interested in elvish: https://github.com/elves/elvish

Disclaimer: I am the author of elvish.


Hi xiaq I mentioned your shell here:

https://github.com/oilshell/oil/wiki/ExternalResources

If there is a better summary, let me know, or feel free to edit the wiki page yourself. (I think it should be editable.)

I've been in contact with the authors of the oh shell, NGS shell, and mash shell, and exchanged some interesting ideas. The design of the oil language should be more concrete in 2-3 months and I'm hoping to get feedback from others who have thought about this space. I might ping you then :)

Are the elvish structured data pipelines actually separate processes, or are they goroutines? Is there a format for piping them over say SSH? I have been thinking about this a little bit for oil.


Hi chubot, I didn't realize that you are developing a shell as well. It's great to see competitions. Can you share some ideas that you learned from the authors of said shells? I would love to hear those, designing shells is not an easy task.

Structured data pipelines run in the same process as elvish, in goroutines. For interacting with external commands you need to serialize and deserialize, like this:

  ~> put [foo bar lorem ipsum] | to-json |
     python3 -c 'import sys, json;print(json.dumps([x.upper() for x in json.load(sys.stdin)]))' |
     from-json
  ▶ [FOO BAR LOREM IPSUM]


OK cool, that makes sense.

I had some good discussions with Michael MacInnis and recently Ilya Sher. I had actually read Michael's masters thesis in 2011 or 2012 and found it to be a very good historical overview of shells:

https://scholar.google.com/scholar?cluster=99934401164441475...

I agree with the goal of "taking shell seriously as a programming language", which I think all projects share. I disagreed on the need to make shell a variant of Lisp. I actually tried this a bit -- I hacked on femtolisp, which is used to bootstrap Julia, and I thought I could bootstrap a good shell with it. This didn't work out as nicely as I had hoped so I dropped the idea.

Ilya was having some issues with the "two modes" of parsing -- command mode for shell-like constructs and expression mode for programming features (function calls, math, etc.). I thought this would be an issue too, but as mentioned in another comment, now that I've written a bash parser I think it's straightforward to solve, and Ilya agreed.

The thing is that bash already has many different modes of parsing, which I implement using "lexer modes" or lexical state. This same technique can be applied to a new shell and then there should be a natural way of mixing commands and expressions (quoted and unquoted words, etc.):

http://www.oilshell.org/blog/2016/10/19.html

I'm also trying to make oil do what bash/dash/etc. do essentially syscall for syscall. I believe that is a little hard with Go because it has its own runtime. Although I think there is a way to pin a goroutine to a single thread which can mitigate those issues.

I was planning to start a thread in maybe a month to review some of these issues and exchange feedback. Let me know if you are interested!


Totally! Bernstein chaining is concatenative. So are shell pipelines, if you squint at them just right: function composition mediated by special stacks of records called stdin and stdout.

Reminds me, I have a pretty neat tail-recursive shell pipeline trick I need to share with HN!


I'm interested in the tail recursive pipeline trick!

The next post mentions that pipelines are also concatenative, although there is a slight difference and I think "point-free" is more accurate.

"Pipelines Support Vectorized, Point-Free, and Imperative Style"

http://www.oilshell.org/blog/2017/01/15.html


Nice. I'm loving this series, keep it up!


Please share it!


I think it would be interesting to have an init system and/or package manager configured using a logic programming language, such as a Prolog dialect.

For instance, one could write "is_installed(X11) :- is_installed (KDE)" to specify that X11 must be installed if KDE is installed.


I actually had this thought w.r.t. makefiles after seing a talk on the Meson build system[1] on youtube.

As a thought experiment:

- Let distros, package maintainers, repository maintainers and the guy building define boolean flags (like "UNIX", "Linux", "ARM", "ARMv7h", "x86", "x86_64", "POSIX_Filesystem", "GCC", "Clang", "C++11", "C++14"... anything you could think of). The latter override earlier definitions (in the order above, i.e. build > repo > distro). Allow the deriving of flags for simplification: "ARMv7h -> ARM", "Windows && ARM -> Use_some_WinRT_Hack" etc.

- Use a Prolog-Like system in which you can state rules for internal flags deciding if/which files should be included. Same goes for compiler flags.

- Add an option to add definitions for external projects that can be included automatically from their sources/binaries (using the same system; like Meson does).

- Just implement the "download/bootstrap" part and the logic solver. Generate ninja or tup files.

That should make an interesting build system. Of course, the maintaining of source file dependencies (which can be dynamic due to header files) and the versioning of libraries would still be problems that are not easy to solve. But at least, one could get rid of the mess that is pretty much every CMake build there is.

[1]: https://github.com/mesonbuild/meson


Nice idea; it's worth noting that Prolog can be used to specify the steps for building the program, not just the configuration.


I'm not sure how much of it is left today, but there is precedent for Prolog in OS startup:

http://www.redditmirror.cc/cache/websites/web.archive.org_84...


I wish that logic programming languages (and especially the one used in that article) had more friendly syntax. Even just replacing ":-" with "<-" would be nice...


Actually, Prolog uses "-->" for to define grammars[1] for parsing. Not confusing at all! :)

(actually, parsing is one of the things i really love about Prolog. Writing a simple grammars is much easier and much more pleasurable than using things like regular expressions. The fact that it's far more powerful and readable is just a bonus. )

[1] https://en.wikipedia.org/wiki/Definite_clause_grammar


Are there two layers of caching in that URL?


Nix (and Guix) uses a declarative approach to all things OS, including package building, managing, dependencies, deployment...

It's functional and not logic though:

https://nixos.org/


I think logic programming is a better fit, because package configurations (and service unit files) are more like descriptions than functions.

But Nix is certainly an improvement over traditional, imperative package management.


Guix is implemented in Guile Scheme.

Lots of the little DSLs used are in fact the descriptive abstractions you mention.

If not, it's quite easy to implement Prolog or any other logic language like Kanren in Scheme. It's been done many times.


What would be the advantage over the current system of having dependencies as data and running solvers on it?

Turing-complete package metadata has some obvious disadvantages too.


Moreover, the use of boolean operators (edit: and higher-order predicates) in such statements would provide a great level of flexibility.


I'm sure the systemd people are working on one.


Actually, it might be possible to translate (a subset of) prolog syntax into SystemD unit files. Hmm...

(edit: incidentally, I've also had the same thought about CSS selectors.)


Easy way to do file descriptor passing ("Bernstein chaining") with any UNIX programs, not just ones written by djb:

http://skarnet.org/software/execline/pipeline.html

Execline resembles a "program launcher".

We can make a braindead launcher in a few minutes using the sh language. This is just the first way that came to mind.

   #!/bin/sh
 
   test $# -ge 1||
   exec echo usage: $0 program arguments
   test $# -le 5||
   exec echo max 5 arguments
   
   {
   a=$#;
   b=$(command -v $1);
   
   echo 'main(){
   char *b['$((a+1))'];'
   
   a=$((a-1));
   test $a -ge 0||
   exec echo 'b['$#']=(void *)0;
   execve("'$b'",b,(void *)0);
   }'  
   echo 'b[0]="'$1'";'
   
   
   
   
   # repeat 
   
   
   a=$((a-1));
   test $a -ge 0||
   exec echo 'b['$#']=(void *)0;
   execve("'$b'",b,(void *)0);
   }' 
   echo 'b[1]="'$2'";'
   
   a=$((a-1));
   test $a -ge 0||
   exec echo 'b['$#']=(void *)0;
   execve("'$b'",b,(void *)0);
   }'
   echo 'b[2]="'$3'";'
   
   a=$((a-1));
   test $a -ge 0||
   exec echo 'b['$#']=(void *)0;
   execve("'$b'",b,(void *)0);
   }'
   echo 'b[3]="'$4'";'
   
   a=$((a-1));
   test $a -ge 0||
   exec echo 'b['$#']=(void *)0;
   execve("'$b'",b,(void *)0);
   }'
   echo 'b[4]="'$5'";'
   
   a=$((a-1));
   test $a -ge 0||
   exec echo 'b['$#']=(void *)0;
   execve("'$b'",b,(void *)0);
   }'
   echo 'b[5]="'$6'";'
   
   
   
   
   } #| exec gcc -xc - -o run -static  \
   && exec run
   
   
   
   
   
   
   
   

re: envdir, storing variables in the filesystem

"no stinkin loops"

or "stupid shell scripter" PoC, take 2

   #!/bin/sh
   # e is a dir of "environmental variables"   
   # e is mounted on tmpfs i.e., it's memory not disk
   # output:
   # ==> 0
   # ==> 1
   # ==> 2
   # ==> 3
   # ==> 4
   # ==> 5
   test -f e/$0||echo 0 > e/$0
   read n < e/$0
   test $n -le 5||exec rm e/$0
   echo ==\> $n
   echo $((n+1)) > e/$0
   test -f $0||exit # just in case
   $0


As someone not well versed in OS history / design:

Why are terminals NOT just REPLs for regular programming languages with OS_Convenience_Functions #include'd automatically?


From a design standpoint: Because the ergonomics of shells and regular programming langauges are very different. 99% of your interactions with a shell is going to be single commands, or short piped chains. Economy of keystrokes is something that a lot of people want here, which is why cd and ls are both that short to type. Most programming languages require at least () to run commands, since not requiring that seems sloppy. This makes your two most common interactions twice as long to type.

Shells have also been built to help glue applications together, and are more concise for the easy versions than most programming languages, sometimes heavily so. Most programming languages, (outside of maybe perl and ruby), don't make this use case anywhere close to their top priority, but shells do, trading off other things like a simple syntax for bitwise OR.

In another sense, shells are exactly what you describe, it just happens that the most popular languages for them happen to be sh/bash/zsh/fish etc, which make a different set of tradeoffs than the 'traditional' programming languages.

I'm only partially aware of the history standpoint. My understanding is that before awk, shells (sh, mostly, IIRC) were about the only scripting level languages on various Unixen.


General programming languages and shells have different priorities. Specifically a shell programming language should be designed to make launching external applications and processing their output the top priority. The language also needs to focus on easy of interactive input, so much so that the repl and the language or somewhat coupled. You wouldn't have a command in a general purpose programming language to execute the same line of code as you did 5 lines ago, that would be worse than goto. However, in a shell, such a thing sounds nice. The shell also needs to make it convenient to deal with some OS level objects, such as files. This is where glob syntax comes in handy.

This all cascades to make a language that looks much different than you would want to write a full application in. One of the primary reasons is because of how shells need to handle strings. It is much more convenient to not type out quotes around string literals and it is nice to have syntax to reference environment variables, files, previous commands, etc. This leads to extremely complex quoting and expansion rules in most shells. It is not intuitive, but once learned it makes for a great interactive experience (compared with traditional languages).


Like Xonsh? http://xon.sh/

Previously discussed two years ago here: (https://news.ycombinator.com/item?id=9207738)


TempleOS does just that. What you type at the command line is the TempleOS systems language (Holy-C if I recall), it's compiled and executed. The syntax of Holy-C is such that it's not too onerous as a shell language.


I like the answer yumaikas gave. I would say that the shell IS what you're saying: a REPL for a programming language. But it is a weird programming language.

The language has evolved with compatibility since the original release in 1971, before it had loops, conditionals, or functions. That's before yacc and probably before some common parsing algorithms were even invented. It is a proper programming language by now, but it just feels old and crufty.

I think it needs a few more OS bindings as well. The "test -f / -d" stuff is a little impoverished.


Is there a good place for discussing Oil with the author (mailing list, irc &c.)? I have thought a lot about many things in the blog posts, but there's no comment section, and I'd really like to get clarification as well as dive into a bit more detail on some of these things.

[edit]

Found it here:

http://www.oilshell.org/blog/2016/12/29.html


How hard would it to be to make an entirely new shell language that doesn't feel like it's 30+ years old? Why isn't this more common?

I'm aware of, like, TermKit[0], but that didn't get finished and I don't recall hearing about any newer efforts.

[0]: https://github.com/unconed/TermKit


I admire Unix, but I also feel things are getting old.

I have a minimal Unix setup, with lots of little programs that do one thing and do it well. For example, mutt is my MUA (mail user agent). It just reads my emails, but fetching or sending is carried out by other programs. Fine.

Now, the issue is that every little Unix utility that is non-trivial comes with its own little DSL for configuration. Most are inflexible, naive and different to each other. It's hard to script mutt. If you want to do some stuff the author hasn't planned for, you are better off patching the code. Flexibility, and composability, is not as good as they preach.

For this reason, I am migrating everything to emacs. It offers me what Unix only promises but falls short at. Everything is an elisp function and thus easy to combine. Things are more organic. And still minimal.


> I admire Unix, but I also feel things are getting old.

" We really are using a 1970s era operating system well past its sell-by date. We get a lot done, and we have fun, but let's face it, the fundamental design of Unix is older than many of the readers of Slashdot, while lots of different, great ideas about computing and networks have been developed in the last 30 years. Using Unix is the computing equivalent of listening only to music by David Cassidy."

Rob Pike, https://interviews.slashdot.org/story/04/10/18/1153211/rob-p...


Yes, it's a very good quote. It's sad most Plan 9 and Inferno innovations, just to name two further developments of Unix, have been ignored.

But overall, I am more comfortable on live environments like Smalltalk or Emacs. There's just one fundamental abstraction, and things can be composed beautifully.

In Unix, on the other hand, everything is not a file. And even if it was, little applications like mutt are silos. The code is static, it can't be easily changed or opened. A file interface, in the form of STDIN/STDOUT is too unstructured.


> But overall, I am more comfortable on live environments like Smalltalk or Emacs.

After Amiga, Atari, Acorn and Windows, getting to learn UNIX ideas seemed quite interesting (I started with Xenix).

However, once at the university I got the opportunity to dive into the work of Xerox PARC and ETHZ, the more I learned about them, the more I came to realize that I actually cared more about their vision about how computing should look like, than anything I ever used in UNIX variants.

Of course, Plan 9, Inferno and C+@ are other Bell Labs ideas that would have improved the overall UNIX experience, but as you say, have been ignored.


Because there are many more than 30 years of learnings in the current shells that people ignore when building substitutes.


Yes I agree... that's why I took the trouble to write a very complete bash parser, which taught me a few things, and improved the Oil language design.

Some features are plain mistakes, but some of them are useful things that newer shells aren't taking into account (particularly with regard to file descriptors). I'm aiming for a superset of bash semantics, but with different syntax. Importantly, it will be backward compatible, because bash can be converted to oil automatically. (I'm working on that now, not released yet.)

http://www.oilshell.org/blog/2016/11/09.html

The rest of the blog has a bunch of shell corner cases, like:

http://www.oilshell.org/blog/2016/11/06.html

http://www.oilshell.org/blog/2016/10/28.html

Although I probably only blogged about 20% of the weird stuff I found.


Just wanted to say that I really like the ideas you've expressed on the blog. I've found myself thinking similar thoughts, such as about combining shell, awk, make into one language. Keep it up!

I'll add that my ideal "command and control language" (as I've been thinking of it) would also be one that's strong at expressing data literals. If there are a number of data types like maps, sets, lists, etc., then it should be easy to initialize them with literals (not true in languages like Java). There are so many configuration file formats that I wonder if they could be subsumed into files with literals expressed in this language (perhaps with a "data mode" to prohibit executable code).

I've been wanting to design a language along these lines as well. I'll start by learning about oil, and reach out if I'm interested and able to contribute!


Thanks! Yes I plan to have data literals -- it's basically going to be JSON, because that is sort of the least common denominator between JS, Python, Ruby, Perl, etc. Shell is a glue language, and JSON is pretty natural at this point.

There is also going to be influence from R and CSV files (which goes with awk).

I have thought about the config problem a lot -- and a configurable sandboxed "data mode" you are talking about is probably what I will go with.

If you squint, the oil shell will look not unlike nginx configs, e.g. a bunch of words with nested {} for blocks. Maybe like https://github.com/vstakhov/libucl or https://github.com/hashicorp/hcl (hm this seems to even have here docs!)

This kind of thing was done a lot with Python at my last job, to various degrees of success (e.g. https://bazel.build/ - the build language is derived from Python). Python sort of has some "data mode" sandboxing features, like being able to set __builtins__ when you exec/eval, but probably not enough. It didn't work well enough to prevent people from writing their own config file parser eventually. The syntax is close but not exactly what you want for some use cases.


I'm curious how far from just dropping to sexps you are looking. While I will not claim that lisp is the be all of languages, when you have a literal syntax that is literally made for linked structures, it becomes a lot easier to represent whatever you are wanting to do. And it you can keep yourself from having tons of "pseudo languages" that are used for different structures.


I've mentioned this a few places so I should probably write a post about it, but I did experiment with Lisp and shell (using femtolisp which is used to bootstrap Julia). There is an obvious similarity in that they both have prefix rather than infix syntax. Tcl explored this design space pretty fruitfully as well.

The short answer is that the experiment didn't work very well. If you want a shell based around Lisp, the new "oh shell" is closer to that. It has homoiconic syntax, which I don't agree with. I talked about that a bit here:

http://www.oilshell.org/blog/2016/10/28.html

The oh shell is influenced by the es shell, which was almost literally a Lisp. There is also EShell (ELisp) on my Github wiki, and earlier Scheme shell. So without going into too much detail, I think the idea has been tried and it has failed not by accident, but for fundamental reasons.

That's not to say that Lisp isn't hugely influential. Julia and Elixir both have very Lisp-like metaprogramming which I hope to take some inspiration from. Without having tried it in detail yet, I think the lesson is that in 2017 you don't need homoiconity to have metaprogramming. And another lesson is that people like syntax (me included!)

The shell especially needs syntax because it will often be typed in a terminal as opposed to in an editor with help.


Actually coming back to this. Eshell is an odd example for you to pick. I use eshell daily. Is my only shell, if I can get away with it. However, it is not a "lisp" shell. It is simply a shell buffer in emacs that is heavily integrated with many emacs features. Tramp, in particular, is nicely integrated such that I am beginning to forget scp and related notations. (That is actually a bad side effect.)

Yes, you can inline lisp expressions. But, I constantly paste in standard shell exports and they work exactly as you would expect them to.

Again, we agree that lisp is not necessarily a good shell language. Just, EShell seems a poor example to look at when making this argument.

Edit: Where is the wiki you referred to? In particular, where you discuss eshell.


I fully agree that lisp is not the answer for a shell. My question was only for your data literal syntax.

Specifically, maps as association lists are very natural in lisp. So are many other structures. And JSON is easily seen as a crappy sexp. XML is easily a complicated sexp. Python's pickling is a complicated eval.

And I get it. Some things have a more "natural" syntax that isn't postfix. Even in common lisp, the reader macro is often used for infix things. However, when the examples used are easily translated to sexps, and back, it makes me wonder why we don't think programmers could learn to write them.


The literal syntax is going to look more like JSON, like Python and JavaScript have. I don't see what's better about association lists (assoc (k v) (k2 v2)) vs. {k:v, k2: v2}.

I mentioned in that blog post that Clojure introduced meanings for [] and {} in a Lisp, and I generally view that as a good thing.

Also if that syntax is only used for data literals, and not code, then it loses the whole Lisp paradigm and I wonder why you would want that syntax.


Well, the "assoc" is only the function to traverse an alist. So, that isn't needed there. And typically, for literals you'd just quote the list. So you are comparing ((k v) (k2 v2)) with {k:v, k2:v2}.

My gripe with the second will be simply that you will retraverse a ton of corner cases and other encoding issues in order to have your literal syntax. More, I will probably be required to have yet another batch of tooling to support programmatically generating these literals.

At the end of the day, probably won't matter. Odds of success here are already low. So, I can see siding with emotional aesthetics for a syntax. I just face palm with all of the tooling that goes out the door simply because people are averse to parens.


I don't follow... {k:v, k2:v2} is not difficult to parse or generate. Based on the ubiquity of JSON parsers, it's been done dozens of times in every language under the sun.

I don't really see a paucity of tools for JSON or Python, unless there is something special you're looking for.

The larger issue of metaprogramming is important, but that would require code to be represented uniformly too, not just data. I'm looking into Julia and Elixir metaprogramming, which take advantage of a uniform Lisp-like representation, but also have rich syntax.

I would say it's 2017 and we should be able to have BOTH syntax and metaprogramming/tools. I have a lot of posts about about parsing on my site. If there's a problem with parsing something, I prefer to fix the parsing tools than to mangle the language's syntax to fit an ancient model.


If you pick JSON explicitly, that is fine. But, I will note that {k:v, k2:v2} is already not JSON. It is JSON-like, but needs quotations and whatnot.

The year is just an appeal to emotion. It would be nice if we could have and eat cake. Empirically, things haven't shaken out in that direction. And it wasn't long ago that there were a host of face palms for JSON parsing out there. Things are getting better, in some regards, but adding another JSON like parser to the fold will only make that worse.


> Clojure introduced meanings for [] and {} in a Lisp

Lisp was using [] and {} before in a lot of software.

For example this is a data structure in Connection Machine Lisp called Xapping, basically a parallel hash table:

    {moe->"Oh, a wise guy, eh?" larry->"Hey, what's the idea?" curly->"Nyuk, nyuk, nyuk!"}
Straight from the 80s.

Scheme was using [] also many years before in the standard, programs and books.


My thought is that the confusion is most people are taught that lisp has no syntax. And that they should avoid the reader macro at all costs. Few, then, actually dive into large programs in lisp to see how real programs actually look.

In this example, though, what was the advantage of the special syntax? (I'm assuming there was one. Just a lot of these parallel hash tables written in literal form?)


> taught that lisp has no syntax

Which is wrong. It's just that Lisp syntax looks and works a bit different. Especially there is a data syntax for s-expressions and the syntax for the programming language Lisp is defined on top of s-expressions.

> And that they should avoid the reader macro at all costs.

Common Lisp itself uses read macros for its implementation. Applications use them in various ways.

> In this example, though, what was the advantage of the special syntax?

Xappings were a central data structure in Connection Machine Lisp, thus it's not unusual that it had a printed representation.


Agreed on your points. Especially that it has syntax. But look at the linked post upthread. I said "no" syntax, but that was me being admittedly lazy. Quote there was "Lisp is a language with little syntax". Which has me wondering why it has that general view with everyone.

For the representation.. So this was not simply a literal syntax, but a serialization one. Which makes sense. And I can see why the serialization syntax is easily usable as a literal one.


GCC's C parser is a 550 kilobyte file that is almost 19000 lines long. The C++ one is over 1.1 megabytes long, and close to 39,000 lines. Github refuses to display it as code, only raw.

Lisps don't have anything that even comes close.


Since many/most Lisp macros implement syntax, there can be a lot of syntax in Lisp, too. The parsing of the syntax is distributed over the macro definitions, sometimes with some help of the general macro mechanism.

See for example the syntax which is implemented by macros like LOOP or ITERATE. The LOOP syntax is documented in the ANSI CL spec...

http://www.lispworks.com/documentation/lw51/CLHS/Body/m_loop...


Macros do not have to deal with syntax at the level of "how does this sequence of tokens reshape into a tree". (Not usually; exceptions are easily contrived and exist in the wild.)

Usually, the syntax is already parsed when it comes into a macro.

E,g:

   (sentence-macro subject object verb)
sentence-macro doesn't have guess what part of the utterance is the object and which is the verb. The verb is the third argument, and that is that. This is the case whether it is a single word, or a phrase.

Though this is still syntax, it is parsed syntax.

When we say that Lisp has no syntax, it means that it dosen't have that silly, counterproductive stuff naively imitated from natural languages which represents the tree in a way that requires mind-bending work to recover.


You should probably check out the LOOP macro sometime. Examples include:

    (loop for i from 0 to 10 
          for j from 10 downto 0 
          collect (cons i j))
That is a fully legit expression. So, if you consider LOOP contrived, then you have a point. It is a very useful and a very powerful macro, though. Not sure why it wouldn't count.


LOOP does speak to what you can do with macros. Obviously, a macro can interpret its arguments however it wants. You can implement a language that requires GLR (or worse).

LOOP is also not universally loved in the Lisp world. It's fine if you don't need to extend it. Not only is it inherently inextensible (for no good reason), but it's hostile to wrapping. If you write a MYLOOP macro which passes most of its arguments to LOOP and adds a few clauses of its own (translated to LOOP syntax), you have to parse the LOOP syntax to see where your clauses are and in which order in relation to the standard ones. (Unless you do something hacky or ugly, like recognize your extensions by some delimiters, without regard for the surrounding syntax, so that they don't actually blend in.)

LOOP is the only high level control construct I've ever seen anywhere in which you can achieve nonportable behaviors: because of the construct itself, not because of nonportable expressions plugged into it.

The treatment of clause symbols as character strings, so that CL:FOR and MYPACKAGE:FOR can both head off a for clause is a design smell. LOOP should have been designed to use symbols in the CL package, so that if someone wants to use APPENDING, they nicely import that symbol or pick it up via :use.

LOOP asks you to use different syntax for walking a vector and list, when the rest of Lisp has a sequences library that lets you use common functions for both. Supporting the specialized syntax is fine (just like the library also has list and vector specific things), but the lack of a generic sequence iterating clause shows a discord with the rest of the language. It didn't require much imagination to have, say a FOR x OVER <list-string-or-vec>.

We can have a perfectly good Lisp dialect without any sort of mini-language parsing macro like LOOP. It illustrates what you can do with macros, not what is usually done with macros. Macros usually leverage their tree structure and destructuring not to do any parsing work, and focus on the transformation and its semantics.


> Not only is it inherently inextensible (for no good reason)

Various LOOP implementations are extensible.

> Macros usually leverage their tree structure and destructuring not to do any parsing work, and focus on the transformation and its semantics.

Untrue. Many macros need custom destructuring or even walking the code in some way.

> We can have a perfectly good Lisp dialect without any sort of mini-language parsing macro like LOOP.

Sure: basic Scheme, but then Scheme has its own complex looping constructs like http://wiki.call-cc.org/eggref/4/foof-loop .

Iteration macros like LOOP, ITERATE, FOR and others are very convenient and powerful.


I mostly agree without, except that custom destructuring and code walking do not imply parsing.

Destructuring is just how we access the tree (already parsed object). Sometimes the tree isn too complicated for the simple destructuirng performed by destructuring macro lambda lists, so we have to do things like walk substructures and apply DESTRUCTURING-BIND or use some pattern matching library or whatever. Example: just because I I have some MAPCAR over a list of variable-init pairs (qualifies as "custom destructuring") doesn't mean I'm parsing syntax that hasn't been parsed. I'm just destructuring a syntax tree that hasn't been destructured. Destructuring isn't parsing. Destructuring is not even mentioned in papers and textbooks on compilers; it falls into the bucket of somehow walking the tree, which falls under semantic analysis.

Code walking is ultimately done to the expansion of every macro. COMPILE and EVAL perform code walking. No textbook on compilers will refer to code walks over an AST as part of the parsing stage.

Of course the point is valid that Lisp macros sometimes take a flat list of items and apply recursive phrase structure rules to recover a tree (or small subtree, as the case may be). This is rare, and basically a last resort device; if you're doing that, you're writinig some sort of "big deal" macro. Instances of it are rare in Common Lisp, and I don't suspect it is done frequently in Lisp programs. It's great that wan do that. It's also great that because of the way the langauge works, we can accomplish a lot without having to do that.

Speaking of ITERATE, its parsing of clauses is trivial, because it uses Lispy syntax. Dealing with (for var = expr) versus (for var initially expr then expr) versus (for var first expr then expr) is "quasi parsing". OK, we have a FOR. What isn the third item? Switch on a bunch of cases: =, INITIALLY, FIRST, .... If it's unrecognized, then signal an error. In each of these cases, certain things are fixed by position in the syntax already. This is "parsing" only in the sense that Unix people refer to simple trivial line argument processing as "parsing". (Only a few utils do actual parsing or a recursive grammar, examples being find and tcpdump.)


> Macros do not have to deal with syntax at the level of "how does this sequence of tokens reshape into a tree"

That's not syntax. Syntax is concerned whether a sequence of words are valid expressions in a language and determines syntactic categories for these. You can lookup a better definition of syntax, I'm too lazy.

These are all DEFUN forms. Some are valid Lisp, some are not.

   (defun foo () ()) is valid

   (defun () foo ()) is invalid

   (defun () () foo) is invalid

   (defun (foo) () foo) is invalid

   (defun foo () foo) is valid

   (defun () foo foo) is invalid

   (defun foo foo ()) is invalid

   (defun (setf foo) (setf foo) foo) is valid

   (defun foo (setf foo) (setf foo)) is invalid

   (defun (setf foo) foo (setf foo)) is invalid
Just by changing the order of subforms in a macro form we can produce valid Lisp and invalid Lisp forms.

   (defmacro defun (name args &body forms) ...)
Does not tell you that. You need to implement that logic in the macro somewhere.

> sentence-macro doesn't have guess what part of the utterance is the object and which is the verb. The verb is the third argument, and that is that.

Macros implement more complex syntax. Check the ANSI CL specification and its EBNF syntax declarations some time.

> When we say that Lisp has no syntax

Then it's just wrong and misleading.

Take the DEFUN macro:

the EBNF (extended backus naur form) syntax definition for DEFUN is:

    defun function-name lambda-list [[declaration* | documentation]] form*
Is

    (defun (bla blub) (&rest foo &optional bar)
                  (declare fixnum) "oops" ((((fooo))))))
a valid defun expression ????????????????

The macro has to check that. It better rejects invalid expressions. It also has to look at the elements of the form to destructure them in the right way, so that it can process them and create a new valid form.

The Lisp implementation provides for the implementation of DEFUN as much as:

    (defmacro defun (spec args &body body ...) ...)
The macro language does not allow further specifications of the name the arglist or the body in the macro interface. All it gets are spec, args and body. Now the macro has to implement the syntax for spec, args and body.

Questions the macro has to answer:

* is the function name a symbol or a list of the form (setf foo) * is the arglist a valid lambda-list? Now check the EBNF syntax for lambda lists with whole/optional/key/rest options with default values and what have you. * now it has to parse the body: * is the declaration valid? Now check the EBNF syntax for declarations to see what needs to be done. * is the documentation a string at the right position? * is the body a sequence of forms? Now check the EBNF syntax for FORM.

This all has to be backed into the DEFUN macro somehow or checked from there. And not all implementations are good at it.

Various syntax errors:

The name is not valid:

    CL-USER 14 > (defun (foo bar) baz (list))

    Error: (FOO BAR) is neither of type SYMBOL nor a list of the form (SETF SYMBOL).
      1 (abort) Return to level 0.
      2 Return to top loop level 0.

A symbol is not a valid lambda list:

    CL-USER 16 > (defun (setf bar) baz (list))
    (SETF BAR)

    CL-USER 17 > (compile '(setf bar))

    Error: Invalid lambda list: BAZ

Keyword argument is wrong:

    CL-USER 19 > (defun foo (&key ((aa))) (list))
    FOO

    CL-USER 20 > (compile 'foo)

    Error: Malformed (keyword variable) form in &key argument ((AA))

Wrong declaration:

    CL-USER 24 > (defun foo (&key (aa)) (declare inline-function foo))

    Error: Alist element INLINE-FUNCTION is not a cons or NIL

Wrong form:

    CL-USER 26 > (defun foo (&key (aa))
                   (declare (inline foo))
                   (((lambda ()
                       (lamda () ())))))
    FOO



    CL-USER 27 > (compile 'foo)

    Error: Illegal car ((LAMBDA NIL (LAMDA NIL NIL)))
              in compound form (((LAMBDA NIL #))).

And so on.

An INFIX macro:

    (infix 3 + 2 ^ 5)
It has to implement infix syntax.

If it is still not clear, below is the syntax for the LOOP macro. The LOOP implementation has to implement the syntax, so that

    (loop for i below 70 do (print i))
is recognized as a valid program and it better detect that

    (loop do (print i) for i below 70 )
is not a valid program, because it violates the syntax below.

    The ``simple'' loop form:

    loop compound-form* => result*

    The ``extended'' loop form:

    loop [name-clause] {variable-clause}* {main-clause}* => result*

    name-clause::= named name 
    variable-clause::= with-clause | initial-final | for-as-clause 
    with-clause::= with var1 [type-spec] [= form1] {and var2 [type-spec] [= form2]}* 
    main-clause::= unconditional | accumulation | conditional | termination-test | initial-final 
    initial-final::= initially compound-form+ | finally compound-form+ 
    unconditional::= {do | doing} compound-form+ | return {form | it} 
    accumulation::= list-accumulation | numeric-accumulation 
    list-accumulation::= {collect | collecting | append | appending | nconc | nconcing} {form | it}  
                         [into simple-var] 
    numeric-accumulation::= {count | counting | sum | summing | } 
                             maximize | maximizing | minimize | minimizing {form | it} 
                            [into simple-var] [type-spec] 
    conditional::= {if | when | unless} form selectable-clause {and selectable-clause}*  
                   [else selectable-clause {and selectable-clause}*]  
                   [end] 
    selectable-clause::= unconditional | accumulation | conditional 
    termination-test::= while form | until form | repeat form | always form | never form | thereis form 
    for-as-clause::= {for | as} for-as-subclause {and for-as-subclause}* 
    for-as-subclause::= for-as-arithmetic | for-as-in-list | for-as-on-list | for-as-equals-then | 
                        for-as-across | for-as-hash | for-as-package 
    for-as-arithmetic::= var [type-spec] for-as-arithmetic-subclause 
    for-as-arithmetic-subclause::= arithmetic-up | arithmetic-downto | arithmetic-downfrom 
    arithmetic-up::= [[{from | upfrom} form1 |   {to | upto | below} form2 |   by form3]]+ 
    arithmetic-downto::= [[{{from form1}}1  |   {{{downto | above} form2}}1  |   by form3]] 
    arithmetic-downfrom::= [[{{downfrom form1}}1  |   {to | downto | above} form2 |   by form3]] 
    for-as-in-list::= var [type-spec] in form1 [by step-fun] 
    for-as-on-list::= var [type-spec] on form1 [by step-fun] 
    for-as-equals-then::= var [type-spec] = form1 [then form2] 
    for-as-across::= var [type-spec] across vector 
    for-as-hash::= var [type-spec] being {each | the}  
                   {{hash-key | hash-keys} {in | of} hash-table  
                    [using (hash-value other-var)] |  
                    {hash-value | hash-values} {in | of} hash-table  
                    [using (hash-key other-var)]} 
    for-as-package::= var [type-spec] being {each | the}  
                      {symbol | symbols | 
                       present-symbol | present-symbols | 
                       external-symbol | external-symbols} 
                      [{in | of} package] 
    type-spec::= simple-type-spec | destructured-type-spec 
    simple-type-spec::= fixnum | float | t | nil 
    destructured-type-spec::= of-type d-type-spec 
    d-type-spec::= type-specifier | (d-type-spec . d-type-spec) 
    var::= d-var-spec 
    var1::= d-var-spec 
    var2::= d-var-spec 
    other-var::= d-var-spec 
    d-var-spec::= simple-var | nil | (d-var-spec . d-var-spec) 
    Arguments and Values:

    compound-form---a compound form.

    name---a symbol.

    simple-var---a symbol (a variable name).

    form, form1, form2, form3---a form.

    step-fun---a form that evaluates to a function of one argument.

    vector---a form that evaluates to a vector.

    hash-table---a form that evaluates to a hash table.

    package---a form that evaluates to a package designator.

    type-specifier---a type specifier. This might be either an atomic type specifier or a compound type specifier, which introduces some additional complications to proper parsing in the face of destructuring; for further information, see Section 6.1.1.7 (Destructuring).

    result---an object.


If you think that's bad, just representing C++, not even parsing it, in Clang, takes 61K lines of headers!

If you look at these files it's mostly declarations and one-line functions. It's not even doing anything. The lexers and parsers are a whole other story, certainly greater than 19K lines combined.

    ~/src/cfe-3.8.0.src/include/clang/AST$ wc -l *.h
       2742 DeclObjC.h
       2809 RecursiveASTVisitor.h
       2927 DeclTemplate.h
       3198 OpenMPClause.h
       3249 DeclCXX.h
       3800 Decl.h
       4154 ExprCXX.h
       4942 Expr.h
       5723 Type.h
      61525 total


But that is a different claim, isn't it? I would expect that the general parser for a CL implementation is also quite large.

My question is why does the teaching often go that there is practically no syntax for lisp. Now, it is often followed quickly with "in lisp, you can write your own language."

But then, those two thoughts are hard to justify next to each other. Again, just see this thread. The complaint is that there is not enough syntax in lisp. The argument appearing that you have to have another language to express some ideas. And again, I'm not even disagreeing with this claim. Just trying to understand why it has the foothold that it does. That you drop to lisp if you want to use no syntax.


Just checked an old clisp-2.32 source tree I have laying around on an old hard drive. The raw line count in the src/ directory over all .d and .lisp files is 212K. Compiler, CLOS, all in there. The io.d module (preprocessed C source) which contains the reader and printer, plus some other cruft, is 10K.


Just wow. I would expect it to be smaller. But I am surprised it is that much smaller.


as an interesting note, Lua actually started out as a data entry language (actually called DEL), that eventually morphed into Lua due to demand for more complex logic and such.


Awesome posts, best of luck in this! Any suggestions/requests for users to help?

And I realize I made an omission in my post. I meant to say not just learnings, but use. I would think the biggest hurdle for you will be building a user base.


Thanks a lot! I'm hoping to set things up for contributions in the next couple months.

I'm want to publish the test matrix and get some help because filling out the corners takes a lot of time. The general architecture is pretty sound though -- I hit all the "major areas" of shell, although I may need to make another pass at globbing.

If you are adventurous you should be able to clone it, follow the instructions for ./pybuild.sh. Then ./spec.sh install-shells, and then run './spec.sh smoke'. That test should pass, but there are like 15-20 other categories in spec.sh in various states of disarray!

Of course this is in Python, but to me it makes sense to fill out a sketch before porting it to C++ (only the runtime; I'm hoping to bootstrap the parsers in oil itself). Not sure if that will inhibit or help contribution.


That's exactly what I'm doing :). The rest of the blog goes into some detail, although it's still in the early stages:

http://www.oilshell.org/blog/

You can also see a list of other shells on the GitHub wiki page titled ExternalResources: https://github.com/oilshell/oil/wiki/ExternalResources

I'm focusing on the shell as a programming language at first, in contrast to TermKit.


What about fish? I use it on the terminal but still write my shell scripts in bash so they work across systems (including windows with minGW). I think a big issue with new shell adoption is the prevalence of bash basically everywhere.


I've been using fish for over two years and love it. But the thing about fish is that is does totally break things that are expecting a sh/bash environment. There is more use out there and I've seen a number of shell tools/plugins include what to add to the fish config right under the bash and zsh sections (that's how I originally found out about fish).

Fish does have bugs though. I've seen enough stack traces while running early builds to question the security (although I'm sure we all remember the bash shell-shcok bug that was laying around for years).

If you haven't tried fish, I highly recommend it. It takes a little getting use to, but overall it adds a lot of really amazing functionality that I find really useful.


> If you haven't tried fish, I highly recommend it. It takes a little getting use to, but overall it adds a lot of really amazing functionality that I find really useful.

Serious question (that I've asked elsethread): what does fish offer than zsh & bash lack? A quick perusal of the docs reveals nothing obvious, but … I could be missing stuff.


Other people indeed chimed in on your question in other comments. But here's my take about the most important fish features:

* Instant auto suggestions, you just type stuff, like in a web browser.

* Sane defaults, no need to bring around your own config. On a new machine I just install fish, run fish_config and customize the prompt with the web UI.

* Scripting with simple syntax, especially for one-liners. Yes, fish is incompatible with bash, but it's less of issue in this case.


> I think a big issue with new shell adoption is the prevalence of bash basically everywhere.

Which is better than the days when csh and different Bourne flavors were common.


I think that sweet-expressions (http://readable.sourceforge.net/) & CL-INTERPOL (http://weitz.de/cl-interpol/) could be the basis for a new kind of shell:

    let (foo bar)
      ls foo "bim/baz" ${foo}
Might be turned into the equivalent of 'ls foo bim/baz bar' in sh.

In fact, with a minimum of work I think this could be turned into a Lisp shell …


Unless you drop the requirement that the shell supports this type of usage:

    command arg1 arg2 ...
then whatever is built to this requirement is going to feel decades old to some extent.


Why? There's nothing old about that syntax. On the contrary, I don't think anyone would use a shell WITHOUT the "cmd arg1 arg2" syntax. I certainly wouldn't want to.

What feels ancient is the lack of good parse error messages and runtime error messages in bash.

And syntax like a=X but not a = X or a = 'X' -- the difference being the spaces.

And (echo hi) is OK for a subshell, but you need { echo hi; } for a group -- space and ; are required in the latter case, but not the former.

The lack of proper functions in shell also feels ancient. For example, writing a completion function in bash is a pretty horrible experience. As another example, a simple function to escape HTML is either 'sed' or three statements like ${foo/&/&amp;}, both of which are obtuse.


There's Microsoft PowerShell, an OO shell recently MIT-licensed and ported to Linux. Just throwing that one out there.


Build it is not the "hard" part. Anyway you can say a lot of languages are almost there (python, ruby, perl, forth, rebol).

Is that developer are THE WORST. Them will refuse to switch to anything better if it means to a small change in his old ways.

That is why C/C++/Js/Bash rule the world...


Beh, I guess i'm getting old then, because I spent ~10 years of my life (over 10 years ago) learning dozens of scripting and programming languages, and programming professionally in quite a number of them.

The takeaway after all that was, rarely is one language better than another, its really about trade-offs.

And frankly the older ones are a lot better at following the KISS principal and actually providing an uncluttered framework for writing _EFFICIENT_ code. For example, everyone knows that Java is a "better" and more productive language than COBOL, yet when you start talking to people trying to port old COBOL apps to java, what you discover is that it turns out that COBOL might actually do a better job in the business/transaction space. Same with python and Fortran in the HPC space, and strangely enough the core infrastructure all these more modern efficient languages are running on tends to be "C" based. Which despite the laundry list of issues that people like to parade around, seems to solve systems programming problems better than anything else.

None of this would really be a problem except for the fact that the new up and coming language that is cool this year changes from year to year leaving a wasteland of abandoned poorly maintained projects using languages that no one really knew, and are despised by the people who are hired and have to learn them for that one job before moving on to the next one.

So, yes bash is a POS, but its universal at this point, isn't really that bad at solving the core problems its designed to solve, and any developer with 1/2 a brain can write a sys V init script (or a lot of similar things) with little more than a couple hours and without the baggage of yet another 1/2GB language/framework/etc pulled in to sit alongside the dozen others already in the project.

</rant> Now get off my yard.


It's somewhat difficult to come up with a good syntax. You need to support: unquoted strings, executing commands, pipes, file redirects, etc.

Adoption would be even harder, but any resulting language is not going to look like Python or any other "mainstream" language, if it's intended to actually be usable.


Yeah this is a problem, but after writing a bash parser I think it's straightforward to solve.

Oil is parsed line by line like any other shell, and there will be a simple rule to determine whether you're in command mode or expression mode.

The "lexer modes" technique (previous referred to as "lexical state") should handle the two sublanguages easily. You basically need it to parse bash, and you will need much less of it to parse oil. My bash parser has 13 lexical states (up from 8), but I expect oil to have somewhere between 4 and 8.

How it's used in the bash parser: http://www.oilshell.org/blog/2016/10/19.html


I'm not talking about implementation, but rather designing a clear and consistent syntax. It sounds like your intent is to hew pretty closely to existing shells, in which case you're probably doing more detail work.


The strategy is to break compatibility with bash but provide an upgrade path, as mentioned here:

https://news.ycombinator.com/item?id=13478291

It is a design issue to combine command and expression syntax (unquoted and vs. quoted literals). But I think I've figured it out ... we'll see in a few months! The blog will have updates on these kinds of issues.


But are you maintaining POSIX compatibility? I thought I'd read that somewhere on your blog, but I might misremember.


Yes, the osh language is basically bash, which is very compatible superset of POSIX sh. The oil language is a totally different language, but you will be able to upgrade to it automatically.

Once you upgrade it's of course not POSIX compatible anymore :)


BTW, I have thinking in build a shell/ipython hybrid. I think is worthless to try to emulate a shell (too much baggage), but instead build something that is alike a shell, but modern.

However, still developers will not buy it. I think maybe it could work for a subset of the user that need to work with data-science and subset of system automation.


have you heard about xonsh?

http://xon.sh/


I'm using xonsh at work, but currently the main thing I've noticed is that I still prefer iPython QT for exploring data and that xonsh starts up too slowly.

Last I checked I think the thing I'm missing from using iPython purely as a shell is the ability to pipe?


As I pointed out at https://news.ycombinator.com/item?id=13397381 , you forgot rtprio, idprio, chrt, and numactl; the s6, perp, daemontools-encore, and nosh toolsets; and execline. And TCL and the Thompson shell.

* http://wiki.tcl.tk/15088

* http://jdebp.eu./FGA/daemontools-family.html

* http://v6shell.org/


Thanks, I did add s6 in response to your comment, but it wasn't meant to be exhaustive, so I left out the others. I didn't add Linux "perf" or a whole host of other such tools either.

I'm not sure how TCL and the Thompson shell relate though.


Apart from the fact that you are re-using the Thompson shell's name, consider things like its if command. Yes, command; not reserved word.

And my list is not exhaustive, either. But it makes the point to those that would erroneously dismiss this (as you yourself do) as something that "doesn't see much use today", that this, given how long the list of non-daemontools-family commands that operate this way actually turns out to be when one looks around, is not really a daemontools, or even a "Bernstein", thing.


In addition, rather than docker, try sometime running a process with cgexec. It helps to know some of the underlying interfaces to get the full scope of what advanced compositions of those interfaces are doing.


Calling anything “oil shell” seems like a pretty good way to invite a trademark dispute from Shell Oil Company.


point free forth like does not make




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: