Hacker News new | past | comments | ask | show | jobs | submit login
Nushell: Introduction to a new kind of shell (dataswamp.org)
319 points by hucste on Nov 1, 2022 | hide | past | favorite | 247 comments



I just ported a bunch of scripts to nushell.

I hit some bugs, a couple of which are a bit sharp, but wow, the list of quirks I have to remember for nutshell are so much fewer than for bash.

Constantly impressed at the errors it catches at parse time, kinda crazy sometimes.

Oh my god I could cry, strings are sane to work with. I may never write a bash script again (it's okay, I use Nix so I get nushell everywhere I'm might need it, for free)

Everyone is asking about PowerShell. I never hated it, but lord it makes some awful, stupid infuriating decisions. I've lost hair and sanity to some stupid list/single-item-list beahvior. Numerous operators and bits of syntax are just different. Everything about the script syntax is just slight odd and confusing. There's basically none of this with nushell. Jonathan Turner is a gifted person with an eye for language design and it really shows.

Edit: I do think it's missing some important output redirection functionality. You can workaround by using "complete" but that feels non-ideal.


> I may never write a bash script again

I wrote 2x 200 line-ish shell scripts recently, and it was simply terrible, and although I'm increasingly convinced we need a shell like abstraction, I can't believe `bash` is the best we can do. Really hope I find one of these alt shells that suits me. Not a fan of Python but xonsh looks really cool too.


Bash is not the best we can do!

If you want a traditional Unix-like shell that is mostly sensible in the places where Bash is not, check out Zsh. It has a ton of complicated features, but most Bash scripts can be ported easily (if not outright copied and pasted). Zsh has fewer footguns by default than Bash, and it has more "safety" settings that you can enable.

There is also the Oil shell, whose creator often posts on HN, and which I think is meant to be a superset of Bash, but I have not used it myself and can't vouch for it.

As for the alt shells, I've was specifically interested in Elvish, but I dropped it as soon as I saw that they don't support parameter interpolation in string literals, like `"${HOME}/.local"`. This is such a common operation in shell scripts that I have no interest in a shell that doesn't support it, and I can't imagine why Elvish doesn't.


I would recommend Oil!

http://www.oilshell.org/release/latest/doc/idioms.html

It removes the need to quote every variable, and this is fantastic


FWIW this is also the main selling point of Zsh. I should go through this document in detail (thank you for linking it!) but a quick search shows that Zsh is not mentioned even once, and I think a Zsh comparison would be really valuable for people like me.

Or is it like Neovim vs. Emacs at that point, where neither one is "better" and it's just a matter of taste and/or whichever one you happened to try first?


Well I actually use zsh (I always mean to switch to oil though), and.. in zsh, you don't need to quote variables? Are you sure?

Oh.. I just tested here. It works!

  $ mkdir -p /q/a\ b/x
  $ a="a b"
  $ ls /q/$a
  x
And in bash:

  $ a="a b"
  $ ls /q/$a
  ls: cannot access '/q/a': No such file or directory
  ls: cannot access 'b': No such file or directory
The weird thing is, my shell is zsh but my shell scrip ts are all either #!/bin/bash or straight #!/bin/sh, so I never took advantage of this

Anyway, I want to link also to

http://www.oilshell.org/release/latest/doc/upgrade-breakage....

http://www.oilshell.org/release/latest/doc/warts.html

http://www.oilshell.org/release/latest/doc/known-differences... <- here it talks about zsh a bit


The only thing to keep in mind about Zsh parameter expansion is that unquoted empty values will be dropped entirely, while quoted empty values will be treated like the empty string '':

    show_nargs() { print $# }

    q=
    show_nargs $q    # 0
    show_nargs "$q"  # 1
So it doesn't completely solve the need for defensive quoting, but at least it mitigates the need for the most part.


Ouch, that is a sharp corner:

    $ echo 'aaa' > a.txt
    $ echo 'bbb' > b.txt
    $ TMP_DIR="" # Mistake! Missing arg, failed search, typo'd name, etc.
    $ cp a.txt b.txt $TMP_DIR
    $ cat b.txt
    aaa
    $ # The contents of b.txt are lost.


cp itself should have been two commands, or at least accept two flags for those two very different semantics.

Actually there's a wave of unixy tools being written in Go and Rust and I think there might be a suitable cp replacement already, but it's up to distros to package it.


>If you want a traditional Unix-like shell that is mostly sensible in the places where Bash is not, check out

PERL. Sh, awk, sed and mini-C all at once.


I tried to like Perl. I really really did. `while (<>)` and regex literals are amazing. Everything else is kind of rough for me, especially the way arrays and hash tables work, and I much prefer Zsh, AWK, or Python for the same niche.


Since I can't edit my post anymore, here are some additional thoughts:

Traditional shell scripting languages are great at exactly three things: typing commands interactively, running other programs, and sourcing other shell scripts. They are also is distinct in that they are "stringly-typed" (i.e. everything is a string), and moreover that syntactically bare symbols are also strings.

Typing commands interactively is essential because... it's a shell. That's what it's for. Most programming languages do not and should not optimize for this. But it's literally the purpose of a shell, so a shell should be good at it.

"Running other programs" includes invoking them (literally 0 extra syntax), piping them (one letter: |), and redirecting the standard input and output streams to files (>, <, <<, etc.). This is one of the great innovations of Unix and nothing holds a candle to its elegance and convenience, even if sometimes we get frustrated that pipes are "dumb" streams of bytes and not something more structured.

"Sourcing other shell scripts" practically isn't much different from "running another shell process", except that source'd shell scripts can set shell parameters and environment variables, which is an important part of e.g. the X11 startup system: the global /etc/X11/Xsession script sources the user's ~/.xsession script, so any environment variables set in the latter are propagated to the former, and thereby are inherited by the X11 window manager when it eventually starts. If you wrote the /etc/X11/Xsession script in any other programming language, you'd have to inspect the environment variables of the ~/.xsession process and "merge" those values back into the current process' environment.

On being stringly-typed, I think it's mostly good in the context of the "two things that it's good at" described above. It cuts down on syntactical noise (otherwise "everything" "would" "always" "be" "quoted" "everywhere"), makes string interpolation painless, and generally supports the "typing commands interactively" use case. Make and Tcl are the only other popular languages in this category. Perl and Ruby allow it in specially-delineated areas of code. Moreover, Bash, Ksh, and Zsh all have arrays, which helps fix some of the biggest problems of everything being a string.

In short, if your program can make good use of the above features, then a traditional shell is a great choice. If you program does not need those features, do not write your program using a shell script, because shell languages are awkward at best in pretty much all areas.

And if you are considering an "alt" or "neo" shell language, in my opinion it must excel in the above two categories. Being stringly-typed is a matter of taste, but the language should also probably support bare-symbols-as-strings and string interpolation.

Python, for example, is not a good shell scripting language because it is not easy to type nontrivial commands interactively, it lacks tidy syntax for input/output redirection (even though it's actually pretty easy using the standard library), and it lacks bare-symbols-as-strings. So instead of:

    #!/bin/sh
    foo -x 1 -y 2 "$ABC" | bar --json -
you have to write something like:

    #!/usr/bin/env python3
    import os
    from subprocess import Popen, PIPE, run

    foo_cmd = ['foo', '-x', '1', '-y', '2', os.environ['ABC']]
    bar_cmd = ['bar', '--json', '-']
    with Popen(foo_cmd, stdout=PIPE) as foo_proc:
        foo_proc.stdin.close()
        run(bar_cmd, stdin=foo_proc.stdout)
You can of course write your own library that abstracts this and uses symbols like > >> < << | to mimic what a shell does. Maybe that's what Xonsh is, I haven't looked at it myself. But this hopefully demonstrates why the base language Python is not a good shell, despite it being portable, being nearly ubiquitous nowadays, being relatively easy to use and learn, being "safe" in many ways that traditional shell languages are not, and having a huge and useful standard library.


I would also like to add for a better shell:

Please add another standard output stream.

stdout is for output

stderr is for error messages

stdlog for status/tracking, verbose output, event streams. Too often this gets shunted to stderr, and I don't want to logfile hunt. Other examples: download status from curl, verbose output flags in things like ssh and other comm programs that need configuration debugging on the regular, and others.

Nushell seems to also focus on json output for commands. I think this should be a requirement pushed into all unix commands that do output (LS! please please please a json output flag for simple ls which CANNOT BE PARSED RELIABLY).


I find the lack of string interpolation to be the one thing that kills me about javascript. The lack of it feels wrong and I can't get behind languages that don't have easy interpolation.

Yes Ruby has ruined me.


  `template${Math.random() > 0.5 ? 'strings' : 'literals'} have beern part of the language since 2015!`


Elvish:

    $E:HOME'/.local'


Xonsh is very posix/bash-like but you can call python functions in an integrated way. Even if you don’t love python it’s a more sane language than bash. I really like it but I’ve been using zsh for too long.


Bash is not the best we can do, but it is definitely the best that most people agree on.

I fear there will be no defacto bash replacement for many years.


What do you mean by "agree on"? We never agreed the programming language, which means ... that I can choose whatever I like.


what's the default/defacto scripting language on all unix or unix-like operating systems?

Bash. (except MacOS I guess)

why? because it is the best thing that everyone can agree on as being the stock shell that should always be installed by default.

until we collectively agree on a replacement, bash will remain, along with all of its problems and everyone will write for it because it's the standard shell that's always installed.

many much better shells exist. we can't agree on what should supplant bash, and so it remains. everywhere.


Honestly I’ve enjoyed working with PowerShell for scripts to work for all developers on my team regardless of what OS they are using.


I wanted to like Powershell, but it is missing some features that I think are essential. For example, apparently checking the return status of an arbitrary command is not trivial, and there's no equivalent of `-eu -o pipefail`. Yes, I know `-eu -o pipefail` is imperfect too, but it covers most common cases. I also struggled badly to stop it from mangling the output encoding of my applications (no, I don't want a goddamn BOM, thank you), and the solutions I found all either had no effect or required me to write what looked like C# code embedded in my shell script.


The upcoming 7.3 release had PSNativeCommandErrorActionPreference which was an experimental feature to implement pipefail https://learn.microsoft.com/en-us/powershell/scripting/learn.... It is no longer experimental and will be part of the actual release. This allows you to set `$PSNativeCommandUseErrorActionPreference = $true` rather than rely on `$ErrorActionPreference = 'Stop'` which does more than just treat non-zero rcs as a failure which is nice.

Otherwise for older versions you can just do `my.exe; if ($lastexitcode) { "failure code here" }`. It's not nice but certainly not impossible to do.


Oh, that is awesome. Finally.


nounset is Set-StrictMode, errexit/pipefail is $ErrorActionPreference = 'stop' with $PSNativeCommandUseErrorActionPreference = $true. With those you can try/catch, which is infinitely better than the $?-based approach, but you can also use $? without them. PSCore encoding is always UTF-8 without BOM; you can enforce that the script is running on Core with #Requires -Version 6, or pass ($content | Out-String) to New-Item's -Value parameter for a v5-compatible hack.


> For example, apparently checking the return status of an arbitrary command is not trivial

You mean the fact that it gets put into the $LASTEXITCODE variable instead of putting the command directly into the conditional?


pipefail is trvial: $ErrorActionPreference = STOP

Its even better then that as you can fail individual commands with their -ErrorAction argument.

BOM is not there any more.


It feels very different from unix shell. I like them both!

I think Powershell more advanced ecosystem with a default output more like the sqlite tables of nushell would be wonderful: the objects handled by PS would be more naturally suitable to this, and would compose more easily.


yep been using powershell here too..great stuff


> strings are sane to work with

Okay, I'll bite - how does it handle filenames with spaces and newlines in them? I have a strong case of Stockholm syndrome with bash because I figured out how to deal with the first one, but I'd love a system that deals with that sanely.


I learned PowerShell some time ago. I really like the concept and the power that comes with it. But, coming from Bash, I can't get a grip of the syntax.

I am in the process of switching most of my shell scripts over to Python; I wonder whether nushell would be the better option.


You are absolutely not alone with this. There appears to be so much power so much potential, but it /feels/ so awkward syntactically.


> I do think it's missing some important output redirection functionality. You can workaround by using "complete" but that feels non-ideal.

Yeah, this is a very fair criticism. We're working on it :)


bash + lua, and life is good again.


How do you use bash and lua together?


Should probably be switched to the official project page rather than a subpage on an unrelated party website: https://www.nushell.sh/

It’s a lot better at introducing Nushell and feels like a proper landing page.


So HN is not supposed to allow blog posts about projects but must always swap out for the project site itself? As someone who writes blog posts about projects, I hope not! It's not like this blog post is affiliate spam.


If the project site is better written and has the same content, yeah, I would prefer to see it.

But this is not the case here.


Deeply disagree. The linked page barely introduces what the tool is. The part above the fold is an unrelated bio and the what is paragraph actually talks about what it’s not while barely explaining what it does.

The project page is better in any respect. I was going to dismiss it and was writing a comment about how the landing page should be improved before realising it was just a blog page. That’s why I’m linking it.


These are 2 different types of posts.

1. A new product

2. An independent party's pro/con opinions about that product

This thread is about #2. If you feel #1 has value on its own, post that.

Read both your link and the OP's. While the topic is the same, the content is markedly different.


Better not. In this case the blog post provides more complex and interesting examples than the simplistic ones on the official page.


Isn't https://www.nushell.sh/ the official website?


It's a blog post but somehow the styling doesn't make it look like one.


It doesn’t need to be switched, you can submit the project page anytime.


haha, thanks for this. i was looking at the examples on the other page and was like, "jfc, no thanks"


It's someone's personal blog post. No need to be mean. The styles are a little brutalist but also readable to me.


i wasnt talking about the css…

i mistook the blog for the an official introduction and was turned off by the complexity of the examples. which i stand by—that’s a poor introduction to the shell.


I use it as my primary driver, for the sole reason is that it's the only cross-platform shell that properly supports Windows (without resorting to Cygwin/MSYS, which has performance issues and many pain points). I'm still getting used to the syntax though...

(Edit: forgot to mention Powershell, since nowadays it also works on Macs and Linux)


Why use this over PowerShell for example?


Well, it is a good point, since nowadays PowerShell is also cross-platform, and seems to have more features than nushell does.

Though the true reason is: I haven't really found the time to. Changing a shell is really stressful since you have to unlearn / relearn lots of things from muscle memory, and PowerShell's huge deviance from the rest of the POSIX-y world doesn't help. At least in nushell `rm -rf` works, the same doesn't in PowerShell.


> PowerShell's huge deviance from the rest of the POSIX-y world doesn't help

In PowerShell (on Windows), `rm` is an alias to `Remove-Item`[0].

Therefore,

    rm -r -fo <thing>
An extra dash, extra space, and extra letter isn't too bad by my books. Furthermore, in scripts, aliases are discouraged by PSScriptAnalyzer[1]; IDEs (PowerShell ISE, VS Code PowerShell extension) also support code completion, so:

    Remove-Item -Recurse -Force <thing>
makes things clearer.

[0]: https://learn.microsoft.com/en-us/powershell/scripting/learn...

[1]: https://github.com/PowerShell/PSScriptAnalyzer/blob/master/d...


why do they alias all this stuff. Remove Item is _not_ rm. It's so stupid. Like when they aliased wget and curl to some nonsense web request command and everybody complained that it didn't work.

I'll never use PowerShell. Just way too many bad decisions all over the place.


> why do they alias all this stuff

The alias only exists on Windows. Did you read the linked page?

    The aliases in this table are Windows-specific. Some aliases aren't available on other platforms. This is to allow the native command to work in a PowerShell session. For example, ls isn't defined as a PowerShell alias on macOS or Linux so that the native command is run instead of Get-ChildItem.
`rm` was never an executable on a pure (aka non-MinGW, non-Cygwin, non-MSYS2, non-WSL) Windows console; the equivalents were/are `rmdir` and `del`. Microsoft's position is clear that PowerShell is meant to supersede these old commands, and hence the aliases, but again, only on Windows. I agree that Microsoft made some strange decisions to alias `curl`, etc on other platforms too in PS6, which were reversed for PS7.

These aliases are meant as stepping-stones for POSIX-first people to get their feet wet with the Windows command-line.


>why do they alias all this stuff

To make interactive usage easier? PowerShell essentially replaces the text-oriented environment provided by coreutils with an integrated object-oriented one.


In hindsight it probably would have been better if they just had a help message with something like "Oh you must be a *nix user, thanks for trying Powershell! You can do something similar with command xyz. Read that command's documentation -here- , and read a guide on the differences between posix and powershell -here-"


And if you want to do multiple things you have to separate them with commas:

rm -r -fo thing1, thing2

Or in the non-recursive way:

rm thing1, thing2, thing3


When PS has pushd, popd and dirs -v I’ll switch.


It has Push-Location and Pop-Location, which are aliased to pushd and popd. IDK about dirs -v.


`dirs -v` would be `Get-Location -Stack`. Also in newish versions PowerShell `Set-Location` (aliased to `cd`) supports -/+ to move backwards/forwards in it's own history, so usually you don't even need to bother with `pushd`/`popd` unless you need a named stack.


what about pushd 7 to go to the 7th path in the stack?


Just get the 7th path from `Get-Location` and `Set-Location` there, e.g.

  function Set-StackLocation ($Position) {
    (gl -Stack).Path | select -Index $Position | cd
  }


Seems like a lot of work for something that's just built into bash and is extremely useful.


Like all shells, PowerShell is split between being a scripting language and an interactive shell.

PowerShell leans a bit more towards being a scripting language than an interactive shell, so things are more verbose, as befits a shell script that needs to be read and updated later by a different programmer.

But yeah, lots of little trade offs like that permeate the language.


Less work than trying to add named stacks to bash. If it's so central to your workflow that you're willing to put up with the rest of bash just for it, adding a couple lines to your profile can't be that big of a deal.


Well, it's the first time I saw the magic incantation listed above so I may give it another try know.


I am a C# programmer at heart, and I use powershell a good bit. I can honestly say I can never use powershell without my cheat sheets or my list of favorite commands (and especially the arguments to use).

I looked at this and kinda get it and think I could do some things with it. I don't think it's as powerful and can _definitely_ say it won't be capable of the same automations we use.

That said, the text parsing people do with bash makes me cringe. It's so repulsive and sketchy. Anything to get linux world off of bash would be a good thing.


Do you use pwsh as your daily driver? I find that its commands are as easily memorable as any other shell.

That being said You should enable these:

    Set-PSReadLineKeyHandler -Key Tab -Function MenuComplete
    Set-PSReadLineOption -PredictionSource History
MenuComplete will give you a menu of arguments that each command takes so you can easily see them when pressing tab for completions and prediction source will try to predict the commands you want based on your history.


Also make sure you're on latest version of PSReadLine (I had some problems with it not updating properly and had to do a manual `Install-Module PSReadLine -Force`) and try

  Set-PSReadLineOption -PredictionSource HistoryAndPlugin -PredictionViewStyle ListView
You can also toggle between the default inline and listview with F2. Also if you install

  Install-Module CompletionPredictor
and add this to your profile:

  Import-Module -Name CompletionPredictor
  Set-PSReadLineOption -PredictionSource HistoryAndPlugin -PredictionViewStyle ListView
you also get the normal intellisense autocompletions in the listview. And remember that if you have all the help files installed locally you can use F1 to view help for the current parameter/command.


Been using the latest PSReadline but CompletionPredictor is awesome and exactly what I've been looking for.

One other PowerShell protip

Ctrl+Space is also another great shortcut for completing commands, it lets you see what type a parameter is expecting etc.

    $ Get-ChildItem -<ctrl+space>
    Path                 Depth                File                 ErrorAction    
    <snip>
    [string[]] Path


Thanks! I haven't seen CompletionPredictor before. I'll give it a shot.


/mind blown/

very nice.


I use pwsh as my daily driver, and the verbosity actually reduces my need for a cheatsheet.

What I'd love is for Powershell to _stop_ adding .\ to my tab completed files. Just quote it and leave it alone, unless it's an executable.

Once I got used to stuff parsing as objects, it's really hard to go back to everything-as-a-string in bash. I've gotten to writing a few personal scripts/modules.. processing stuff in JSON is just really nice.


> can _definitely_ say it won't be capable of the same automations we use.

Anything in particular you think would be difficult/impossible in Nushell?

(I'm one of the Nushell developers, might be able to help or put features on the roadmap)


Honesty, I believe this because of the maturity of powershell and not any inability in nushell. Having literally the number of commandlets and having entire .net framework available makes powershell have more options.

Additionally, pretty much every Windows Server feature is commandable with powershell. Some (much?)_ of the UI stuff in windows is just using powershell under the hood.

That said. I am _very much_ looking forward to using nushell when I use linux or wsl. And I wish your team great success. Linux needs it.


Yeah, we're probably never going to be able to compete with PowerShell's sheer volume of Windows integration points. Shelling out to pwsh is probably the best we can do in some situations.


This is a common complaint among implementers of POSIX shells.

They are not LR-parsed languages, and cannot be expressed with yacc grammars.

Debian dash replaced bash as /bin/sh. It gains speed, but no better syntax.

https://archive.fosdem.org/2018/schedule/event/code_parsing_...


Some reasons I prefer Nushell over PowerShell:

- less verbose syntax

- better cross-platform support (PowerShell is technically cross-platform, but it has some baggage from its Windows-first history)

- way faster to start up

I'm a little biased (I'm a member of the Nushell core team), but those are all things that drew me to start contributing to Nushell.

On the other hand, Nushell is certainly less mature+stable than PowerShell.


Something I've been curious about - are there any plans to sink serious effort into massively expanding the stdlib? Powershell's syntax and object-orientation I could take or leave, but access to the entire .NET Framework is pretty hard to beat, and the same draw exists for xonsh. Nushell is neat but there just aren't enough builtins.


I don't think we're going to be able to compete with PowerShell's bajillions of developer-hours invested in deep Windows integration. But we are looking at revamping the Nu language to make it much more pleasant+powerful as a scripting language.

What kinds of features are you looking for in the stdlib?


It's less any specific missing features and more the confidence that I'll literally never run into a missing feature - the things you almost never need, until you do. E.g. upcasing a Turkish string, or printing a number that does the correct thing with `,` and `.` in the user's current locale. PowerShell's support for ACLs is also hugely helpful - being able to do the structured-data thing with the icacls command would go a long way.


No need for (a minimal set of) dotnet runtime. This means Nushell can possibly run on slightly resource constrained environment such as Raspberry Pi, of course you don't expect it to run well with 256MB of RAM though. Also, Nushell is battery packed. For example, say bye bye to your Oh-my-ZSH because the features you have with OMZ (like shell autocompletion, suggestion, theming and formatting) is built-in in Nushell. I have also tried to parse `zfs list` output in my Proxmox machine that I don't want to use `zfs list | less` that often.

That said, I hope Powershell Core can be packed with dotnet 7 native AOT mode [1] so we don't have to screw around with 100 different MSIL DLL files to run a single shell...

[1]: https://learn.microsoft.com/en-us/dotnet/core/deploying/nati...


>of course you don't expect it to run well with 256MB of RAM thoug

I've run Perl scripts on 32MB...


It's dumb but I personally can't get over Camel-Kebab-Case commands. I hate typing Caps, and hyphens.


PowerShell is case-insensitive, you don't need to type caps for variable/function references if you don't want to. You'll still have hyphens, but in an interactive shell (as opposed to a script) there are many aliases you could use that avoid hyphens (or you can make your own).

https://learn.microsoft.com/en-us/powershell/module/microsof...


It's more Unix-y. Smaller footprint, opens faster, short commands by default. Otherwise the PowerShell influence is vastly understated in project page and docs.


Doesn’t xonsh run correctly on windows? I used it but hated the environment due to the lack of utilities on windows (it’s just a shell, not an environment).


xonsh (Python based shell) works fine on Windows. Been using it for over 4 years.


I'm curious what you mean by "properly supports Windows." Are you on a version of Windows that supports WSL2 (Windows Subsystem for Linux)?


I really don't want to start up a separate Linux VM just to open up a shell. (I already crossed out Cygwin / MSYS on the list because of performance issues...)


Also, I need to compile actual Windows binaries using the shell (MSVC, clang-cl), and I obviously can’t do that under a Linux VM.


MSVC's command line tools kick off just fine from shell scripts running under WSL2. Much to my surprise. I assume it doesn't run in the VM.


Busybox for Windows has a very accessible shell.


Yeah, I've used that before. But it was a bit too minimal for my tastes.


WSL is great but it’s basically a convenient VM. It’s no longer windows, as far as I’m concerned.


It’s a lightweight Hyper-V VM, just like Windows itself when you turn Hyper-V support on (boots hypervisor first, then the Windows VM in the root or parent partition)[0].

But if that’s an issue, WSL1 is still an option.[1] It’s a thin translation layer between Linux kernel calls and NT kernel calls, which was the original concept of subsystem from the early NT days which allowed OS/2 apps to run on top of ntdll.dll.

WSL2 didn’t replace WSL1.

[0] https://learn.microsoft.com/en-us/virtualization/hyper-v-on-...

[1] https://learn.microsoft.com/en-us/windows/wsl/compare-versio...


It’s not an issue, in fact as a Linux fan it’s magnificent! However it is not windows so if you want to use it for windows stuff / software development there’s going to be some edge cases, performance penalties and hiccups. So long as you stay inside the WSL2 VM and filesystem it’s fantastic.

Also WSL1 was a subsystem in the NT kernel but WSL2 is not like this - it runs a separate Linux kernel, with some convenient integrations. WSL1 never did support all the features - I recall I had to use windows .exe executables for some software packages that do have apt-installable packages.


> WSL2 didn’t replace WSL1.

It kind of did, IMO. You can still use WSL1, but my understanding is that it's a dead end; MS has given up on the translation layer approach. WSL1 will still get bugfixes but it seems pretty clear that WSL2 will get the lion's share of investment going forward.


It's literally a subsystem. It's integrated in many ways you would not use to describe a "VM". I don't understand the aversion nor the confusion.


Here’s an example: if I build a Python app with PyInsyaller (which can’t cross-compile), and if I want a Windows executable, I have to build on Windows, not WSL, which defeats the whole purpose of “bash on Windows” because now at best I have to use Cygwin or Powershell or something and deal with a completely different environment.


You can run windows programs from whatever Linux shell (in WSL) you want, because the Windows filesystem is mounted and Windows executables run in the Windows environment.

OTOH, if you are doing anything complicated, it gets weird because (e.g.) file paths have to be passed as windows paths to windows programs, not using the linux mount path that you would access the path from in linux. But, you can, in principle, use a Linux shell in WSL to run Windows PyInstaller to build for Windows.


I thought since WSL2 it’s just a Hyper-V VM with some nice upfront configuration (like mounting all windows disks)?


I'm pretty sure Windows itself, as in your desktop, runs virtualized under many circumstances; and nobody seems to complain about Hyper-V then.

And to be clear, WSL2 is very much integrated into windows via the filesystem and networking. I like to think of the networking more akin to docker than anything else.


It can also cross-run programs and UIs. I can type in `code .` in my Ubuntu image, and VSCode will open in Windows at that directory. I can also run UI programs in Ubuntu and their windows pop up in the Windows desktop environment.


While the «code .» integration is great, it’s running as client-server. You can do similar development over ssh to another vm or physical machine, too.


Having written a lot of shell scripts, the single greatest thing I've ever experienced is shell-friendly outputs.

For example, consider if ls had a "--shell" option that output each entry as a single line of shell-quoted variables safe for eval:

    # cd /usr/share/dict
    # ls --shell words
    path="/usr/share/dict" file="words" user="root" group="root" size=985084 ...
Then all sorts of things become easy.

    # eval $(ls --shell words); echo "$size" "$file"
This nushell doesn't really change anything important about this biggest scripting problem. You're still parsing human readable output and it's still dependent on particular versions and options passed.

If I had a genie wish of nushell being installed everywhere or even just coreutils having a --shell option I'd take the latter any day.


Calling an external too to list files is an anti-pattern; every programming language or shell language should have that built-in, so it's just doing that via the OS API (opendir/readdir/closedir).

The POSIX shell has that in the form of globbing. That only gives you names, and has quirks that are only adequately worked around with Bash options.

The fix is to use some real programming language for complex work.

I made a language for myself in this space, geared toward C and Unix people who are willing to try Lisp: TXR.

  1> (stat "/usr/share/dict/words")
  #S(stat dev 2306 ino 884986 mode 33188 nlink 1 uid 0 gid 0 rdev 0 size 931708
          blksize 4096 blocks 1832 atime 1667315034 atime-nsec 0 mtime 1238396423
          mtime-nsec 0 ctime 1405615444 ctime-nsec 0 path "/usr/share/dict/words")
  2> (flow "/usr/share/dict/words"
            stat
            [callf list .path .size])
  ("/usr/share/dict/words" 931708)
  3> (flow "/usr/share/dict/words"
            stat
            [callf list .path .size [chain .uid getpwuid .name]])
  ("/usr/share/dict/words" 931708 "root")


> Having written a lot of shell scripts, the single greatest thing I've ever experienced is shell-friendly outputs.

This is basically the idea of PowerShell. Output is well formatted data that can be passed around and manipulated in a regular manner. No slicing on columns or anything like that.

PowerShell has other issues (many of which are documented in these very comments!) but shell friendly output is one of its central ideas.


Well in nushell you wouldn't be using the coreutils commands, you'd be using the nushell builtin replacements. Those builtins return structured data.

Now for the 1000 other non-coreutils commands on your system, that's not very helpful. --shell is a good idea, though it seems like JSON is becoming the most common alternative structured output implemented by commands. Nushell and PowerShell can turn JSON into a structured variable easily, though using jq in shell is alright most of the time.


jc [0] is dedicated to providing this externally for common commands, but I agree that it would be better if command authors built it in as an option. It would make shell scripts shorter, more readable, and less buggy.

[0] https://github.com/kellyjonbrazil/jc


I like the idea, but if it was interpreted directly, that would be a security nightmare unless we also had something like Python's ast.literal_eval(). Which makes us come back to JSON-like outputs because we need some form of serialization anyway, I guess.


Of course there could be some tweaks like a scoped eval to prevent stomping on the script's variables, but it's really not very hard for a C program to escape shell variables correctly and safely.

You could have an eval that only read variables, but trusting a program to only return variables is a really low bar; it's very hard to mess that up.


Part of it is knowing the tools. In this case, 'stat' is the command you probably want to get parsable details about a file, not more options to 'ls'. I often see convoluted shell scripts that could be a lot simpler with the use of more appropriate utilities.


It's just an example. With shell-friendly outputs you can recreate "ls -l" with any field order you want with just a few lines of readable shell script. Many of the unix tools only exist because it's so hard to parse the human-readable output in the shells.


Wouldn't 'find .. -exec ' and friends get you this?

    find . -type f -depth 1 -exec ls -la {} + | awk {'print "file="$9 " user=" $3 " group=" $4 " size=" $5'}


Your awk has to put out file="..." notation, and if double quotes occur, they have to be escaped. So you're looking at substantially longer command.

  | awk 'function esc(str) {
         ...
         }
         { print "file=\"" esc($9) " user=\"" ... }'


Many tools have some switch which will make them output JSON, which is easy to process if you have jq. For example `docker something --format='{{json .}}'`.


Its dataframe support (based on Apache Arrow and the Polars Rust library) looks very interesting. https://www.nushell.sh/book/dataframes.html


Yes, it really should be called Polarshell.


I still believes that [xonsh](https://xon.sh/) is the best of its kind because I do not want to remember yet another language syntax


Is there something like this but with Lua?



Just was trying it. A couple minutes in, I discovered it doesn't support suspended jobs.

:( seems like that would be a very basic feature.

https://github.com/nushell/nushell/issues/1329

https://github.com/nushell/nushell/issues/1796

https://github.com/nushell/nushell/discussions/5239


In the past, I've toured these alternate, non-POSIX shells like Nushell. A lot of them (e.g. Powershell, Elvish) don't provide job control. I looked into how job control works and it's kind of a bother, so I see why they might have elided it. I wonder if multiplexing the terminal using tmux or screen is a good enough alternative the job control for many use-cases. You do lose state (i.e. environment variables, working directory), but if all you want to do is run something else maybe it's good enough.

I personally haven't tried living without job control though.


Sorry, but you're incorrect here. PowerShell does most definitely provide job control.

    Get-Help about_Jobs
Alternatively, online documentation from Microsoft regarding jobs: https://learn.microsoft.com/en-us/powershell/module/microsof...


I think you must be unfamiliar with job control, but I could be wrong. How do you Ctrl+Z a foreground job, maybe several, and then switch between them.

All of what “powershell” provides depends on knowing you want to background something ahead of time, and even then I don’t think you can make it the foreground task (maybe wait job comes close) There is even a Github issue on it, which I’m afraid I can’t be bothered to go find again


My data point: I prefer tmux/multiplexing over job control because the process hierarchy leads me to managing them better (I never accidentally quit tmux). Also I don't have to worry about std stream usage. I'm actually not sure of a case I'd care for job control.


The problem with tmux is that I often start a long process without really realizing it, then control-Z bg to continue working with the shell. With tmux, I'd need to either realize up front I'm about to start a long-living thing, or lose the context of the current shell to continue my interactive work.


I also use tmux for this. Personally, since starting to use tmux years ago, I've never even thought of wanting job control in an interactive shell process.


You might be interested in Cat9, https://github.com/letoram/cat9

It's another reimagination of a shell, but built around asynchronous jobs


Does anyone have an ELI5 for what this, BASH, ZSH, Powershell etc do? It seems it's a conflation of #1: A CLI. #2: A specialized programming language that mainly does file-system operations. Is there a compelling reason to use something like this (or Powershell's scripting system, from the comments here), vice Python or Rust's etc standard library filesystem tools?

Context: I use Windows Terminal with embedded powershell on Windows, and Terminal on Linux. Useful for compiling programs, running Python scripts, and executing CLI-based programs (eg clippy, ipython, rust, various programming-language-specific formatters etc, some CLI-based text editors like VIM). Also, troubleshooting Linux problems.


Shells are a combination of programming language, user interface, and standard library. The programming language and stdlib aspects are just like Python or Rust, but with a different focus:

- Terseness. I can very realistically type hundreds, if not thousands, of commands per day. Extra words or punctuation marks add up quickly.

- Shells are focused on executing things right now, as opposed to most other languages where the output is an abstract program to be run multiple times. This means using literals more often than variables, for example, which is why unquoted words in Bash are strings (except the first word of a command, because terseness is more important than consistency).

- They have more interactive features, such as warnings and prompts. Powershell detects if you're missing required params and prompts you one by one, for example.

- The purpose of a shell is to enable interactivity with OS primitives, be they Linux's files/processes or Windows' objects.

- Because most commands are typed interactively and only run once, glue code is especially painful and wasteful to write. So these languages pick a simple data structure for all function communication, and stick with it (e.g. Powershell's typed objects, Bash's newline separated strings, Lua's everything-is-a-table).

- Quality-of-life features. For example, Bash aliases (e.g. `alias wget='wget -c '`) are a pretty good idea that is non-trivial in other programming languages.

- Easy to plug stuff into other stuff. I don't mean just piping, but also things like temporary files via Bash's process substitution `<()`.


Very clear explanation; thank you!


A shell is pretty much what you described. More accurately it’s a program for interacting with the operating system, and it usually ships with a scripting language, and Utility programs. Hence the conflation.

If you’re using Windows Terminal, you’re using a sort of predecessor to Powershell (somebody please correct me on this), if you’re using terminal on Linux, you’re most likely using bash. Enter ps -p $$ at your Linux terminal to find out (which shell you’re using).

> Useful for compiling programs, running Python scripts, and executing CLI-based programs

Yeah, so from here, it’s possible to automate those commands, if you wanted to. In windows you’re talking about .Bat or .Ps1 files. Shells are the language you use to write those scripts, and they typically ship with access to some useful (file/text oriented) commands.

The only problem is these shells were invented in the 80s 90s, and have awful conventions that make some people miserable enough that they go and write a new shell that tosses previous conventions out the window. And IMO they did a great job.


> If you’re using Windows Terminal, you’re using a sort of predecessor to Powershell (somebody please correct me on this)

Not necessarily. Windows Terminal is just a terminal emulator; you can run any shell in it: cmd.exe, PowerShell, Nushell, bash, fish...


And what a great Terminal emulator it is.


Agreed; it’s a massive step forward for Windows and the team behind it is great.


It blows my mind that they have a proper team behind WT now (for a while now), and it has already (in my mind) surpassed MacOS Terminal (unchanged for years) and most of Linux Terminal Emulators.

If Microsoft can just get a proper team behind the other tech...


Oh shit. I’m thinking of command prompt.


Appreciate those details! And as ripley said, Windows Terminal is a housing for other terminals like Powershell or various WSL OSes, but has tabs.


Yes, scripting languages all support both:

* A repl for performing common operations (moving around the file system, running programs, etc),

* A interpreted environment for scripting these operations.

The main reason to use these instead of Python/Rust is the ease of interfacing with other programs.

In Python, and most other non-scripting languages, spawning and interacting with external processes is quite verbose.


It's verbose in Rust too; error handling; various types ways to handle the program's stdout/stderr etc.


One issue I have with nushell is that on macOS they use the `Library/Application Support` path for the config instead of the `.config` making it awkward, their reason is that the former is the correct path of application data, while none of the other CLI apps I use do this, every other app just points to the `$HOME` dir or `.config` dir.

I used it for a while but I switched back to zsh, maybe in the future when it has a stable release I try it again, was too much hassle to update the config on almost each release. Also didn't worked well with some utils I use that has specific initialisation for each shell (i.e. rbenv).

Otherwise nushell seems to be a good contender for an alternative shell.


it is the correct path for application data. Configs go in ~/.config on Linux, ~/Library/Application Support on Mac, and ~\AppData\Roaming on Windows; some other tools doing the wrong thing isn't a reason for nushell to do the wrong thing, and your preference for nushell doing the wrong thing shouldn't mean I have to suffer it. If you want it to be visible in your home dir, you can easily put a symlink there.


When you work on multiple OSes (mainly macOS and Linux), having your dot files normalized is IMHO the best experience, yes, I have symlinked, but if I need to setup a new mac (that does not happen often) then I could forgot about it and will take an extra step to setup, as I said that's just my preference.

Also I want to say that the ~/Library/Application Support directory only works good for desktop app, so it's easy to get rid of the configurations of an app that you want to uninstall (I use AppCleaner), but terminal apps you install most likely by a curl|bash or brew, so there's no point on using that directory IMHO, to me feels that the maintainers are not mac users so they don't care on improving that, for me, is a pain point.

But that does not take the merit of nushell, from all the other new ones (fish, oil, etc) I think is the one that I enjoyed the most.


iterm2 puts config in ~/.config, while also having a directory in ~/Library/Application\ Support/ (habit). The author of iTerm has more credibility than the person you're replying to imho.

If a program is putting text files a user is supposed to edit in that directory, that's a mistake. If the application doesn't live in ~/Applications and provide a GUI for editing configuration, the file belongs in ~/.config and the app should follow XDG standards in general.


The XDG standard is a Linux standard, for compatibility between Linux distributions. Writing to ${XDG_CONFIG_HOME:-~/.config} on a non-Linux platform is like writing to ${APPDATA:-~/AppData/Roaming} on a non-Windows platform. I will never understand where the habit of assuming Linux standards are universal standards comes from.

There is nothing in Apple's documentation about Application Support being incorrect for CLI applications. It is an ex post facto justification for apps that do the wrong thing on Linux too and don't want to go to the effort. They are almost guaranteed to do the same thing on Windows and not even go to the effort of setting the folder as hidden.


That's just, like, your opinion, man.

It's also objectively Wrong, the X stands for X Windows, what you are referring to as Linux is actually ABunchOfStuff/Linux, or as I've recently started calling it, ABunchOfStuff+Linux. macOS has an X server, still, to this day.

It's also pragmatically wrong, there are more than a dozen directories in my ~/.config directory. Be the old man yelling at the cloud all you want, or, maybe, follow the available standard like other sane, polite programs.

The nice thing about XDG? Just point XDG_CONFIG_HOME and XDG_DATA_HOME at Application\ Support if you want stuff to show up there. A program which doesn't check for those flags is not supporting the standard, file a bug.

So, to repeat myself: use XDG. That's what it's there for.


And the D stands for Desktop, and no platforms other than GNU/Linux, or as I've taken to calling it, Linux, use X for the DE. It is a Linux standard. Specifically, the role the XDG spec plays on Linux is the role that the official documentation plays on Mac and Windows: to define where well-behaved programs should place their files. The consequences of ignoring it on Linux are the same as ignoring the official guidelines on Mac or Windows, as is the Correctness(tm). If you ignore it on Linux, your program is not well-behaved, and if you follow it on Mac and ignore Apple's official guidelines, your program is not well-behaved. Pointing out many instances of bad behavior does not make behavior less bad.

This specifically breaks down on Windows. The purpose of sticking dotfolders in $HOME is because you can't see them so they don't clutter your user folder, but they're also not hard to find. One of three things must be true:

1. The universal value is in them being hidden, so a program should go out of its way to hide them on Windows, and thus have platform specific code, despite not having a platform specific directory.

2. The universal value is in them not requiring platform specific code (other than maybe checking $USERPROFILE instead of $HOME), so a program should not bother hiding them on Windows, despite this resulting in a user folder with more junk folders than meaningful folders.

3. The whole concept doesn't translate very well to Windows, being a Linux standard designed for Linux. Windows software should be written to conform to Windows standards.

The answer is very clearly 3, and it is no great leap from there to notice that Mac is the same way. It is also no great leap to notice that not a single program that stores stuff in ~/.config on Mac pays the least bit of notice to XDG variables. Your declaration of it being a de facto standard falls completely flat on that point alone: well written software hardcodes ~/Library unless XDG_CONFIG_HOME is defined, poorly written software unconditionally hardcodes ~/.config, no Mac software that I have ever seen hardcodes ~/.config but listens to $XDG_CONFIG_HOME. So it is not standard outside of Linux. What is standard for Linux software, especially cross-platform ports of Linux software, is total abject laziness on the subject of well-behavedness.


surely something as basic as the location of the config folder can somehow be changed to a custom path, right?


Configuring where configuration data lives can be tricky. (Chicken/egg problem.)

But you can certainly create your own symlink.


Yeah, the issue is mainly if you setup a new machine with your dot files, you need to remember to setup the symlink, because you don't do that so often is so easy to forgot. Also I pointed that out because I share my dot files between Mac and Linux and that's certainly annoying to have two paths for both.


>This is just an example from YAML to JSON, but you can convert much more formats into other formats.

>open dev/home-impermanence/tests/impermanence.yml | to json

I can do it with PowerShell:

$os_list = (Get-Content -Path "C:\temp\operating-systems.yml" | ConvertFrom-Yaml)

Set-Content -Path "C:\temp\PowerShell_operating-systems.json" -Value ($os_list | ConvertTo-Json)


That is a great example, but for different reasons. I think PowerShell is amazing, but it never feels like something I want to work in as a shell. Writing scripts in an IDE, sure. But to convert some yaml to json I’d much rather type open/file.yml | to json


This is where I’m at with PS as well. It’s stuck in this no man’s land between bash and python. It’s a better scripting language than bash and a better shell than python. But when compared to their main purposes it’s a worse shell than bash and a worse scripting language than python.

The caveat of course is Windows system administration. I’m sure it’s excellent in that domain.


I end up using C# as a scripting language, the GUI on Windows, and ZSH on Linux Servers…


OP here wrote this the most verbose way I could think possible. Most commands are pre-aliased in PowerShell and arguments are fuzzy matched as long as they're not ambiguous.

For example:

   Remove-Item $directory -Recurse -Force
   rm $directory -r -fo

For this specific example ConvertFrom-Yaml actually doesn't exist as a standard cmdlet and ConvertTo-Json isn't aliased by default:

    gc ./something.yml | ConvertFrom-Yaml | ConvertTo-Json > /some/file.json


With any shell:

    $ brew install yq
    $ yq -o json . < config.yml


This... yq eats yaml. And this is all I need to know (really). I don't know yaml. I don't WANT to know yaml.

Two letter command which I can remember, in the same way that I know jq. And, as a short-form jo. I would not use jo to eat json. I do think that yq is "misnamed" slightly, in that there is no yo command.

(note that I do have yq installed, but not nushell or PowerShell).

shell is glue, not oop -- json is an object notation, converted to stringyness. Which makes for oop->shell. Since "objects" are programmer-think, this should not be the primary command interface. People (not programmers) recognize this, and thus PowerShell is not used by "non-programmers". Ok, you want something above bash (sh, shell), for more programming structure? This is why the #! construct is exec() is so important. That lets you use... um awk, sh, (and should support PowerShell -- not sure). Even C (with tiny-c). I would go with javascript, myself, because it fits with json, and thus jq and jo in the eco-system.

Now, for a criticism of yq -- jq is 30K for a front-end and 350K (or so) for a library. jo is 120K, yq is (gulp) 12MB. Ok, I have the disk space. And it is GO, so ok. Compared with the other GO programs I use commonly: minio (100MB), mcli (mc, 20MB), doctl (20MB), it isn't bad at all. But, I guess that is what GO demands...

Can I teach "functional shell"? Frankly, no. I can get through simple shell in a semester.


I mean, yeah? As much as I hate to admit it (and believe me, I hate to admit it), PowerShell is better than bash at composing tools made by all sorts of people on the fly in easy to write pipelines. I have my issues with it as a language, but it's not like I'm thrilled to write bash. Nushell is an attempt at making a scripting language which builds on bash, rather than supplanting it. PowerShell is much less nice on Linux, especially when you'd have to translate every existing resource.


Yeah, PowerShell eliminated the output parsing overhead which was the significant portion of any composed shell operation. That's where all shells need to be heading I believe.


Probably better to write this with the standard alias's so people don't complain about verbosity:

  gc ./something.yml | ConvertFrom-Yaml | ConvertTo-Json > /some/file.json
Also I don't think ConvertFrom-Yaml actually exists as a standard cmdlet.


Powershell is also quite anti-semantic, non-portable and verbose


> anti-semantic

Why?

> non-portable

PowerShell 6 and later run on Windows, macOS, and Linux.

> verbose

This is by design.


>Why?

Case insensitive, weird parameter passing, no Posix-like interface.

>Windows, macOS, and Linux

Yet it is still mostly suitable for Windows, where it makes sense with weird (read: mostly bad and outdated) platform-wide proprietary decisions.

>This is by design

Well, humans should not spend too much focus on verbosity, maybe I'm wrong, at least that's what I've learnt from a university HCI course.


> Case insensitive, weird parameter passing, no Posix-like interface

Parameters are passed like any other shell?

> Well, humans should not spend too much focus on verbosity, maybe I'm wrong, at least that's what I've learnt from a university HCI course.

While its by design everything comes aliased and arguments are fuzzy matched. As a human you don't need to spend time on verbosity if you don't want to.

For example:

   Remove-Item $directory -Recurse -Force

   rm $directory -r -fo


> > anti-semantic > Why? I can't speak to the other poster, but I found the PowerShell example to misleading because it required the presence of a variable that the Nushell did not. I had to read through the code three times to understand:

- Whether the code was modifying some fundamental environment variable that would affect the running of other programs? - If the variable would need to be accessed after the command was run? - Why the variable was named `os_list`? This was admittedly stupidity on my part, but it wasn't event a concern with Nushell.

> > non-portable > PowerShell 6 and later run on Windows, macOS, and Linux.

PowerShell is absolutely multi-platform and that should never be held against it. However, my personal experience has been that most PowerShell guides assume that you are running Windows and will attempt to call commands not available on other platforms (e.g. OS services, other applications). Granted, scripting tutorials for Linux and OSX also assume the presence of certain services, though it's easier to search my path for `7z` than figure out whether the Get-Acme-Version command is available. Nonetheless, the issue is more of documentation than implementation.

> > verbose > This is by design

As for the verbosity, I'll fully agree that PowerShell is highly verbose by design. However, that's a design decision that makes it inappropriate for some applications. An aircraft carrier is larger than a kayak "by design", but that doesn't make it any easier to strap to the top of your car.


> can't speak to the other poster, but I found the PowerShell example to misleading because it required the presence of a variable that the Nushell did not.

This was just a choice of OP and also isn’t a concern in PowerShell:

  gc ./some.yml | ConvertFrom-Yaml | ConvertTo-Json > some.json


Lol so what happens if I run 'curl' in my PS6 script? Asking for a friend with PTSD.


Assuming you have curl installed, it'll say

  curl: try 'curl --help' for more information
I assume you're thinking of the old Windows PowerShell that used to have curl/wget aliased to it's own commands, but that's not the case with the new cross-platform PowerShell.


Its not verbose. Its just that people that use it like to share verbose stuff as it is more helpful to others.


When I'm using PowerShell as a shell, I use all the aliases and shortcuts.

When I'm using PowerShell as a scripting environment, I use the full name for every command and parameter, and even include the names of positional parameters.

I love that it can do both.


I thought the original Nushell was written by Jonathan Turner.

Is this different? Some attribution in the article would help.

(Thanks for pointing, there are links to original in the article)

https://www.nushell.sh/


> I packaged it for OpenBSD, so it's available on -current (and will be in releases after 7.3 is out), the port could be used on 7.2 with no effort.

I think this is what makes it sound like the author of the article is the author of nushell. My first reading led me to believe the same, but she is "only" responsible for the OpenBSD package.


oh, I'm indeed not the author of nushell, I'll try to make it more clear!


It doesn't seem so. While not immediately clear, this article doesn't seem to be claiming ownership of the project. It is a third-party review/introduction. They link to the project after the first section.


I get the impression that she packaged it for OpenBSD and wrote this as an intro to the shell for others. The 'written by Solene' bit reads initially as though she's claiming to have written Nushell, but when reading the text, it's pretty clear she didn't:

> With nushell, it feels like I finally have a better tool to create more reliable, robust, portable and faster command pipelines. The learning curve didn't feel too hard, but maybe it's because I'm already used to functional programming.


Previous discussions about Nushell (https://hn.algolia.com/?q=Nushell), for example:

https://news.ycombinator.com/item?id=27525031 (763 points | 1 year ago | 398 comments)

https://news.ycombinator.com/item?id=20783006 (1477 points | 3 years ago | 366 comments)


Thanks! Macroexpanded:

Nushell 0.60 Released - https://news.ycombinator.com/item?id=30772042 - March 2022 (40 comments)

Nushell: A New Kind of Shell - https://news.ycombinator.com/item?id=30718349 - March 2022 (8 comments)

Nushell – a new type of shell written in Rust - https://news.ycombinator.com/item?id=28968966 - Oct 2021 (3 comments)

Nushell 0.38 - https://news.ycombinator.com/item?id=28841901 - Oct 2021 (1 comment)

GitHub – nushell/nushell: A new type of shell - https://news.ycombinator.com/item?id=27525031 - June 2021 (398 comments)

Nushell: A New Type of Shell - https://news.ycombinator.com/item?id=26712741 - April 2021 (2 comments)

Cbsh – a couchbase shell in rust on top of nushell - https://news.ycombinator.com/item?id=24767244 - Oct 2020 (4 comments)

One Year of Nushell - https://news.ycombinator.com/item?id=24259914 - Aug 2020 (47 comments)

Nushell – A New Type of Shell - https://news.ycombinator.com/item?id=22880320 - April 2020 (37 comments)

Nushell – a modern shell written in Rust - https://news.ycombinator.com/item?id=20845584 - Aug 2019 (4 comments)

Introducing nushell - https://news.ycombinator.com/item?id=20783006 - Aug 2019 (366 comments)


How did I not know about this? I will never parse text in Bash again.


But... Bash (sh) is not meant to parse text!

Bash (sh) is glue. awk can parse text. As can SNOBOL4 if you REALLY want to go crazy. Or sed. Math is done by expr, dc and bc.


'In a way, Nu is a way of saying “what if we didn’t need tools like awk so often?” Since you’re working with structured data, as we add more support for file types, it’s less often you need to reach for “awk”, “jq”, “grep”, and the array of other tools to open and work with common file types.'

From this interview: https://www.notamonadtutorial.com/nushell-the-shell-where-tr...


Please correct me if I'm wrong, but doesn't nushell suffer from the same problem as powershell that all the nice fancy stuff works only for in-process commands and external programs are bit of a second-class citizens?

To me interesting question is that is it even possible to build rich strucure-aware shell-like cli environment that would allow seamless integration of external polyglot programs in the same vein as unix shells where only minimal amount of stuff is required to be builtins (/bin/[ is cute example).

It is a difficult problem, at minimum probably requires all compatible software to be explicitly made so. But for best experience even the basic unix fundamental of stdin/stdout/stderr+argv/env might need re-evaluation.


nushell seems to have better way of parsing the string-only outputs of other commands.

https://www.nushell.sh/book/commands/parse.html

I don't know how to do a similar thing in powershell.

> To me interesting question is that is it even possible to build rich strucure-aware shell-like cli environment that would allow seamless integration of external polyglot programs...

Problem is that most of the classic Unix programs output in some messy format. It would be nice if we could agree that each command has --json flag (for example) which will make the output trivial to parse programmatically in any other advanced shell.


> It would be nice if we could agree that each command has --json flag (for example) which will make the output trivial to parse programmatically in any other advanced shell.

That's part of the solution, but I think a signaling layer of sorts is needed too so that the reading side knows that the data is structured instead of just bytes. So basically instead of needing explicit

    > foo --json | from json | ...
You could have

    > foo --json | ...
From there it would then be easy to have env var or something instead of explicit cmdline flag to eventually end up with (almost) seamless

    > foo | ...


Powershell Select-String has the thing similar to nu's parse. See https://devblogs.microsoft.com/powershell/parsing-text-with-...

There is also very powerfull ConvertFrom-StringData

https://learn.microsoft.com/en-us/powershell/module/microsof...


> nushell seems to have better way of parsing the string-only outputs of other commands.

Yup. In addition to that `parse` command, Nu also has a suite of `from` commands that trivially convert other data formats to Nu tables. CSV, JSON, TOML, XML, YAML, and many more.

So if you're working with a program that can emit, say, JSON, you just do `some_external_command | from json` and boom, you've got a Nu table.


Parsing JSON in powershell is not that bad either, `ConvertFrom-Json` pretty much does what you need. AFAIK there is nothing as convenient as nu's `parse`


Has anybody switched to any newer shell from Fish?


The output being in tables is nice and being able to act on it as a table is also pretty nice. At first glance, it seems it's well documented as well. I don't know that I would switch but I'll definitely read up on it.


Should have been called nutshell


That's what I read it as and was a bit disappointed when it turned out not to be the case.


same


Nushell has the huge advantage of being searchable, which I think is really important for new projects where you need to look up docs often.

It is cumbersome to always add a postfix: rust (lang), go (lang), etc...


Nushell looks pretty good!

But, how to automatically convert all existing shell scripts -- to Nushell?

I think it would be highly interesting if someone created "an LLVM for shell scripts"...

In other words -- given a shell script that runs on one shell, turn that into a parse tree -- then transpile that parse tree back into a script such that the transpiled script runs on a different shell -- with guaranteed full compatibility...

Maybe some work in this area is being done, or already has been done...

I'd be interested if anyone knew of anything like this...

If so -- then links would be appreciated!


Calling things “new something” is usually a bad idea. It might be new now, but it won’t stay new for long. And then it becomes a lie.

“Newcastle”, “Newport”, “The New iPad”(which was later renamed to a much more logical 3.0)


Nushell is PowerShell done right.


It felt less powerful than PowerShell to me, and didn’t feel like it brought anything different to the table that would warrant me to switch. It also has far less tooling and plugin support.


Powershell is powerful because of the deep integration with the .NET ecosystem, and by extension, the OS.

This new shell lacks that, I agree. But I think it's a way harder problem to solve on linux.


What are the features that are done differently, or in other words, why would I choose Nushell over PowerShell?


I'm one of the Nushell developers, and one of the main reasons I joined the project was that I couldn't find a cross-platform shell I was happy with.

PowerShell has a _lot_ of baggage. I don't like a lot of their design decisions (ex: the super verbose naming, the continue-on-error default that makes scripting a pain), and it's unlikely that those decisions will ever be revisited.

I tried pretty seriously to use PowerShell as my daily driver on Linux and macOS 2 years ago and was disappointed. Too many rough edges and Windows-first features.

PowerShell's startup performance is also pretty rough; I was seeing startup times around 2s after some minor profile customization.

I think Nushell addresses all of those points well; it's got nice concise syntax, solid support for Mac/Windows/Linux, and instant startup times. On the other hand, Nushell's less polished+stable than PowerShell; we're still making breaking changes to the syntax+commands.


> the super verbose naming

Isn’t long names but short aliases the pragmatic way of solving the naming problem? “ls” or “dir” is perfect when typing interactive but in a long program I don’t mind typing something longer out.

Startup perf is definitely an issue but otoh that doesn’t feel like a design decision that couldn’t be fixed.


But then you have 2 names to remember.

PowerShell's approach to naming is divisive. I'm not saying it's objectively bad, but I'm in the camp that doesn't like it.


Error messages are great, autocomplete works out of the box, and commands ArentCamelCaseMonstrocitiesOnlyMicrosoft --CouldLove.


> autocomplete works out of the box

> Error messages are great

Some improvements in pwsh would be nice here I agree but they have moved away from the more verbose error messages to a more concise view. I personally prefer if there was something like Python 3.11's new error message in PowerShell.

> autocomplete works out of the bo

Not sure what you mean, autocomplete works out of the box with PowerShell as well. It can even auto complete object properties and methods. You can also adjust the auto completion method to be like the shell you desire; i.e. bash/zsh modes.

> and commands ArentCamelCaseMonstrocitiesOnlyMicrosoft --CouldLove

Sure builtin ones are the `Verb-Noun` syntax but you don't need to follow this if you don't want. You can certainly use snake_case if that's your personal preference.


> ArentCamelCaseMonstrocitiesOnlyMicrosoft --CouldLove

Personally, CamelCase makes more sense than everythinglowercasesmushedinto -onetwothreefourparametersSometimesCapitalised and --sometimesdoubledashed and -Isometimesspacedoesn'tmatter -and --sometimes - "dashlivesalone" --AND -o -- "twodasheslivealone".


Not having used nushell, is that what it is like?


nushell is like most POSIX shell - terse.

   ls | where type == file


The equivalent Powershell for that is..

    ls -file


You don't.

The good thing is you have alternative in case you need it. Performance might be different in specific case.


Well, they were replying to a post that said this is powershell done right. Seems very relevant to ask what the done right part is.


Anybody can say anything, that doesnt mean it should be taken seriously.

"Done right" is probably that it is not done by MS and that it is developed in Rust. That ignores universe of other aspects.


Yes, let's go with your guess instead of trying to get an answer.


Indeed :) I have seen them all (answers).


IDK, my first reaction was that it was PowerShell except worse.


Yeah, lets live in holywood movies where decades old project is replaced overnight with right solution :)


I tried nushell for a few days, but sh syntax is just too ingrained... It slowed me down a bunch and couldn't get over the initial learning curve. Ended up going back to zsh. Also, losing a bunch of completions for tools I used was a bummer. I wish nushell could parse bash/zsh completion scripts or something to help ease the transition.

Maybe I'll revisit it at some point in the future.


It took me about a decade after fish came out for me to finally switch from bash. Nu looks like an improvement over fish, but I expect it will also take me about a decade to bite the bullet and use it. :P



It'd be useful if there was some site detailing and benchmarking the size, overhead, performance etc. of the various popular shells on different systems. This looks interesting but the binaries seem rather large relative to bash. For example, what do the developers view as the minimal system parameters needed to avoid annoying latency?


It looks promising, but why does it flicker every time I input a key? Using WSL & Terminal.


We (Nushell core team) think it's a Windows Terminal bug, please upvote the issue: https://github.com/microsoft/terminal/issues/13710


Open an issue?


Looks pretty pragmatic and readable


“Ask and ye shell receive”: I swear I was just complaining about bash, zsh and the lot the other day saying that we needed a modern alternative shell without ridiculous names and design choices. Installing this tonight! So excited!


Hope you enjoy it! The Nushell Discord is pretty active if you have any questions: https://discord.gg/NtAbbGn


Seems like a missed naming opportunity, it should've been "Nutshell" :D


I used nushell's data frame support a year ago to do some data munging. It was a pretty pleasant experience. That was before I learned pandas, though, so I don't know if I'd reach for it again.


> In a nutshell, nushell is non-POSIX shell, so most of your regular shells knowledge (zsh, bash, ksh, etc…) can't be applied on it, and using it feels like doing functional programming.

Is there a reason some shells like fish (and now nushell) are non-POSIX? What is the benefit? I really like fish but I've kept away from it because it's non-POSIX and won't necessarily play well with years of POSIX scripts and knowledge I have.


Is there a reason every PL designer in the world didn't just make a new implementation of Pascal rather than design Python or Java or Rust? :)


Is there a reason some shells like fish (and now nushell) are non-POSIX?

Some things are different than other things. It's the same reason that not all desserts are pie, and not all automobiles are trucks.


I guess I was more curious about the cost/benefit. Given a well defined standard like POSIX, and the difficulty of swapping in a new shell into Windows, I wasn't seeing why you'd buck the well established trend.


fish = "friendly INTERACTIVE shell"

Fish is my daily driver. I use it because it makes things I do regularly either easier, more memorable, or less error prone.

The primary use for fish is not scripting, afaik.

From a quick glance, nushell, like powershell, is strong working with pipelines of not-just-text data. That's better for your scripting/automation needs.

[Rant: Afaik, POSIX standardized what people did and agreed on at one point in time, warts and all.

For example, whitespace is a nightmare... in sh, var=$value vs var="$value", look at warts like "$@" and the ever-present annoyance of handling of paths with whitespace. I'm not even going to go look to see what of these are mandated by POSIX. Multi-OS scripts are different, doing something like Oil (oilshell.org) or PowerShell makes a lot of sense... ]


Again, it's just because sometimes things are different. That's why I used pies and trucks as an analogy. Both pies and trucks are well established, serve their needs well, have plenty of variety, and room for improvement and innovation. That does not mean all work on desserts and automobiles should contain crusts and beds. Those are design decisions that serve some purposes very well, but not everything.

Most importantly: programming, like baking and automobile design, is an artistic craft, not just a solution to real world problems. We as a society already have well established solutions to tasty treats, transportation, and interacting with a computer. Trying something new will not harm anything, but there is a chance it may revolutionize everything. If it meets the authors requirements, while satisfying their creative urge, then the cost is well worth the benefit. Even if they are the only ones to ever use it.

That said, if you are asking why is it worth it for you or anyone else to invest their time into learning how to use it, that is a very different question, and I have no idea. I will not be using this any time soon, but I wish them luck, and I am glad that they are trying something new.


Benefit is a better syntax...?


On windows a command interpreter is just a console application like any other console application, you just run and use it, no "swapping" involved.


So is Unix.

xterm -e /usr/games/nethack

xterm -e /bin/sh


Correct me if I’m wrong, but doesn’t POSIX dictate a text stream as STDIN/STDOUT?

If my understanding is correct, this basically rules out any attempts to create richer CLI pipelines that necessitate other kinds of data interchange. IIRC this was one of the motivating factors in the design of PowerShell—though I don’t know PS well enough to say if it’s POSIX complaint or not.


Because POSIX shell syntax is an unreadable and dangerous mess.


I found Nushell to be too buggy for day to day use, but I sure am looking forward to the day it's stable!


I would kill for a different shell language but im so deep into ohmyzsh its hard to switch


A comparison to fish (shell) would be useful too.


finally. a unix powershell. everything is an object.


Do we need a new shell or a few utilities that help produce structured output? For example, `ls | gen_structure | limit 50 | head 5`


Eww. This is basically screen-scraping command output to enable structure-oriented commands. That means you need a scraper for every output, the scrapers need to be aware of all the options, handle filenames with whitespace correctly (not sure that's even possible in general).

And once you've done that, you have this new batch of commands that are structure-oriented, so you are using bash (for example) as a boot loader for your new shell. Why not just use the new shell?

I have a project in this space: https://marceltheshell.org. It is based on Python, so Python values (strings, numbers, tuples, lists, etc.) are piped between commands. No sublanguages (as in awk); instead you write Python functions.

Marcel has an ls command, but instead of producing a list of strings, it generates a stream of File objects which can be piped downstream to other commands that operate on Files.

Example: sum of file sizes under directory foobar:

    ls -f foobar | (f: f.size) | red +
-f means files only. In parens is a Python function (you can omit "lambda") that maps a file to its size. red + means reduce using addition, i.e., add up all the file sizes.

Or to compute size by file extension:

    ls -f foobar | (f: (f.suffix, f.size)) | red . +
f.suffix is the extension. "red . +" means to group by the first field of the tuple (the extension) and add up the sizes within each group.



That's an interesting project to attempt. An issue that should be considered is that pipes can only pass plain text. So every command should import/export structured text which may impact performance and limit capabilities when compared to a shell that can pipe objects.


The structure can be done with ASCII: csv and tsv manage to pass data in plain text (the later while remaining readable)

Relational data would have the extra advantage over json-like property-based data of being both easier to present on screen (tables) and to interact with (SQL like: where + order + limit would cover most usecases)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: