Hacker News new | past | comments | ask | show | jobs | submit login
PSKoans: A simple, fun, and interactive way to learn the PowerShell language (github.com/vexx32)
135 points by thunderbong on April 19, 2023 | hide | past | favorite | 53 comments



I'll share a bit of personal experience for those who hate PowerShell to maybe make them look at it from a new point of view.

As a devoted Unix fanatic, I used to hate PowerShell, because I initially came in with the expectation that it's going to be “just another POSIX-like shell”. After all, it's right in the name! And it uses dollar signs! So every little thing that differed from my unixey expectations made me angry. (Them using awful aliases like wget and curl didn't help. Seriously, what were they thinking?)

But once I've realized that PowerShell is much closer to AWK and Perl, I kind of started to almost enjoy it. Don't get me wrong, it's not my favourite language or anything. But these days I think that it is a formidable scripting language with most of the stuff you'd normally need in place, at least in the recent versions.

One thing that I still dislike is the absence of something like the Unix manual pages. For some reason, help pages require additional installation, and the learn.microsoft.com search is so awful that it's often quicker to use DDG with "site:learn.microsoft.com".


> "Them using awful aliases like wget and curl didn't help. Seriously, what were they thinking?

Seriously they were thinking "if people open our new shell, type a command they expect to work and get an error, they will quit it and never come back. Let's make some aliases for common commands to give people a helping hand so that `cd` and `dir` and `type` and `move` and so on from command prompt, and `echo` and `ls` and `cat` and `md` and `ps` and `mv` and so on from Unix shell, all do something useful. If people have to start right in with PowerShell specific `gci` (get-childitem) and `sl` (set-location) and `mi` (move-item) that will be too unfriendly and too steep learning curve.

It was a great idea; if you're a Linux user you're well aware that the same command can behave differently on different systems (see: grep, egrep, fgrep, POSIX standards vs GNU utilities) and both aware and comfortable with customising your environment with startup scripts (removing aliases if you don't want them). On the other hand if they didn't do that, they were going to get grief from the Windows side (breaking backwards compatibility) and from the Linux side ('all shells must be POSIX'). Damned if they do, damned if they don't. At the time (~2006) PowerShell was Windows only and CURL was not a common Windows tool.


Besides, it's just faster to write "ls" than "Get-ChildItem". Linux command names come from a time when pressing keys on a keyboard was very expensive [1], so they're very economical. Nowadays we can type keyboard keys as long and as fast as we want and shell command names just get longer and longer.

________________

[1] In the 1960's it cost 10 XP to press one key. 15 for the Return key.


PowerShell has its own alias, "gci" for get-childitem and just "childitem" would automagically work.

> "shell command names just get longer and longer."

This is another rarely mentioned problem with CLIs; computers get bigger, software gets more complex, people want it to do more, and that means any command needs more context disambiguation and more typing to narrow down the more degrees of freedom. GUIs and TUIs can bring up related things to try and help you out.

btw. what is the benefit of Louise learning the Ackermann function with successor arithmetics? Ok the successor notation is structurally simple enough to 'learn' ... what does 'learning it' mean and how is that practical or useful?


Oooh, I remember now. You are talking about that discussion we had a while ago. OK.

Yeah, so,"learning" in that context means you're given a bunch of atoms as examples and you learn a logic programs that entails those atoms. "Atoms" here in the logical sense, of atomic formulae (not in the Prolog sense, where "atoms" are constants).

Ultimately, a sound and complete inductive algorithm like the one in Louise, is capable of learning any program, given one or more example atoms it entails, but in practice some programs are harder to learn than others. This is more so if a program must essentially be executed before it can be learned. Which, for the Ackermann fucntion, is a bit of a problem, given it's very expensive. So "learning" can't be done in a PAC-Learning sense, of learning from randomly chosen examples, because then you end up with combinatorial explosion and fast. So this "learning" I show in the Ackermann example is more like teaching: I chose, manually, an example that I know is informative and that the learner can use to learn.

Btw, I like necromancing old converstations too :D


>> PowerShell has its own alias, "gci" for get-childitem and just "childitem" would automagically work.

Yeah, I totally keep forgetting those and just use the unix-like aliases.

>> btw. what is the benefit of Louise learning the Ackermann function with successor arithmetics? Ok the successor notation is structurally simple enough to 'learn' ... what does 'learning it' mean and how is that practical or useful?

Holy unexpected question, Batman! :D

The point is that it much simplifies notation, which in turn simplifies the background knowledge. The alternative is to give is/2 as background knowledge, but then you also need to somehow specify all the arithmetic functions it can use, so it doesn't make for a simple example anymore (although I don't think the data/examples/ackermann.pl example is simple, given the flattening; I really need to sit my bum down and write a flattening/unflattening program transformation).

An alternative, while still keeping things simple, is the way it's done in the data/examples/even_odd.pl example, where I just hard-code the prev/1 predicate. But if you do it that way, you then need to extend prev/1 after learning, or the whole learned program, along with the background knowledge, is incomplete (it works only for the numbers in the interval [1,4]).

Generally, those data/examples/ files are meant to be simple examples and I tried to simplify the notation. One could certainly do a better job than what I've done, I think. But they're not supposed to be very useful, they're demonstrations.

I might have some time this month and the next to improve as much as I can in the documentation and the examples. But real-world applications will have to wait. I might have something in the works. Wish me luck :)


What is XP in this context?


Windows Experience Points? Sorry, it's all tongue-in-cheek.


> Seriously they were thinking "if people open our new shell, type a command they expect to work and get an error, they will quit it and never come back. Let's make some aliases for common commands to give people a helping hand so that `cd` and `dir` and `type` and `move` and so on from command prompt, and `echo` and `ls` and `cat` and `md` and `ps` and `mv` and so on from Unix shell, all do something useful. If people have to start right in with PowerShell specific `gci` (get-childitem) and `sl` (set-location) and `mi` (move-item) that will be too unfriendly and too steep learning curve.

There are lots of problems with this, not the least of which is that these aliases prevent you from using ports of those actual commands in your PowerShell session, even if you have them installed. And they are absolutely worlds apart from the commands they alias— pretty much totally incompatible. The differences between those PowerShell commands and their GNU equivalents isn't at all in the same ballpark as the differences between GNU coreutils and their counterparts on FreeBSD.

> customising your environment with startup scripts (removing aliases if you don't want them)

Doing this in a concise way is annoying as hell. And you can't even do it with a uniform interface, because some of those things are aliases and some of them are built-in functions, and those have to be disabled via different APIs. To avoid totally ruining your shell startup time, you also have to precompute a generated list of external programs to unfuck instead of just piping PowerShell commands to each other in your profile because PowerShell is insanely, insanely slow— especially on a Windows machine where half a dozen grubby little agents want to get their fingers into the filesystem's filters... which is how you're going to be using the damn thing.

I would way rather have a steeper learning curve then have all my muscle memory screwed over with no easy, performant way to toggle the disagreeable behavior. I'm fine typing `gci` instead of `ls` most of the time. When I type `ls`, it's because I want the damn `ls` I went out of my way to install!

> At the time (~2006) PowerShell was Windows only and CURL was not a common Windows tool.

Even among developers and sysadmins, Windows users hardly use the CLI. The difference in CLI usage at Microsoft shops vs. everywhere else is might and day. Let the Windows CLI users damn you, all six of them.

And of course there's already a sane solution to this, by the way: a command-not-found handler. All PowerShell has to do is tell you when you run `grep` and it's not present that there's this really neat tool built into PowerShell called 'Select-String'. And if you do have `grep` installed, it can stay out of your damn way.

As an aside: most of these issues have nothing to do with being POSIX. Most of those aliases are for external programs that have nothing to do with the compliance of the shell with POSIX. And lots of times, deliberately non-POSIX shells get them right where PowerShell doesn't.

FTR, I actually like PowerShell. I think it has made a huge contribution to the future of the command line, scripting, and shells in general. I'd much rather write a PowerShell script than a bash script, for anything important and non-interactive. But the things people will say in defense of PowerShell's very bad aspects drive me up a wall.


> "not the least of which is that these aliases prevent you from using ports of those actual commands in your PowerShell session, even if you have them installed."

No they don't. a) you can remove the aliases, b) you can fully specify the name e.g. "curl.exe" wouldn't trigger the alias, or "c:\path\to\curl" wouldn't either. I don't remove the aliases, so I don't know why "remove-item alias:* -force" is so bad. The functions are, what, mkdir and more?

> "And they are absolutely worlds apart from the commands they alias— pretty much totally incompatible."

This is a complaint that "all things must be exactly how I expect them" which is not going to happen in a completely different shell/language/environment.

> "And of course there's already a sane solution to this, by the way: a command-not-found handler."

And of course PowerShell has one:

https://learn.microsoft.com/en-us/dotnet/api/system.manageme...

https://stackoverflow.com/a/67453249/

> "All PowerShell has to do is tell you when you run `grep` and it's not present that there's this really neat tool built into PowerShell called 'Select-String'"

Yeah, no, command line utilities where I type "thing -h" and they go "-h doesn't exist. If you want help, type -help" suck. Thanks for nothing, technically correct but unhelpful nitpicking junk.[1]

What would you do if PowerShell had used "ls" and "grep" for its actual command names, not aliases, but they behaved differently from Unix shell `ls`?

[1] yes I am aware that it's not safe to type `-h` to random commands and expect it to be helpful and non destructive.


> No they don't. a) you can remove the aliases, b) you can fully specify the name e.g. "curl.exe"

I'm aware you can remove them, as my earlier comment clearly indicated by its description of how I remove them. Even in the example you give, adding '.exe' to the command name doubles the number of characters typed. Very ergonomic. They absolutely do prevent you from involing those commands by their names, just like any ordinary obstacle prevents you from doing things until you remove it.

> I don't know why "remove-item alias:* -force" is so bad. The functions are, what, mkdir and more?

There are 16 I remove on my system, but more if you also want to remove the aliases that clobber DOS commands and other legacy Windows stuff.

> This is a complaint that "all things must be exactly how I expect them" which is not going to happen in a completely different shell/language/environment.

lol. Nope.

> > "All PowerShell has to do is tell you when you run `grep` and it's not present that there's this really neat tool built into PowerShell called 'Select-String'"

> Yeah, no, command line utilities where I type "thing -h" and they go "-h doesn't exist. If you want help, type -help" suck. Thanks for nothing, technically correct but unhelpful nitpicking junk.[1]

1. What you described is not what a command-not-found handler does. What you described is a shitty usage message.

2. What you described there is literally what actually happens if you run `mkdir -p`, `rm -rf`, `ls --help`, `ps -ef`, or `tee -a` in a vanilla PowerShell configuration. lmfao

And (2) is why aliases that clobber common external utilities with radically incompatible alternatives are only even potentially useful to people who (a) never use a command with any flags or options at all but (b) already 'know' how to use it, which are damn near non-overlapping groups.

> What would you do if PowerShell had used "ls" and "grep" for its actual command names, not aliases, but they behaved differently from Unix shell `ls`?

They sort of do this already with `mkdir`, which is a whole shell function and not an alias (and has to be removed differently). But if it was like, say, Get-ChildItem, and distributed as a built-in Cmdlet, it'd be harder to deal with. Truthfully, I would probably just not use PowerShell. But one could also rig up aliases for all of the GNU coreutils and other clobbered externals with a super short prefix, like ',' or something like that.

I don't really mind unconventional shells. I've been running a deliberately non-POSIX shell as my daily login shell on every system I touch for more than a decade, and some time in the next decade I'll probably switch to an even more exotic shell (inspired by PowerShell, even!) like Elvish or Nushell.

I even think it could be great for such a shell to bundle replacements for everything in GNU coreutils and more, so that scripts can be more portable. But they'd probably have to be clones of the GNU implementation for me to really get onboard with that. I know them pretty well and I like them, to the point that I always take them with me and put them directly on my PATH when I have to use macOS, clobbering the slightly-different native implementations. (But those slight differences are way easier to contend with, and a much smaller annoyance for me than PowerShell's default behavior on Windows.)

> yes I am aware that it's not safe to type `-h` to random commands and expect it to be helpful and non destructive.

That's also a major annoyance to me, to the point that I won't use some software that acts that way. Unix commands that do things other than print a help message for `--help` frankly don't belong on any system I administer. I don't fuck with Bernstein's daemontools because I don't like its argument parsing (no long options, -h does things other than print a help message), for example. I'm not a big fan of some of the classics either, like `tar` and `ps`, and will probably some day try to get used to some alternatives that use more 'normal', argparse-y flags and options.


> I've realized that PowerShell is much closer to AWK and Perl [than to POSIX shell]

For me, what distinguishes[1,2] the former category from the latter one is that the latter one is quick and comfortable to use interactively, as a shell for the OS, to iterate at the command line and not in the editor, while the former one... isn’t. The cost is that a couple hundred lines of AWK are usually fairly OK, while the same amount of shell visibly strains the language.

Are we still in the world where the interactive text-based environment (not programming environment) with the best available ergonomics is based on a general framework set in 1979? I’m aware Lisp machines existed and did things differently, but the best you can get in that respect now is Emacs and it still cannot reach that seamless ad-hoc composability of Bourne shell.

[1] http://yosefk.com/blog/i-cant-believe-im-praising-tcl.html

[2] https://scsh.net/docu/html/man-Z-H-1.html#node_toc_node_sec_...


> Are we still in the world where the interactive text-based environment (not programming environment) with the best available ergonomics is based on a general framework set in 1979?

Yes, because it's not broken (certainly not for interactive use), so it's not in a need for fixing. People who don't understand it are doomed to reinvent it poorly (vide curl/wget aliases in powershell).


People who don't understand PowerShell are doomed to bikeshed two aliases from an entire shell, programming language, and ecosystem of OS integration and tooling.

> "Yes, because it's not broken (certainly not for interactive use)"

It's not /broken/, it's dated, limited in the way "turn everything to CSV or JSON and turn it back" is becoming common as a clunky workaround, limited and weak as a programming/scripting environment with hard boundaries making it difficult to extend or build on without having to switch to a completely different thing like Python/Perl, inconsistent and arbitrary with all commands doing their own thing for argument parsing and input/output formatting and style of human interface, and all being isolated silos, it's easy to do badly and difficult to do well.

PowerShell parses command parameters at the shell level. So they're all consistently handled, there isn't the same "this command has weird argument parsing". That means the engine can look into the cmdlets and interrogate them in standard ways and present hints about usage from their help comments. PowerShell is a scripting language and lets you reach for the .NET framework when its builtin commands aren't enough, and then lets you build commands in C# still for more flexibility or performance but still using the PowerShell engine for parameter processing and from there lets you build modules for ease of deployment to different machines. PowerShell engine has scriptable tokeniser, AST, meaning your code can process PowerShell code and editors can lean on the engine for syntax highlighting and autocompletion instead of using regexes from before LSPs.


Powershell is very cool and the object model and .NET integration is awesome. I just wish that Microsoft adapted Lua or Javascript or massaged Python or whatever something instead of inventing a language with yet another weird syntax.


It's not two aliases, it's also that Verb-Noun two word rule instead more sane two letter rule, which is better suited for interactive use. That this decision was made for the purpose of noise abatement (every character was loudly hammered into a paper by a printer) doesn't make it any less valid today and we still prefer less keystrokes, even on today's silent terminals.

I don't care much about parsing CLI parameters, because after some time they get into muscle memory. Yes, I know `ps` and `tar` have some different legacy conventions but I don't care. I just type what I've remembered. In this area, only the number of keystrokes (corrected for tab completion) counts.

Also, does "consistent argument parsing" work for regular programs written in normal (non-.NET) programming languages like C, Python or Rust? Because if not, then it's not really consistent, only consistent within a specific ecosystem, which can also be said about `getopt(1)`, `getopt(3)`, click or any other language's argv parser.


A ton of common pwsh commands have built-in short aliases, sometimes even two-letter ones. They're intended for interactive use, with the verbose forms expected in scripts.

  New-Item -> ni
  Set-Location -> sl
  Get-ChildItem -> gci
  Invoke-WebRequest -> iwr 
  Get-Help -> help
  Select-Object -> select


> instead more sane two letter rule

The what rule? The one that "printf", "curl", "tar", "gunzip" and pretty much every other command don't follow except for "cd", "ls", "cp", "mv", and "rm"? Those two-letter commands are only really needed because UNIX started on paper ttys and follows "ed" in its "don't show current context to the user" philosophy; launch "mc", or use "fzf", and you can forget about them as if they were a scary dream.


bc, cc, dd, df, du, ip, jq, nc, ps, ss, su, tr, wc, xz are some I use regularly. There are others. But it's not coincidence that the basic file operations you've mentioned are very short: they're the most common used commands in the interactive shell.

> Those two-letter commands are only really needed because UNIX started on paper ttys and follows "ed" in its "don't show current context to the user" philosophy

Today they need to be short so they're just quick to type. You can set $PS1 just fine and it shows you as much (or as little) context as you'd like.

It's only scary to people watching over the shoulder, because it's just faster than anything else. Close second place is Windows 95/98 "Start" menu and "Explorer", both of which also can be operated with keyboard, thought only for browsing directories deeper — and some Linux DEs like Cinnamon menu can also be worked in similar fashion.


> [Bourne shell is] not broken (certainly not for interactive use), so it's not in a need for fixing.

From the perspective of writing command-line utilities, it’s kind of broken though?

Builtin or forked pagers gated on isatty(), similar situation with syntax highlighting, a dozen line editors (from the readline juggernaut to whatever less has inside it), most programs rolling their own command systems rather than reuse the shell—all of that stinks of wrong factoring at the terminal layer.

(To be clear, I’ve used Emacs, Acme, and the Plan 9 not-a-terminal; they served to point out some of these problems, but I don’t believe any of them counts as a full solution.)

Move away from the terminal, and you still see individual programs being much larger that they need to be, up to and including plugin systems and embedded scripting. Because—and I hate to say it—moving structured data across the process boundary is a pain!

It’s certainly possible; if you dumb it down enough and resign yourself to writing bad half-parsers, you can get qbe or noweb or perhaps even roff (although that one’s got a scripting language inside it). But most probably you’ll turn each part into a library, link everything into a hulking monolith and call it good design. Maybe it is! But it’s working around the environment instead of taking advantage of it. Which means the environment is not doing so great.

I think the table stakes should be ls. Can the proposed model make ls simple, but allow the user to easily do most of the things that the current humongous ls is forced to have inside (at the very least, table layout, multicolumn display, colouring, ls -l, ls -t, ls -R)? PowerShell can, admittedly, but it’ll make my fingers bleed in the process. And it’ll still, as far as I know, throw up its hands at stitching together (not as a single program!) something like lstree or ncdu, let alone wget -r.

I don’t mean to say any of this is simple. A full solution would probably end up saying interesting things about hypertext, component systems, incremental programming, and other theoretically nifty things people have tried and mostly failed to make work. But I don’t see progress towards a solution, either, and if what we have is the best possible option, it still sucks.


Writing bash is fun the same way as doing cross words are fun. And I say that as someone who enjoys writing bash scripts.

But you are right, posix shell are not in the need of fixing. They need a complete overhaul.


> Writing bash is fun the same way as doing cross words are fun. And I say that as someone who enjoys writing bash scripts.

Most of the time people who say they think the Unix shell and tools are difficult haven’t bothered to learn them. But not all the time, and that small minority makes me distinctly uncomfortable, because I simply can’t understand how that can be.

Given you say you’re part of that minority, can you try to explain or demonstrate the problem? Perhaps give me a problem I can try my hand at, if you have one that makes sense out of context? Because while I’ll admit there are plenty of tasks for which I’ve found shell scripting unsuitable, for those it does work on I’ve rarely found it difficult.

(I feel like I also have to point out that “writing bash scripts”, especially writing them to last, is very much not what my original comment was about. What it was about is more like this thing I just typed at the terminal because I couldn’t be bothered to write a script:

  curl -fsSL "$URL" | pup -p 'a[href$=".pdf"]' attr{href} |
  sed "s,^,$URL," | aria2c -i -
It doesn’t do things Properly™ and wouldn’t work with an arbitrary $URL, but I knew which one I meant, and it was certainly simpler and faster than digging up requests and beautifulsoup and whatnot.)


> One thing that I still dislike is the absence of something like the Unix manual pages. For some reason, help pages require additional installation, and the learn.microsoft.com search is so awful that it's often quicker to use DDG with "site:learn.microsoft.com".

I haven't found a case where `Get-Help` hasn't met most of my need for something like Unix `man`. `Get-Help` is even by default "helpfully" aliased to `man` in PowerShell.

It is weird that sometimes you need to `Update-Help` somewhat frequently, especially if you install a lot of modules regularly, but some tools have gotten better at calling it automatically after module installs and I believe the PowerShell 7+ installers have gotten better than before about making sure the built-in modules are all up-to-date at install time and not just uninstalled/missing. It's also not that weird when you look at how complicated man page installation is in most distributions of Unix (even if it feels magic when it works).

Also, the `-Online` argument to `Get-Help` sends your default browser to the relevant learn.microsoft.com documentation page for most commands. That's a very handy shortcut to searching in my experience.


There is no fun in a language with little to no help from the "intellisense/autocomplete" and the verb starting the command like "get-item". Ich you're used to autocomplete one will type "item<tab>" to get available commands or just explore the api - not so in pwrsh. I'm a c# dev vor over 20 years and have troubles to like the ps language.


I just type "*item" then ctrl+space and I get this: https://imgur.com/a/bmLWHnU


That is a useful tip that ctrl+space/tab completion supports wildcards.

Also, older versions of PowerShell used to include the ISE which offered better auto-completion UI and I found really helpful for learning at times.

Today the suggestion is to install the PowerShell extension in VS Code if you want the best auto-completion support and a lot of features lit up for .ps1 files. (That's the closest to a modern ISE.) https://marketplace.visualstudio.com/items?itemName=ms-vscod...


A strange game. The only winning move is to write scripts in powershell.


Greetings, professor Falken.


Thank you for this, I've avoided learning powershell for as long as possible but recent circumstances have left me no choice in the matter.


Powershell has plenty of warts and weird design choices, but hopefully once you immerse yourself in it you'll find it's actually fairly nice to work with. The "object pipeline" nature of it means that working with complex structured data is far easier than trying to mangle the same data as lines of text in a traditional shell.


I’m with you. Powershell is a jarring experience for someone not used to it. Heck, it’s still jarring for me and I’ve been working at Microsoft for over 5 years. I truly dislike writing it.

That said, it’s so much more “usable” than any other shell or scripting language. It’s hard to explain to someone who hasn’t used it, but I even reach for Powershell over Python in my daily work.


I think only Microsoft could make software that, after 5 years of experience, you still absolutely hate it, but also still use it.


Microsoft wrote Java?


It is the dominant player in this category, but a few others are there, too.


Sun


A tangent comment: is there some document with guidelines/list for emoji headers to use in commit messages? I mean stuff like "bug" for bug fixes, "upwards black arrow" for upgrading dependencies, "artist palette" for stylistic changes etc — I like it more than those sad +/-/* headers but I'd like to see this custom written out explicitly.


I think a lot of the emoji choices are personal, but there are plenty of customs written out.

One of the most common I see referred to is gitmoji.dev. It's somewhat popular because it also offers a CLI tool to help write commit messages in that style and to setup hooks to do some basic enforcement, if you wish.


I love PowerShell. I've built CLI apps, desktop GUI apps, web apps, and integrations using it. Now that I can use it on Mac, Windows and Linux it's even more valuable to me. It isn't without a few shortcomings, but I've tried picking up other languages and keep returning to what makes me productive and happy: PowerShell. I've read the criticisms, and experienced some of them firsthand, but considering all that I've been able to do with PowerShell the good far outweighs the bad.


I'm curious....are the shortcomings that you encounter the types where elements of your scripts need to differe between Mac, windows, and Linux?

I ask because i really don't like powershell...but am crafting more and more cli apps on linux, and now (sadly) am starting to spend more time across both linux and windows (forced on windows due to work)...so it makese sense to consider a cross-platform shell.


Anecdotally, I haven't noticed too much pain in crafting cross-platform scripts in PowerShell 7+. Most of the platform differences I've seen in PowerShell are at the "module level" (library of "commands/cmdlets") and most commands that are in platform-specific modules generally are pretty obvious that they are platform-specific modules. A lot of modules are cross-platform.

The biggest shift is learning to trust PowerShell commands/cmdlets and operators over platform-specific binaries like awk/sed/grep and so forth. A lot of those tools and manipulation steps can be done with native PowerShell commands/cmdlets/operators, but it's definitely a journey to thinking in PowerShell terms (that the linked repo looks like it can help with) and using/trusting its object-based tools, especially if you are used to the extended Linux shell text manipulation tool chain as "conversion" between the two paradigms isn't always straightforward.


Aaahhh, ok, i understand now. Thanks alot; very, very helpful!


Before version 6, PowerShell would default to writing output files as UTF-16LE. Good to know before reading those in to something expecting UTF-8.


Nice :). I would have loved if this existed a few years back (story below).

A few years back when i was consulting, my manager threw me in a meeting for a client wanting to use ping identity to synchronize their identity data for sso. The client wanted a cheap solution as their ping sso budget was overblown already so i tried making a call with powershell to ping sso which worked. We agreed on a day next week to write this client in powershell (which i didnt have much clue about at that time).

Took a book called powershell in a month of lunches and just started doing the lessons in my train rides from my work to home over the week (2hrs in total a day).

The next week when i met the client, we finished writing the sync client in less than a day including testing. The client later told my ex-manager, that it was the cheapest and fastest IT solution they got.

Tl;dr: amazing book to learn powershell: powershell in a month of lunches.


Don Jones - Learn Windows PowerShell in a Month of Lunches


You messed up there, buddy. You should have charged for all the time you spent reading up on it, not just the day with the customer.


Just today I used ChatGPT to turn our existing batch scripts into PowerShell, and convert the existing environment variables into command line options. Worked flawlessly and saved a ton of time.


Nice! GPTs have proven invaluable for PS stuff. I have it add comment-based help blocks when I don't feel like writing them myself :) super easy.


yes, that is very useful


I only wish it was easier to put c# code within a powershell script. Would be a great mix of two worlds.


It's actually pretty easy to "import" a C# class, either from a string or from a separate .cs file, with `Add-Type`, you don't even need to compile it first.

See: https://stackoverflow.com/a/24881346


Goodness Powershell is gross. Why would I went to do get-command or some junk. Ah well, I just need to work with it more. Tried the month of lunches book, but it got rather in-depth pretty quickly to me. Maybe ChatGPT can break it down a bit more.


I suspect you find Powershell "gross" only in a sense that "it is not a sh-inspired shell", but that's quite subjective. Now, cmd.exe, that is objectively gross. But Powershell? Unless your first shell experience was bash and it imprinted on you, no, it's not gross.


> Why would I went to do get-command or some junk.

One reason might be to make commands consistent and improve discoverability.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: