IIRC you could resize the standard cmd console easily, just type "mode 160" (or whatever width you want). I don't have a Windows installation around to check it, but maybe someone can confirm?
All functionality used to reside in conhost.exe, no matter which character mode application you used (command prompt or PowerShell), which is why a lot of commands would work for both character modes. This is not how things work now (though a lot of commands are still shared, like “mode 120,120”) as conhost.exe now just decides if it should give you the legacy console host or the new one (with things like buffer improvements and word wrapping).
There are a lot of subtle improvements I think many are not aware of, like, if you paste in text with smart quotes they will be changed to straight quotes.
Most Windows terminal people use ConEmu rather than the inbuilt terminal apps - it's like iTerm2 vs Terminal.app on MacOS. ConEmu adds Unix style cut and paste, tabs, etc. Add openssh, PSReadLine and PSCX and you've got a proper terminal setup.
Also MS should really improve the inbuilt apps to do this stuff.
I've used iTerm and Terminal on Mac, and the default Windows Terminal (conhost), PuTTY sand MinTTY on Windows.
Terminal.app is much, much better than conhost, even after the Windows 10 update. It's fast, supports a ton of thoughtful features (such as customizable title bars via extended ANSI codes, real line wrapping during resize, good Unicode support from the very beginning, etc.). With Terminal.app I can be quite happy and productive even with someone else's Mac or with the default settings. I've used iTerm2 but their features weren't compelling enough for me, and the iTerm font rendering is noticeably worse in many ways. Apple has also been very good about giving Terminal.app continuous updates.
On Windows though I have almost always had to install MinTTY or something just to get a halfway usable terminal emulator. The default emulator is just so limited - Unicode support is very plug and pray, the title bar is totally locked, and having to use Windows APIs to change text formatting is a pain. Conhost is also amazingly slow when it comes to large amounts of text, so much so that printf can be extremely detrimental to program performance just because it has to wait for the terminal window to catch up.
PowerShell's terminal experience is better but not quite there. And, PS suffers from extremely long load times - I've seen it take upwards of 10 seconds to start without any extensions. That means that in practice I often pop open cmd.exe even if PS is a better choice, just because I don't want to sit through a long load time.
Microsoft really should get their Terminal story in order. I'll definitely try out ConEmu the next time I sit in front of a Windows box, but like you I wish MS would improve their own apps!
I once started a Chinese character flash card program in Python on Windows.
The yak shaving was unbelievable. I never got a working version, but I learned a lot about code pages, and why "building a console from scratch" is not actually the right answer to "I want to be able to test a toy program I'm making before it's finished."
> PowerShell's terminal experience is better but not quite there. And, PS suffers from extremely long load times - I've seen it take upwards of 10 seconds to start without any extensions.
Small price to pay to be able to pipe objects, man!
Sorry about that - we lost control of our startup times in PS V3 and have been working to get it back under control. PowerShell V5 had substantial improvements but we keep working on it and V5.1 is even faster. Give it a try - I think you'll like it.
I'm not entirely sure if you're joking or not (one of the hazards of the textual communication medium). Yes, piping objects is fun and occasionally very useful, but waiting ten seconds to pop a terminal so I can run `ipconfig` or something is not so useful.
Are you actually able to not be confused by piping objects? In Unix everything is a string so I just pipe text to sort and then unique...etc. PS always complains relentlessly. The GUI features and .NET integration is nice though.
⌃⇥ actually does work in Terminal.app. And in fact, ⌃⇥ and ⌃⇧⇥ are the official keyboard shortcuts in the menu, with ⌘{ and ⌘} being undocumented aliases. This is also how Safari works too.
Whilst I am glad to hear about ConEmu; I have to disagree, I use windows cmd.exe very regularly, and know lots of other people that do, and I have never used ConEmu before! I might give it a try now I know about it.
ConEmu si really nice simply because you can have tabs, split pane and good color schemes. 've never gotten to the level that some people need, so ConEmu + bash = everything I was using on OSX.
My work laptop still runs Windows 7 - and will for the foreseeable future - but that is one of the things that make me curious about Windows 10. It is, of course, easy to ridicule Microsoft for how long it took them to change this, but I think better late than never.
(On older versions of Windows, ConEmu[1] does that, too, but it's third-party software, of course. It also supports tabs!)
> If you have trouble with anything I am happy to help. But you will have much better chances to find solutions on the pages of the upstream projects. Those are:
Console emulator ~ Conemu (https://conemu.github.io/)
Cmd.exe enhancements ~ clink (https://mridgers.github.io/clink/)
Unix tools on windows ~ git for windows (https://git-for-windows.github.io/)
Correct me if i am wrong, but all i remember is you could edit the width somewhere in the options. But there was no way to change the width _on the fly_ (assuming thats something it does now)
Thanks for that suggestion. I am well aware of these options, and if I used Windows at home, I could utilize these. However, I only use Windows when I have to, which is at work. The environment in which I work, you can absolutely not install software like this, as you would be breaking security protocols (and the law).
So this explains the joy I feel over small improvements like those mentioned above.
You could do it at least in Win7, maybe earlier, do not have anything older to test with. Click on icon (right click on top bar) --> properties --> layout. You can edit both window size and buffer size
It also support transparency and lot more features.
Install sysinternals, chocolatey (a package manager, https://chocolatey.org) and python, you are good to go.
You just needed to go in to the properties of the window and increase the screen buffer size (both width and height). It's one of the first things I'd do upon installing a Windows machine.
I have been waiting for that for years - now if they would only add TABS to real powershell (as opposed to powershell ISE) or if they can add colors to powershell ISE I will be super-duper-bonus happy.
I've found ConEmu's xterm emulation to be frustratingly buggy if I'm ssh-ing into a Linux machine and running tmux, vim, or really anything with colors or a status bar. mintty [1] works a lot better for me. (mintty is also the console that's packaged with Git for Windows.)
also CTRL+V pastes by default now, but yea you're no longer locked into half the screen by default. I wonder why that's been like that for so long? Must be some reason...
> Every idea for a feature starts out with an imaginary deficit of -100 points. That means it has to demonstrate a significant net-positive effect on the product as a whole in order to emerge as being truly worthy of consideration.
That is actually pretty interesting, thanks for linking!
I still dont understand how this was not "fixed" years ago, sure some config windows have fixed widths because it makes sense, but something as a terminal obviously needs the ability to grow (or shrink), especially after they introduced sticky windows and people actually used "tiling"
The idea that the console screen is an MxN matrix is pretty embedded in the API... it's not a MDA display adapter metaphor, not a Unix-style typewriter. I'm guessing they just felt they had bigger fish to fry.
(Everybody I know that uses the console window a lot has always just set the window to 120x9999 or 160x9999 or whatever and forgotten about it. It's one of those things that's obviously kind of lame, but after a while you just forget about it.)
You can't use it as grep in bash, because it isn't that. It outputs [MatchInfo] objects, not text, and Out-File (>) formats complex objects for console viewing (for ??? reasons), which is one reason > isn't a great PS habit.
On a somewhat related note, one of PowerShell's biggest weaknesses is memory utilization. And since they make it so simple to pass around hefty object collections it tends to bite you in the ass early on. Particularly when you're using cmdlets. When doing real scripting that actually needs to be run as a job I am surprised when I can get away with cmdlets and don't need to manage .net APIs/objects directly. Especially third party cmdlets.
One performance tip I picked up early on: If you're passing around large object sets you need to operate on try to keep them in a pipeline. Populating a variable and interacting with it is arguably more readable and maintainable, especially for junior people, but it has a heavy cost in terms of memory.
That said every version of posh is better than the last so I may be wrong or overestimating the impact for the newest versions. YMMV, don't believe everything you read on the internet, etc.
Upon reading that again maybe I should have said it's weakness is that it's easy for everyone involved to create objects that use a lot of memory. I'm not sure it's fair to say PowerShell itself is bad at utilizing or managing memory. It's more of a possible negative side effect of one of its great strengths.
That is odd, powershell shells out to cmd.exe if you try to pipe binary or text data. It goes back to powershell only if you use powershell commands in the pipeline.
If you don't pass the flags zo handle it as binary in those cases stuff breaks.
While the first rule of backup is "do them", the second rule is "test them". If your backups are bad and you only notice it when it's time to restore, you only have yourself to blame.
"The backups fail to restore" is such a common occurrence, it's really scary.
Something that irks me is how PowerShell’s default aliases take precedence over binaries in the PATH. To be able to use the GNU utils, you have to put this in your profile.ps1:
Isn't that always the case, that shell built-ins take precedence over things elsewhere in the system? On Unix you have to call them with their full path, e.g. to get the system echo instead of bash's you need /bin/echo, I guess. On Windows you can add the extension, i.e. use ls.exe instead. Even cmd shadows programs that have the same name as a built-in, e.g. echo.exe must be called as such and just echo will use the built-in command.
I have noticed this with echo too. In specific, the bash echo built in does not drop support for -n and -e when bash is forced into standards compliance mode as /bin/sh. That causes breakage on systems that use more strictly compliant shells when code was written against bash as sh under the assumption that either -n or -e work. I spotted a regression in Linus' tree a while back where this very thing happened in the build system for perf. Sadly, my patch to fix it was ignored:
If it is still an issue you want to fix, you may need to resend the patch periodically. It doesn't look like you got any response at all.
LKML gets hundreds of messages every day. When I used to read it I had a set of search filters to prioritize it. Anything I didn't get to I simply marked Read. There's no way to catch everything on there.
Maybe get it passed through by someone who is in Linus's email filters.
Although it's been a year and you probably don't care anymore. :)
No, this is in powershell not the linux subsystem. In powershell they defined a bunch of aliases to internal powershell functions like cp, mv, cat ... that don't work like the original unix version. If you install the GNU version on the system and try to use them in powershell that won't work by default. You either remove the alias or use the trick above to force powershell to use the .exe of the utility.
It's GUI shell does. Theoretically you can create a process from any binary blob. But to invoke it the gui way you need to have it either with an .exe or another self executing ext defined in the registry.
"is executable" is different from "is an executable". Your other comment was worded to say the former, while this one seems to refer to the latter.
For example, a .jpg image can be marked as executable, but it is not an executable. This can happen on both Linux and Windows, and I believe it is handled by the filesystem. I supposed the term "marked as executable" could be used for this, while "is executable" could mean the OS will grant it a PID using the correct system call, but this is uncommon.
On Linux, you can attempt to execute any file using a system call like execve(). Similarly, on Windows you can also attempt to run a JPEG using CreateProcess(). I know the Windows shell probably won't run a JPEG even if it is marked executable(there are a collection of Registry entries that map file extensions to programs that run them, for the shell), but I don't know if the CreateProcess() API function rejects files without the correct extension.
If both CreateProcess() and GetBinaryType() fail to detect a PE file with the incorrect extension, then you can reasonably say that Windows uses the extension to determine whether a file is an executable(though not whether it is executable). The behavior of the shell isn't sufficient.
CreateProcess() doesn't care what the file extension is when you pass the executable filename into the lpApplicationName parameter.
I used this feature when I worked on a VR app written in Unity, and I had a little launcher program that checked for updates and then started the main program. I didn't want people to bypass the launcher, so I renamed the main program to have a .VR extension instead of .exe, preventing it from being run directly from File Explorer. The launcher had a .exe extension and could be run normally, and then it used CreateProcess to run the FooBar.VR executable.
The key here is not so much the ".exe" suffix as that it's not "pwd". Think:
alias some-command='some-command.sh --arg'
and then running "some-command" versus "some-command.sh". (The Window's shell does interpret stuff with the ".exe" suffix differently, just it's not the key factor here.)
That's a bad idea - the inbuilt 'ls' (aka get-childitem - it also works on registry paths) will output real file objects, with properties you can access with 'select' and 'forEach' etc, whereas the GNU tool will output text you'll have to scrape.
I don't think this is a good default for posh on nix platforms but I would argue that it's a good thing on Windows. The output of PowerShell ls is an object with typed properties and methods that you can interact with in a much richer way.
Not to mention user experience. Adding these aliases made moving back-and-forth between nix and Windows much less jarring for me. They also really helped drive home the difference between something like bash and posh because investigating the differences between the aliases and their nix namesakes revealed their respective strengths and weaknesses.
I imagine changing aliases would be out of the question since they should have priority over PATH but perhaps a special conditional case for these the first time they're run interactively would be to check for the nix equivalent in your PATH and prompt to see if you'd like to remove them.
It's not that weird, they put the alias' in so new admins on windows fresh from linux don't hit a complete roadblock at the first command they type. Its to make powershell less punishing to learn, not emulate a different system.
So imagine when they packaged that for Mac, that assumption is no longer true.
I would argue that for scripting portability, this should be the same. With wget equivalent in Powershell being so different for example, scripts that use aliases will be broken.
Granted, don't use aliases in scripts and all that.
FWIW, I always thought of it as being for portability of humans, and was very grateful for it on Windows. A large chunk of my time is spent on *NIX systems, so my fingers automatically type "ls" and "rm" instead of "dir" and "del". The fact that Powershell worked, rather than throwing a "eh, what?" error made my time on Windows suprisingly more pleasant.
They should have implemented these with a missing command handler. So if the ls wget etc exist they work. But just before the command line complains of unknown program name it should check that list and execute the alias.
A new "bottom of the stack" alias instead of the current one that takes precedent over executables.
Possibly because by the time the makers put out the Mac version, they were already aware of the issue, and also because the Mac has the GNU coreutils as part of the main OS, rather than being third-party installs, so the problem was immediately obvious to the first tester who actually used it.
But are all cmd commands portable verbatim to powershell? With all the options / special characters? (I don't know the answer, but https://en.wikipedia.org/wiki/PowerShell#Comparison_of_cmdle... lists quite a few missing commands, including "talkkill" and "find")
With the amount of online posts that tell you "to achieve X, open command prompt and run ...", it would be a bad idea to break any of them that go beyond a simple command. (so for example any "FOR ..." lines)
There can be several aliases for a single command.
ls=dir=gci="Get-ChildItem"
But you can't run just run "dir /s". You'd have to say "Get-ChildItem -Recurse" which can be shortened to "dir -Recurse" which can be shortened to "dir -r".
When scripting, you try to use the longform versions. With one-liners, you just sort of fall into whatever paradigm you were previously accustomed to for your first argument.
For example, bcdedit with its {}-delimited GUIDs breaks in PowerShell without additional quoting, but works fine in cmd.exe. Makes following online instructions tricky if you're not familiar with that nuance.
Well, the nice thing in PowerShell is that you have explicit control over "stop parsing here and just pass the rest to the application verbatim" with --%. Solves a lot of quoting nightmares.
Well, it won't work with Start-Process, and it's also buried deep in the documentation and impossible to google. I constantly forget about the exact character sequence, too.
Isn't the -- a feature of the command to stop option parsing and treat the remaining arguments as file names? AFAIK it's not a shell feature that solves quoting (and it's not that easy on Unix either, since running a program involves argument parsing instead of just passing a string).
You could define `--` as "perform no expansions for the rest of the line", then you wouldn't have to worry about globs or `{}` which is where most of my annoyances come from.
BTW `--` isn't treated specially in version 4 of Bash.
I know, you could also just put {} around quotation marks and would work. My and parent's point was that it will probably flood MS's support channels, if you just force every cmd user into PS.
Funny, that was my first thought as well. Sometimes I start cmd.exe instead of powershell on rotating-disk machines because of the startup time. On SSDs I find the startup time is negligible.
In particular if you have left the prompt window untouched for a few hours and it has been paged out. Just pressing <tab> to complete a file name will hang for 3-6 seconds on my computer while PS wakes back up.
CMD is always instantaneous.
I realize that PS is doing a lot more work to match commands but sometimes simple is all that is needed.
They are aliases for Get-Children. Just like you can have an arbitrary alias for anything on UNIX. If I so pleased I could make an alias so that when I wanted to delete a file I'd type
emacs some-old-file-i-dont-want.c
Emacs would be an alias for rm on my system.
That being said I think it was quite offensive of MS to use wget and curl as aliases, even though they may have had no ill intentions. I'm ok with using ls and echo and names of other such basic commands as aliases, but with tools like wget and curl that are known for their vast array of useful features you either implement their features and options or you don't use their names.
> I'm ok with using ls and echo and names of other such basic commands as aliases, but with tools like wget and curl that are known for their vast array of useful features
c:\temp>timecmd dir c:\
Volume in drive C is Windows
Volume Serial Number is A0A8-6684
Directory of c:\
01/11/2016 12:45 <DIR> AMD
...
17/11/2016 12:52 <DIR> Windows
9 File(s) 25,337 bytes
18 Dir(s) 71,423,356,928 bytes free
command took 0:0:0.08 (0.08s total)
I have had a quick look through here [1] and most of the answers involve timing precision of two significant digits. By that metric, PowerShell's 2.4ms clocks in at 0.00 seconds. Which means cmd.exe (at a precision of two significant digits) can't beat it.
and find that command prompt dir is faster because it's doing less, because it's less capable. It's not creating System.IO.FileInfo and DirectoryInfo objects for each thing in the directory.
I just wrote batch files that serve as aliases for those. I type ls in my command shell and it does a dir /w behind the scenes, which makes it look more like how it looks in Linux anyway. Here are the ones I have:
* cat.bat: type %1
* clear.bat: cls
* diff.bat: fc /n /w %1 %2
* ls.bat: dir /w %1%
* pwd.bat: cd
* touch.bat: echo . >> %1
* top.bat: tasklist
All have an @echo off at the top too. Touch is a little janky, because I couldn't figure out how to actually create an empty file with nothing in it from the command prompt.
I'm actually not really a Linux person, but on Mac you kinda have to use these, so I got more used to using them than the Windows counterparts.
to get an empty file. It's not a replacement for touch, though, as touch will do something different to existing files. Your attempt will silently overwrite the file in that case.
Being verbose would be a good thing (more readable than a cryptic acronym) if there was auto-complete support. A console should really be a small IDE that gives live feedback on what arguments are possible in the current context, etc. But that's not the way powershell was designed.
Actually, that's exactly how PowerShell was designed. Have you tried pressing <Tab>? Every command can be inspected and you get auto-completion support by default, for commands, parameters, even properties on returned objects (since the shell often knows what type is returned from a command).
That's not what I mean by autocomplete. What I mean is more like Visual Studio's intellisense, ie a drop down that lets you know what are all the options from there. You can type DIR <TAB>, that won't tell you that you can apply the parameters "/p" or "/w" nor what these parameters mean. ISE does a slightly better job but is based on a static specs, rather than the current state of the system. So it won't list the VM names currently active in the system for instance.
Yes you can type -?, but that's as bad as having to go on the web to read the documentation. It's disruptive. The point of a good auto-complete is to have a list at your fingertips with a short description of what it does, without interrupting your train of thought.
You can press ctrl+space and it pops out a list of options to choose from, then you move with the arrow keys. It works for parameters and values (if there's a set to choose from):
I you type "Start-VM -name <TAB>", it won't list the list of the VMs, but rather the list of the files in the current directory. That's not exactly what I would call insightful.
I didn't know about the CTR+SPACE. That being said it doesn't seem much more insightful than TAB. And no description of what the argument does.
The issue with "Start-VM -name <TAB>" is not an issue with Powershell itself, but with the Start-VM cmdlet; the Start-VM cmdlet could have been designed to complete from the list of VMs.
The ise can do that if you've got the object at the time you're writing like that, it can't run a command for you as your typing it.
You'll never get a drop down list in the host terminal windows has it doesn't have a GUI layer like that. Wrap it in a electron app and make one.
you can type `dir -<tab>` and cycle through the options. type `get-help dir` to see what they do or install posh-git and use `dir -<ctrl-space>` to get a menu of options to choose. I didn't know about -?
Sorry yes that's psreadline not posh-git like I commented. I wouldnt' call that a drop down though, that's just adding to the output isn't it? You know I've no idea how that works, could come in handy actually I should learn.
It's as close to a dropdown as you can get in the CLI without putting GUI elements over the top. Try
gci - then ctrl+space
The available parameters appear on the screen, you can select one with the arrow keys, and tab, when you choose one and press space, they all disappear again. It's interactive, dynamic, it changes depending on how much of the parameter name you've typed - it's not just printing to the screen and stuck there.
> Starting with Windows 10 build 14971, Microsoft is trying to make PowerShell the main command shell in the operating system.
> As a result, PowerShell officially replaces the Command Prompt in the Win + X menu, so when you right-click the Start menu, you’ll only be allowed to launch the more powerful app.
This is an older change - I have this behavior in 10.0.14393.0 (ie, current stable).
There's an option in the Taskbar settings: »Replace Command Prompt with Windows PowerShell in the menu when I right-click the start button or press Windows key+X«. I suspect the default value for that setting has been false, so far.
I have that here in stable. I suspect you're right and changing the default might be what they mean (I think I already changed it because cmd is gross).
I know we have machines with lots of RAM now, but at 90-100MB (PS) vs 4MB (CMD) [1] RAM usage per instance (not counting the conhost instance that also spawns), I think I would still want CMD around.
Does that really affect you in any way? (And I'm really asking, not trying to be snarky, I've got 64gb on my main dev machine so I really don't know...)
I can't imagine any scenario in my actual usage where I'll have 4mb (plus whatever is needed to actually do the work) but not 100mb (plus whatever is needed to actually do the work).
Also, on my machine (windows 10 stable branch) each PS window takes up about 20mb (at least the task manager is telling me that, I didn't dive any deeper)
My work machine is not nearly that beefy. :| I do often have 8-10 cmd windows open. According to task manager on my machine, each PS instance on start is 90-100MB. I didn't dig deeper to see if that RAM usage would shrink over time.
I'm pretty sure the Softpedia article is incorrect in saying that "cmd" is now aliased to powershell. I installed the update, and running cmd from search bar (win cmd enter) and from run dialog (win+r cmd enter) both launch cmd.exe.
The release announcement post says "Typing “cmd” (or “powershell”) in File Explorer’s address bar will remain a quick way to launch the command shell at that location."
I honestly kind of wish it was aliases - I regularly launch a cmd window due to decades of muscle memory, then remember I really probably wanted powershell.
PS is pretty robust, but it's a bit verbose, to say the least.
As noted elsewhere, command processors internals like 'dir' and 'copy' are supported, but aliased to something else. In the case of dir it's aliased to Get-ChildItem, and copy to Copy-Item.
The latter breaks my favourite quick file create method:
C:\temp>copy con example.bat [return]
commands go here^Z
1 file(s) copied.
I had a quick look at Copy-Item and there is no obvious way to use it to the same effect that I can see.
[edit: added the [return] to make it more obvious]
Bash has its warts too. Treating everything as strings can cause headaches when filenames have spaces. As a long-time Linux user who recently learned some powershell, I find Bash rather primitive after getting accustomed to an environment where everything is an object.
I wonder how feasible it would be to build a powershell-like environment for Linux on top of Python.
It's probably doable with time... and (mostly because nobody's directly linked to it yet) the official Powershell is now on github under the MIT license:
That was the approach I took before inventing PowerShell. We didn't WANT to invent a shell - we were forced into it.
The problem was that Bash on Windows wasn't effective. At the heart of the matter is the difference in architecture between Unix and Windows. In Unix, most everything is a file so if you can modify files and restart processes, you can manage everything. In Windows, most everything is an API so tools that manipulate files don't do much for you.
Ergo - we needed an admin automation model which supported an API oriented architecture. Thus PowerShell.
NOTE - Bash on Windows today has a very different focus - it is not about managing Windows (which it still doesn't do) - it is about using OSS tools to develop OSS Software. It does a great job at that.
I don't see PowerShell as a Bash replacement, I see Bash as more of a cmd replacement. I'm a user, but I'm also a developer so Bash works for my needs. PowerShell, not so much. It's powerful, no doubt, but it's not the solution I was looking for. Admins can always fire up PowerShell if they need it.
I didn't WANT to run Windows - I was forced into it.
Path of the course perhaps. After 30+ years of fighting a what was always a "rear guard action" against the prevailing dominant Unix customs and methods Nadella has switched Microsoft and Windows into a full-on conformist mode by removing the immediate -> monetization of the OS and thus the need to be "different".
It would not surprise me in a few years to come, to find even sooner than expected that "Windows" had become another variant of Unix in effect if not entirely in fact.
Or maybe a compatibility layer, where they integrate the most common bash commands and Unix utilities. I prefer Git Bash when on Windows for this reason.
Nothing. Bat is a filetype, that file type is bound to executive in CMD's context. You can even execute it from PS and have it run the Bat perfectly normally (using CMD).
Ditto with VBS. You can execute VBS files from any context on Windows (e.g. double click, PS, CMD, etc) and they'll always use the cscript engine.
> Typing cmd in the run dialog will launch PowerShell as well, so Microsoft has made a significant step towards phasing out the traditional Command Prompt.
Imho that sounds like stupid idea (and I like PS).
> Microsoft is expected to get rid of [Command Prompt] completely at some point in the future.
This is extremely unlikely to ever happen. As long as people have .cmd and .bat files they need to run, cmd.exe will still be around. They're not going to just remove it and break all those scripts.
By the way, where does the capitalized kebab case come from? Prior art or just Microsofts general hankering for capitalization (e.g. C#s Class.DoSomething() vs Java's Class.doSomething())?
Note that it's not quite "kebab case". The dash only separates the initial verb from the object, but the rest of it is regular PascalCase. So Get-ChildItem, not Get-Child-Item.
PascalCase is well-established in the Microsoft developer ecosystem. It goes at least all the way back to Win16 and Windows 1.0 (it using the Pascal calling convention might have something to do with it, perhaps).
More importantly, it is used uniformly in .NET, and PowerShell builds on top of .NET, and deals with .NET objects directly, which have properties like e.g. `FullName`. So it makes sense for consistency.
I agree it's annoying¹, but I wouldn't call it a show-stopper. I have resorted to Out-File -Encoding <foo> wherever I need file output.
_________________
¹ The redirection syntax cannot easily accommodate an encoding, so in general you cannot really use it everywhere anyway. I often need to use the system's legacy codepage instead of Unicode, so UTF-8 by default would be just as useless there. GCJ, however, could just accept text in any common encoding instead of insisting on ASCII. Detecting UTF-16 isn't hard, even though common Unix tools tend to treat it as arbitrary binary data instead of text.
Generally I'd say > is a convenience feature more than an actually useful construct, at least in a shell like PowerShell. On Unix-likes > simply dumps bytes since that's what the shell is built around. For PowerShell you could just as well say you'd dump CLIXML instead, since the shell works with objects. Since you have a few valid options you could either try to shoehorn them into the syntax, either with various funny characters
all of which are options I'd say don't really fit into PowerShell, nor should they be entertained. Does it really matter whether the last part of a pipeline is a redirection operator or just a terminating command that writes the data to a file? Conceptually I'd say having the pipeline end in a pipeline element instead of something entirely else is preferable since it reduces the number of distinct concepts.
Yeah, I haven't liked many of the changes in Gnome over the past few years. Terminator as a terminal emulator has been rock solid though. Highly recommend it.
I think this is probably a step in the right direction, but it still only has 16 colors. When are we going to get a command prompt with at least 256 colors on Windows? There are already a bunch of linux consoles out there with true color, but nothing for Windows. Am I crazy for caring about this?
Windows fan here. I hate PowerShell and I'd much rather have Bash built into Windows. Things I hate about PowerShell:
- You can't even run your own scripts without performing the Set-ExecutionPolicy ceremony first or signing your scripts.
- It's way too verbose.
- It's a strange bird that next to nobody uses, so there's zero motivation to learn it.
- It's not "old reliable". You can't depend on it working due to the first point and also due to the fact that they're still working on it and even in 2016 they broke some PowerShell stuff with updates that needed to be uninstalled (KB3176934).
It is strange to claim no one really uses powershell. It is widely used in the Windows world.
The main complaint I see about powershell is that it isn't bash. The object pipeline in powershell can be so much more powerful than parsing text between commands. After all, that's why we have data types instead of storing everything in strings.
In an attempt to understand and empathize with people who do not share my opinions, I looked back through some of your post history and you said this 5 days ago:
"It's never been clear to me why anyone would want to use a completely opaque and undiscoverable interface such as a commandline over a nice GUI for anything. You didn't have to read anything or search around in order to change all of those settings in the GUI."
Maybe it's possible you're not familiar with people who use powershell because you tend to stick to the GUI?
That's a good point - almost everywhere I go as a consultant...their IT people use the GUI to manage things and not PowerShell.
Most places that I've seen automate things with a .NET program whether it's a service, a console app run via Task Scheduler, an SSIS package and in a few cases - a desktop app.
EDIT: I'm a programmer and not a sysadmin except for on my own network where I do prefer "the easy way" as opposed to the way where you have to do rote memorization to perform simple tasks. If I automate anything, it's also done with .NET or Node.js because I like the tools better and my stuff runs everywhere without performing any ceremonies. I do visit a lot of client sites though and I work with client sysadmins regularly. I'm on the East coast and I work in the NY/NJ/PA area. Also the reason I like bash better is because if I have to memorize anything (and I do, for Linux work)...I'd rather have it be something terse.
You seem to be missing the fact that powershell is just designed better. Take powershell vs bash. Bash operates primarily on strings. In powershell everything that is interacted with is an object. This wont really effect you in day to day interactions, but it is incredibly useful when you decide that you want to automate something. This combined with the fact that powershell is built on the .net framework means that there is a simple way to extend it from C#. You should give it an honest try sometime. Its not really that much more verbose in everyday practice and it has lots of tools around it that make life simpler.
> Walk into any office and ask an IT guy to run "ipconfig". They'll open up cmd.exe without even thinking about it.
Absolutely because of twenty years of muscle memory.
But once Powershell is the default and they learn about the wonderful "gip" alias, they'll be hooked. "gip" in case people don't know is an alias for Get-NetIpConfiguration which is a more powerful version of IPConfig.
PS - "gip -all" is useful. "gip -all -d" for more detailed results.
For simple tasks such as running a command, sure. But if that admin needs to do anything more complex, such as parsing the output of ipconfig, they're most likely going to use powershell.
Every windows admin uses it but all the admins I know don't particularly like it or fully understand it. They do like what they can do with it, that they can script almost everything, and it's far better than cmd.exe but honestly the syntax and semantics are pretty baffling.
There's a lot of cargo-cult copy and pasting in the powershell community.
That's all correct and it is the right way to do things (especially from Microsoft's perspective). However, the main goal of Powershell seems to be providing a scripting language that can automate Windows internals and not a shell in which you live in. Powershell is just a nice REPL, not so much a shell as you might be used from Bash.
We have had Powershell for 10 years now. Has it taken off in a big way? No. The admins you mentioned use it because simple automation is the one thing GUIs cannot provide and MS had to face this after over a decade of denial.
Do people get out Powershell instead of Python to do cool new projects that are unrelated to devops or admin work? Some maybe. But the real music plays somewhere else. And some of the reasons for that have been mentioned by the GP.
I've never quite understood the concern about needing to Set-ExecutionPolicy before running scripts.
In Unix, you have to chmod a+x a file before running it.
And you have to do it for every script you want to run.
So to run 1 cmdlet to enable all scripts seems a bargain.
(Honestly I could be missing something and I probably am because you are not the first person to mention it. I just can't connect the dots.)
My problem with the Execution Policy is that it's useless in practice since it doesn't actually prevent anything (so it annoys me every time). What was the motivation for adding it since it's not a real security feature? If it actually prevented executing stuff full stop it would be cool. The signed script concept is cool... I wish I could do that with Python.
Was it just to prevent accidental script execution?
I think the problem is that when you chmod a file it stays chmoded. With the Set-ExecutionPolicy I find that I have to do it every time I start a shell. I also run a lot of headless scripts (AWS cloud-init, services, etc.) and it's always a pain resetting everything every time. With Unix, I know that a script is executable or not, with PS I don't.
Also, I might not want every script to be executable. With Unix I can choose which ones are executable and by whom. With PS it's all or nothing. :/
I totally get where you are coming from. When having to do work on Windows, I used to go out of my way to avoid PowerShell.
I finally gave it a solid chance one day and I've found that it can be surprisingly nice and powerful to work with.
You can globally disable policy enforcement. Maybe not the most "secure", but it's not something I wish to bother with. There is also a really nice package management system for PowerShell modules with things like git integration and etc.
Java developer here, multi-OS fan. I'm using both PowerShell and bash from Windows Ubuntu subsystem. I prefer PS on Windows because I do like its object pipeline, but have to use bash for some cross-platform automation.
Powershell users don't age gracefully, that's for sure. You'll learn some arcane long-winded way of doing something in v3 and it will be relaced with something far better. Unfortunately, you'll still have to know the old way.
As for longwinded, I hate using ACL in Powershell. Let's say you have to reestablish FullControl on a bunch of files as an Administrator and you don't have Write access. It's so convoluted and it can even fail silently. So much easier to just use icacls in a single line.
- Set-ExecutionPolicy can be set with a single click from the "For Developer Settings." Just scroll down and hit Apply three times and your machine has sane developer defaults (inc. ExecutionPolicy).
- The verbosity means that you can "guess" PS commands. Each PS command is a set layout with an action word (e.g. Add, Clear, Get, Write, etc) and then target (e.g. Set-Alias, Set-Date, Set-Service, etc).
- A ton of people use PS. In particular SysAdmins are moving from VBS/Bat to PS in droves. No clue what communities you hang out with where nobody uses it?
- It is definitely still a work in progress. But most of the core parts of the language hasn't changed much, if you wrote a PS file three years ago it likely still works today. All they've done is add new cmdlets, new libraries, and new functionality which doesn't hurt backwards compatibility.
> If you wrote a PS file three years ago it likely still works today.
That's a really low bar.
Isn't Windows supposed to have a legendary commitment to backwards compatibility? 3 years of backwards compatibility would be nice for a bleeding-edge development environment like node.js that you can tear down and replace whenever you want, but not for your operating system shell.
Hey, I'm still on Windows desktop after having tried Linux desktop more times than I can count. But to claim that bash (or most other shells) are only preferable to cmd and powershell if you're a linux zealot is what is truly laughable.
If you have ever worked with both there would be no question.
I have worked with both and I would not claim that bash is superior to Powershell. I wouldn't even claim that my favorite shell, zsh, which is mostly better than bash, is superior to Powershell.
They're different approaches and the only real benefit of bash is that it's super entrenched and extremely widely supported. With the same level of support Powershell would run circles around it. But since we're comparing things in the real world, Powershell is roughly equivalent to bash for most tasks. Better for some, worse for others.
> They're different approaches and the only real benefit of bash is that it's super entrenched and extremely widely supported. With the same level of support Powershell would run circles around it.
I'm not sure what you mean with this.
If you're referring to feature set and robustness with typing in scripting, I might agree with you.
I have however a much tougher time agreeing if you also think that powershell could come even close to bash/zsh when working from the shell, in both speed of use (and as you said) support.
> I have however a much tougher time agreeing if you also think that powershell could come even close to bash/zsh when working from the shell, in both speed of use (and as you said) support.
At the end of the day, both bash/zsh are "dumb" shells, they don't have the introspection capabilities Powershell has regarding commands they execute. That's both a benefit (they're flexible and you can throw anything at them) but also a disadvantage (they can't really help you much out of the box when you want to run a command; they are only able to help you regarding things they find out from the kernel, such as files, but not arbitrary info from the commands themselves).
The reasons that bash/zsh are faster and better supported boil down to history: tens of years of production usage by millions of people. As a result there's this cottage industry of tips and tricks, autocompletion scripts, etc. that makes bash and zsh usable.
I'm willing to bet that with a similar amount of man-hours poured into its ecosystem Powershell would beat both of them.
But as I said, we live in the real world and both bash/zsh are faster and a bit more robust than Powershell.
In essence, the speed when working in it (compared to powershell). While powershell does have more features, at that point I'm just more likely to use python or similar for portability.
Dude. I have worked with UNIX like systems most of my prof. career. I have migrated to the "dark'side just recently (3 y ago).
They are both just tools to get the job done.
Your .bat files still work, and you can use the legacy terminal if you I insist. All they did was change the default terminal from the archaic one to newer one they first introduced as a the non-default option only a mere decade ago.
How is that any different from Linux distributions camping their default shell? Like back when debian changed their default shell to dash I don't remember anyone saying you should stay away from debian, even though it broke a lot of existing scripts.
That's a bit different. The change only exposed that some scripts were already broken and worked by accident. Specifying that your script should be run with #!/bin/sh and then using bash syntax is what broke them. It's a bit like naming your files something.c, writing the code in C++ and compiling with `cc` just because `cc` happened to be linked to `c++` on your system. It will work for you, but don't be surprised when you distribute the source and people have problems using it.
Scripts that used #!/bin/bash still worked, scripts that actually used only sh syntax and #!/bin/sh still worked.
That's the difference - the extensions continue to work, but the gui link to the application changes. Essentially now people will open powershell instead of cmd without necessarily realising.
I am a fan of change. Likely similar to the parent of your comment, I am a fan of change when I can control when those changes happen.
With Windows 10 I am sadly realising that I have very little control over when things change and even less knowledge of what those changes will be. For software, this is a nightmare. As a maintainer, I will never know what changes are going to affect my software if I am not informed.
As a user, I no longer know when a feature of my computer that I use daily will disappear. If I used cmd yesterday, will it disappear tomorrow?
Who wants to be the passenger let alone the driver of a car if it steers and accelerates randomly despite your best efforts to steer it and drive it? Would you get in a car that removed the gearstick without warning? Would you really want to be in a car that accelerated and steered at whim?
This is what Windows 10 feels like, particularly compared to my 20 years of using Windows prior to it (Win3.11, 95, 98, Me for a day, XP, Vista for as little as possible, 7 for a lengthy time). With each of those releases, updates were applied by me in a timely manner, with the crucial point being that I could apply updates when I found it applicable for my own personal machine. I could read up on the release notes for each update and service pack to see what was changed.
With Windows 10, I get updates applied in a phantom fashion and do not know what they contain. It's like Apple's vague "fixed an issue" release notes.
EDIT: I must state that the removal of features is not unique to Microsoft, given that sweeping changes affected my Mac after every release since Snow Leopard. However, the key difference is that with Windows 10 you will never have control over when things are installed (unless you use some of the finer controls for updates available with GPOs in an enterprise; still what's the point of having an AD when you don't have full control over the endpoint systems?)
In technology, there's points where certain things stabilize and change infrequently. You can only build great things when your foundation isn't changing all the time.
I'm not really a heavy shell programmer. I mostly use it for git and simple scripts.
IMO, nothing's really "sold" me on powershell. It's always come across as some tool that some people like, but I've never really seen a reason to use it. (As I don't do a lot of complicated things in the shell.)
Hopefully this is something that's easy to get used to.
Not sure how that is related. Its years ago i last time used Windows for anything development related. And honestly i would prefer the old cmd because its easy to install Gow into it. It seemed quite harder to have my common environment working properly in Powershell without actually learning at least some of that syntax stuff last time i tried.
You say that; and yet [32-bit] Windows 10 can still run 16-bit GUI applications originally compiled for Windows 1.0 back in 1985, more than 30 years ago. We aren't even talking about a recompile here, but straight up binary compatibility.
How many other OS/DE combos do you know that can boast anything similar?
It's not compatible at the syntax level, but it will also not break anything either. Batch files written for CMD will still be executed by CMD, not PowerShell. Unless you have an application that prompts the user to open a command prompt and then send keystrokes there to run commands, nothing will be broken here.
"Compatible because we run the other program" isn't really compatibility, it's running the other program.
Let's be clear: The syntax and idiosyncrasies and just plain stupidity of cmd.exe are very, very difficult to be compatible with. There's stuff in cmd.exe that beggars the words "awful and random". Deprecating the terrible thing that is cmd is a good move, it should have been done 20 years ago.
As I understand it, cmd.exe largely stagnated for a couple of decades because a senior dev at Microsoft "owned" the code and took any attempt to improve as a personal affront. It couldn't be changed without bad political fallout. (Frankly I don't care how smart or politically powerful you are: If you're doing damage to customers, you should be told not to, and lose your job if you keep doing it).
So Microsoft is finally getting it. I just wish that PowerShell wasn't their answer, because it's got significant problems of its own.
Powershell is very insecure and is heavily abused by potential attackers as it's more flexible than CMD.exe. Of course, arguing about CMD.exe vs PS.exe is pointless, but it still needs to be argued.
There has been a recent spate of talks in Blackhat conf. and other confs, about the versatility of PS.exe, how it is used to perform persistence in comparably little characters, or lines of script than CMD.exe
Some of my Win10 deployments even contain a script which silently disables PS.exe in the installation, and removes every reference to that executable in the registry. There are a few cases where I caught PS.exe re-spawning itself when a Win10 update arrives, so a PS-free deployment is hard to enforce.
I'm really confused here. What attack vectors apply to PowerShell, but not to a situation where I can invoke cmd.exe? I can tell PowerShell to only run signed scripts, and specify what signatures to honor, which is more than I can do with either cmd.exe or WSH. You're citing compactness of scripts, but a straight-up binary would be more compact still. I think I'm missing something pretty basic here.
Which is a post exploitation tool. Assuming you have a payload in Windows ready to execute, one typically wants to leverage tools already in Windows itself, like Powershell, which can make rootkits and other payloads have a lot less footprint, and make them difficult to spot using heuristics. Most crap payloads are actually easy to spot because their payload is massive.
Essentially my point is that you don't want to make it easy for attackers. For context, one would not want Powershell installed on 1000 Windows 10 installations.
I happen to get paid good money for deploying Win10 kiosks in different offices in my area and Powershell is one of many tools I routinely remove from Windows to decrease the attack surface in Win
Anything VBS can do, Powershell can do ("everything" that a running user has the security for). Number of characters isn't relevant to scenarios where you're running PS/VBS/CMD on a local machine.
PS - Your "PS breaker" machine configuration is going to cause problems. Several of Microsoft's installers already run PS scripts behind the scenes. Plus you're doing it for absolutely no technical reason (just ignorant fear).