This is an off topic, but I just learned from the article that PowerShell will soon be the default in place of cmd.exe in Windows 10. I welcome this change as I found the experience of using PS was superior to that of bash/zsh in general cases.
But I hope they figured out the performance problem. As of writing, in the stable version of Windows 10, PS is perceptually slower than cmd, so I was forced to use PS only when needed. Funnily it was even slower in Windows 8, so the current affair is better than ever. But to be truly a default I think the performance of PS should at least match that of cmd.exe.
On that note if you are a .NET developer you owe yourself to learn PS and how to write cmdlets, it's such a powerful and easy way to expose a CLI for your .NET app compared to stdio and command line arguments parsing - it's incredibly powerful and trivial to do.
Unfortunately most .NET devs are programmers who grew up with VB, RAD and GUI tools they don't understand the value of exposing UNIX like small functionality CLI commands over big monolithic services, GUI apps, etc.
I learned C# with a big fat book, and the command line compiler... I understand the usefulness of command line applications.
However, when I need more than what a simple shell script gives me, I'm more inclined to reach for node, ruby or python than C#/.Net ... the overhead for a quick scripting environment is quite a bit lower than having to setup a project and build requirements. I've done both...
Why node, is simply npm... create a directory, npm init, write my script, etc. reference it from an alias or .cmd in my ~/bin directory (added to my path). Then it works in windows, mac and linux which I use all regularly. PS is mostly windows.
Have you ever written a commandlet ? This is my point, regardless of how you feel in PS vs Bash or Python vs C# - exposing your application logic trough commandlets is incredibly simple and powerful - much simpler than doing a CLI app in other languages even with argument parsing frameworks in python and likes, commandlets let you pipe and return/accept .NET objects.
So my comment isn't PS everything or C# everything, it's if you're using C# to write a big fat monolithic app that has multiple isolated functionalities you can expose those fairly easily with PS.
I'm a .net developer, I still install cygwin/bash on my computer because I can write stuff quicker and better. PS get's incredibly verbose for anything non trivial. I learned powershell long before bash too.
Calling Powershell incredibly verbose is about the same as calling C# incredibly verbose.
Sure, perhaps they're not as compact as some of the incredibly information-dense perl scripts that people come up with.
I'd wager though that it's easier to understand what a random powershell script is doing than a random perl or bash script that pipes output throgh a dozen utilities.
I've got an example of powershell getting verbose here: http://flukus.github.io/2015/03/13/2015_03_13_Powershell-is-...
. Ignore the rest and look at the example where I'm trying to copy a directory recursively and with a filter. In bash the same task is a relatively simple one liner:
It is a bit shorter than your bash equivalent, still doing the same thing.
I think you're mistaken about what `cp` (`Copy-Item`) is intended to do. Its main purpose is to copy, not to filter. Yes, `Copy-Item` supports filtering because it's part of the "common parameters", but to be more in line with the cmdlets' original purposes, you should `ls` (`Get-ChildItem`) first, because it has more filtering capability, and then pipe the results to `cp`.
Is it complex? No, actually its complexity is exactly the same as that of your `find` example! `find` is more or less equivalent to `ls` in that both gathers the list of files that meet certain criteria. And then, like `find` invokes `cp` multiple times to do actual copying, `ls` (`Get-ChildItem`) feeds its result to `cp` (`Copy-Item`). They are structured in a similar way.
I'd even say the PS one-liner is more akin to "the UNIX philosophy". In the bash one-liner, there is a direct parent-child relationship between `find` and `cp`, which doesn't utilize pipes at all. Whereas the PS one-liner connects two equivalent processes (`ls` and `cp`) with a pipe. This is exactly what I'd call "small processes work together to get the job done", which is again the UNIX way.
Well yes, that's an example that's relying on the find utility, for which the equivalent in the windows world (at least for file management) is robocopy.
robocopy /r $pathFrom $pathTo *.html
If you wanted to exclude say 'main.js' then it'd be:
robocopy /r $pathFrom $pathTo *.html /xf main.js
..and /xd is for excluding directories too.
I guess copy-item should be smarter to be able to handle this.
Also, for what it's worth, your example in the linked post is a bit more verbose than it really needs to be.
You're comparing a full-up programming language to nant there. So yeah, it's a bit more verbose there. But I bet I can come up with a counter-example where nant is a giant nightmare to get right (or requires just dropping straight to executing external commands) (There's a reason I've killed off all usage of nant years ago, and gone to powershell for build scripts)
Perl scripts can be vastly more powerful than PS. Perl is a full blown general purpose programming language with a massive collection of libs and frameworks.
It is however very powerful that Perl can be used in bash pipelines. But that is also true for any Unix tool that takes I/O.
So Bash is also very powerful with the help from all its friends that can be used with pipes. Shell native.
And you don't HAVE to code Bash in the most convoluted way possible. Sane code structure and naming goes a long way.
> Perl is a full blown general purpose programming language with a massive collection of libs and frameworks.
> So Bash is also very powerful with the help from all its friends
> you don't HAVE to code Bash in the most convoluted way possible. Sane code structure and naming goes a long way.
All of these things are also true for Powershell. Powershell can use any .NET Assembly, and if necessary make native windows system calls too if you really want.
I've been dabbling with PS (using it as my primary console) but my workflow just isn't that complex, so I don't have a compelling reason to write cmdlets (yet!).
The msdn docs are great, it's just difficult to navigate them. Perhaps this would help? https://msdn.microsoft.com/en-us/library/dd878294(v=vs.85).a... There's tutorials in there as well, but I didn't find them very useful compared to the overview/concepts documentation.
Yeah that's what I went trough as well but usually tutorials are shorter and more to the point, this is more reference/in-depth style - I'm guessing OP was looking for something along those lines.
For one, I can do away with all of the text parsing that is prone to breakage in case a tool changes their output somewhat (or god help me - tools that can't handle unicode properly).
The power of bash actually comes from the GNU coreutils and other userland software. It has almost very little to with bash.
Try out bash on a busybox and feel the crippled effect.
So you mean PS as a scripting language, not as a shell, is superior. That's not a surprise, since it's newer and designed to do away with many annoyances of Bash. As a shell, however, it is sadly just slightly more usable as the REPL of, e.g., Python.
I had been a full-time Linux (desktop) user for more than 10 years, and only recently made a transition to Windows. As I'm nowhere proficient at using PowerShell I might be overrating it a bit, as the grass is always greener on the other side. Anyways, what I found to be satisfying while using PS were:
1. The input/output is done using objects. I know that "inter-process communication should be done with text" is the UNIX philosophy, and I appreciated that when using Linux, but after using PS I started to have mixed feelings about that. When using bash/zsh, I typically used awk to extract the data I wanted from the text emitted by an external process. Doing so isn't hard, mostly as simple as using `awk {print $3}` or something like that, but it is still a bit annoyance and more importantly, vulnerable to the changes in the output format.
PS cmdlets communicate with themselves using objects, so it is very easy to extract some columns out of the command results. For example, when I query about a process in PS:
As you can see, I can simply specify the column name(s).
This is probably why many Linux commands have detailed options to limit displayed information. For example, `uname` has -s, -r, -m, -p, and many others that are just portions of -a. If it were in PS there would be no other options than -a and users could utilize it accordingly. Likewise `ps` has many options just to control the output which is again not necessary in the PS's side.
Also due to the probable scripts that may be reliant on the column orders (e.g. my script assumes the third column to be always the one I wanted, because I hard-coded `awk {print $3}` in there), it is very hard to change the layout of the output in Linux commands. In PS there is no layout in the first place, so this backward compatibility concern doesn't exist.
2. Command names are much clearer. Many names are pretty descriptive so I don't have to remember the exact abbreviated forms, but at the same time they provide shorter aliases. For example `Get-Process` can also be called `ps`. bash/zsh can also benefit from this by manually assisning aliases, but I believe "sane defaults" should be long-descriptive names first, and abbreviated forms later.
3. Much more objected-oriented design. Say for example you want to get the last modified date of a file. In Linux I'd use `stat` and somehow extract information from it. Or, `stat` may have some option to print mtime so I may have to google for it. In PowerShell, I can use this instead:
All of these are benefited by tab completion, so you can easily find what properties any object has. This greatly improves discoverability, so that I don't need to rely on documentations (man pages on Linux, MSDN on Windows).
Not only that, but PS is much closer to a general-purpose programming language than bash/zsh. It has built-in calculations (no need to rely on expr/bc), and it even has some basic type safety, such as:
PS> 1 + 2
3
PS> 1 + "a"
Cannot convert value "a" to type "System.Int32". Error: "Input string was not in a correct format."
+ 1 + "a"
+ ~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [], RuntimeException
+ FullyQualifiedErrorId : InvalidCastFromStringToInteger
which might be silently ignored in bash/zsh in most cases. As you can see it even has fixed-width integer types (Int32), which is rarely seen in dynamic languages!
When the logic of my scripts got complicated, I tended to abandon shell scripts and start programming in Python. But after learning PowerShell I'm starting to have a confidence that typical workflows can be implemented in PowerShell, in a readable way. I even think that PS can be utilized as a general-purpose programming language, like, "Python without dependencies", because PS is installed by default on Windows nowadays.
Whoa, my response got unintentionally huge O.o. Hope this helps anyone.
You seem to have articulated all my thoughts perfectly.
The thing about parsing text is a very huge pain point for me because I've had some tool change their output (and unicode issues) which broke some scripts.
The structured nature of PS makes it very powerful and allowed me to write a script that checks the latest versions of some software upstream and tells me if there are updates. I have that on bash as well but it's comparatively unmaintainable.
i agree with the fact that PS object oriented communication between commands is much better then text. But I disagree about your point (2) saying that command names are more discoverable. with linux style conventions, there is a hierarchy that helps you navigate between command's features. for example `docker image ls` you can type docker, see that there is an images subtree, type docker images, see that there is an ls command, and run it. With Powershell you kind of have to guess and type `get-docker` and tab through commands. Also some times the verb is not easy to guess. So in terms not relying on documentation as you called it, I think PS is worse.
That being sayd, once you do know the command you need, using it is much easier with PS as you nicely described. tip - if you liked tab completion, try ctrl+space :)
I was thinking more about the "proper noun" aspect of the UNIX commands. I mean, what do `awk`, `sed`, `tar`, `xargs`, `df` mean? Why does `free` only print the remaining memory, not the disk space? Why is `top` even related to processes? They are all like that because Unix has had a long way until today. In the beginning `grep` would have been enough, but suddenly someone wanted to improve the state of affairs, and made a new command named `awk`. Probably there had only been `ar`, and then later the necessity of `tar` was found. `free` is not `mf` and `df` is not `free`, because the original designer thought the free space of the main memory was more important. All these inconsistency/idiosyncrasy do make learning the UNIX command hierarchy harder. We developers don't feel that way because we all are very used to such commands, but there might be some memory in our inside when we tried really hard to memorize all of the useful commands just to do basic things.
PowerShell didn't have this backward compatibility concern so it built up its own vocabulary from the scratch. While it is nowhere near to perfect (as your example shows, the VERB-NOUN naming scheme can be a bit cumbersome when some functionalities need to be grouped), I'd say it is at least much more consistent regarding basic file/device management, because there was simply no baggage to consider when they designed PowerShell for the first time.
Ah, and thanks for suggesting `Ctrl-Space`! I thought it would have been better if PowerShell had a GUI widget listing possible candidates, so I was considering sending a patch. It turns out that the MS people are definitely more clever than me. :)
I used majority of mainstream shells and Posh is a scpace ship compared to any other in existence. I wont reiterate again and again why, there are planty of places eihter on HN, reddit or SO why is it so. I guess you will have to give it a serious try.
PS is great if everything you are dealing with is built for the .NET ecosystem.
It's less great otherwise.
This makes it often great for working on Windows, and definitely great for working with Windows and other MS software that is designed for the .NET/PS world.
Perhaps, but I find it very easy to interact with various webservices. We use powershell to call API methods on our load balancers, change AWS configuration, change DNS records and a variety of other things.
I go back and forth between windows/unix for my day job and I also would be curious on any actual bullet points. I find that even in a situation where PS could potentially prove more useful or has an extra feature, what you end up with is a new set of chrome tabs open just to try and figure out how to do the thing.
I suppose this might(?) be mitigated if you're embedded in the .NET world, but I just can't seem to get the memory down for the silly cmdlet naming and since they also have the admittedly interesting object-piping thing going on, you're always battling two pain points at a time instead of just one (e.g. syntax/naming in bash)
One of the best bits of PS1 advice I've seen was to get used to the Verb-Noun naming pattern [1] and in particular, the discoverability of the very standardized set of verbs [2].
Once you can guess the verb you want, then it can often be a simple matter of finding the right noun, and often nouns will be useful in sets similar to verbs.
Get-Verb, Get-Command, and Get-Help are all quite useful for looking for a command.
For dealing with the object pipeline I tend to find ConvertTo-Json very handy because as a developer I'm already quite used to reading objects in JSON already. (ConvertFrom-Json can similarly be used to bootstrap a PS1 pipeline with test data or remote data.)
I've used both extensively and I think they are fairly different, with different strengths and weaknesses. Bash is more pragmatic and more concise on the command line. PS is more uniform in its design and nicer for scripting IMO. It allows passing objects through pipes, has built-in JSON reading and writing, built-in parameter handling with defaults, mandatory and optional params, switches etc., and such niceties.
Bash is not more concise, its other way around, of course, if you use default aliases. The reasons is logical - you almost never use text parsing in Posh while you almost never have anything without it in bash.
cmd.exe has a limited set of builtin and relies on external applications with a plain text interface.
PowerShell has builtins, can import code from any .NET Assembly or native code DLL, invoke external processes, full featured programming language and structured data.
GP was replying to a comment about the performance of PS compared to cmd.exe, stating that PS could never be on par with cmd.exe because it needs to load more.
Yeah this is the one thing with PowerShell: with great power comes, erm, not so great performance. I've never been a full-time shell user so after years of occasionally getting some stuff automated using bash discovering Powershell really was like a breath of fresh air.. No more trying/failing to come up with the correct regex to parse the plain text spit out by tools. Command-line completion of arguments. A whole bunch of sane defaults and command names making things easy to discover. Etc. Now at first I only ran PS on a beefy workstation and didn't notice it was kinda slow (even for common operations). But on not-so-beefy machines: yeah, not nice. Then again, cmd is worthless in comparsion with PS so I stopped caring. Thing is also: the time gained by how fast I can get stuff done in PS probably makes up for the time lost trying to figure out the weird syntax of other shells, especially cmd.
But I hope they figured out the performance problem. As of writing, in the stable version of Windows 10, PS is perceptually slower than cmd, so I was forced to use PS only when needed. Funnily it was even slower in Windows 8, so the current affair is better than ever. But to be truly a default I think the performance of PS should at least match that of cmd.exe.