Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

PowerShell was clearly tailored for system scripting, sysadmin, filesystem manipulations, etc... and it does a very good job at that.

Where I find it lacking is in the manipulation of large (10s of GBs) text files, which is what I do most of the time. If I try to cat a large file to wc -l I have to kill the shell because it starts eating all my memory. Even when the memory is not a problem, it is considerably slower than Cygwin or MinGW. I think that the lines are converted to System.String and cached in memory; if this is the case it might be very hard to solve this without an architectural change.

I haven't found any good equivalent of less or grep, things like head and tail are unnecessarily verbose, etc..

Also, the text input is not nearly as advanced as readline-based shells (no Ctrl-K, Ctrl-Y, Alt-Backspace, history search, decent completion...).

I sincerely hope that someone (MSFT or others) comes up with a better solution. Cygwin/MinGW are a reasonable trade-off but PowerShell proved that a much better integration with the system is possible.



cat is an alias for Get-Content which is notoriously slow. If you do some googling people suggesting using System.IO.StreamReader instead. Have you tried that?


When I say cat I mean cat.exe, the unix utility. My usual workflow is

     <exe writing to stdout> | <filtering/manipulation> | <exe reading from stdin>
Whenever I run this inside PowerShell, whatever flows through the pipeline seems to be cached in memory, and it is orders of magnitude slower than when I run the same pipeline inside bash.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: