Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For me I always used for loops and only recently (after a decade of using Linux daily) have learned about the power of piped-loops. It’s strange to me you are more comfortable with those than for loops, but I think it does make sense as you’re letting a program generate the list to iterate over. A pain point in for loops is getting that right, e.g. there isn’t a good way to iterate over files with spaces in them using a for loop (this is why I learned about piped loops recently.)


> A pain point in for loops is getting that right, e.g. there isn’t a good way to iterate over files with spaces in them using a for loop

If those files came as arguments, you can use a for-loop as long as they're kept in an array:

  for f in "${files[@]}";
That handles even newlines in the filenames, while I'm not sure if you can handle that with a while-read-loop. IFS=$'\0' doesn't seem to cut it.

for-loops seem preferable for working with filenames. If a command is generating the list, then something like `xargs -0` is preferable.


My problem was that I had a directory with probably 200+ subdirectories each one, and the files and subdirectories below them, had a couple of spaces in the name. I typically use

    for f in ‘ls’; 
For operations like that but it was obviously built on windows (but I run steam on Ubuntu) and I never interact with windows so tbh I had never thought of this problem before.


The GNU way for handling files that have inconvenient characters in their names is:

    find ... -print0 | xargs -0 ...
It makes all the problems go away.


Also you can use readarray to store the found filenames in a bash array (to use with a for loop).


You can also have your script change the bash file seperator.

https://bash.cyberciti.biz/guide/$IFS

Something I wish I'd learned 23 years ago instead of 3 years ago :(




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: