I created a simple alias for xargs
, with the intend to pipe it when needed. It will simply run a command for each line of it. My question to you is, is this useful or are there better ways of doing this? This is just a little bit of brainstorming basically. Maybe I have a knot in my head.
# Pipe each line and execute a command. The "{}" will be replaced by the line.
# Example:
# find . -maxdepth 2 -type f -name 'M*' | foreach grep "USB" {}
alias foreach='xargs -d "\n" -I{}'
For commands that already operate on every line from stdin, this won’t be much useful. But in other cases, it might be. A more simplified usage example (and a useless one) would be:
find . -maxdepth 1 | foreach echo "File" {}
It’s important to use the {}
as a placeholder for the “current line” that is processed. What do you think about the usefulness? Have you any idea how to use it?
Great minds, lol. I have almost the exact same command set up as a little script. Mine has an extra modification for my use case, and I named mine iter, but foreach is a good name for it too.
Oh iter is a good one too. :D
Here’s the code:
#!/bin/bash cmd=$@ if echo $cmd | grep '/$' then xargs -rd '\n' -I {} $cmd{} else xargs -rd '\n' -I {} $cmd {} fi
Usage is like:
ls *zip | iter shasum
or
ls *zip | iter shasum ../zipfile_archive/
The second one would get the shasum of zip files that have the same name as ones in the cwd
This assumes, of course, that the input files have sane names, ie, no spaces, etc
Thanks for posting. I find the echo part and extra use of variable is a little bit flaky. Here is a modified version. But I am not 100% sure if its doing what your script is doing.
I skipped the extra variable and echo and grep, by comparing its content with
${*}
, which is similar to${@}
, but won’t separate each argument and create a single string instead. The=~ /$
is a regex comparison, which Bash supports native. Then I am using${@}
for the call, which separates each argument. Maybe this could be done with${*}
instead. I’m not sure which of them is the correct one for this case. At least it seems filenames with spaces work. Otherwise, not claiming it would be better. Just giving some food for thoughts.#!/usr/bin/bash if [[ "${*}" =~ /$ ]]; then xargs -rd '\n' -I {} "${@}"{} else xargs -rd '\n' -I {} "${@}" {} fi
Very nice; I will use this. Thanks!
Don’t use
ls
if you want to get filenames, it does a bunch of stuff to them. Use a shell glob orfind
.Also, because filenames can have newlines, if you want to loop over them, it’s best to use one these:
for x in *; do do_stuff "$x"; done # can be any shell glob, not just * find . -exec do_stuff {} \; find . -print0 | xargs -0 do_stuff # not POSIX but widely supported find . -print0 | xargs -0 -n1 do_stuff # same, but for single arg command
When reading newline-delimited stuff with
while read
, you want to use:cat filenames.txt | while IFS= read -r x; do_stuff "$x"; done
The
IFS=
prevents trimming of whitespace, and-r
prevents interpretation of backslashes.Some additional thoughts to be aware of by looking closer to each line (previously I just glanced over).
This point is not directly affecting your example, but I want to make you aware of something I fall into myself. Its one of those Bash quirks. Other shells might handle it differently, only speaking about Bash here. For a regular for loop over files, its important to note that if no file exists, the variable will be set to the search string. So example
for x in *.png; do
, if no .png file is found, thenx
will be set to*.png
literally. So depending on what you do in the loop this could be catastrophic. But Bash has an option for this specifically:shopt -s nullglob
. Using this option, if no file is found, thenx
will be set to an empty string. More about Bash options: https://www.gnu.org/software/bash/manual/html_node/The-Shopt-Builtin.htmlfor x in *.abcdefg; do echo "$x"; done shopt -s nullglob for x in *.abcdefg; do echo "$x"; done
BTW one can also do a read line by line without cat, by reading the file directly: (for some reasons Beehaw won’t let me type the lower than character, so replace that, here a screenshot too):
while IFS= read -r line; do echo "Line: ${line}" ; done \< filenames.txt
Those find and ls commands were just to illustrate how the actual alias work, to get a sense of. I’m aware of those issues with filenames. It’s not about ls or find here.
Yeah sorry then. It would be good to not use
ls
in your example though, someone who doesn’t know about that might read this discussion and think that’s reasonable.As for your original question, doing the
foreach
as a personal alias is fine. I wouldn’t use it in any script, since if anyone else reads that, they probably already know aboutxargs
. So using yourforeach
would be more confusing to any potential reader I think.I guess you are right. I even point such things out on others, so fair enough. I will update the example, as I don’t want someone to see this and take it as a good example. For the alias, I never use aliases in scripts anyway. These are always for interactive usage for me at least. In scripts i follow some other rules, such as use longer names for options (unless it is a really common one).
I just use xargs -n1. Or -exec with find.
A bit of a tangent, but I almost never use
xargs
in the shell anymore, and instead use “while read line ; do *SOMETHING* $line ; done
”, becausexargs
doesn’t have access to the shell’s local variables, aliases, or functions.How to call
xargs
is typically one of those things I always forget. The foreach alias is a great solution!My current solution was to use
tldr
for all of these tools, but yeah if I find myself having to do a for each line, I’ll definitely steal your alias.Luckily (knocks on wood) I almost exclusively work with yaml and json nowadays so I should just learn
yq
.