Node.js devs, so picture this: you run `npm install` and you get a bunch of packages with audit errors.

The only thing I want to know at that point is what’s the root package that these dependencies belong to? (Running npm audit fix is a last resort as I don’t like it fiddling around with the dependencies of nested packages.)

It’s also not a straightforward thing to do, but it’s nothing jq and a bit of piping can’t fix:

```bash
npm audit --json | jq -r '.vulnerabilities[].name' | xargs -n1 npm ls
```

If you’re using fish shell, add an abbr(aviation) or an alias to that with a name like npm-audit-tree and you’re golden ;)

```bash
abbr --add --global npm-audit-tree 'npm audit --json | jq -r '.vulnerabilities[].name' | xargs -n1 npm ls'
```

(I usually prefer abbreviations to aliases as I like to remember/see the actual command being executed.)

Enjoy 💕

#NodeJS #npm #audit #security #JavaScript #JSON #jq #xargs #dev #tip

Already more than 10yrs old but I just stumbled over this great blog post showing an insanely fast local data processing pipeline simply using #find #xargs #awk .
I really like the use of xargs for parallelization here.

Command-line Tools can be 235x Faster than your Hadoop Cluster - Adam Drake
https://adamdrake.com/command-line-tools-can-be-235x-faster-than-your-hadoop-cluster.html

Adam Drake

Adam Drake is an advisor to scale-up tech companies. He writes about ML/AI/data, leadership, and building tech teams.

Adam Drake

#aprende #studia #linux #xargs #español #espanol

Ejemplo del use del comando xargs:

find / -name nftables* -type f 2>errors.txt | xargs -I % cp /home/someUser/ %

El primer comando es find, lo que genera una lista de 9 archivos (vean la imagen). Luego estos se canalizan a xargs. La opción -I define el contenedor de la variable, que luego se utiliza en el comando cp

Los 9 archivos se copiaron al directorio de destino (vean la otra imagen)

Ihr kennt doch sicher diesen Country-Song von Roger Miller über #xargs:

King of the Quote

🤓

Feels a bit dangerous, but I just dealt with #git complaining about local untracked files that would be overwritten by a `git pull` by selecting the list, copying it, then doing `pbpaste | xargs rm`. I assumed I'd get asked for confirmation for each one... but it just deleted them all! Eek! #bash #xargs

Ok. Remotely cleaning a huge (>2 TB, many many files and subdirs) #Nextcloud-hosted folder (not the whole user) is *painful*. Without access to the host it runs on I am limited to either the webinterface - which breaks - or using #webdav with a tool like #rclone.

#rclone purge breaks (timeout), so rclone delete it is. Which is *slow*, really slow. Probably because the remote moves a deleted file into the (for this case) useless trashbin which can't be turned off.

At least one can use #xargs to run multiple rclones in parallel - first get a list of entries of the to-be-deleted-dir (rclone lsf), format them the way rclone expects (basically put name of remote in front) and use something like `xargs -n 1 -P0 rclone delete -v --rmdirs` on it.

Still, its running since yesterday later afternoon and we are down to 1.4Tb left, of 2Tb. Even in parallel, the webdav shit manages to delete 2 to 4 files a second only.

#TIL: #xargs has an option -a to read items from a file instead of from standard input. This is perfect if you need to run xargs without a surrounding shell to provide the commonly used pipe mechanism, e.g. calling it just via exec() or similar.

My case today was with Perl's Test::Command::Simple::run_ok() in pxzgrep's TAP based test suite.

#linuxcli #cli #climagic #commandlinemagic

#unix is awesome #oneliner #otd

"Hey UNIX, give me a list of all our infra projects that create databases sorted by linecount"

rg -l OLAmazonDB | xargs wc -l | sort

This message brought to you by the awesome power of #ripgrep and #xargs :)

Für mehr `xargs -d'\n'`!

#xargs #shell #bash

#grep, #xargs, and #sed are the holy trinity for batch processing text on #linux:

https://rm-o.dev/til/sed-batch-processing-files/

rm-o | Til: Batch Processing Files

Learn how to process a collection of files in a batch.

rm-o.dev