Process Substitution

Treat command output as files — compare, join, and loop without temp files.

Comparing Output (diff/comm/join)

Compare output of two commands — no temp files needed
diff <(cmd1) <(cmd2)

Creates named pipes under /dev/fd/. No temp files touch disk.

Compare remote files — diff configs across servers without copying
diff <(ssh host1 cat /etc/hosts) <(ssh host2 cat /etc/hosts)
Sorted diff — find content differences regardless of order
diff <(sort file1) <(sort file2)

Sort both inputs before comparing so line order does not produce false differences.

Compare DNS responses — verify internal and external DNS agree
diff <(dig @10.50.1.50 example.com +short) <(dig @8.8.8.8 example.com +short)
Diff remote vs local — compare API response with local file interactively
vimdiff <(curl -s https://api/v1/config) <(cat local-config.json)
Three-column comparison — col1: only in list1, col2: only in list2, col3: both
comm <(sort list1.txt) <(sort list2.txt)
Items in expected but not actual — show missing items only
comm -23 <(sort expected.txt) <(sort actual.txt)

-23 suppresses columns 2 and 3, leaving only items unique to the first file.

Intersection — items present in both files
comm -12 <(sort group1.txt) <(sort group2.txt)

-12 suppresses unique-to-each columns, showing only shared items.

Join on sorted key — relational join of two files
join <(sort file1) <(sort file2)

Both files must be sorted on the join field.

Joining & Pasting

Join columns from same file — extract two fields and display side by side
paste <(cut -d: -f1 /etc/passwd) <(cut -d: -f7 /etc/passwd)
Build computed table — numbers and their squares merged as columns
paste <(seq 5) <(seq 5 | awk '{print $1*$1}')

paste merges parallel streams as tab-separated columns.

Loop Input

Loop over command output — process substitution feeds stdin to while
while IFS= read -r line; do echo "processing: $line"; done < <(find . -name "*.py")

< <(cmd) feeds process substitution as stdin to the while loop. Runs in the current shell, so variable assignments persist.

Iterate filtered file — awk strips comments, process sub feeds clean lines
while IFS= read -r host; do ssh "$host" uptime; done < <(awk '/^[^#]/' hosts.txt)

Output Splitting (tee)

Parallel output streams — write compressed and checksum simultaneously
tee >(gzip > backup.gz) >(sha256sum > backup.sha256) > /dev/null < data.bin

One read pass produces both outputs.

Split output by pattern — route to multiple filtered destinations
cmd | tee >(grep ERROR > errors.log) >(grep WARN > warnings.log) > full.log

Single command’s output routed to multiple filtered log files.

Sort CSV preserving header — header row first, then sorted data
cat <(head -1 data.csv) <(tail -n +2 data.csv | sort -t, -k2)

Loading & Sourcing

Source command output — load shell completions without a temp file
source <(kubectl completion bash)
Load secrets into shell — process substitution avoids writing secrets to disk
source <(vault read -field=data secret/env | base64 -d)
Load command output into array — each line becomes an array element
mapfile -t results < <(find . -name "*.conf" -type f)

Handles spaces in filenames correctly. -t strips trailing newlines.

Count matches — process sub feeds grep output to wc
wc -l < <(grep -r "TODO" --include="*.py" .)

Cleaner than a pipe because $? reflects `wc’s exit status, not `grep’s.

See Also

  • Pipes — linear pipelines, process substitution enables branching

  • Streams — file descriptors underlying <() and >()