CLI Data Processing: Shell Patterns for ISE Automation
Context
This document captures CLI data processing patterns discovered during the Linux AD Authentication deployment for Xianming Ding’s workstation. The goal: complete testing at home enterprise, then hand off the workstation by tomorrow noon.
Project Structure
Shared team runbooks (CHLA internal):
-
linux-workstation-deployment-runbook.adoc- Main deployment guide -
executive-status-review.adoc- Status reporting -
Team views:
cloud-ad-tasks.adoc,infosec-tasks.adoc,syseng-tasks.adoc
Internal runbooks using netapi are personal intellectual property and remain in .internal/ - not shared externally.
|
Working Method
All reference material comes from docs.domusdigitalis.dev - eating my own dog food. No external documentation. If it’s not in my docs, it gets added.
Objectives
-
Master jq patterns for ISE API output formatting
-
Document shell color escape sequences for professional output
-
Create reusable examples for cross-project includes
-
Validate Linux AD Auth deployment on modestus-aw
-
Document fleeting ideas for future roadmap items
Shell Color Patterns
ANSI Escape Codes
| Code | Color | Usage |
|---|---|---|
|
Cyan |
Hostnames, names |
|
Yellow |
Numbers, MACs, ranks |
|
Green |
Enabled, success, checkmarks |
|
Red |
Disabled, errors |
|
Magenta |
Borders, decorations |
|
White |
Headers |
|
Gray |
Dimmed, disabled items |
|
Reset |
End color sequence |
Pattern: Status Table with Indicators
echo -e "\e[1;37m # POLICY SET HITS STATE\e[0m"
echo -e "\e[90m βββββββββββββββββββββββββββββββββββββββββββββββββ\e[0m"
netapi ise api-call openapi GET '/api/v1/policy/network-access/policy-set' | \
jq -r '.response[] | "\(.rank)|\(.name)|\(.hitCounts // 0)|\(.state)"' | \
while IFS='|' read rank name hits state; do
if [[ "$state" == "enabled" ]]; then
printf " \e[1;33m%d\e[0m \e[1;36m%-28s\e[0m \e[1;32m%6s\e[0m \e[1;32mβ\e[0m\n" "$rank" "$name" "$hits"
else
printf " \e[90m%d\e[0m \e[90m%-28s\e[0m \e[90m%6s\e[0m \e[1;31mβ\e[0m\n" "$rank" "$name" "$hits"
fi
done
# POLICY SET HITS STATE βββββββββββββββββββββββββββββββββββββββββββββββββ 0 Domus-Wired MAB 15 β 1 Domus-Wired 802.1X 103 β 2 Domus-Secure 802.1X 100 β 3 Domus-IoT iPSK 17 β 4 Domus-Guest 0 β 5 Default 1 β
jq Patterns
Extract and Format
# Basic extraction
jq -r '.response[] | .name'
# Multiple fields with delimiter (for while read)
jq -r '.response[] | "\(.hostname)|\(.roles | join(", "))"'
# Filter with select
jq -r '.response[] | select(.name | startswith("Domus-")) | .name'
# Default value for missing field
jq -r '.response[] | "\(.hitCounts // 0)"'
Color Note
-
jq -r= raw output, no quotes, no color -
jq -C= colored JSON with quotes -
jq -C -r= contradictory flags,-rwins (no color)
For colored output with raw strings, use shell printf with escape codes.
Utility Patterns
Logging with tee
LOGFILE="/tmp/ise-validation-$(date +%Y%m%d-%H%M%S).log"
netapi ise api-call openapi GET '/api/v1/deployment/node' | \
tee -a "$LOGFILE" | jq '.response[].hostname'
echo "Log saved to: $LOGFILE"
Column Alignment
netapi ise api-call openapi GET '/api/v1/deployment/node' | \
jq -r '.response[] | "\(.hostname)\t\(.roles | join(", "))"' | \
column -t -s$'\t'
Counting Results
COUNT=$(netapi ise api-call openapi GET '/api/v1/policy/network-access/policy-set' | jq '.response | length')
printf "\e[1;37mTotal Policy Sets:\e[0m \e[1;32m%s\e[0m\n" "$COUNT"
POSIX Power Tools
cut - Field Extraction
# Extract specific fields (delimiter = |)
echo "user|role|status" | cut -d'|' -f2
# Output: role
# Extract columns 1 and 3
netapi ise mnt sessions | cut -d'|' -f1,3
# Character positions (first 17 chars = MAC address)
echo "AA-BB-CC-DD-EE-FF extra" | cut -c1-17
tr - Translate/Delete Characters
# Lowercase to uppercase
echo "aa:bb:cc:dd:ee:ff" | tr '[:lower:]' '[:upper:]'
# Output: AA:BB:CC:DD:EE:FF
# Replace colons with dashes (ISE MAC format)
echo "aa:bb:cc:dd:ee:ff" | tr ':' '-' | tr '[:lower:]' '[:upper:]'
# Output: AA-BB-CC-DD-EE-FF
# Delete characters
echo "AA-BB-CC-DD-EE-FF" | tr -d '-'
# Output: AABBCCDDEEFF
# Squeeze repeated characters
echo "too many spaces" | tr -s ' '
# Output: too many spaces
xargs - Build Commands from Input
# Delete multiple endpoints by MAC
netapi ise api-call openapi GET '/api/v1/endpoint' | \
jq -r '.response[] | select(.groupId == "rejected") | .id' | \
xargs -I {} netapi ise api-call openapi DELETE '/api/v1/endpoint/{}'
# Parallel execution (4 at a time)
cat macs.txt | xargs -P4 -I {} netapi ise mnt session {}
# Build command with multiple args
echo "file1 file2 file3" | xargs rm -v
Heredocs - Multi-line Input
# Create dACL content file
cat > /tmp/dacl-onboard.txt << 'EOF'
permit udp any any eq 67
permit udp any any eq 68
permit tcp any any eq 22
permit ip any any
EOF
# Variable expansion (no quotes on EOF)
HOSTNAME="modestus-aw"
cat > /tmp/config.txt << EOF
hostname=${HOSTNAME}
domain=inside.domusdigitalis.dev
EOF
# Heredoc to command
netapi ise api-call openapi POST '/api/v1/policy/network-access/downloadable-acl' << 'EOF'
{
"name": "Linux-Research-AD-Auth-dACL",
"dacl": "permit ip any any"
}
EOF
sed - Stream Editor
The stream editor - transforms text line by line.
# Basic substitution (first occurrence per line)
echo "hello world world" | sed 's/world/universe/'
# Output: hello universe world
# Global substitution (all occurrences)
echo "foo foo foo" | sed 's/foo/bar/g'
# Output: bar bar bar
# Case-insensitive (GNU sed)
echo "Hello HELLO hello" | sed 's/hello/hi/gi'
# Output: hi hi hi
# In-place edit
sed -i 's/old/new/g' file.txt
# In-place with backup (safer)
sed -i.bak 's/old/new/g' file.txt
Line Selection
# Delete lines matching pattern
sed '/^#/d' file.txt # Remove comments
sed '/^$/d' file.txt # Remove empty lines
sed '/DEBUG/d' logfile.txt # Remove debug lines
# Print only matching lines (like grep)
sed -n '/pattern/p' file.txt
# Print line numbers
sed -n '/ERROR/=' logfile.txt
# Address ranges
sed '1,10s/foo/bar/g' file.txt # Lines 1-10 only
sed '5,$s/foo/bar/g' file.txt # Line 5 to end
sed '/START/,/END/d' file.txt # Delete between patterns
Advanced Substitution
# Capture groups (extended regex -E)
echo "2026-02-13" | sed -E 's/([0-9]{4})-([0-9]{2})-([0-9]{2})/\3\/\2\/\1/'
# Output: 13/02/2026
# Backreferences
echo "hello hello" | sed -E 's/(\w+) \1/\1/'
# Output: hello
# Substitute only on lines matching another pattern
sed '/ERROR/s/foo/bar/g' file.txt
# Multiple commands
sed -e 's/foo/bar/g' -e 's/baz/qux/g' file.txt
# Command file (for complex transforms)
sed -f commands.sed input.txt
Insert, Append, Change
# Insert line BEFORE match
sed '/pattern/i\New line before' file.txt
# Append line AFTER match
sed '/pattern/a\New line after' file.txt
# Replace entire line
sed '/pattern/c\Replacement line' file.txt
# Add header to file
sed '1i\# Header comment' file.txt
# Add footer
sed '$a\# End of file' file.txt
ISE-Specific sed Patterns
# Convert MAC format (aa:bb:cc β AA-BB-CC)
echo "aa:bb:cc:dd:ee:ff" | sed 's/://g; s/../&-/g; s/-$//' | tr '[:lower:]' '[:upper:]'
# Output: AA-BB-CC-DD-EE-FF
# Extract IPs from log
sed -n 's/.*\([0-9]\{1,3\}\.[0-9]\{1,3\}\.[0-9]\{1,3\}\.[0-9]\{1,3\}\).*/\1/p' auth.log
# Clean up JSON for readability (simple)
echo '{"a":1,"b":2}' | sed 's/,/,\n/g; s/{/{\n/g; s/}/\n}/g'
# Remove ANSI color codes from log
sed 's/\x1b\[[0-9;]*m//g' colored-output.txt
Process Substitution - Diff Two Commands
# Compare two API responses
diff <(netapi ise get-policy-sets | jq -r '.response[].name' | sort) \
<(cat expected-policies.txt | sort)
# Feed command output as file
while read line; do
echo "Processing: $line"
done < <(netapi ise api-call openapi GET '/api/v1/endpoint' | jq -r '.response[].mac')
Pipelines - Chain Everything
The Unix philosophy: small tools, combined powerfully.
# Full pipeline: API β filter β format β color β log
netapi ise api-call openapi GET '/api/v1/policy/network-access/policy-set' | \
jq -r '.response[] | "\(.rank)|\(.name)|\(.state)"' | \
grep -v "Default" | \
sort -t'|' -k1 -n | \
while IFS='|' read rank name state; do
printf "\e[1;33m%d\e[0m \e[1;36m%-30s\e[0m \e[1;32m%s\e[0m\n" "$rank" "$name" "$state"
done | \
tee /tmp/policy-sets.log
Pipeline Patterns
# Pattern 1: Extract β Transform β Filter β Format
command | jq '.field' | tr 'a-z' 'A-Z' | grep PATTERN | column -t
# Pattern 2: API β Process β Multiple Outputs (tee)
netapi ise mnt sessions | \
tee >(jq '.[] | .username' > /tmp/users.txt) \
>(jq '.[] | .calling_station_id' > /tmp/macs.txt) \
>(wc -l > /tmp/count.txt) | \
jq '.'
# Pattern 3: Parallel Processing (xargs -P)
cat endpoints.txt | xargs -P4 -I {} sh -c 'netapi ise mnt session {} > /tmp/{}.json'
# Pattern 4: Conditional Pipeline
netapi ise dc failed | jq -r '.[] | .FAILURE_REASON' | \
sort | uniq -c | sort -rn | \
while read count reason; do
if [[ $count -gt 10 ]]; then
printf "\e[1;31m%4d\e[0m %s\n" "$count" "$reason" # Red for >10
else
printf "\e[1;33m%4d\e[0m %s\n" "$count" "$reason" # Yellow
fi
done
Multi-stage Pipelines
# Stage 1: Collect
DATA=$(netapi ise api-call openapi GET '/api/v1/endpoint?size=100')
# Stage 2: Transform (reuse $DATA multiple times)
MACS=$(echo "$DATA" | jq -r '.response[].mac')
COUNT=$(echo "$DATA" | jq '.response | length')
GROUPS=$(echo "$DATA" | jq -r '.response[].groupId' | sort -u)
# Stage 3: Report
printf "Endpoints: %d\nGroups: %s\n" "$COUNT" "$(echo $GROUPS | tr '\n' ', ')"
Pipeline Debugging
# Insert tee to see intermediate values
command1 | tee /dev/stderr | command2 | tee /dev/stderr | command3
# Log each stage to files
command1 | tee /tmp/stage1.txt | command2 | tee /tmp/stage2.txt | command3
# Use pv for progress (if installed)
cat largefile.txt | pv | grep pattern | wc -l
# Debug with head (don't process entire file)
command1 | head -5 | command2 | command3
Pipeline Error Handling
# pipefail - fail on any pipeline command failure
set -o pipefail
netapi ise mnt sessions | jq '.[]' | process_data || echo "Pipeline failed"
# Check PIPESTATUS array (bash)
cmd1 | cmd2 | cmd3
echo "Exit codes: ${PIPESTATUS[@]}"
# Trap errors
trap 'echo "Error at line $LINENO"' ERR
set -e
netapi ise api-call ... | jq ... | process
Globbing & Brace Expansion
Shell Globs
# Match any .adoc files
ls *.adoc
# Recursive glob (zsh/bash 4+)
shopt -s globstar # bash
ls **/*.adoc # all .adoc files recursively
# Match single character
ls file?.txt # file1.txt, file2.txt, etc.
# Character classes
ls file[0-9].txt # file0.txt through file9.txt
ls file[!0-9].txt # NOT digits
# Extended globs (bash)
shopt -s extglob
ls !(*.log) # everything except .log files
ls *.@(jpg|png) # .jpg or .png files
Brace Expansion
# Generate sequences
echo {1..10} # 1 2 3 4 5 6 7 8 9 10
echo {a..z} # a b c ... z
echo {01..05} # 01 02 03 04 05 (zero-padded)
# Multiple options
echo file.{txt,md,adoc} # file.txt file.md file.adoc
mkdir -p project/{src,lib,bin,docs}
# Combine with commands
cp config.{yaml,yaml.bak} # backup a file
mv report.{txt,md} # rename extension
# Nested braces
echo {a,b}{1,2} # a1 a2 b1 b2
Parameter Expansion
FILE="/path/to/document.adoc"
# Extract parts
echo ${FILE##*/} # document.adoc (basename)
echo ${FILE%/*} # /path/to (dirname)
echo ${FILE%.adoc} # /path/to/document (remove extension)
echo ${FILE##*.} # adoc (extension only)
# Default values
echo ${VAR:-default} # use default if VAR unset
echo ${VAR:=default} # set VAR to default if unset
# String manipulation
NAME="hello world"
echo ${NAME^^} # HELLO WORLD (uppercase)
echo ${NAME,,} # hello world (lowercase)
echo ${NAME^} # Hello world (capitalize first)
# Substring
echo ${NAME:0:5} # hello (first 5 chars)
echo ${NAME: -5} # world (last 5 chars)
Perl One-Liners
Perl: the Swiss Army chainsaw.
Text Processing
# Replace (like sed but more powerful)
echo "hello world" | perl -pe 's/world/universe/'
# In-place edit with backup
perl -i.bak -pe 's/old/new/g' file.txt
# Print matching lines (like grep)
perl -ne 'print if /pattern/' file.txt
# Print specific field (like awk)
echo "a:b:c:d" | perl -F: -ane 'print $F[2]' # c
# Multiple operations
echo "HELLO" | perl -pe '$_ = lc; s/e/3/g' # h3llo
Regex Power
# Named captures
echo "user=admin" | perl -ne 'print "$1\n" if /user=(\w+)/'
# Non-greedy matching
echo "<tag>content</tag>" | perl -pe 's/<.*?>/[]/g'
# Output: []content[]
# Lookahead/lookbehind
echo "price: $100" | perl -pe 's/(?<=\$)\d+/200/'
# Output: price: $200
# Multi-line mode
perl -0777 -pe 's/start.*?end/REPLACED/gs' file.txt
JSON Processing (alternative to jq)
# Pretty print JSON
echo '{"a":1}' | perl -MJSON -e 'print to_json(from_json(<>), {pretty=>1})'
# Extract field
echo '{"name":"test"}' | perl -MJSON -ne '$d=from_json($_); print $d->{name}'
yq - YAML Processing
yq: jq for YAML (and JSON, XML, TOML).
Basic Operations
# Read YAML field
yq '.site.title' antora-playbook.yml
# Read nested field
yq '.content.sources[0].url' antora-playbook.yml
# List all source URLs
yq '.content.sources[].url' antora-playbook.yml
# Count sources
yq '.content.sources | length' antora-playbook.yml
Editing YAML
# Update field
yq -i '.site.title = "New Title"' playbook.yml
# Add new field
yq -i '.site.author = "EvanusModestus"' playbook.yml
# Delete field
yq -i 'del(.site.keys.google_analytics)' playbook.yml
# Add to array
yq -i '.content.sources += [{"url": "https://github.com/new/repo"}]' playbook.yml
Format Conversion
# YAML to JSON
yq -o json antora.yml
# JSON to YAML
cat data.json | yq -P
# YAML to XML
yq -o xml config.yml
# Pretty print YAML
yq '.' messy.yml
Antora Playbook Examples
# List all component names
yq '.content.sources[].url | split("/") | .[-1]' antora-playbook.yml
# Check if kroki is enabled
yq '.asciidoc.extensions[] | select(. == "asciidoctor-kroki")' antora-playbook.yml
# Get start paths
yq '.content.sources[] | .url + " β " + .start_path' antora-playbook.yml
Ruby One-Liners
Ruby: elegant and expressive.
Text Processing
# Replace (like sed)
echo "hello world" | ruby -pe '$_.gsub!(/world/, "universe")'
# Print matching lines
ruby -ne 'print if /pattern/' file.txt
# Field extraction (like awk)
echo "a:b:c:d" | ruby -F: -ane 'puts $F[2]' # c
# Transform each line
cat file.txt | ruby -pe '$_.upcase!'
# Number lines
ruby -ne 'puts "#{$.}: #{$_}"' file.txt
JSON Processing
# Pretty print
echo '{"a":1}' | ruby -rjson -e 'puts JSON.pretty_generate(JSON.parse(STDIN.read))'
# Extract field
echo '{"name":"test"}' | ruby -rjson -e 'puts JSON.parse(STDIN.read)["name"]'
# Transform JSON
netapi ise api-call openapi GET '/api/v1/endpoint' | \
ruby -rjson -e 'd=JSON.parse(STDIN.read); d["response"].each{|e| puts e["mac"]}'
Quick Calculations
# Calculator
ruby -e 'puts 2**10' # 1024
ruby -e 'puts Math.sqrt(144)' # 12.0
# Sum numbers from stdin
seq 1 100 | ruby -e 'puts STDIN.map(&:to_i).sum' # 5050
# Average
echo -e "10\n20\n30" | ruby -e 'a=STDIN.map(&:to_f); puts a.sum/a.size'
ISE Playbook Patterns
Patterns for building ISE automation at work.
Extract Policy Set IDs for Ansible
# Get policy set ID by name (for Ansible vars)
POLICY_ID=$(netapi ise api-call openapi GET '/api/v1/policy/network-access/policy-set' | \
jq -r '.response[] | select(.name == "Wired-802.1X") | .id')
# Generate Ansible vars file
netapi ise api-call openapi GET '/api/v1/policy/network-access/policy-set' | \
jq -r '.response[] | " - name: \(.name)\n id: \(.id)\n state: \(.state)"' > policy_sets.yml
Bulk Endpoint Operations
# Export all endpoints to CSV
netapi ise api-call openapi GET '/api/v1/endpoint?size=100' | \
jq -r '.response[] | [.mac, .groupId, .staticGroupAssignment] | @csv' > endpoints.csv
# Import from CSV (for Ansible loop)
while IFS=, read mac group static; do
echo "mac: $mac, group: $group"
done < endpoints.csv
dACL Content for Ansible Templates
# Extract dACL rules as Jinja2 template input
netapi ise api-call openapi GET '/api/v1/policy/network-access/downloadable-acl' | \
jq -r '.response[] | select(.name | startswith("Linux-")) |
"dacl_\(.name | gsub("-"; "_")):\n name: \(.name)\n rules: |\n \(.dacl)"'
Session Monitoring for Alerting
# Failed auths in last hour (for monitoring script)
netapi ise dc query "SELECT USERNAME, FAILURE_REASON, TIMESTAMP_TIMEZONE
FROM RADIUS_AUTHENTICATIONS
WHERE PASSED = 0
AND TIMESTAMP_TIMEZONE > SYSDATE - 1/24
ORDER BY TIMESTAMP_TIMEZONE DESC" | \
jq -r '.[] | "\(.TIMESTAMP_TIMEZONE) | \(.USERNAME) | \(.FAILURE_REASON)"'
# Count by failure reason (for dashboards)
netapi ise dc query "SELECT FAILURE_REASON, COUNT(*) as CNT
FROM RADIUS_AUTHENTICATIONS
WHERE PASSED = 0 AND TIMESTAMP_TIMEZONE > SYSDATE - 1
GROUP BY FAILURE_REASON ORDER BY CNT DESC"
Config Drift Detection
# Snapshot current policy sets
netapi ise api-call openapi GET '/api/v1/policy/network-access/policy-set' | \
jq -S '.response | sort_by(.name)' > policy_sets_$(date +%Y%m%d).json
# Compare to baseline
diff <(jq -S '.' baseline.json) <(jq -S '.' policy_sets_$(date +%Y%m%d).json)
Fleeting Ideas
Capture ideas as they arise during deployment - park for future roadmap.
Idea: Shell Pattern Library
Create a standalone shell script library with reusable functions:
# ~/.local/lib/ise-patterns.sh
ise_policy_sets_table() {
# Full implementation from above
}
ise_sessions_colored() {
# Session listing with colors
}
Source in scripts: source ~/.local/lib/ise-patterns.sh
Idea: Neovim Snippet for Shell Colors
Add LuaSnip snippets for common escape code patterns:
-
colorβprintf "\e[1;36m%s\e[0m" "$var" -
colorloopβ full while read loop with colors
Idea: Pre-commit Hook for dACL Validation
Before committing ISE policy changes, validate: * dACL syntax * No duplicate rules * Required permit/deny present
Deployment Progress
Track validation steps as completed:
-
Phase 0: Discovery (ISE nodes, policy sets, dACLs)
-
Phase 1: Verification (existing config check)
-
Phase 2: Creation (dACLs, profiles, rules)
-
Phase 3: Testing (CoA, authentication)
-
Phase 4: Validation (zero-trust tests)
Notable commands
/home/evanusmodestus/atelier/_bibliotheca/domus-ise-linux && gach << 'EOF'
refactor(runbook): Use netapi example includes for Phase 0 Discovery
Replace inline jq commands with cross-project includes:
- 0.2: include::netapi::example$ise-api-patterns.adoc[tag=deployment-nodes]
- 0.3: include::netapi::example$ise-api-patterns.adoc[tag=policy-sets-table]
- 0.4: include::netapi::example$ise-api-patterns.adoc[tag=dacls-list]
- 0.5: include::netapi::example$ise-api-patterns.adoc[tag=authz-profiles-list]
- 0.6: include::netapi::example$ise-api-patterns.adoc[tag=endpoint-mac]
Colorized output patterns now maintained in one place (netapi-docs),
reusable across all runbooks.
EOF
examples/outputs/
βββ ise-discovery.adoc # deployment-nodes, policy-sets, dacls, authz-profiles
βββ ise-dc-queries.adoc # recent-auths, failed-auths, eaptls-auths, top-failures, auth-stats
βββ ise-sessions.adoc # active-sessions, session-detail, coa-result, coa-bounce
βββ ise-validation.adoc # ping-ad, ping-denied, kerberos-test, cert-verify, wpa-status
βββ linux-endpoint.adoc # endpoint-mac, nmcli-wired, sssd-status, journalctl-8021x
Usage:
// ISE discovery outputs
.Policy Sets
[source]
# POLICY SET HITS STATE --------------------------------------------- 0 Domus-Wired MAB 15 * 1 Domus-Wired 802.1X 103 * 2 Domus-Secure 802.1X 100 * 3 Domus-IoT iPSK 17 * 4 Domus-Guest 0 o 5 Default 1 *
// DataConnect query results .EAP-TLS Authentications [source]
USERNAME POLICY_SET AUTHZ_PROFILE PASSED evanusmodestus Domus-Wired 802.1X Linux-Research-AD-Auth true modestus-razer$ Domus-Wired 802.1X Linux-Research-AD-Auth true modestus-p50$ Domus-Wired 802.1X Linux-Research-AD-Auth true
// Session/CoA results .CoA Result (Reauthenticate) [source]
CoA Request sent to: 10.50.1.1 Session ID: 0A320101000000123456789A Result: SUCCESS New Session State: AUTHENTICATED New dACL: Linux-Research-AD-Auth-EapTls
// Validation tests .Zero-Trust: External Ping Blocked [source]
$ ping -c 3 8.8.8.8 PING 8.8.8.8 (8.8.8.8) 56(84) bytes of data.
--- 8.8.8.8 ping statistics --- 3 packets transmitted, 0 received, 100% packet loss, time 2045ms
// Linux endpoint commands .NetworkManager Wired Connection [source]
$ nmcli connection show "Wired-802.1X-EAP-TLS" connection.id: Wired-802.1X-EAP-TLS connection.type: 802-3-ethernet connection.interface-name: enp44s0 802-1x.eap: tls 802-1x.identity: evanusmodestus@inside.domusdigitalis.dev 802-1x.client-cert: /etc/ssl/certs/modestus-aw-eaptls.pem 802-1x.private-key: /etc/ssl/private/modestus-aw-eaptls.key 802-1x.ca-cert: /etc/ssl/certs/domus-ca-chain.crt GENERAL.STATE: activated
Commit: cd /home/evanusmodestus/atelier/_bibliotheca/domus-captures && gach << 'EOF' feat(examples): Add modular output capture files examples/outputs/ structure: - ise-discovery.adoc: nodes, policy-sets, dacls, authz-profiles - ise-dc-queries.adoc: auth queries, failures, stats, profiler - ise-sessions.adoc: active sessions, CoA results - ise-validation.adoc: zero-trust tests, kerberos, certs - linux-endpoint.adoc: MAC, nmcli, sssd, journalctl Usage: include::example$outputs/FILE.adoc[tag=TAG_NAME] EOF
References
Internal documentation only (eating my own dog food):
Revision History
| Date | Author | Changes |
|---|---|---|
2026-02-13 |
EvanusModestus |
Initial draft - shell color patterns, jq basics |
2026-02-13 |
EvanusModestus |
Expanded: comprehensive sed (line selection, capture groups, ISE patterns), pipeline composition (patterns, multi-stage, debugging, error handling), POSIX tools (cut, tr, xargs, heredocs), globbing, brace/parameter expansion, Perl/Ruby one-liners, yq, ISE Playbook patterns |