Transform Types
Transforms modify logs as they flow through pipelines. Each transform type serves a specific purpose - from simple field additions to complex jq-based routing logic.
Transform Categories
| Category | Types | Purpose |
|---|---|---|
Filtering |
|
Remove low-value logs |
Enrichment |
|
Add fields for routing/analysis |
Normalization |
|
Standardize field names/formats |
Transformation |
|
Complex logic, routing |
Removal |
|
Remove sensitive fields |
1. drop_record_where_value_eq
Purpose: Filter out entire records where a field equals a specific value.
Use Case: Remove info-level logs, drop health checks.
{
"operation": "drop_record_where_value_eq",
"arguments": {
"key": "severity",
"value": "info"
}
}
Infrastructure Examples:
# Drop ISE accounting success (high volume, low value)
key: "ise.auth_result"
value: "PASSED"
# Drop FTD permit logs
key: "action"
value: "permit"
# Drop health checks
key: "event_type"
value: "heartbeat"
Volume Reduction: 40-60% typical for security logs.
2. add
Purpose: Add a static field to all records.
Use Case: Tag logs for routing, add environment labels.
{
"operation": "add",
"arguments": {
"key": "destination",
"value": "sentinel"
}
}
Infrastructure Examples:
# Tag for Sentinel routing
key: "route"
value: "sentinel"
# Add environment label
key: "environment"
value: "production"
# Add source identifier
key: "log_source"
value: "ise-radius"
3. rename_key
Purpose: Rename a field (for schema normalization).
Use Case: Standardize field names across sources.
{
"operation": "rename_key",
"arguments": {
"old_key": "src_ip",
"new_key": "source.ip"
}
}
Normalization Examples:
| Source | Original | Normalized (OCSF) |
|---|---|---|
ISE |
|
|
FTD |
|
|
ASA |
|
|
Syslog |
|
|
4. drop_key
Purpose: Remove a field from records.
Use Case: Remove sensitive data before SIEM ingestion.
{
"operation": "drop_key",
"arguments": {
"key": "password"
}
}
Security Examples:
# Remove passwords
key: "password"
# Remove internal IPs
key: "internal_mgmt_ip"
# Remove debug data
key: "raw_packet"
# Remove PII
key: "ssn"
5. timestamp
Purpose: Add ingestion timestamp to records.
Use Case: Track pipeline latency, correlate events.
{
"operation": "timestamp",
"arguments": {
"key": "ingested_at"
}
}
Result: "ingested_at": "2026-03-15T14:30:45.123Z"
6. add_identifier
Purpose: Add a unique identifier (UUID) to each record.
Use Case: Deduplication, audit trails.
{
"operation": "add_identifier",
"arguments": {
"key": "monad_id"
}
}
Result: "monad_id": "550e8400-e29b-41d4-a716-446655440000"
7. convert_timestamp
Purpose: Convert timestamp formats.
Use Case: Normalize epoch to ISO8601.
{
"operation": "convert_timestamp",
"arguments": {
"key": "timestamp",
"from_format": "epoch_seconds",
"to_format": "iso8601"
}
}
Supported Formats:
| Format | Example |
|---|---|
|
|
|
|
|
|
|
|
8. flatten
Purpose: Flatten nested JSON structures.
Use Case: Simplify deeply nested logs for SIEM indexing.
{
"operation": "flatten",
"arguments": {
"separator": "."
}
}
Before:
{
"user": {
"name": "jdoe",
"domain": "CHLA"
}
}
After:
{
"user.name": "jdoe",
"user.domain": "CHLA"
}
9. duplicate_key_value_to_key
Purpose: Copy a field’s value to a new field.
Use Case: Preserve original while transforming.
{
"operation": "duplicate_key_value_to_key",
"arguments": {
"source_key": "src_ip",
"destination_key": "original_source_ip"
}
}
10. create_key_value_if_key_value
Purpose: Conditionally add a field based on another field’s value.
Use Case: MITRE ATT&CK tagging, conditional enrichment.
{
"operation": "create_key_value_if_key_value",
"arguments": {
"condition_key": "event_type",
"condition_value": "auth_failure",
"new_key": "mitre_technique",
"new_value": "T1078"
}
}
MITRE ATT&CK Tagging:
| Condition Key | Condition Value | MITRE Tag |
|---|---|---|
|
|
|
|
|
|
|
|
|
|
|
|
11. mutate_value_where_key_eq
Purpose: Change a field’s value conditionally.
Use Case: Normalize severity levels, remap values.
{
"operation": "mutate_value_where_key_eq",
"arguments": {
"key": "severity",
"match_value": "emergency",
"new_value": "critical"
}
}
Severity Normalization:
| Source Value | Normalized |
|---|---|
|
|
|
|
|
|
|
|
|
|
12. jq
Purpose: Full JSON transformation using jq expressions.
Use Case: Complex routing logic, multi-condition filtering, OCSF normalization.
{
"operation": "jq",
"arguments": {
"query": "if .severity == \"critical\" then . + {\"route\": \"sentinel\"} else . + {\"route\": \"s3\"} end"
}
}
See jq Transforms for comprehensive examples.
Transform Chaining
Transforms execute in order. Chain them for complex workflows:
1. drop_record_where_value_eq → Filter out info logs
2. rename_key → Normalize field names
3. add → Add environment tag
4. create_key_value_if_key_value → Add MITRE tag
5. timestamp → Add ingestion time
Transform vs Edge Routing
Two ways to route logs:
| Method | Use Case | Example |
|---|---|---|
Transform + Edge |
Add routing field, then split by edge condition |
Add |
Multiple Pipelines |
Separate pipelines per source |
ISE pipeline, FTD pipeline, each with own logic |
Trial Limitations
| Feature | Trial Access |
|---|---|
Create via API |
✗ Blocked (403) |
Create via UI |
✓ Works |
View existing |
✓ Works |
jq transforms |
✓ Limited (may require full license) |
Practical Workflow
1. List Available Transform Types
monad_transform_types | jq '.[].type_id'
2. View Transform Config
monad_transform <id> | jq '.config.operations'
3. Create Transform (UI)
-
Navigate to app.monad.com → Transforms
-
Click "Create Transform"
-
Add operations in sequence
-
Save and attach to pipeline
Key Takeaways
-
12 transform types - From simple add to complex jq
-
Chain for complex logic - Order matters
-
drop_record reduces volume - 40-60% cost savings
-
create_key_value for conditional - MITRE tagging
-
jq for complex routing - Full JSON transformation
-
Trial limits API creation - Use UI
Next Module
jq Transforms - Deep dive into complex routing with jq.