Structured Logging: Stop Parsing Log Lines

Unstructured logs are technical debt. Structured logs are queryable, parseable, and actually useful when things break. The Problem # 2 2 2 0 0 0 U 2 2 2 n 6 6 6 s - - - t 0 0 0 r 2 2 2 u - - - c 2 2 2 t 8 8 8 u r 1 1 1 e 0 0 0 d : : : : 1 1 1 5 5 5 g : : : o 2 2 2 o 3 4 5 d I E I l N R N u F R F c O O O k R U R p s F e a e a q r r i u s l e i a e s n l d t g i c t c t e o o h m i l p p s o r l g o e g c t e e e d s d s i i n o n r f d 2 r e 3 o r 4 m m 1 s 1 2 9 3 2 4 . 5 1 : 6 8 c . o 1 n . n 1 e c t i o n t i m e o u t Regex hell when you need to extract user, IP, order ID, or duration. ...

February 28, 2026 · 4 min · 744 words · Rob Washington

jq Patterns for JSON Processing on the Command Line

JSON is everywhere. APIs return it, configs use it, logs contain it. jq is the Swiss Army knife for processing it all from the command line. Basic Selection 1 2 3 4 5 6 7 8 9 10 11 12 13 # Pretty print echo '{"name":"alice","age":30}' | jq . # Extract a field echo '{"name":"alice","age":30}' | jq '.name' # Output: "alice" # Raw output (no quotes) echo '{"name":"alice","age":30}' | jq -r '.name' # Output: alice # Nested fields echo '{"user":{"name":"alice"}}' | jq '.user.name' Working with Arrays 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 # Get all elements echo '[1,2,3]' | jq '.[]' # Output: # 1 # 2 # 3 # Get specific index echo '["a","b","c"]' | jq '.[1]' # Output: "b" # Slice echo '[1,2,3,4,5]' | jq '.[2:4]' # Output: [3,4] # First/last echo '[1,2,3]' | jq 'first' # 1 echo '[1,2,3]' | jq 'last' # 3 Filtering 1 2 3 4 5 6 7 8 9 10 11 12 13 # Select objects matching condition echo '[{"name":"alice","age":30},{"name":"bob","age":25}]' | \ jq '.[] | select(.age > 27)' # Output: {"name":"alice","age":30} # Multiple conditions jq '.[] | select(.status == "active" and .role == "admin")' # Contains jq '.[] | select(.tags | contains(["important"]))' # Regex matching jq '.[] | select(.email | test("@company\\.com$"))' Transforming Data 1 2 3 4 5 6 7 8 9 10 11 12 # Create new object echo '{"first":"Alice","last":"Smith"}' | \ jq '{fullName: (.first + " " + .last)}' # Output: {"fullName":"Alice Smith"} # Map over array echo '[1,2,3]' | jq 'map(. * 2)' # Output: [2,4,6] # Transform array of objects echo '[{"name":"alice"},{"name":"bob"}]' | \ jq 'map({user: .name, active: true})' API Response Processing 1 2 3 4 5 6 7 8 9 10 11 12 # Extract data from GitHub API curl -s https://api.github.com/users/torvalds/repos | \ jq '.[] | {name, stars: .stargazers_count, language}' | \ jq -s 'sort_by(.stars) | reverse | .[0:5]' # Get just names curl -s https://api.github.com/users/torvalds/repos | \ jq -r '.[].name' # Count items curl -s https://api.github.com/users/torvalds/repos | \ jq 'length' Aggregation 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Sum echo '[{"value":10},{"value":20},{"value":30}]' | \ jq '[.[].value] | add' # Output: 60 # Average jq '[.[].value] | add / length' # Group by echo '[{"type":"a","n":1},{"type":"b","n":2},{"type":"a","n":3}]' | \ jq 'group_by(.type) | map({type: .[0].type, total: [.[].n] | add})' # Count by field jq 'group_by(.status) | map({status: .[0].status, count: length})' Building Output 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 # Concatenate to string echo '{"host":"db","port":5432}' | \ jq -r '"\(.host):\(.port)"' # Output: db:5432 # Create CSV echo '[{"name":"alice","age":30},{"name":"bob","age":25}]' | \ jq -r '.[] | [.name, .age] | @csv' # Output: # "alice",30 # "bob",25 # Create TSV jq -r '.[] | [.name, .age] | @tsv' # URI encode jq -r '@uri' # Base64 jq -r '@base64' Conditional Logic 1 2 3 4 5 6 7 8 9 10 11 # If/then/else echo '{"score":85}' | \ jq 'if .score >= 90 then "A" elif .score >= 80 then "B" else "C" end' # Alternative operator (default values) echo '{"name":"alice"}' | jq '.age // 0' # Output: 0 # Try (suppress errors) echo '{"a":1}' | jq '.b.c.d // "missing"' # Output: "missing" Modifying JSON 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Update field echo '{"name":"alice","age":30}' | jq '.age = 31' # Add field echo '{"name":"alice"}' | jq '. + {active: true}' # Delete field echo '{"name":"alice","temp":123}' | jq 'del(.temp)' # Update nested echo '{"user":{"name":"alice"}}' | jq '.user.name = "bob"' # Recursive update jq '.. | objects | .timestamp |= (. // now)' Multiple Files 1 2 3 4 5 6 7 8 9 # Combine objects from files jq -s '.[0] * .[1]' defaults.json overrides.json # Process files independently jq -r '.name' file1.json file2.json # Slurp into array jq -s '.' file1.json file2.json # Output: [{...}, {...}] Stream Processing For large files, use streaming: ...

February 28, 2026 · 5 min · 1029 words · Rob Washington

Structured Logging: Stop Grepping, Start Querying

Unstructured logs are a trap. They look simple until you need to find something. [ [ [ 2 2 2 0 0 0 2 2 2 6 6 6 - - - 0 0 0 2 2 2 - - - 2 2 2 7 7 7 0 0 0 5 5 5 : : : 3 3 3 0 0 0 : : : 1 1 1 5 6 7 ] ] ] I E W N R A F R R O O N R U H s F i e a g r i h l j e m o d e h m n t o @ o r e y x p a r u m o s p c a l e g e s e . s c d o o e m r t d e l e c o r t g e g 1 d e 2 : d 3 4 8 i 5 7 n : % f c r o o n m n e 1 c 9 t 2 i . o 1 n 6 8 t . i 1 m . e 5 o 0 u t Quick: find all login failures from a specific IP range in the last hour. Now try parsing the order ID from error messages. Hope you enjoy regex. ...

February 27, 2026 · 6 min · 1228 words · Rob Washington

jq: The Swiss Army Knife for JSON on the Command Line

If you work with APIs, logs, or configuration files, you work with JSON. And if you work with JSON from the command line, jq is indispensable. Here are the patterns I use daily. The Basics 1 2 3 4 5 6 7 8 9 10 # Pretty print echo '{"name":"alice","age":30}' | jq '.' # Extract a field echo '{"name":"alice","age":30}' | jq '.name' # "alice" # Raw output (no quotes) echo '{"name":"alice","age":30}' | jq -r '.name' # alice The -r flag is your friend. Use it whenever you want the actual value, not a JSON string. ...

February 26, 2026 · 6 min · 1200 words · Rob Washington

jq Mastery: JSON Processing on the Command Line

Every API returns JSON. Every config file is JSON. If you’re not fluent in jq, you’re copying data by hand like it’s 1995. The Basics 1 2 3 4 5 6 7 8 9 10 # Pretty print echo '{"name":"test","value":42}' | jq '.' # Extract a field echo '{"name":"test","value":42}' | jq '.name' # "test" # Raw output (no quotes) echo '{"name":"test","value":42}' | jq -r '.name' # test Working with APIs 1 2 3 4 5 6 7 8 # GitHub API curl -s https://api.github.com/users/torvalds | jq '.login, .public_repos' # Extract specific fields curl -s https://api.github.com/repos/stedolan/jq | jq '{name, stars: .stargazers_count, language}' # AWS CLI (already outputs JSON) aws ec2 describe-instances | jq '.Reservations[].Instances[] | {id: .InstanceId, state: .State.Name}' Array Operations 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 # Sample data DATA='[{"name":"alice","age":30},{"name":"bob","age":25},{"name":"carol","age":35}]' # First element echo $DATA | jq '.[0]' # Last element echo $DATA | jq '.[-1]' # Slice echo $DATA | jq '.[0:2]' # All names echo $DATA | jq '.[].name' # Array of names echo $DATA | jq '[.[].name]' # Length echo $DATA | jq 'length' Filtering 1 2 3 4 5 6 7 8 9 10 11 # Select by condition echo $DATA | jq '.[] | select(.age > 28)' # Multiple conditions echo $DATA | jq '.[] | select(.age > 25 and .name != "carol")' # Contains echo '[{"tags":["web","api"]},{"tags":["cli"]}]' | jq '.[] | select(.tags | contains(["api"]))' # Has key echo '{"a":1,"b":null}' | jq 'has("a"), has("c")' Transformation 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 # Add/modify fields echo '{"name":"test"}' | jq '. + {status: "active", count: 0}' # Update existing field echo '{"count":5}' | jq '.count += 1' # Delete field echo '{"a":1,"b":2,"c":3}' | jq 'del(.b)' # Rename key echo '{"old_name":"value"}' | jq '{new_name: .old_name}' # Map over array echo '[1,2,3,4,5]' | jq 'map(. * 2)' # Map with objects echo $DATA | jq 'map({username: .name, birth_year: (2026 - .age)})' String Operations 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 # Concatenation echo '{"first":"John","last":"Doe"}' | jq '.first + " " + .last' # String interpolation echo '{"name":"test","ver":"1.0"}' | jq '"\(.name)-\(.ver).tar.gz"' # Split echo '{"path":"/usr/local/bin"}' | jq '.path | split("/")' # Join echo '["a","b","c"]' | jq 'join(",")' # Upper/lower echo '"Hello World"' | jq 'ascii_downcase' echo '"Hello World"' | jq 'ascii_upcase' # Test regex echo '{"email":"test@example.com"}' | jq '.email | test("@")' # Replace echo '"hello world"' | jq 'gsub("world"; "jq")' Conditionals 1 2 3 4 5 6 7 8 9 10 11 # If-then-else echo '{"status":200}' | jq 'if .status == 200 then "ok" else "error" end' # Alternative operator (default value) echo '{"a":1}' | jq '.b // "default"' # Null handling echo '{"a":null}' | jq '.a // "was null"' # Error handling echo '{}' | jq '.missing.nested // "not found"' Grouping and Aggregation 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 LOGS='[ {"level":"error","msg":"failed"}, {"level":"info","msg":"started"}, {"level":"error","msg":"timeout"}, {"level":"info","msg":"completed"} ]' # Group by field echo $LOGS | jq 'group_by(.level)' # Count per group echo $LOGS | jq 'group_by(.level) | map({level: .[0].level, count: length})' # Unique values echo $LOGS | jq '[.[].level] | unique' # Sort echo $DATA | jq 'sort_by(.age)' # Reverse sort echo $DATA | jq 'sort_by(.age) | reverse' # Min/max echo '[5,2,8,1,9]' | jq 'min, max' # Sum echo '[1,2,3,4,5]' | jq 'add' # Average echo '[1,2,3,4,5]' | jq 'add / length' Constructing Output 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 # Build new object curl -s https://api.github.com/users/torvalds | jq '{ username: .login, repos: .public_repos, profile: .html_url }' # Build array echo '{"users":[{"name":"a"},{"name":"b"}]}' | jq '[.users[].name]' # Multiple outputs to array echo '{"a":1,"b":2}' | jq '[.a, .b, .a + .b]' # Key-value pairs echo '{"a":1,"b":2}' | jq 'to_entries' # [{"key":"a","value":1},{"key":"b","value":2}] # Back to object echo '[{"key":"a","value":1}]' | jq 'from_entries' # Transform keys echo '{"old_a":1,"old_b":2}' | jq 'with_entries(.key |= ltrimstr("old_"))' Real-World Examples Parse AWS Instance List 1 2 3 4 5 aws ec2 describe-instances | jq -r ' .Reservations[].Instances[] | [.InstanceId, .State.Name, (.Tags[]? | select(.Key=="Name") | .Value) // "unnamed"] | @tsv ' Filter Docker Containers 1 2 3 4 5 6 docker inspect $(docker ps -q) | jq '.[] | { name: .Name, image: .Config.Image, status: .State.Status, ip: .NetworkSettings.IPAddress }' Process Log Files 1 2 3 4 5 6 7 # Count errors by type cat app.log | jq -s 'group_by(.error_type) | map({type: .[0].error_type, count: length}) | sort_by(.count) | reverse' # Extract errors from last hour cat app.log | jq --arg cutoff "$(date -d '1 hour ago' -Iseconds)" ' select(.timestamp > $cutoff and .level == "error") ' Transform Config Files 1 2 3 4 5 6 7 8 # Merge configs jq -s '.[0] * .[1]' base.json override.json # Update nested value jq '.database.host = "newhost.example.com"' config.json # Add to array jq '.allowed_ips += ["10.0.0.5"]' config.json Generate Reports 1 2 3 4 5 6 # Kubernetes pod status kubectl get pods -o json | jq -r ' .items[] | [.metadata.name, .status.phase, (.status.containerStatuses[0].restartCount // 0)] | @tsv ' | column -t Useful Flags 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 # Compact output (no pretty print) jq -c '.' # Raw output (no quotes on strings) jq -r '.name' # Raw input (treat input as string, not JSON) jq -R 'split(",")' # Slurp (read all inputs into array) cat *.json | jq -s '.' # Pass variable jq --arg name "test" '.name = $name' # Pass JSON variable jq --argjson count 42 '.count = $count' # Read from file jq --slurpfile users users.json '.users = $users' # Exit with error if output is null/false jq -e '.important_field' && echo "exists" # Sort keys in output jq -S '.' Output Formats 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Tab-separated echo $DATA | jq -r '.[] | [.name, .age] | @tsv' # CSV echo $DATA | jq -r '.[] | [.name, .age] | @csv' # URI encoding echo '{"q":"hello world"}' | jq -r '.q | @uri' # Base64 echo '{"data":"secret"}' | jq -r '.data | @base64' # Shell-safe echo '{"cmd":"echo hello"}' | jq -r '.cmd | @sh' Debugging 1 2 3 4 5 6 7 8 9 10 11 # Show type echo '{"a":[1,2,3]}' | jq '.a | type' # Show keys echo '{"a":1,"b":2}' | jq 'keys' # Debug output (shows intermediate values) echo '{"x":{"y":{"z":1}}}' | jq '.x | debug | .y | debug | .z' # Path to value echo '{"a":{"b":{"c":1}}}' | jq 'path(.. | select(. == 1))' Quick Reference 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 # Identity . # Field access .field .field.nested # Array access .[0] .[-1] .[2:5] # Iterate array .[] # Pipe .[] | .name # Collect into array [.[] | .name] # Object construction {newkey: .oldkey} # Conditionals if COND then A else B end VALUE // DEFAULT # Comparison ==, !=, <, >, <=, >= and, or, not # Array functions map(f), select(f), sort_by(f), group_by(f), unique, length, first, last, nth(n), flatten, reverse, contains(x), inside(x), add, min, max # String functions split(s), join(s), test(re), match(re), gsub(re;s), ascii_downcase, ascii_upcase, ltrimstr(s), rtrimstr(s), startswith(s), endswith(s) # Object functions keys, values, has(k), in(o), to_entries, from_entries, with_entries(f) # Type functions type, isnumber, isstring, isnull, isboolean, isarray, isobject jq turns JSON from a data format into a query language. Once you internalize the pipe-and-filter model, you’ll wonder how you ever survived without it. ...

February 25, 2026 · 7 min · 1346 words · Rob Washington

jq: Command-Line JSON Processing

If you work with APIs, logs, or config files, you work with JSON. And jq is how you make that not painful. It’s like sed and awk had a baby that speaks JSON fluently. The Basics 1 2 3 4 5 6 7 8 9 10 # Pretty-print JSON curl -s https://api.example.com/data | jq . # Get a field echo '{"name": "Alice", "age": 30}' | jq '.name' # "Alice" # Get raw string (no quotes) echo '{"name": "Alice"}' | jq -r '.name' # Alice Navigating Objects 1 2 3 4 5 6 7 8 9 10 11 # Nested fields echo '{"user": {"name": "Alice", "email": "alice@example.com"}}' | jq '.user.name' # "Alice" # Multiple fields echo '{"name": "Alice", "age": 30, "city": "NYC"}' | jq '{name, city}' # {"name": "Alice", "city": "NYC"} # Rename fields echo '{"firstName": "Alice"}' | jq '{name: .firstName}' # {"name": "Alice"} Working with Arrays 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 # Get first element echo '[1, 2, 3]' | jq '.[0]' # 1 # Get last element echo '[1, 2, 3]' | jq '.[-1]' # 3 # Slice echo '[1, 2, 3, 4, 5]' | jq '.[1:3]' # [2, 3] # Get all elements echo '[{"name": "Alice"}, {"name": "Bob"}]' | jq '.[].name' # "Alice" # "Bob" # Wrap results in array echo '[{"name": "Alice"}, {"name": "Bob"}]' | jq '[.[].name]' # ["Alice", "Bob"] Filtering 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 # Select where condition is true echo '[{"name": "Alice", "age": 30}, {"name": "Bob", "age": 25}]' | \ jq '.[] | select(.age > 26)' # {"name": "Alice", "age": 30} # Multiple conditions echo '[{"name": "Alice", "active": true}, {"name": "Bob", "active": false}]' | \ jq '.[] | select(.active == true and .name != "Admin")' # Contains echo '[{"tags": ["dev", "prod"]}, {"tags": ["staging"]}]' | \ jq '.[] | select(.tags | contains(["prod"]))' # String matching echo '[{"name": "Alice"}, {"name": "Bob"}, {"name": "Alicia"}]' | \ jq '.[] | select(.name | startswith("Ali"))' Transforming Data 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 # Map over array echo '[1, 2, 3]' | jq 'map(. * 2)' # [2, 4, 6] # Map with objects echo '[{"name": "Alice", "score": 85}, {"name": "Bob", "score": 92}]' | \ jq 'map({name, passed: .score >= 90})' # [{"name": "Alice", "passed": false}, {"name": "Bob", "passed": true}] # Add fields echo '{"name": "Alice"}' | jq '. + {role: "admin"}' # {"name": "Alice", "role": "admin"} # Update fields echo '{"name": "alice"}' | jq '.name |= ascii_upcase' # {"name": "ALICE"} # Delete fields echo '{"name": "Alice", "password": "secret"}' | jq 'del(.password)' # {"name": "Alice"} Aggregation 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 # Length echo '[1, 2, 3, 4, 5]' | jq 'length' # 5 # Sum echo '[1, 2, 3, 4, 5]' | jq 'add' # 15 # Min/Max echo '[3, 1, 4, 1, 5]' | jq 'min, max' # 1 # 5 # Unique echo '[1, 2, 2, 3, 3, 3]' | jq 'unique' # [1, 2, 3] # Group by echo '[{"type": "a", "val": 1}, {"type": "b", "val": 2}, {"type": "a", "val": 3}]' | \ jq 'group_by(.type) | map({type: .[0].type, total: map(.val) | add})' # [{"type": "a", "total": 4}, {"type": "b", "total": 2}] # Sort echo '[{"name": "Bob"}, {"name": "Alice"}]' | jq 'sort_by(.name)' # [{"name": "Alice"}, {"name": "Bob"}] String Operations 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 # Split echo '{"path": "/usr/local/bin"}' | jq '.path | split("/")' # ["", "usr", "local", "bin"] # Join echo '["a", "b", "c"]' | jq 'join("-")' # "a-b-c" # Replace echo '{"msg": "Hello World"}' | jq '.msg | gsub("World"; "jq")' # "Hello jq" # String interpolation echo '{"name": "Alice", "age": 30}' | jq '"Name: \(.name), Age: \(.age)"' # "Name: Alice, Age: 30" Working with Keys 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 # Get keys echo '{"a": 1, "b": 2, "c": 3}' | jq 'keys' # ["a", "b", "c"] # Get values echo '{"a": 1, "b": 2, "c": 3}' | jq 'values' # Doesn't exist - use: [.[]] # Convert object to array of key-value pairs echo '{"a": 1, "b": 2}' | jq 'to_entries' # [{"key": "a", "value": 1}, {"key": "b", "value": 2}] # Convert back echo '[{"key": "a", "value": 1}]' | jq 'from_entries' # {"a": 1} # Transform keys echo '{"firstName": "Alice", "lastName": "Smith"}' | \ jq 'with_entries(.key |= ascii_downcase)' # {"firstname": "Alice", "lastname": "Smith"} Conditionals 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # If-then-else echo '{"status": 200}' | jq 'if .status == 200 then "OK" else "Error" end' # "OK" # Alternative operator (default value) echo '{"name": "Alice"}' | jq '.age // 0' # 0 echo '{"name": "Alice", "age": 30}' | jq '.age // 0' # 30 # Try (ignore errors) echo '{"data": "not json"}' | jq '.data | try fromjson' # null Real-World Examples Parse API Response 1 2 3 4 5 6 7 # Extract users from paginated API curl -s 'https://api.example.com/users' | \ jq '.data[] | {id, name: .attributes.name, email: .attributes.email}' # Get specific fields as TSV curl -s 'https://api.example.com/users' | \ jq -r '.data[] | [.id, .attributes.name] | @tsv' Process Log Files 1 2 3 4 5 6 7 8 # Parse JSON logs, filter errors cat app.log | jq -c 'select(.level == "error")' # Count by status code cat access.log | jq -s 'group_by(.status) | map({status: .[0].status, count: length})' # Get unique IPs cat access.log | jq -s '[.[].ip] | unique' Transform Config Files 1 2 3 4 5 6 7 8 # Merge configs jq -s '.[0] * .[1]' base.json override.json # Update nested value jq '.database.host = "newhost.example.com"' config.json # Add to array jq '.allowed_ips += ["10.0.0.5"]' config.json AWS CLI Output 1 2 3 4 5 6 7 8 9 10 11 12 # List EC2 instance IDs and states aws ec2 describe-instances | \ jq -r '.Reservations[].Instances[] | [.InstanceId, .State.Name] | @tsv' # Get running instances aws ec2 describe-instances | \ jq '.Reservations[].Instances[] | select(.State.Name == "running") | .InstanceId' # Format as table aws ec2 describe-instances | \ jq -r '["ID", "Type", "State"], (.Reservations[].Instances[] | [.InstanceId, .InstanceType, .State.Name]) | @tsv' | \ column -t Kubernetes 1 2 3 4 5 6 7 8 9 10 11 # Get pod names and statuses kubectl get pods -o json | \ jq -r '.items[] | [.metadata.name, .status.phase] | @tsv' # Find pods not running kubectl get pods -o json | \ jq '.items[] | select(.status.phase != "Running") | .metadata.name' # Get container images kubectl get pods -o json | \ jq -r '[.items[].spec.containers[].image] | unique | .[]' Output Formats 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 # Compact (one line) echo '{"a": 1}' | jq -c . # {"a":1} # Tab-separated echo '[{"a": 1, "b": 2}]' | jq -r '.[] | [.a, .b] | @tsv' # 1 2 # CSV echo '[{"a": 1, "b": 2}]' | jq -r '.[] | [.a, .b] | @csv' # 1,2 # URI encode echo '{"q": "hello world"}' | jq -r '.q | @uri' # hello%20world # Base64 echo '{"data": "hello"}' | jq -r '.data | @base64' # aGVsbG8= Quick Reference Pattern Description .field Get field .[] Iterate array .[0] First element .[-1] Last element select(cond) Filter map(expr) Transform array . + {} Add fields del(.field) Remove field // default Default value @tsv, @csv Output format -r Raw output -c Compact output -s Slurp (read all input as array) jq has a learning curve, but it pays off quickly. Once you internalize the patterns, you’ll wonder how you ever worked with JSON without it. Start with .field, .[].field, and select() — those three cover 80% of use cases. ...

February 24, 2026 · 7 min · 1338 words · Rob Washington

JSON API Design: Conventions That Make Integration Easy

A well-designed API feels obvious. Endpoints are predictable, responses are consistent, errors are helpful. A poorly designed API creates confusion, support tickets, and workarounds. These conventions create APIs that are intuitive to integrate. Resource Naming Use nouns, not verbs. Plural for collections: ✅ G G P P D ❌ G P G P E E O U E E O E O G T T S T L B T S T S o T E a T T o T d d E : : / / / / / / / / / u u u u u g c u u s s s s s e r s s e e e e e t e e e r r r r r U a r r s s s s s s t / s / / / e e 1 / 1 1 1 r U 2 1 2 2 2 s s 3 2 3 3 3 e 3 r / d e l e t # # # # # e L G C U D i e r p e s t e d l t a a e u t t t u s e e e s e e r u u u r s s s s 1 e e e 2 r r r 3 1 1 2 2 3 3 Nested resources for relationships: ...

February 23, 2026 · 13 min · 2611 words · Rob Washington

Structured Logging: Making Logs Queryable and Actionable

Plain text logs are for humans. Structured logs are for machines. In production, machines need to read your logs before humans do. When your service handles thousands of requests per second, grep stops working. You need logs that can be indexed, queried, aggregated, and alerted on. That means structure. The Problem with Text Logs [ [ [ 2 2 2 0 0 0 2 2 2 6 6 6 - - - 0 0 0 2 2 2 - - - 1 1 1 6 6 6 0 0 0 8 8 8 : : : 3 3 3 0 0 0 : : : 1 1 1 5 6 7 ] ] ] I E W N R A F R R O O N : R : : U H s P i e a g r y h m j e m o n e h t m n o @ f r e a y x i a l u m e s p d a l g e f e . o c r d o e m o t r e l d c o e t g r e g d e 1 : d 2 3 8 i 4 7 n 5 % f - r o i m n s 1 u 9 f 2 f . i 1 c 6 i 8 e . n 1 t . 5 f 0 u n d s Looks readable. But try answering: ...

February 16, 2026 · 7 min · 1406 words · Rob Washington