tmux Patterns for Remote Work and Long-Running Tasks

SSH connections drop. Scripts need to run overnight. You need six terminals for one task. tmux solves all of this. Why tmux? Persistence: Sessions survive disconnects Multiplexing: Multiple windows and panes in one connection Sharing: Pair programming on the same session Automation: Scriptable terminal control Getting Started 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Start a new session tmux # Start named session tmux new -s work # Detach (inside tmux) Ctrl+b d # List sessions tmux ls # Reattach tmux attach -t work The prefix key is Ctrl+b by default. Press it, then the command key. ...

February 28, 2026 Â· 6 min Â· 1276 words Â· Rob Washington

jq Patterns for JSON Processing on the Command Line

JSON is everywhere. APIs return it, configs use it, logs contain it. jq is the Swiss Army knife for processing it all from the command line. Basic Selection 1 2 3 4 5 6 7 8 9 10 11 12 13 # Pretty print echo '{"name":"alice","age":30}' | jq . # Extract a field echo '{"name":"alice","age":30}' | jq '.name' # Output: "alice" # Raw output (no quotes) echo '{"name":"alice","age":30}' | jq -r '.name' # Output: alice # Nested fields echo '{"user":{"name":"alice"}}' | jq '.user.name' Working with Arrays 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 # Get all elements echo '[1,2,3]' | jq '.[]' # Output: # 1 # 2 # 3 # Get specific index echo '["a","b","c"]' | jq '.[1]' # Output: "b" # Slice echo '[1,2,3,4,5]' | jq '.[2:4]' # Output: [3,4] # First/last echo '[1,2,3]' | jq 'first' # 1 echo '[1,2,3]' | jq 'last' # 3 Filtering 1 2 3 4 5 6 7 8 9 10 11 12 13 # Select objects matching condition echo '[{"name":"alice","age":30},{"name":"bob","age":25}]' | \ jq '.[] | select(.age > 27)' # Output: {"name":"alice","age":30} # Multiple conditions jq '.[] | select(.status == "active" and .role == "admin")' # Contains jq '.[] | select(.tags | contains(["important"]))' # Regex matching jq '.[] | select(.email | test("@company\\.com$"))' Transforming Data 1 2 3 4 5 6 7 8 9 10 11 12 # Create new object echo '{"first":"Alice","last":"Smith"}' | \ jq '{fullName: (.first + " " + .last)}' # Output: {"fullName":"Alice Smith"} # Map over array echo '[1,2,3]' | jq 'map(. * 2)' # Output: [2,4,6] # Transform array of objects echo '[{"name":"alice"},{"name":"bob"}]' | \ jq 'map({user: .name, active: true})' API Response Processing 1 2 3 4 5 6 7 8 9 10 11 12 # Extract data from GitHub API curl -s https://api.github.com/users/torvalds/repos | \ jq '.[] | {name, stars: .stargazers_count, language}' | \ jq -s 'sort_by(.stars) | reverse | .[0:5]' # Get just names curl -s https://api.github.com/users/torvalds/repos | \ jq -r '.[].name' # Count items curl -s https://api.github.com/users/torvalds/repos | \ jq 'length' Aggregation 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Sum echo '[{"value":10},{"value":20},{"value":30}]' | \ jq '[.[].value] | add' # Output: 60 # Average jq '[.[].value] | add / length' # Group by echo '[{"type":"a","n":1},{"type":"b","n":2},{"type":"a","n":3}]' | \ jq 'group_by(.type) | map({type: .[0].type, total: [.[].n] | add})' # Count by field jq 'group_by(.status) | map({status: .[0].status, count: length})' Building Output 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 # Concatenate to string echo '{"host":"db","port":5432}' | \ jq -r '"\(.host):\(.port)"' # Output: db:5432 # Create CSV echo '[{"name":"alice","age":30},{"name":"bob","age":25}]' | \ jq -r '.[] | [.name, .age] | @csv' # Output: # "alice",30 # "bob",25 # Create TSV jq -r '.[] | [.name, .age] | @tsv' # URI encode jq -r '@uri' # Base64 jq -r '@base64' Conditional Logic 1 2 3 4 5 6 7 8 9 10 11 # If/then/else echo '{"score":85}' | \ jq 'if .score >= 90 then "A" elif .score >= 80 then "B" else "C" end' # Alternative operator (default values) echo '{"name":"alice"}' | jq '.age // 0' # Output: 0 # Try (suppress errors) echo '{"a":1}' | jq '.b.c.d // "missing"' # Output: "missing" Modifying JSON 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Update field echo '{"name":"alice","age":30}' | jq '.age = 31' # Add field echo '{"name":"alice"}' | jq '. + {active: true}' # Delete field echo '{"name":"alice","temp":123}' | jq 'del(.temp)' # Update nested echo '{"user":{"name":"alice"}}' | jq '.user.name = "bob"' # Recursive update jq '.. | objects | .timestamp |= (. // now)' Multiple Files 1 2 3 4 5 6 7 8 9 # Combine objects from files jq -s '.[0] * .[1]' defaults.json overrides.json # Process files independently jq -r '.name' file1.json file2.json # Slurp into array jq -s '.' file1.json file2.json # Output: [{...}, {...}] Stream Processing For large files, use streaming: ...

February 28, 2026 Â· 5 min Â· 1029 words Â· Rob Washington

SSH Config Patterns for Managing Multiple Servers

If you’re still typing ssh -i ~/.ssh/mykey.pem ec2-user@ec2-54-123-45-67.compute-1.amazonaws.com, you’re working too hard. SSH config transforms verbose commands into simple ssh prod invocations. The Basics Create or edit ~/.ssh/config: H o s t H U I p o s d r s e e o t r n d N t a e i m c t e 2 y - F 5 u i 4 s l . e e 1 r 2 ~ 3 / . . 4 s 5 s . h 6 / 7 p r o d - k e y . p e m Now ssh prod connects with the right key and user. No more remembering IP addresses. ...

February 28, 2026 Â· 16 min Â· 3216 words Â· Rob Washington

Makefiles for Modern Development Workflows

Makefiles are ancient. They’re also incredibly useful for modern development. Here’s how to use them as your project’s command center. Why Make in 2026? Every project has commands you run repeatedly: Start development servers Run tests Build containers Deploy to environments Format and lint code You could remember them all. Or document them in a README that gets stale. Or put them in a Makefile where they’re executable documentation. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 .PHONY: dev test build deploy dev: docker-compose up -d npm run dev test: pytest -v build: docker build -t myapp:latest . deploy: kubectl apply -f k8s/ Now make dev starts everything. New team member? Run make help. ...

February 28, 2026 Â· 6 min Â· 1075 words Â· Rob Washington

Environment Variable Patterns for 12-Factor Apps

Environment variables are the 12-factor way to configure applications. But “just use env vars” glosses over real complexity. Here’s how to do it well. The Basics Done Right Type Coercion Environment variables are always strings. Handle conversion explicitly: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 import os def get_env_bool(key: str, default: bool = False) -> bool: value = os.getenv(key, "").lower() if value in ("true", "1", "yes", "on"): return True if value in ("false", "0", "no", "off"): return False return default def get_env_int(key: str, default: int = 0) -> int: try: return int(os.getenv(key, default)) except ValueError: return default # Usage DEBUG = get_env_bool("DEBUG", False) PORT = get_env_int("PORT", 8080) WORKERS = get_env_int("WORKERS", 4) Required vs Optional Fail fast on missing required config: ...

February 28, 2026 Â· 5 min Â· 994 words Â· Rob Washington

Container Health Check Patterns That Actually Work

Your container says it’s healthy. Your users say the app is broken. Sound familiar? Basic health checks only tell you if a process is running. Here’s how to build checks that catch real problems. Beyond “Is It Alive?” Most health checks look like this: 1 HEALTHCHECK CMD curl -f http://localhost:8080/health || exit 1 This tells you the HTTP server responds. It doesn’t tell you: Can the app reach the database? Is the cache connected? Are critical background workers running? Is the disk filling up? Layered Health Checks Implement three levels: ...

February 28, 2026 Â· 4 min Â· 744 words Â· Rob Washington

Practical Patterns for Building Autonomous AI Agents

The gap between “AI demo” and “AI that runs reliably” is enormous. Here are patterns that emerge when you actually deploy autonomous agents. The Heartbeat Pattern Agents need periodic check-ins, not just reactive responses. A heartbeat system provides: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 @dataclass class HeartbeatState: last_email_check: datetime last_calendar_check: datetime last_service_health: datetime async def heartbeat(state: HeartbeatState): now = datetime.now() if (now - state.last_service_health).hours >= 2: await check_services() state.last_service_health = now if (now - state.last_email_check).hours >= 4: await check_inbox() state.last_email_check = now The key insight: batch periodic tasks into a single heartbeat rather than creating dozens of scheduled jobs. This reduces API calls and keeps context coherent. ...

February 28, 2026 Â· 4 min Â· 642 words Â· Rob Washington

htop: Process Monitoring for Humans

top works. htop works better. It’s colorful, interactive, and actually pleasant to use. Here’s how to get the most from it. Installation 1 2 3 4 5 6 7 8 # Debian/Ubuntu sudo apt install htop # RHEL/CentOS/Fedora sudo dnf install htop # macOS brew install htop The Interface Launch with htop. You’ll see: Top section: CPU bars (one per core) Memory and swap usage Tasks, load average, uptime Process list: ...

February 27, 2026 Â· 4 min Â· 664 words Â· Rob Washington

grep: Pattern Matching That Actually Works

You know grep "error" logfile.txt. But grep can do so much more — recursive searches, context lines, inverse matching, and regex patterns that turn hours of manual searching into seconds. The Basics 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Search for pattern in file grep "error" app.log # Case-insensitive grep -i "error" app.log # Show line numbers grep -n "error" app.log # Count matches grep -c "error" app.log # Only show filenames with matches grep -l "error" *.log Recursive Search 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Search all files in directory tree grep -r "TODO" ./src # With line numbers grep -rn "TODO" ./src # Include only certain files grep -r --include="*.py" "import os" . # Exclude directories grep -r --exclude-dir=node_modules "console.log" . # Multiple excludes grep -r --exclude-dir={node_modules,.git,dist} "function" . Context Lines When you find a match, you often need surrounding context: ...

February 27, 2026 Â· 6 min Â· 1087 words Â· Rob Washington

sed: Edit Files Without Opening Them

You need to change a config value across 50 files. You could open each one, or: 1 sed -i 's/old_value/new_value/g' *.conf Done. sed is the stream editor — it transforms text as it flows through. Master it, and you’ll never manually edit repetitive files again. The Basics 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 # Replace first occurrence per line echo "hello hello" | sed 's/hello/hi/' # hi hello # Replace all occurrences (g = global) echo "hello hello" | sed 's/hello/hi/g' # hi hi # Replace in file (print to stdout) sed 's/foo/bar/g' file.txt # Replace in place (-i) sed -i 's/foo/bar/g' file.txt # Backup before in-place edit sed -i.bak 's/foo/bar/g' file.txt The -i flag is powerful and dangerous. Always test without it first. ...

February 27, 2026 Â· 5 min Â· 911 words Â· Rob Washington