tmux: Terminal Sessions That Survive Disconnects

SSH into a server, start a long-running process, lose connection, lose your work. tmux solves this. It creates persistent terminal sessions that survive disconnects and lets you manage multiple terminals from one window. Basic Usage 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 # Start new session tmux # Start named session tmux new -s mysession # List sessions tmux ls # Attach to session tmux attach -t mysession tmux a -t mysession # short form # Attach to last session tmux attach tmux a # Detach from session (inside tmux) Ctrl-b d # Kill session tmux kill-session -t mysession The Prefix Key All tmux commands start with a prefix key (default: Ctrl-b), then another key: ...

February 24, 2026 Β· 10 min Β· 1992 words Β· Rob Washington

xargs: Turning Output into Arguments

Many commands output lists. Many commands need arguments. xargs connects them. It reads input and runs a command with that input as arguments. Basic Usage 1 2 3 4 5 6 7 8 9 # Without xargs (doesn't work) find . -name "*.txt" | rm # rm doesn't read stdin # With xargs (works) find . -name "*.txt" | xargs rm # What's happening echo "file1 file2 file3" | xargs rm # Becomes: rm file1 file2 file3 Input Handling 1 2 3 4 5 6 7 8 9 10 11 12 13 # Default: split on whitespace echo "a b c" | xargs echo # Output: a b c # One item per line echo -e "a\nb\nc" | xargs echo # Output: a b c # Handle spaces in filenames (-0 with null delimiter) find . -name "*.txt" -print0 | xargs -0 rm # Treat each line as one argument cat list.txt | xargs -d '\n' command Argument Placement 1 2 3 4 5 6 7 8 9 10 # Default: append to end echo "file.txt" | xargs wc -l # Becomes: wc -l file.txt # Custom placement with -I echo "file.txt" | xargs -I {} cp {} {}.bak # Becomes: cp file.txt file.txt.bak # Multiple uses of placeholder echo "file.txt" | xargs -I {} sh -c 'echo "Processing {}"; wc -l {}' Limiting Arguments 1 2 3 4 5 6 7 8 9 10 11 12 13 # One argument per command execution find . -name "*.txt" | xargs -n 1 rm # Runs: rm file1.txt, rm file2.txt, rm file3.txt (separately) # Two arguments per execution echo "a b c d e f" | xargs -n 2 echo # Output: # a b # c d # e f # Limit by size (bytes) echo "a b c d e f" | xargs -s 10 echo Parallel Execution 1 2 3 4 5 6 7 8 # Run 4 processes in parallel find . -name "*.jpg" | xargs -P 4 -I {} convert {} -resize 800x600 {}.resized.jpg # All available CPUs find . -name "*.log" | xargs -P 0 gzip # Combined with -n cat urls.txt | xargs -n 1 -P 10 curl -O Confirmation (-p) and Verbose (-t) 1 2 3 4 5 # Ask before each execution find . -name "*.bak" | xargs -p rm # Show command before running find . -name "*.txt" | xargs -t wc -l Handling Empty Input 1 2 3 4 # Don't run if no input find . -name "*.missing" | xargs --no-run-if-empty rm # Short form: find . -name "*.missing" | xargs -r rm Practical Examples Bulk File Operations 1 2 3 4 5 6 7 8 9 10 11 # Delete files matching pattern find . -name "*.tmp" -print0 | xargs -0 rm -f # Move files to directory find . -name "*.jpg" -print0 | xargs -0 -I {} mv {} ./images/ # Change permissions find . -type f -name "*.sh" | xargs chmod +x # Compress multiple files find . -name "*.log" -mtime +7 | xargs gzip Search and Process 1 2 3 4 5 6 7 8 # Search in found files find . -name "*.py" | xargs grep "import os" # Count lines in all matching files find . -name "*.js" | xargs wc -l # Replace text in multiple files find . -name "*.txt" | xargs sed -i 's/old/new/g' Git Operations 1 2 3 4 5 6 7 8 # Add specific files git status --short | awk '{print $2}' | xargs git add # Remove deleted files from tracking git ls-files --deleted | xargs git rm # Checkout specific files echo "file1.txt file2.txt" | xargs git checkout -- Download Multiple URLs 1 2 3 4 5 6 7 8 # Download sequentially cat urls.txt | xargs -n 1 curl -O # Download in parallel (10 at a time) cat urls.txt | xargs -n 1 -P 10 curl -O # wget version cat urls.txt | xargs -n 1 -P 5 wget -q Docker Operations 1 2 3 4 5 6 7 8 9 10 11 # Stop all containers docker ps -q | xargs docker stop # Remove all stopped containers docker ps -aq | xargs docker rm # Remove images by pattern docker images | grep "none" | awk '{print $3}' | xargs docker rmi # Pull multiple images echo "nginx redis postgres" | xargs -n 1 docker pull Process Management 1 2 3 4 5 6 7 8 # Kill processes by name pgrep -f "pattern" | xargs kill # Kill processes by name (safer) pgrep -f "my-app" | xargs -r kill -9 # Send signal to multiple PIDs cat pids.txt | xargs kill -HUP Package Management 1 2 3 4 5 6 7 8 # Install multiple packages echo "vim git curl wget" | xargs sudo apt install -y # Uninstall packages from list cat remove.txt | xargs sudo apt remove -y # pip install from list cat requirements.txt | xargs -n 1 pip install Combining with Other Tools With find 1 2 3 4 5 # Process found files find . -name "*.md" -print0 | xargs -0 -I {} pandoc {} -o {}.html # Archive old files find /var/log -name "*.log" -mtime +30 -print0 | xargs -0 tar -czvf old-logs.tar.gz With grep 1 2 3 4 5 # Files containing pattern -> process grep -l "TODO" *.py | xargs -I {} echo "File with TODOs: {}" # Extract matches and process grep -oh "https://[^ ]*" urls.txt | xargs -n 1 -P 5 curl -sI | grep "HTTP" With awk 1 2 3 4 5 # Select column and process ps aux | awk '$3 > 50 {print $2}' | xargs kill # Format output for xargs cat data.csv | awk -F, '{print $1}' | xargs -I {} echo "ID: {}" Error Handling 1 2 3 4 5 6 7 8 9 # Continue on errors find . -name "*.txt" | xargs -I {} sh -c 'command {} || true' # Stop on first error (default behavior) find . -name "*.txt" | xargs command # Exit codes echo "a b c" | xargs false echo $? # Non-zero Performance Comparison 1 2 3 4 5 6 7 8 # Slow: one process per file for f in *.txt; do wc -l "$f"; done # Faster: batch arguments ls *.txt | xargs wc -l # Fastest: parallel execution ls *.txt | xargs -P 4 -n 10 wc -l GNU Parallel Alternative For complex parallel jobs, GNU Parallel offers more features: ...

February 24, 2026 Β· 6 min Β· 1237 words Β· Rob Washington

sed: Stream Editing for Text Transformation

sed (stream editor) processes text line by line, applying transformations as data flows through. It’s the scalpel to awk’s Swiss army knife β€” focused on text substitution and line manipulation. Basic Substitution 1 2 3 4 5 6 7 8 9 10 11 # Replace first occurrence per line sed 's/old/new/' file.txt # Replace all occurrences per line sed 's/old/new/g' file.txt # Case insensitive sed 's/old/new/gi' file.txt # Replace Nth occurrence sed 's/old/new/2' file.txt # Second occurrence only Delimiters When patterns contain slashes, use different delimiters: ...

February 24, 2026 Β· 5 min Β· 1016 words Β· Rob Washington

awk One-Liners: Text Processing Power

awk is a programming language disguised as a command-line tool. It processes text line by line, splitting each into fields. Most tasks need just one line. The Basics 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Print entire line awk '{print}' file.txt # Print specific field (space-delimited) awk '{print $1}' file.txt # First field awk '{print $2}' file.txt # Second field awk '{print $NF}' file.txt # Last field awk '{print $(NF-1)}' file.txt # Second to last # Print multiple fields awk '{print $1, $3}' file.txt # Custom output format awk '{print $1 " -> " $2}' file.txt Field Separators 1 2 3 4 5 6 7 8 9 10 11 # Colon-separated (like /etc/passwd) awk -F: '{print $1}' /etc/passwd # Tab-separated awk -F'\t' '{print $2}' data.tsv # Multiple separators awk -F'[,;]' '{print $1}' file.txt # Set output separator awk -F: 'BEGIN{OFS=","} {print $1,$3}' /etc/passwd Filtering Lines 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 # Lines matching pattern awk '/error/' logfile.txt # Lines NOT matching pattern awk '!/debug/' logfile.txt # Field matches value awk '$3 == "ERROR"' logfile.txt # Numeric comparison awk '$2 > 100' data.txt # Multiple conditions awk '$2 > 100 && $3 == "active"' data.txt # Line number range awk 'NR >= 10 && NR <= 20' file.txt Built-in Variables 1 2 3 4 5 6 7 NR # Current line number (total) NF # Number of fields in current line FNR # Line number in current file FS # Field separator (input) OFS # Output field separator RS # Record separator (default: newline) ORS # Output record separator 1 2 3 4 5 6 7 8 # Print line numbers awk '{print NR, $0}' file.txt # Print lines with more than 3 fields awk 'NF > 3' file.txt # Print total lines at end awk 'END{print NR}' file.txt Arithmetic 1 2 3 4 5 6 7 8 9 10 11 # Sum a column awk '{sum += $2} END{print sum}' data.txt # Average awk '{sum += $2; count++} END{print sum/count}' data.txt # Min/Max awk 'NR==1{min=max=$2} $2>max{max=$2} $2<min{min=$2} END{print min, max}' data.txt # Calculate percentage awk '{print $1, $2, ($2/$3)*100 "%"}' data.txt String Operations 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 # Length of field awk '{print length($1)}' file.txt # Substring awk '{print substr($1, 1, 3)}' file.txt # First 3 chars # Convert case awk '{print toupper($1)}' file.txt awk '{print tolower($1)}' file.txt # String concatenation awk '{print $1 $2}' file.txt # No space awk '{print $1 " " $2}' file.txt # With space # Split string awk '{split($1, arr, "-"); print arr[1]}' file.txt Conditional Logic 1 2 3 4 5 6 7 8 9 10 11 12 13 # If-else awk '{if ($2 > 100) print "high"; else print "low"}' data.txt # Ternary awk '{print ($2 > 100 ? "high" : "low")}' data.txt # Multiple conditions awk '{ if ($2 > 100) status = "high" else if ($2 > 50) status = "medium" else status = "low" print $1, status }' data.txt BEGIN and END 1 2 3 4 5 # Header and footer awk 'BEGIN{print "Name\tScore"} {print $1"\t"$2} END{print "---\nTotal: " NR}' data.txt # Initialize variables awk 'BEGIN{count=0} /error/{count++} END{print count " errors"}' logfile.txt Practical One-Liners Log Analysis 1 2 3 4 5 6 7 8 9 10 11 # Count occurrences of each status code awk '{print $9}' access.log | sort | uniq -c | sort -rn # Or all in awk awk '{count[$9]++} END{for (code in count) print count[code], code}' access.log # Requests per IP awk '{count[$1]++} END{for (ip in count) print count[ip], ip}' access.log | sort -rn | head # Slow requests (response time > 1s) awk '$NF > 1.0 {print $7, $NF}' access.log CSV Processing 1 2 3 4 5 6 7 8 9 10 11 # Print specific columns awk -F, '{print $1","$3}' data.csv # Skip header awk -F, 'NR > 1 {print $2}' data.csv # Sum a column awk -F, 'NR > 1 {sum += $3} END{print sum}' data.csv # Filter by value awk -F, '$4 == "active"' data.csv System Administration 1 2 3 4 5 6 7 8 9 10 11 # Disk usage over 80% df -h | awk '$5+0 > 80 {print $6, $5}' # Memory by process ps aux | awk '{mem[$11] += $6} END{for (proc in mem) print mem[proc], proc}' | sort -rn | head # Users with bash shell awk -F: '$7 ~ /bash/ {print $1}' /etc/passwd # Show listening ports netstat -tlnp | awk '$6 == "LISTEN" {print $4}' Data Transformation 1 2 3 4 5 6 7 8 9 10 11 # Transpose rows to columns awk '{for (i=1; i<=NF; i++) a[i,NR]=$i} END{for (i=1; i<=NF; i++) {for (j=1; j<=NR; j++) printf a[i,j] " "; print ""}}' file.txt # Remove duplicate lines (preserving order) awk '!seen[$0]++' file.txt # Print unique values from column awk '{print $2}' file.txt | awk '!seen[$0]++' # Join lines with comma awk '{printf "%s%s", sep, $0; sep=","} END{print ""}' file.txt Text Manipulation 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Remove blank lines awk 'NF' file.txt # Remove leading/trailing whitespace awk '{$1=$1}1' file.txt # Replace field value awk '{$2 = "REDACTED"; print}' file.txt # Add line numbers awk '{print NR": "$0}' file.txt # Print every Nth line awk 'NR % 5 == 0' file.txt Combining with Other Tools 1 2 3 4 5 6 7 8 # Filter then process grep "ERROR" logfile.txt | awk '{print $5}' # Process then sort awk -F: '{print $3, $1}' /etc/passwd | sort -n # Use in pipeline cat data.txt | awk '{print $2}' | sort | uniq -c Multi-line Scripts For complex logic, use a script file: ...

February 24, 2026 Β· 6 min Β· 1151 words Β· Rob Washington

Cron Jobs Done Right: Scheduling That Doesn't Break

Cron has been scheduling tasks on Unix systems since 1975. It’s simple, reliable, and available everywhere. But that simplicity hides gotchas that break jobs in production. Cron Syntax β”Œ β”‚ β”‚ β”‚ β”‚ β”‚ ─ ─ β”Œ β”‚ β”‚ β”‚ β”‚ ─ ─ ─ ─ β”Œ β”‚ β”‚ β”‚ ─ ─ ─ ─ ─ ─ β”Œ β”‚ β”‚ ─ ─ ─ ─ ─ ─ ─ ─ β”Œ β”‚ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ c ─ ─ ─ ─ ─ o ─ ─ ─ ─ ─ m ─ ─ ─ ─ ─ m ─ ─ ─ ─ a m ─ ─ ─ ─ n i ─ ─ ─ d n h ─ ─ ─ u o ─ ─ t u d ─ ─ e r a ─ y m ─ ( ( o 0 0 o n d - - f t a 5 2 h y 9 3 m ) ) o ( o n 1 f t - h 1 w 2 e ( ) e 1 k - 3 ( 1 0 ) - 7 , 0 a n d 7 a r e S u n d a y ) Common Schedules 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 # Every minute * * * * * /path/to/script.sh # Every hour at minute 0 0 * * * * /path/to/script.sh # Every day at midnight 0 0 * * * /path/to/script.sh # Every day at 2:30 AM 30 2 * * * /path/to/script.sh # Every Monday at 9 AM 0 9 * * 1 /path/to/script.sh # Every 15 minutes */15 * * * * /path/to/script.sh # Every weekday at 6 PM 0 18 * * 1-5 /path/to/script.sh # First day of every month at midnight 0 0 1 * * /path/to/script.sh Special Strings 1 2 3 4 5 6 7 8 @reboot # Run once at startup @yearly # 0 0 1 1 * @annually # Same as @yearly @monthly # 0 0 1 * * @weekly # 0 0 * * 0 @daily # 0 0 * * * @midnight # Same as @daily @hourly # 0 * * * * Editing Crontabs 1 2 3 4 5 6 7 8 9 10 11 # Edit current user's crontab crontab -e # List current user's crontab crontab -l # Edit another user's crontab (as root) crontab -u username -e # Remove all cron jobs (careful!) crontab -r System Crontabs 1 2 3 4 5 6 7 8 9 10 11 # System-wide crontab (includes user field) /etc/crontab # Drop-in directory (no user field needed) /etc/cron.d/ # Periodic directories (scripts run by run-parts) /etc/cron.hourly/ /etc/cron.daily/ /etc/cron.weekly/ /etc/cron.monthly/ /etc/crontab format includes username: ...

February 24, 2026 Β· 7 min Β· 1349 words Β· Rob Washington

SSH Config: Stop Typing Long Commands

If you’re still typing ssh -i ~/.ssh/prod-key.pem -p 2222 ubuntu@ec2-54-123-45-67.compute-1.amazonaws.com, you’re working too hard. SSH config files let you define aliases with all your connection details. Basic Config Create or edit ~/.ssh/config: H o s t H U P I p o s o d r s e r e o t r t n d N t a u 2 i m b 2 t e u 2 y n 2 F e t i c u l 2 e - 5 ~ 4 / - . 1 s 2 s 3 h - / 4 p 5 r - o 6 d 7 - . k c e o y m . p p u e t m e - 1 . a m a z o n a w s . c o m Now connect with just: ...

February 24, 2026 Β· 16 min Β· 3205 words Β· Rob Washington

grep and find: Search Patterns for the Command Line

Two commands solve 90% of search problems on Unix systems: grep for text patterns and find for file locations. Master these and you’ll navigate any codebase. grep Basics 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Search for pattern in file grep "error" logfile.txt # Case insensitive grep -i "error" logfile.txt # Show line numbers grep -n "error" logfile.txt # Count matches grep -c "error" logfile.txt # Invert match (lines NOT matching) grep -v "debug" logfile.txt grep in Multiple Files 1 2 3 4 5 6 7 8 9 10 11 # Search all files in directory grep "TODO" *.py # Recursive search grep -r "TODO" src/ # Show only filenames grep -l "TODO" *.py # Show filenames with no match grep -L "TODO" *.py grep with Context 1 2 3 4 5 6 7 8 # 3 lines before match grep -B 3 "error" logfile.txt # 3 lines after match grep -A 3 "error" logfile.txt # 3 lines before and after grep -C 3 "error" logfile.txt grep Regular Expressions 1 2 3 4 5 6 7 8 9 10 11 # Basic regex (default) grep "error.*failed" logfile.txt # Extended regex grep -E "error|warning|critical" logfile.txt # Or use egrep egrep "error|warning" logfile.txt # Perl regex (most powerful) grep -P "\d{4}-\d{2}-\d{2}" logfile.txt Common Patterns 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # IP addresses grep -E "\b[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\b" access.log # Email addresses grep -E "[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}" users.txt # URLs grep -E "https?://[^\s]+" document.txt # Timestamps grep -P "\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}" app.log # Word boundaries grep -w "error" logfile.txt # Won't match "errors" or "error_code" grep with Exclusions 1 2 3 4 5 6 7 8 # Exclude directories grep -r "TODO" --exclude-dir=node_modules --exclude-dir=.git . # Exclude file patterns grep -r "TODO" --exclude="*.min.js" --exclude="*.map" . # Include only certain files grep -r "TODO" --include="*.py" --include="*.js" . find Basics 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Find by name find . -name "*.py" # Case insensitive name find . -iname "readme*" # Find directories find . -type d -name "test*" # Find files find . -type f -name "*.log" # Find links find . -type l find by Time 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Modified in last 7 days find . -mtime -7 # Modified more than 30 days ago find . -mtime +30 # Modified in last 60 minutes find . -mmin -60 # Accessed in last day find . -atime -1 # Changed (metadata) in last day find . -ctime -1 find by Size 1 2 3 4 5 6 7 8 9 10 11 # Files larger than 100MB find . -size +100M # Files smaller than 1KB find . -size -1k # Files exactly 0 bytes (empty) find . -size 0 # Size units: c (bytes), k (KB), M (MB), G (GB) find . -size +1G find by Permissions 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Executable files find . -perm /u+x -type f # World-writable files find . -perm -002 # SUID files find . -perm -4000 # Files owned by user find . -user root # Files owned by group find . -group www-data find with Actions 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Print (default) find . -name "*.log" -print # Delete (careful!) find . -name "*.tmp" -delete # Execute command for each file find . -name "*.py" -exec wc -l {} \; # Execute command with all files at once find . -name "*.py" -exec wc -l {} + # Interactive delete find . -name "*.bak" -ok rm {} \; find Logical Operators 1 2 3 4 5 6 7 8 9 10 11 # AND (implicit) find . -name "*.py" -size +100k # OR find . -name "*.py" -o -name "*.js" # NOT find . ! -name "*.pyc" # Grouping find . \( -name "*.py" -o -name "*.js" \) -size +10k Combining grep and find 1 2 3 4 5 6 7 8 9 10 11 # Find files and grep in them find . -name "*.py" -exec grep -l "import os" {} \; # More efficient with xargs find . -name "*.py" | xargs grep -l "import os" # Handle spaces in filenames find . -name "*.py" -print0 | xargs -0 grep -l "import os" # Find recently modified files with pattern find . -name "*.log" -mtime -1 -exec grep "ERROR" {} + Practical Examples Find Large Files 1 2 3 4 5 # Top 10 largest files find . -type f -exec du -h {} + | sort -rh | head -10 # Files over 100MB, sorted find . -type f -size +100M -exec ls -lh {} \; | sort -k5 -h Find and Clean 1 2 3 4 5 6 7 8 9 # Remove old log files find /var/log -name "*.log" -mtime +30 -delete # Remove empty directories find . -type d -empty -delete # Remove Python cache find . -type d -name "__pycache__" -exec rm -rf {} + find . -name "*.pyc" -delete Search Code 1 2 3 4 5 6 7 8 # Find function definitions grep -rn "def " --include="*.py" src/ # Find TODO comments grep -rn "TODO\|FIXME\|XXX" --include="*.py" . # Find imports grep -r "^import\|^from" --include="*.py" src/ | sort -u Search Logs 1 2 3 4 5 6 7 8 # Errors in last hour find /var/log -name "*.log" -mmin -60 -exec grep -l "ERROR" {} \; # Count errors per file find /var/log -name "*.log" -exec sh -c 'echo "$1: $(grep -c ERROR "$1")"' _ {} \; # Unique IPs from access log grep -oE "\b[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+\b" access.log | sort -u Find Duplicates 1 2 3 4 5 # Find files with same name find . -type f -name "*.py" | xargs -I{} basename {} | sort | uniq -d # Find by content hash (requires md5sum) find . -type f -exec md5sum {} \; | sort | uniq -w32 -d Performance Tips 1 2 3 4 5 6 7 8 9 10 11 # Stop at first match (faster) grep -m 1 "pattern" largefile.txt # Use fixed strings when possible (faster than regex) grep -F "exact string" file.txt # Limit find depth find . -maxdepth 2 -name "*.py" # Prune directories find . -path ./node_modules -prune -o -name "*.js" -print ripgrep (Modern Alternative) If available, rg is faster than grep: ...

February 24, 2026 Β· 6 min Β· 1235 words Β· Rob Washington

Systemd Service Files: Running Apps Reliably

Systemd replaced init scripts on most Linux distributions. Instead of shell scripts with start/stop logic, you write declarative unit files that tell systemd what to run and how. Basic Service File 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # /etc/systemd/system/myapp.service [Unit] Description=My Application After=network.target [Service] Type=simple User=myapp WorkingDirectory=/opt/myapp ExecStart=/opt/myapp/bin/server Restart=always [Install] WantedBy=multi-user.target Enable and start: ...

February 24, 2026 Β· 5 min Β· 1020 words Β· Rob Washington

YAML Gotchas: The Traps Everyone Falls Into

YAML is everywhere β€” Kubernetes, Docker Compose, Ansible, GitHub Actions, CI/CD pipelines. It looks friendly until you spend an hour debugging why on became true or your port number turned into octal. Here are the traps and how to avoid them. The Norway Problem 1 2 3 4 5 # What you wrote country: NO # What YAML parsed country: false YAML interprets NO, no, No, OFF, off, Off as boolean false. Same with YES, yes, Yes, ON, on, On as true. ...

February 24, 2026 Β· 5 min Β· 965 words Β· Rob Washington

Git Workflow Patterns: From Solo to Team

Git is flexible enough to support almost any workflow. That flexibility is both a blessing and a curse β€” without clear patterns, teams end up with merge conflicts, lost work, and frustration. Here are workflows that work. Solo Developer: Simple Trunk When you’re working alone, keep it simple: 1 2 3 4 5 6 7 8 # Work directly on main git checkout main git pull # Make changes git add . git commit -m "Add feature X" git push That’s it. No branches, no PRs, no ceremony. Save complexity for when you need it. ...

February 24, 2026 Β· 7 min Β· 1451 words Β· Rob Washington