Terraform Basics: Infrastructure as Code

Clicking through cloud consoles doesn’t scale. Terraform lets you define infrastructure in code, track changes in git, and deploy the same environment repeatedly. Core Concepts Provider: Plugin for a platform (AWS, GCP, Azure, etc.) Resource: A thing to create (server, database, DNS record) State: Terraform’s record of what exists Plan: Preview of changes before applying Apply: Make the changes happen Basic Workflow 1 2 3 4 terraform init # Download providers terraform plan # Preview changes terraform apply # Create/update resources terraform destroy # Tear everything down First Configuration 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 # main.tf # Configure the AWS provider terraform { required_providers { aws = { source = "hashicorp/aws" version = "~> 5.0" } } } provider "aws" { region = "us-east-1" } # Create an EC2 instance resource "aws_instance" "web" { ami = "ami-0c55b159cbfafe1f0" instance_type = "t3.micro" tags = { Name = "WebServer" } } 1 2 3 terraform init # Downloads AWS provider terraform plan # Shows: 1 to add terraform apply # Creates the instance Variables 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 # variables.tf variable "environment" { description = "Deployment environment" type = string default = "dev" } variable "instance_type" { description = "EC2 instance type" type = string default = "t3.micro" } variable "allowed_ips" { description = "IPs allowed to SSH" type = list(string) default = ["0.0.0.0/0"] } # main.tf - use variables resource "aws_instance" "web" { ami = "ami-0c55b159cbfafe1f0" instance_type = var.instance_type tags = { Name = "web-${var.environment}" Environment = var.environment } } Setting Variables 1 2 3 4 5 6 7 8 9 # Command line terraform apply -var="environment=prod" # File (terraform.tfvars) environment = "prod" instance_type = "t3.small" # Environment variables export TF_VAR_environment="prod" Outputs 1 2 3 4 5 6 7 8 9 10 # outputs.tf output "instance_ip" { description = "Public IP of the instance" value = aws_instance.web.public_ip } output "instance_id" { description = "Instance ID" value = aws_instance.web.id } After apply: ...

March 5, 2026 Â· 7 min Â· 1424 words Â· Rob Washington

Bash Scripting Essentials: From One-Liners to Real Scripts

Bash scripts automate everything from deployments to backups. Here’s how to write them properly. Script Structure 1 2 3 4 #!/bin/bash set -euo pipefail # Your code here The shebang (#!/bin/bash) tells the system which interpreter to use. The set options make scripts safer: -e: Exit on error -u: Error on undefined variables -o pipefail: Catch errors in pipelines Variables 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 # Assignment (no spaces!) NAME="Alice" COUNT=42 # Usage echo "Hello, $NAME" echo "Count is ${COUNT}" # Command substitution DATE=$(date +%Y-%m-%d) FILES=$(ls -1 | wc -l) # Default values ENVIRONMENT=${1:-production} # First arg, default "production" LOG_LEVEL=${LOG_LEVEL:-info} # Env var with default Variable Scope 1 2 3 4 5 6 7 8 # Global by default GLOBAL="I'm everywhere" # Local to function my_function() { local LOCAL="I'm only here" echo "$LOCAL" } Conditionals 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 # String comparison if [ "$NAME" = "Alice" ]; then echo "Hi Alice" elif [ "$NAME" = "Bob" ]; then echo "Hi Bob" else echo "Who are you?" fi # Numeric comparison if [ "$COUNT" -gt 10 ]; then echo "More than 10" fi # File tests if [ -f "$FILE" ]; then echo "File exists" fi if [ -d "$DIR" ]; then echo "Directory exists" fi if [ -z "$VAR" ]; then echo "Variable is empty" fi if [ -n "$VAR" ]; then echo "Variable is not empty" fi Comparison Operators String Numeric Meaning = -eq Equal != -ne Not equal -lt Less than -le Less or equal -gt Greater than -ge Greater or equal Modern Syntax 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 # Double brackets (safer, more features) if [[ "$NAME" == "Alice" ]]; then echo "Hi" fi # Pattern matching if [[ "$FILE" == *.txt ]]; then echo "Text file" fi # Regex if [[ "$EMAIL" =~ ^[a-z]+@[a-z]+\.[a-z]+$ ]]; then echo "Valid email" fi # Logical operators if [[ "$A" == "yes" && "$B" == "yes" ]]; then echo "Both yes" fi if [[ "$A" == "yes" || "$B" == "yes" ]]; then echo "At least one yes" fi Loops 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 # For loop over list for NAME in Alice Bob Charlie; do echo "Hello, $NAME" done # For loop over files for FILE in *.txt; do echo "Processing $FILE" done # For loop with range for i in {1..10}; do echo "Number $i" done # C-style for loop for ((i=0; i<10; i++)); do echo "Index $i" done # While loop while [ "$COUNT" -gt 0 ]; do echo "$COUNT" COUNT=$((COUNT - 1)) done # Read lines from file while IFS= read -r line; do echo "Line: $line" done < file.txt # Read from command while read -r file; do echo "Found: $file" done < <(find . -name "*.log") Functions 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 # Basic function greet() { echo "Hello, $1" } greet "World" # With local variables calculate() { local a=$1 local b=$2 echo $((a + b)) } result=$(calculate 5 3) echo "Result: $result" # Return values (0 = success, non-zero = failure) is_valid() { if [[ "$1" =~ ^[0-9]+$ ]]; then return 0 else return 1 fi } if is_valid "42"; then echo "Valid number" fi Arguments 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 #!/bin/bash # Positional arguments echo "Script: $0" echo "First arg: $1" echo "Second arg: $2" echo "All args: $@" echo "Arg count: $#" # Shift through args while [ $# -gt 0 ]; do echo "Arg: $1" shift done # Getopts for flags while getopts "vf:o:" opt; do case $opt in v) VERBOSE=true ;; f) FILE="$OPTARG" ;; o) OUTPUT="$OPTARG" ;; ?) echo "Usage: $0 [-v] [-f file] [-o output]"; exit 1 ;; esac done Error Handling 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 #!/bin/bash set -euo pipefail # Trap errors trap 'echo "Error on line $LINENO"; exit 1' ERR # Trap cleanup on exit cleanup() { echo "Cleaning up..." rm -f "$TEMP_FILE" } trap cleanup EXIT # Create temp file TEMP_FILE=$(mktemp) # Check command success if ! command -v docker &> /dev/null; then echo "Docker not installed" exit 1 fi # Or with || docker ps || { echo "Docker not running"; exit 1; } Useful Patterns Check if root 1 2 3 4 if [ "$EUID" -ne 0 ]; then echo "Please run as root" exit 1 fi Confirm before proceeding 1 2 3 4 5 read -p "Are you sure? (y/n) " -n 1 -r echo if [[ ! $REPLY =~ ^[Yy]$ ]]; then exit 1 fi Logging 1 2 3 4 5 6 7 8 LOG_FILE="/var/log/myscript.log" log() { echo "[$(date '+%Y-%m-%d %H:%M:%S')] $1" | tee -a "$LOG_FILE" } log "Starting script" log "Task completed" Config file 1 2 3 4 5 6 7 8 # config.sh DB_HOST="localhost" DB_PORT=5432 DB_NAME="myapp" # main.sh source ./config.sh echo "Connecting to $DB_HOST:$DB_PORT" Lock file (prevent concurrent runs) 1 2 3 4 5 6 7 8 9 10 11 LOCKFILE="/tmp/myscript.lock" if [ -f "$LOCKFILE" ]; then echo "Script already running" exit 1 fi trap "rm -f $LOCKFILE" EXIT touch "$LOCKFILE" # Your script here Arrays 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 # Define array SERVERS=("web1" "web2" "web3") # Access elements echo "${SERVERS[0]}" # First element echo "${SERVERS[@]}" # All elements echo "${#SERVERS[@]}" # Length # Loop over array for server in "${SERVERS[@]}"; do echo "Deploying to $server" done # Add element SERVERS+=("web4") # Associative array (bash 4+) declare -A PORTS PORTS[web]=80 PORTS[api]=8080 echo "${PORTS[web]}" String Operations 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 STR="Hello, World!" # Length echo "${#STR}" # 13 # Substring echo "${STR:0:5}" # Hello # Replace echo "${STR/World/Bash}" # Hello, Bash! # Replace all echo "${STR//o/0}" # Hell0, W0rld! # Remove prefix FILE="document.txt" echo "${FILE%.txt}" # document # Remove suffix PATH="/home/user/file.txt" echo "${PATH##*/}" # file.txt echo "${PATH%/*}" # /home/user Real Script Example 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 #!/bin/bash set -euo pipefail # Deploy script with logging and error handling SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" LOG_FILE="$SCRIPT_DIR/deploy.log" ENVIRONMENT="${1:-staging}" log() { echo "[$(date '+%Y-%m-%d %H:%M:%S')] [$ENVIRONMENT] $1" | tee -a "$LOG_FILE" } die() { log "ERROR: $1" exit 1 } cleanup() { log "Cleanup complete" } trap cleanup EXIT # Validate environment if [[ ! "$ENVIRONMENT" =~ ^(staging|production)$ ]]; then die "Invalid environment: $ENVIRONMENT" fi # Production confirmation if [ "$ENVIRONMENT" = "production" ]; then read -p "Deploy to PRODUCTION? (yes/no) " confirm [ "$confirm" = "yes" ] || die "Aborted" fi log "Starting deployment to $ENVIRONMENT" # Pull latest code log "Pulling latest code..." git pull origin main || die "Git pull failed" # Build log "Building..." npm run build || die "Build failed" # Deploy log "Deploying..." rsync -avz ./dist/ "deploy@$ENVIRONMENT.example.com:/var/www/" || die "Deploy failed" log "Deployment complete!" Quick Reference 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 # Variables VAR="value" VAR=${VAR:-default} VAR=$(command) # Conditionals [[ -f file ]] # File exists [[ -d dir ]] # Dir exists [[ -z "$var" ]] # Empty [[ "$a" == "$b" ]] # String equal [[ $n -eq 5 ]] # Numeric equal # Loops for i in list; do ...; done while cond; do ...; done # Functions func() { local x=$1; echo $x; } # Error handling set -euo pipefail trap 'cleanup' EXIT command || { echo "failed"; exit 1; } Bash isn’t pretty, but it’s everywhere. Master these patterns and you can automate anything. ...

March 5, 2026 Â· 7 min Â· 1438 words Â· Rob Washington

grep, awk, and sed: Text Processing Power Tools

grep, awk, and sed are the foundational text processing tools in Unix. They’re old, they’re cryptic, and they’re incredibly powerful once you learn them. grep: Search and Filter grep searches for patterns in text. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 # Basic search grep "error" logfile.txt # Case insensitive grep -i "error" logfile.txt # Show line numbers grep -n "error" logfile.txt # Count matches grep -c "error" logfile.txt # Invert (lines NOT matching) grep -v "debug" logfile.txt # Recursive search grep -r "TODO" ./src/ # Only filenames grep -l "password" *.conf Regex with grep 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 # Extended regex (-E or egrep) grep -E "error|warning|critical" logfile.txt # Word boundary grep -w "fail" logfile.txt # Matches "fail" not "failure" # Line start/end grep "^Error" logfile.txt # Lines starting with Error grep "done$" logfile.txt # Lines ending with done # Any character grep "user.name" logfile.txt # user1name, username, user_name # Character classes grep "[0-9]" logfile.txt # Lines with digits grep "[A-Za-z]" logfile.txt # Lines with letters Context 1 2 3 4 5 6 7 8 # Lines before match grep -B 3 "error" logfile.txt # Lines after match grep -A 3 "error" logfile.txt # Lines before and after grep -C 3 "error" logfile.txt Real Examples 1 2 3 4 5 6 7 8 9 10 11 # Find IP addresses grep -E "\b[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\.[0-9]{1,3}\b" access.log # Find function definitions grep -n "^def \|^function " *.py *.js # Exclude directories grep -r "config" . --exclude-dir={node_modules,.git} # Find files NOT containing pattern grep -L "copyright" *.py sed: Stream Editor sed transforms text line by line. ...

March 5, 2026 Â· 6 min Â· 1216 words Â· Rob Washington

tmux: Terminal Multiplexing for Productivity

SSH into a server, start a long-running process, lose connection, lose everything. tmux solves this by keeping sessions alive independently of your terminal. Why tmux? Persistence: Sessions survive disconnections Multiplexing: Multiple windows and panes in one terminal Remote pairing: Share sessions with teammates Scriptable: Automate complex layouts Basic Concepts t ├ │ │ │ │ └ m ─ ─ u ─ ─ x S ├ │ │ └ S e ─ ─ e s ─ ─ s s s i W ├ └ W i o i ─ ─ i o n n ─ ─ n n d d ( o P P o 2 n w a a w a n n m 1 e e 2 e d ( 1 2 l c i o k l e l e a c t t i a o b n ) o f w i n d o w s ) Getting Started 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # Start new session tmux # Start named session tmux new -s myproject # List sessions tmux ls # Attach to session tmux attach -t myproject # Attach to last session tmux attach The Prefix Key All tmux commands start with a prefix (default: Ctrl+b), then a key. ...

March 5, 2026 Â· 6 min Â· 1131 words Â· Rob Washington

curl Tips and Tricks for API Work

Every developer ends up using curl. It’s installed everywhere, works with any API, and once you learn the patterns, it’s faster than any GUI tool. Basic Requests 1 2 3 4 5 6 7 8 9 10 11 # GET (default) curl https://api.example.com/users # POST with data curl -X POST https://api.example.com/users \ -d '{"name":"Alice"}' # Other methods curl -X PUT ... curl -X DELETE ... curl -X PATCH ... Headers 1 2 3 4 5 6 7 8 9 10 # Set content type curl -H "Content-Type: application/json" ... # Auth header curl -H "Authorization: Bearer TOKEN" ... # Multiple headers curl -H "Content-Type: application/json" \ -H "Authorization: Bearer TOKEN" \ -H "X-Custom-Header: value" ... JSON Data 1 2 3 4 5 6 7 8 9 10 11 12 # Inline JSON curl -X POST https://api.example.com/users \ -H "Content-Type: application/json" \ -d '{"name":"Alice","email":"alice@example.com"}' # From file curl -X POST https://api.example.com/users \ -H "Content-Type: application/json" \ -d @data.json # Pretty output (pipe to jq) curl -s https://api.example.com/users | jq Response Info 1 2 3 4 5 6 7 8 9 10 11 # Show response headers curl -i https://api.example.com/users # Only headers (no body) curl -I https://api.example.com/users # HTTP status code only curl -s -o /dev/null -w "%{http_code}" https://api.example.com/users # Timing info curl -w "Time: %{time_total}s\n" -o /dev/null -s https://api.example.com Authentication 1 2 3 4 5 6 7 8 9 10 11 # Basic auth curl -u username:password https://api.example.com # Bearer token curl -H "Authorization: Bearer eyJ..." https://api.example.com # API key in header curl -H "X-API-Key: abc123" https://api.example.com # API key in query curl "https://api.example.com?api_key=abc123" Form Data 1 2 3 4 5 6 7 8 # URL-encoded form curl -X POST https://example.com/login \ -d "username=alice&password=secret" # Multipart form (file upload) curl -X POST https://example.com/upload \ -F "file=@photo.jpg" \ -F "description=My photo" Following Redirects 1 2 3 4 5 # Follow redirects (3xx) curl -L https://example.com/redirect # Show redirect chain curl -L -v https://example.com/redirect 2>&1 | grep "< HTTP\|< Location" SSL/TLS 1 2 3 4 5 6 7 8 # Ignore SSL errors (dev only!) curl -k https://self-signed.example.com # Use specific cert curl --cacert /path/to/ca.crt https://api.example.com # Client cert authentication curl --cert client.crt --key client.key https://api.example.com Timeouts 1 2 3 4 5 6 7 8 # Connection timeout (seconds) curl --connect-timeout 5 https://api.example.com # Max time for entire operation curl --max-time 30 https://api.example.com # Both curl --connect-timeout 5 --max-time 30 https://api.example.com Debugging 1 2 3 4 5 6 7 8 # Verbose output curl -v https://api.example.com # Even more verbose (includes SSL handshake) curl -vv https://api.example.com # Trace everything to file curl --trace trace.log https://api.example.com Saving Output 1 2 3 4 5 6 7 8 # Save to file curl -o response.json https://api.example.com/data # Save with remote filename curl -O https://example.com/file.zip # Silent mode (no progress) curl -s https://api.example.com Cookies 1 2 3 4 5 6 7 8 # Send cookie curl -b "session=abc123" https://example.com # Save cookies to file curl -c cookies.txt https://example.com/login -d "user=alice" # Send cookies from file curl -b cookies.txt https://example.com/dashboard Retry Logic 1 2 3 4 5 6 7 8 # Retry on failure curl --retry 3 https://api.example.com # Retry with delay curl --retry 3 --retry-delay 2 https://api.example.com # Retry on specific errors curl --retry 3 --retry-connrefused https://api.example.com Useful Aliases Add to your .bashrc: ...

March 5, 2026 Â· 4 min Â· 810 words Â· Rob Washington

jq: Command-Line JSON Processing

APIs return JSON. Config files use JSON. Logs are often JSON. The jq tool lets you slice, filter, and transform JSON from the command line—no scripting required. Basic Syntax 1 2 3 4 5 jq 'filter' input.json # or cat input.json | jq 'filter' # or curl -s api.example.com | jq 'filter' Pretty Print 1 2 3 4 5 6 7 8 9 10 11 # Ugly JSON echo '{"name":"Alice","age":30}' | jq # Output: # { # "name": "Alice", # "age": 30 # } # Compact (remove whitespace) echo '{"name": "Alice"}' | jq -c # {"name":"Alice"} Extract Fields 1 2 3 4 5 6 echo '{"name":"Alice","age":30}' | jq '.name' # "Alice" # Without quotes echo '{"name":"Alice","age":30}' | jq -r '.name' # Alice Nested Objects 1 2 echo '{"user":{"name":"Alice","email":"alice@example.com"}}' | jq '.user.name' # "Alice" Arrays 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 # Get element by index echo '[1,2,3,4,5]' | jq '.[0]' # 1 # Get slice echo '[1,2,3,4,5]' | jq '.[1:3]' # [2, 3] # Get all elements echo '[{"name":"Alice"},{"name":"Bob"}]' | jq '.[].name' # "Alice" # "Bob" # Wrap in array echo '[{"name":"Alice"},{"name":"Bob"}]' | jq '[.[].name]' # ["Alice", "Bob"] Filtering 1 2 3 4 5 6 # Select objects matching condition echo '[{"name":"Alice","age":30},{"name":"Bob","age":25}]' | jq '.[] | select(.age > 26)' # {"name": "Alice", "age": 30} # Multiple conditions jq '.[] | select(.status == "active" and .role == "admin")' Transform Data 1 2 3 4 5 6 7 8 9 10 11 # Create new object echo '{"first":"Alice","last":"Smith"}' | jq '{fullName: "\(.first) \(.last)"}' # {"fullName": "Alice Smith"} # Map over array echo '[1,2,3]' | jq 'map(. * 2)' # [2, 4, 6] # Add field echo '{"name":"Alice"}' | jq '. + {role: "admin"}' # {"name": "Alice", "role": "admin"} Real-World Examples Parse API response: ...

March 5, 2026 Â· 5 min Â· 955 words Â· Rob Washington

Git Hooks: Automate Your Workflow at the Source

Git hooks are scripts that run automatically at specific points in the Git workflow. They’re perfect for enforcing standards, running tests, and automating tedious tasks—all before code leaves your machine. Hook Locations Hooks live in .git/hooks/. Git creates sample files on git init: 1 2 3 4 5 6 ls .git/hooks/ # applypatch-msg.sample pre-commit.sample # commit-msg.sample pre-push.sample # post-update.sample pre-rebase.sample # pre-applypatch.sample prepare-commit-msg.sample # pre-merge-commit.sample update.sample Remove .sample to activate. Hooks must be executable: ...

March 5, 2026 Â· 5 min Â· 970 words Â· Rob Washington

Nginx Essentials: From Basic Proxy to Production Config

Nginx powers a significant portion of the internet, yet its configuration syntax trips up even experienced engineers. Here’s a practical guide to the patterns you’ll actually use. Basic Structure 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 # /etc/nginx/nginx.conf user nginx; worker_processes auto; error_log /var/log/nginx/error.log; events { worker_connections 1024; } http { include /etc/nginx/mime.types; default_type application/octet-stream; # Include site configs include /etc/nginx/conf.d/*.conf; } Site configs go in /etc/nginx/conf.d/ or /etc/nginx/sites-enabled/. ...

March 5, 2026 Â· 5 min Â· 965 words Â· Rob Washington

Cron Jobs Done Right: Scheduling Without the Pain

Cron has been scheduling Unix tasks since 1975. It’s simple, reliable, and will silently fail in ways that waste hours of debugging. Here’s how to use it properly. Cron Syntax │ │ │ │ │ └ ─ │ │ │ │ └ ─ ─ ─ │ │ │ └ ─ ─ ─ ─ ─ │ │ └ ─ ─ ─ ─ ─ ─ ─ │ └ ─ ─ ─ ─ ─ ─ ─ ─ ─ c ─ ─ ─ ─ ─ o m D M D H M m a o a o i a y n y u n n t r u d o h o t f f ( e ( 0 w 1 m - ( e - o 2 0 e 1 n 3 - k 2 t ) 5 ) h 9 ( ) 0 ( - 1 7 - , 3 1 0 ) a n d 7 a r e S u n d a y ) Common Schedules 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 # Every minute * * * * * /script.sh # Every 5 minutes */5 * * * * /script.sh # Every hour at minute 0 0 * * * * /script.sh # Daily at 3 AM 0 3 * * * /script.sh # Weekly on Sunday at midnight 0 0 * * 0 /script.sh # Monthly on the 1st at 6 AM 0 6 1 * * /script.sh # Every weekday at 9 AM 0 9 * * 1-5 /script.sh # Every 15 minutes during business hours */15 9-17 * * 1-5 /script.sh The Silent Failure Problem Cron’s default behavior: run command, discard output, send errors to email (which you probably haven’t configured). ...

March 5, 2026 Â· 5 min Â· 1035 words Â· Rob Washington

Environment Variable Management: Patterns That Scale

Environment variables seem trivial—just key-value pairs. Then you have 50 of them across 4 environments with secrets mixed in, and suddenly you’re in configuration hell. Here’s how to stay sane. The Hierarchy Configuration should flow from least to most specific: D e f a u l t s ( c o d e ) → C o n f i g f i l e s → E n v v a r s → C o m m a n d - l i n e f l a g s Each layer overrides the previous. Environment variables sit near the top—easy to change per environment without touching code. ...

March 5, 2026 Â· 5 min Â· 975 words Â· Rob Washington