Bash Scripting Patterns for Reliable Automation

Bash scripts glue systems together. Here’s how to write them without the usual fragility. Script Header Always start with: 1 2 3 4 5 6 #!/usr/bin/env bash set -euo pipefail # -e: Exit on error # -u: Error on undefined variables # -o pipefail: Fail if any pipe command fails Argument Parsing Simple Positional 1 2 3 4 5 6 7 8 9 #!/usr/bin/env bash set -euo pipefail if [[ $# -lt 1 ]]; then echo "Usage: $0 <filename>" >&2 exit 1 fi FILENAME="$1" With Options (getopts) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 #!/usr/bin/env bash set -euo pipefail usage() { echo "Usage: $0 [-v] [-o output] [-n count] input" echo " -v Verbose mode" echo " -o output Output file" echo " -n count Number of iterations" exit 1 } VERBOSE=false OUTPUT="" COUNT=1 while getopts "vo:n:h" opt; do case $opt in v) VERBOSE=true ;; o) OUTPUT="$OPTARG" ;; n) COUNT="$OPTARG" ;; h) usage ;; *) usage ;; esac done shift $((OPTIND - 1)) if [[ $# -lt 1 ]]; then usage fi INPUT="$1" Long Options 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 #!/usr/bin/env bash set -euo pipefail VERBOSE=false OUTPUT="" while [[ $# -gt 0 ]]; do case $1 in -v|--verbose) VERBOSE=true shift ;; -o|--output) OUTPUT="$2" shift 2 ;; -h|--help) usage ;; -*) echo "Unknown option: $1" >&2 exit 1 ;; *) break ;; esac done Error Handling Trap for Cleanup 1 2 3 4 5 6 7 8 9 10 11 12 13 14 #!/usr/bin/env bash set -euo pipefail TMPDIR="" cleanup() { if [[ -n "$TMPDIR" && -d "$TMPDIR" ]]; then rm -rf "$TMPDIR" fi } trap cleanup EXIT TMPDIR=$(mktemp -d) # Work with $TMPDIR - it's cleaned up on exit, error, or interrupt Custom Error Handler 1 2 3 4 5 6 7 8 9 10 11 12 13 14 #!/usr/bin/env bash set -euo pipefail error() { echo "Error: $1" >&2 exit "${2:-1}" } warn() { echo "Warning: $1" >&2 } # Usage [[ -f "$CONFIG" ]] || error "Config file not found: $CONFIG" Detailed Error Reporting 1 2 3 4 5 6 7 8 #!/usr/bin/env bash set -euo pipefail on_error() { echo "Error on line $1" >&2 exit 1 } trap 'on_error $LINENO' ERR Variables and Defaults 1 2 3 4 5 6 7 8 9 10 11 # Default value NAME="${1:-default}" # Error if unset NAME="${1:?Error: name required}" # Default only if unset (not empty) NAME="${NAME-default}" # Assign default if unset : "${NAME:=default}" String Operations 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 FILE="/path/to/file.txt" # Extract parts echo "${FILE##*/}" # file.txt (basename) echo "${FILE%/*}" # /path/to (dirname) echo "${FILE%.txt}" # /path/to/file (remove extension) echo "${FILE##*.}" # txt (extension only) # Replace echo "${FILE/path/new}" # /new/to/file.txt echo "${FILE//t/T}" # /paTh/To/file.TxT (all occurrences) # Case conversion echo "${FILE^^}" # Uppercase echo "${FILE,,}" # Lowercase # Length echo "${#FILE}" # String length Conditionals 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 # File tests [[ -f "$FILE" ]] # File exists [[ -d "$DIR" ]] # Directory exists [[ -r "$FILE" ]] # Readable [[ -w "$FILE" ]] # Writable [[ -x "$FILE" ]] # Executable [[ -s "$FILE" ]] # Non-empty # String tests [[ -z "$VAR" ]] # Empty [[ -n "$VAR" ]] # Non-empty [[ "$A" == "$B" ]] # Equal [[ "$A" != "$B" ]] # Not equal [[ "$A" =~ regex ]] # Regex match # Numeric tests [[ $A -eq $B ]] # Equal [[ $A -ne $B ]] # Not equal [[ $A -lt $B ]] # Less than [[ $A -le $B ]] # Less or equal [[ $A -gt $B ]] # Greater than [[ $A -ge $B ]] # Greater or equal # Logical [[ $A && $B ]] # And [[ $A || $B ]] # Or [[ ! $A ]] # Not Loops 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 # Over arguments for arg in "$@"; do echo "$arg" done # Over array arr=("one" "two" "three") for item in "${arr[@]}"; do echo "$item" done # C-style for ((i=0; i<10; i++)); do echo "$i" done # Over files for file in *.txt; do [[ -f "$file" ]] || continue echo "$file" done # Read lines from file while IFS= read -r line; do echo "$line" done < "$FILE" # Read lines from command while IFS= read -r line; do echo "$line" done < <(some_command) Functions 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 # Basic function greet() { local name="$1" echo "Hello, $name" } # With return value is_valid() { local input="$1" [[ "$input" =~ ^[0-9]+$ ]] } if is_valid "$value"; then echo "Valid" fi # Return data via stdout get_config() { cat /etc/myapp/config } CONFIG=$(get_config) # Local variables process() { local tmp tmp=$(mktemp) # tmp is local to this function } Arrays 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 # Create arr=("one" "two" "three") arr[3]="four" # Access echo "${arr[0]}" # First element echo "${arr[@]}" # All elements echo "${#arr[@]}" # Length echo "${arr[@]:1:2}" # Slice (start:length) # Add arr+=("five") # Iterate for item in "${arr[@]}"; do echo "$item" done # With index for i in "${!arr[@]}"; do echo "$i: ${arr[$i]}" done Associative Arrays 1 2 3 4 5 6 7 8 9 10 11 12 declare -A config config[host]="localhost" config[port]="8080" echo "${config[host]}" echo "${!config[@]}" # All keys echo "${config[@]}" # All values for key in "${!config[@]}"; do echo "$key: ${config[$key]}" done Process Substitution 1 2 3 4 5 6 7 # Compare two commands diff <(sort file1) <(sort file2) # Feed command output as file while read -r line; do echo "$line" done < <(curl -s "$URL") Subshells 1 2 3 4 5 6 7 8 9 10 11 12 # Run in subshell (changes don't affect parent) ( cd /tmp rm -f *.tmp ) # Still in original directory # Capture output result=$(command) # Capture with error result=$(command 2>&1) Here Documents 1 2 3 4 5 6 7 8 9 10 11 12 13 # Multi-line string cat << EOF This is a multi-line string with $VARIABLE expansion EOF # No variable expansion cat << 'EOF' This preserves $VARIABLE literally EOF # Here string grep "pattern" <<< "$string" Practical Patterns Check Dependencies 1 2 3 4 5 6 7 8 9 check_deps() { local deps=("curl" "jq" "git") for dep in "${deps[@]}"; do if ! command -v "$dep" &> /dev/null; then echo "Missing dependency: $dep" >&2 exit 1 fi done } Logging 1 2 3 4 5 6 7 LOG_FILE="/var/log/myapp.log" log() { echo "[$(date '+%Y-%m-%d %H:%M:%S')] $*" | tee -a "$LOG_FILE" } log "Starting process" Confirmation Prompt 1 2 3 4 5 6 7 8 confirm() { read -rp "$1 [y/N] " response [[ "$response" =~ ^[Yy]$ ]] } if confirm "Delete all files?"; then rm -rf /tmp/data/* fi Retry Logic 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 retry() { local max_attempts="$1" local delay="$2" shift 2 local attempt=1 until "$@"; do if ((attempt >= max_attempts)); then echo "Failed after $attempt attempts" >&2 return 1 fi echo "Attempt $attempt failed, retrying in ${delay}s..." sleep "$delay" ((attempt++)) done } retry 3 5 curl -sf "$URL" Lock File 1 2 3 4 5 6 7 8 9 10 11 12 LOCKFILE="/var/lock/myapp.lock" acquire_lock() { exec 200>"$LOCKFILE" flock -n 200 || { echo "Another instance is running" >&2 exit 1 } } acquire_lock # Script continues only if lock acquired Complete Example 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 #!/usr/bin/env bash set -euo pipefail readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" readonly SCRIPT_NAME="$(basename "$0")" usage() { cat << EOF Usage: $SCRIPT_NAME [options] <input> Options: -o, --output FILE Output file (default: stdout) -v, --verbose Verbose output -h, --help Show this help Examples: $SCRIPT_NAME data.txt $SCRIPT_NAME -o result.txt -v input.txt EOF exit "${1:-0}" } log() { if [[ "$VERBOSE" == true ]]; then echo "[$(date '+%H:%M:%S')] $*" >&2 fi } error() { echo "Error: $*" >&2 exit 1 } # Defaults VERBOSE=false OUTPUT="/dev/stdout" # Parse arguments while [[ $# -gt 0 ]]; do case $1 in -o|--output) OUTPUT="$2"; shift 2 ;; -v|--verbose) VERBOSE=true; shift ;; -h|--help) usage ;; -*) error "Unknown option: $1" ;; *) break ;; esac done [[ $# -ge 1 ]] || usage 1 INPUT="$1" # Validate [[ -f "$INPUT" ]] || error "File not found: $INPUT" # Main logic log "Processing $INPUT" process_data < "$INPUT" > "$OUTPUT" log "Done" Bash scripts don’t have to be fragile. Apply these patterns and they’ll work reliably for years. ...

February 28, 2026 Â· 8 min Â· 1589 words Â· Rob Washington

Python Patterns for Command-Line Scripts

Python is the go-to language for automation scripts. Here’s how to write CLI tools that are reliable and user-friendly. Basic Script Structure 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 #!/usr/bin/env python3 """One-line description of what this script does.""" import argparse import sys def main(): parser = argparse.ArgumentParser(description=__doc__) parser.add_argument('input', help='Input file path') parser.add_argument('-o', '--output', help='Output file path') parser.add_argument('-v', '--verbose', action='store_true') args = parser.parse_args() # Your logic here process(args.input, args.output, args.verbose) if __name__ == '__main__': main() Argument Parsing with argparse Positional Arguments 1 2 3 4 parser.add_argument('filename') # Required parser.add_argument('files', nargs='+') # One or more parser.add_argument('files', nargs='*') # Zero or more parser.add_argument('config', nargs='?') # Optional positional Optional Arguments 1 2 3 4 5 parser.add_argument('-v', '--verbose', action='store_true') parser.add_argument('-q', '--quiet', action='store_false', dest='verbose') parser.add_argument('-n', '--count', type=int, default=10) parser.add_argument('-f', '--format', choices=['json', 'csv', 'table']) parser.add_argument('--config', type=argparse.FileType('r')) Subcommands 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 parser = argparse.ArgumentParser() subparsers = parser.add_subparsers(dest='command', required=True) # 'init' command init_parser = subparsers.add_parser('init', help='Initialize project') init_parser.add_argument('--force', action='store_true') # 'run' command run_parser = subparsers.add_parser('run', help='Run the application') run_parser.add_argument('--port', type=int, default=8080) args = parser.parse_args() if args.command == 'init': do_init(args.force) elif args.command == 'run': do_run(args.port) Error Handling 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 import sys def main(): try: result = process() return 0 except FileNotFoundError as e: print(f"Error: File not found: {e.filename}", file=sys.stderr) return 1 except PermissionError: print("Error: Permission denied", file=sys.stderr) return 1 except KeyboardInterrupt: print("\nInterrupted", file=sys.stderr) return 130 except Exception as e: print(f"Error: {e}", file=sys.stderr) return 1 if __name__ == '__main__': sys.exit(main()) Logging 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 import logging def setup_logging(verbose=False): level = logging.DEBUG if verbose else logging.INFO logging.basicConfig( level=level, format='%(asctime)s - %(levelname)s - %(message)s', datefmt='%Y-%m-%d %H:%M:%S' ) def main(): args = parse_args() setup_logging(args.verbose) logging.info("Starting process") logging.debug("Detailed info here") logging.warning("Something might be wrong") logging.error("Something went wrong") Log to File and Console 1 2 3 4 5 6 7 8 9 10 11 def setup_logging(verbose=False, log_file=None): handlers = [logging.StreamHandler()] if log_file: handlers.append(logging.FileHandler(log_file)) logging.basicConfig( level=logging.DEBUG if verbose else logging.INFO, format='%(asctime)s - %(name)s - %(levelname)s - %(message)s', handlers=handlers ) Progress Indicators Simple Progress 1 2 3 4 5 6 7 8 import sys def process_items(items): total = len(items) for i, item in enumerate(items, 1): process(item) print(f"\rProcessing: {i}/{total}", end='', flush=True) print() # Newline at end With tqdm 1 2 3 4 5 6 7 8 9 10 from tqdm import tqdm for item in tqdm(items, desc="Processing"): process(item) # Or wrap any iterable with tqdm(total=100) as pbar: for i in range(100): do_work() pbar.update(1) Reading Input From File or Stdin 1 2 3 4 5 6 7 8 9 10 import sys def read_input(filepath=None): if filepath: with open(filepath) as f: return f.read() elif not sys.stdin.isatty(): return sys.stdin.read() else: raise ValueError("No input provided") Line by Line 1 2 3 4 5 import fileinput # Reads from files in args or stdin for line in fileinput.input(): process(line.strip()) Output Formatting JSON Output 1 2 3 4 5 6 7 import json def output_json(data, pretty=False): if pretty: print(json.dumps(data, indent=2, default=str)) else: print(json.dumps(data, default=str)) Table Output 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 def print_table(headers, rows): # Calculate column widths widths = [len(h) for h in headers] for row in rows: for i, cell in enumerate(row): widths[i] = max(widths[i], len(str(cell))) # Print header header_line = ' | '.join(h.ljust(widths[i]) for i, h in enumerate(headers)) print(header_line) print('-' * len(header_line)) # Print rows for row in rows: print(' | '.join(str(cell).ljust(widths[i]) for i, cell in enumerate(row))) With tabulate 1 2 3 4 5 6 7 from tabulate import tabulate data = [ ['Alice', 30, 'Engineer'], ['Bob', 25, 'Designer'], ] print(tabulate(data, headers=['Name', 'Age', 'Role'], tablefmt='grid')) Configuration Files YAML Config 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 import yaml from pathlib import Path def load_config(config_path=None): paths = [ config_path, Path.home() / '.myapp.yaml', Path('/etc/myapp/config.yaml'), ] for path in paths: if path and Path(path).exists(): with open(path) as f: return yaml.safe_load(f) return {} # Defaults Environment Variables 1 2 3 4 5 6 7 8 import os def get_config(): return { 'api_key': os.environ.get('API_KEY'), 'debug': os.environ.get('DEBUG', '').lower() in ('true', '1', 'yes'), 'timeout': int(os.environ.get('TIMEOUT', '30')), } Running External Commands 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 import subprocess def run_command(cmd, check=True): """Run command and return output.""" result = subprocess.run( cmd, shell=isinstance(cmd, str), capture_output=True, text=True, check=check ) return result.stdout.strip() # Usage output = run_command(['git', 'status', '--short']) output = run_command('ls -la | head -5') With Timeout 1 2 3 4 5 6 7 8 9 try: result = subprocess.run( ['slow-command'], timeout=30, capture_output=True, text=True ) except subprocess.TimeoutExpired: print("Command timed out") Temporary Files 1 2 3 4 5 6 7 8 9 10 11 12 13 import tempfile from pathlib import Path # Temporary file with tempfile.NamedTemporaryFile(mode='w', suffix='.json', delete=False) as f: f.write('{"data": "value"}') temp_path = f.name # Temporary directory with tempfile.TemporaryDirectory() as tmpdir: work_file = Path(tmpdir) / 'work.txt' work_file.write_text('working...') # Directory deleted when context exits Path Handling 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 from pathlib import Path def process_files(directory): base = Path(directory) # Find files for path in base.glob('**/*.py'): print(f"Processing: {path}") # Path operations print(f" Name: {path.name}") print(f" Stem: {path.stem}") print(f" Suffix: {path.suffix}") print(f" Parent: {path.parent}") # Read/write content = path.read_text() path.with_suffix('.bak').write_text(content) Complete Example 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 #!/usr/bin/env python3 """Process log files and output statistics.""" import argparse import json import logging import sys from collections import Counter from pathlib import Path def setup_logging(verbose): logging.basicConfig( level=logging.DEBUG if verbose else logging.INFO, format='%(levelname)s: %(message)s' ) def parse_args(): parser = argparse.ArgumentParser( description=__doc__, formatter_class=argparse.RawDescriptionHelpFormatter ) parser.add_argument( 'logfiles', nargs='+', type=Path, help='Log files to process' ) parser.add_argument( '-o', '--output', type=argparse.FileType('w'), default=sys.stdout, help='Output file (default: stdout)' ) parser.add_argument( '-f', '--format', choices=['json', 'text'], default='text', help='Output format' ) parser.add_argument( '-v', '--verbose', action='store_true', help='Enable verbose output' ) return parser.parse_args() def analyze_logs(logfiles): stats = Counter() for logfile in logfiles: logging.info(f"Processing {logfile}") if not logfile.exists(): logging.warning(f"File not found: {logfile}") continue for line in logfile.read_text().splitlines(): if 'ERROR' in line: stats['errors'] += 1 elif 'WARNING' in line: stats['warnings'] += 1 stats['total'] += 1 return dict(stats) def output_results(stats, output, fmt): if fmt == 'json': json.dump(stats, output, indent=2) output.write('\n') else: for key, value in stats.items(): output.write(f"{key}: {value}\n") def main(): args = parse_args() setup_logging(args.verbose) try: stats = analyze_logs(args.logfiles) output_results(stats, args.output, args.format) return 0 except Exception as e: logging.error(f"Failed: {e}") return 1 if __name__ == '__main__': sys.exit(main()) Usage: ...

February 28, 2026 Â· 6 min Â· 1202 words Â· Rob Washington

Shell Scripting Patterns That Prevent 3 AM Pages

Every ops engineer has a story about a shell script that worked perfectly — until it didn’t. Usually at 3 AM. Usually in production. These patterns won’t make your scripts bulletproof, but they’ll stop the most common failures. Start Every Script Right 1 2 3 #!/usr/bin/env bash set -euo pipefail IFS=$'\n\t' This preamble should be muscle memory: set -e: Exit on any error (non-zero return code) set -u: Exit on undefined variables set -o pipefail: Catch errors in pipes (not just the last command) IFS=$'\n\t': Safer word splitting (no spaces) Without these, a typo like rm -rf $UNSET_VAR/ could wipe your root filesystem. ...

February 27, 2026 Â· 5 min Â· 1001 words Â· Rob Washington

Bash Scripting Patterns That Prevent Disasters

Bash scripts have a reputation for being fragile. They don’t have to be. Here are the patterns that separate scripts that work from scripts that work reliably. Start Every Script Right 1 2 3 #!/usr/bin/env bash set -euo pipefail IFS=$'\n\t' What each does: set -e - Exit on any command failure set -u - Error on undefined variables set -o pipefail - Pipelines fail if any command fails IFS=$'\n\t' - Safer word splitting (no space splitting) Error Handling Basic Trap 1 2 3 4 5 6 7 8 9 10 11 12 #!/usr/bin/env bash set -euo pipefail cleanup() { echo "Cleaning up..." rm -f "$TEMP_FILE" } trap cleanup EXIT TEMP_FILE=$(mktemp) # Script continues... # cleanup runs automatically on exit, error, or interrupt Detailed Error Reporting 1 2 3 4 5 6 7 8 9 10 11 12 #!/usr/bin/env bash set -euo pipefail error_handler() { local line=$1 local exit_code=$2 echo "Error on line $line: exit code $exit_code" >&2 exit "$exit_code" } trap 'error_handler $LINENO $?' ERR # Now errors report their line number Log and Exit 1 2 3 4 5 6 7 die() { echo "ERROR: $*" >&2 exit 1 } # Usage [[ -f "$CONFIG_FILE" ]] || die "Config file not found: $CONFIG_FILE" Argument Parsing Simple Positional 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 #!/usr/bin/env bash set -euo pipefail usage() { echo "Usage: $0 <environment> <version>" echo " environment: staging|production" echo " version: semver (e.g., 1.2.3)" exit 1 } [[ $# -eq 2 ]] || usage ENVIRONMENT=$1 VERSION=$2 [[ "$ENVIRONMENT" =~ ^(staging|production)$ ]] || die "Invalid environment" [[ "$VERSION" =~ ^[0-9]+\.[0-9]+\.[0-9]+$ ]] || die "Invalid version format" Flags with getopts 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 #!/usr/bin/env bash set -euo pipefail VERBOSE=false DRY_RUN=false OUTPUT="" usage() { cat <<EOF Usage: $0 [options] <file> Options: -v Verbose output -n Dry run (don't make changes) -o FILE Output file -h Show this help EOF exit 1 } while getopts "vno:h" opt; do case $opt in v) VERBOSE=true ;; n) DRY_RUN=true ;; o) OUTPUT=$OPTARG ;; h) usage ;; *) usage ;; esac done shift $((OPTIND - 1)) [[ $# -eq 1 ]] || usage FILE=$1 Long Options (Manual Parsing) 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 #!/usr/bin/env bash set -euo pipefail VERBOSE=false CONFIG="" while [[ $# -gt 0 ]]; do case $1 in -v|--verbose) VERBOSE=true shift ;; -c|--config) CONFIG=$2 shift 2 ;; -h|--help) usage ;; --) shift break ;; -*) die "Unknown option: $1" ;; *) break ;; esac done Variable Safety Default Values 1 2 3 4 5 6 7 8 # Default if unset NAME=${NAME:-"default"} # Default if unset or empty NAME=${NAME:="default"} # Error if unset : "${REQUIRED_VAR:?'REQUIRED_VAR must be set'}" Safe Variable Expansion 1 2 3 4 5 6 7 8 # Always quote variables rm "$FILE" # Good rm $FILE # Bad - breaks on spaces # Check before using if [[ -n "${VAR:-}" ]]; then echo "$VAR" fi Arrays 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 FILES=("file1.txt" "file with spaces.txt" "file3.txt") # Iterate safely for file in "${FILES[@]}"; do echo "Processing: $file" done # Pass to commands cp "${FILES[@]}" /destination/ # Length echo "Count: ${#FILES[@]}" # Append FILES+=("another.txt") File Operations Safe Temporary Files 1 2 3 4 5 TEMP_DIR=$(mktemp -d) trap 'rm -rf "$TEMP_DIR"' EXIT TEMP_FILE=$(mktemp) trap 'rm -f "$TEMP_FILE"' EXIT Check Before Acting 1 2 3 4 5 6 7 8 9 10 11 12 13 14 # File exists [[ -f "$FILE" ]] || die "File not found: $FILE" # Directory exists [[ -d "$DIR" ]] || die "Directory not found: $DIR" # File is readable [[ -r "$FILE" ]] || die "Cannot read: $FILE" # File is writable [[ -w "$FILE" ]] || die "Cannot write: $FILE" # File is executable [[ -x "$FILE" ]] || die "Cannot execute: $FILE" Safe File Writing 1 2 3 4 5 6 7 8 9 10 # Atomic write with temp file write_config() { local content=$1 local dest=$2 local temp temp=$(mktemp) echo "$content" > "$temp" mv "$temp" "$dest" # Atomic on same filesystem } Command Execution Check Command Exists 1 2 3 4 5 6 7 require_command() { command -v "$1" >/dev/null 2>&1 || die "Required command not found: $1" } require_command jq require_command aws require_command docker Capture Output and Exit Code 1 2 3 4 5 6 7 8 9 10 11 # Capture output output=$(some_command 2>&1) # Capture exit code without exiting (despite set -e) exit_code=0 output=$(some_command 2>&1) || exit_code=$? if [[ $exit_code -ne 0 ]]; then echo "Command failed with code $exit_code" echo "Output: $output" fi Retry Logic 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 retry() { local max_attempts=$1 local delay=$2 shift 2 local cmd=("$@") local attempt=1 while [[ $attempt -le $max_attempts ]]; do if "${cmd[@]}"; then return 0 fi echo "Attempt $attempt/$max_attempts failed. Retrying in ${delay}s..." sleep "$delay" ((attempt++)) done return 1 } # Usage retry 3 5 curl -f https://api.example.com/health Logging 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 LOG_FILE="/var/log/myscript.log" log() { local level=$1 shift echo "[$(date '+%Y-%m-%d %H:%M:%S')] [$level] $*" | tee -a "$LOG_FILE" } info() { log INFO "$@"; } warn() { log WARN "$@"; } error() { log ERROR "$@" >&2; } # Usage info "Starting deployment" warn "Deprecated config option used" error "Failed to connect to database" Confirmation Prompts 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 confirm() { local prompt=${1:-"Continue?"} local response read -r -p "$prompt [y/N] " response [[ "$response" =~ ^[Yy]$ ]] } # Usage if confirm "Delete all files in $DIR?"; then rm -rf "$DIR"/* fi # With default yes confirm_yes() { local prompt=${1:-"Continue?"} local response read -r -p "$prompt [Y/n] " response [[ ! "$response" =~ ^[Nn]$ ]] } Parallel Execution 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 # Simple background jobs for server in server1 server2 server3; do deploy_to "$server" & done wait # Wait for all background jobs # With job limiting MAX_JOBS=4 job_count=0 for item in "${ITEMS[@]}"; do process_item "$item" & ((job_count++)) if [[ $job_count -ge $MAX_JOBS ]]; then wait -n # Wait for any one job to complete ((job_count--)) fi done wait # Wait for remaining jobs Complete Script Template 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 #!/usr/bin/env bash # # Description: What this script does # Usage: ./script.sh [options] <args> # set -euo pipefail IFS=$'\n\t' # Constants readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)" readonly SCRIPT_NAME="$(basename "${BASH_SOURCE[0]}")" # Defaults VERBOSE=false DRY_RUN=false # Logging log() { echo "[$(date '+%Y-%m-%d %H:%M:%S')] $*"; } info() { log "INFO: $*"; } warn() { log "WARN: $*" >&2; } error() { log "ERROR: $*" >&2; } die() { error "$@"; exit 1; } # Cleanup cleanup() { # Add cleanup tasks here : } trap cleanup EXIT # Usage usage() { cat <<EOF Usage: $SCRIPT_NAME [options] <argument> Description: What this script does in more detail. Options: -v, --verbose Enable verbose output -n, --dry-run Show what would be done -h, --help Show this help message Examples: $SCRIPT_NAME -v input.txt $SCRIPT_NAME --dry-run config.yml EOF exit "${1:-0}" } # Parse arguments parse_args() { while [[ $# -gt 0 ]]; do case $1 in -v|--verbose) VERBOSE=true; shift ;; -n|--dry-run) DRY_RUN=true; shift ;; -h|--help) usage 0 ;; --) shift; break ;; -*) die "Unknown option: $1" ;; *) break ;; esac done [[ $# -ge 1 ]] || die "Missing required argument" ARGUMENT=$1 } # Main logic main() { parse_args "$@" info "Starting with argument: $ARGUMENT" if $VERBOSE; then info "Verbose mode enabled" fi if $DRY_RUN; then info "Dry run - no changes will be made" fi # Your script logic here info "Done" } main "$@" Bash scripts don’t need to be fragile. set -euo pipefail catches most accidents. Proper argument parsing makes scripts usable. Traps ensure cleanup happens. These patterns transform one-off hacks into reliable automation. Use the template, adapt as needed, and stop being afraid of your own scripts. ...

February 26, 2026 Â· 7 min Â· 1491 words Â· Rob Washington

Bash Scripting Patterns: Write Scripts That Don't Embarrass You

Bash scripts have a reputation for being fragile, unreadable hacks. They don’t have to be. These patterns will help you write scripts that are maintainable, debuggable, and safe to run in production. The Essentials: Start Every Script Right 1 2 3 4 #!/usr/bin/env bash set -euo pipefail IFS=$'\n\t' What these do: set -e — Exit on any error set -u — Error on undefined variables set -o pipefail — Catch errors in pipelines IFS=$'\n\t' — Safer word splitting (no spaces) Script Template 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 #!/usr/bin/env bash set -euo pipefail # Script metadata readonly SCRIPT_NAME="$(basename "$0")" readonly SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)" readonly SCRIPT_VERSION="1.0.0" # Default values VERBOSE=false DRY_RUN=false CONFIG_FILE="" # Colors (if terminal supports them) if [[ -t 1 ]]; then readonly RED='\033[0;31m' readonly GREEN='\033[0;32m' readonly YELLOW='\033[0;33m' readonly NC='\033[0m' # No Color else readonly RED='' GREEN='' YELLOW='' NC='' fi # Logging functions log_info() { echo -e "${GREEN}[INFO]${NC} $*"; } log_warn() { echo -e "${YELLOW}[WARN]${NC} $*" >&2; } log_error() { echo -e "${RED}[ERROR]${NC} $*" >&2; } log_debug() { [[ "$VERBOSE" == true ]] && echo -e "[DEBUG] $*" || true; } # Cleanup function cleanup() { local exit_code=$? # Add cleanup tasks here log_debug "Cleaning up..." exit $exit_code } trap cleanup EXIT # Usage usage() { cat <<EOF Usage: $SCRIPT_NAME [OPTIONS] <argument> Description of what this script does. Options: -h, --help Show this help message -v, --verbose Enable verbose output -n, --dry-run Show what would be done -c, --config FILE Path to config file --version Show version Examples: $SCRIPT_NAME -v input.txt $SCRIPT_NAME --config /etc/myapp.conf data/ EOF } # Parse arguments parse_args() { while [[ $# -gt 0 ]]; do case $1 in -h|--help) usage exit 0 ;; -v|--verbose) VERBOSE=true shift ;; -n|--dry-run) DRY_RUN=true shift ;; -c|--config) CONFIG_FILE="$2" shift 2 ;; --version) echo "$SCRIPT_NAME version $SCRIPT_VERSION" exit 0 ;; -*) log_error "Unknown option: $1" usage exit 1 ;; *) ARGS+=("$1") shift ;; esac done } # Validation validate() { if [[ ${#ARGS[@]} -eq 0 ]]; then log_error "Missing required argument" usage exit 1 fi if [[ -n "$CONFIG_FILE" && ! -f "$CONFIG_FILE" ]]; then log_error "Config file not found: $CONFIG_FILE" exit 1 fi } # Main logic main() { log_info "Starting $SCRIPT_NAME" log_debug "Arguments: ${ARGS[*]}" for arg in "${ARGS[@]}"; do if [[ "$DRY_RUN" == true ]]; then log_info "[DRY RUN] Would process: $arg" else log_info "Processing: $arg" # Actual work here fi done log_info "Done" } # Entry point ARGS=() parse_args "$@" validate main Error Handling Check Command Existence 1 2 3 4 5 6 7 8 9 10 require_command() { if ! command -v "$1" &> /dev/null; then log_error "Required command not found: $1" exit 1 fi } require_command curl require_command jq require_command docker Retry Pattern 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 retry() { local max_attempts=$1 local delay=$2 shift 2 local cmd=("$@") local attempt=1 while (( attempt <= max_attempts )); do if "${cmd[@]}"; then return 0 fi log_warn "Attempt $attempt/$max_attempts failed, retrying in ${delay}s..." sleep "$delay" ((attempt++)) done log_error "Command failed after $max_attempts attempts: ${cmd[*]}" return 1 } # Usage retry 3 5 curl -sf https://api.example.com/health Timeout Pattern 1 2 3 4 5 6 7 8 9 10 11 12 13 14 with_timeout() { local timeout=$1 shift timeout "$timeout" "$@" || { local exit_code=$? if [[ $exit_code -eq 124 ]]; then log_error "Command timed out after ${timeout}s" fi return $exit_code } } # Usage with_timeout 30 curl -s https://api.example.com Safe File Operations Temporary Files 1 2 3 4 5 6 7 # Create temp file that's automatically cleaned up TEMP_FILE=$(mktemp) trap 'rm -f "$TEMP_FILE"' EXIT # Or temp directory TEMP_DIR=$(mktemp -d) trap 'rm -rf "$TEMP_DIR"' EXIT Safe Write (Atomic) 1 2 3 4 5 6 7 8 9 10 11 12 safe_write() { local target=$1 local content=$2 local temp_file temp_file=$(mktemp) echo "$content" > "$temp_file" mv "$temp_file" "$target" } # Usage safe_write /etc/myapp/config.json '{"key": "value"}' Backup Before Modify 1 2 3 4 5 6 7 8 9 backup_file() { local file=$1 local backup="${file}.bak.$(date +%Y%m%d_%H%M%S)" if [[ -f "$file" ]]; then cp "$file" "$backup" log_info "Backup created: $backup" fi } Input Validation File Checks 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 validate_file() { local file=$1 [[ -e "$file" ]] || { log_error "File not found: $file"; return 1; } [[ -f "$file" ]] || { log_error "Not a regular file: $file"; return 1; } [[ -r "$file" ]] || { log_error "File not readable: $file"; return 1; } } validate_directory() { local dir=$1 [[ -e "$dir" ]] || { log_error "Directory not found: $dir"; return 1; } [[ -d "$dir" ]] || { log_error "Not a directory: $dir"; return 1; } [[ -w "$dir" ]] || { log_error "Directory not writable: $dir"; return 1; } } Input Sanitization 1 2 3 4 5 6 7 8 sanitize_filename() { local input=$1 # Remove path components and dangerous characters echo "${input##*/}" | tr -cd '[:alnum:]._-' } # Usage safe_name=$(sanitize_filename "$user_input") Configuration Config File Loading 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 load_config() { local config_file=$1 if [[ -f "$config_file" ]]; then log_debug "Loading config: $config_file" # shellcheck source=/dev/null source "$config_file" else log_warn "Config file not found, using defaults: $config_file" fi } # Config file format (config.sh): # DB_HOST="localhost" # DB_PORT=5432 # ENABLE_FEATURE=true Environment Variable Defaults 1 2 3 4 5 6 : "${DB_HOST:=localhost}" : "${DB_PORT:=5432}" : "${LOG_LEVEL:=info}" # Or with validation DB_HOST="${DB_HOST:?DB_HOST environment variable required}" Parallel Execution Using xargs 1 2 3 4 process_files() { find . -name "*.txt" -print0 | \ xargs -0 -P 4 -I {} bash -c 'process_single_file "$1"' _ {} } Using Background Jobs 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 parallel_tasks() { local pids=() for item in "${items[@]}"; do process_item "$item" & pids+=($!) done # Wait for all and check results local failed=0 for pid in "${pids[@]}"; do if ! wait "$pid"; then ((failed++)) fi done [[ $failed -eq 0 ]] || log_error "$failed tasks failed" return $failed } Locking Prevent Concurrent Runs 1 2 3 4 5 6 7 8 9 10 11 LOCK_FILE="/var/run/${SCRIPT_NAME}.lock" acquire_lock() { if ! mkdir "$LOCK_FILE" 2>/dev/null; then log_error "Another instance is running (lock: $LOCK_FILE)" exit 1 fi trap 'rm -rf "$LOCK_FILE"' EXIT } acquire_lock With flock 1 2 3 4 5 6 7 8 9 10 11 12 LOCK_FD=200 LOCK_FILE="/var/run/${SCRIPT_NAME}.lock" acquire_lock() { eval "exec $LOCK_FD>$LOCK_FILE" if ! flock -n $LOCK_FD; then log_error "Another instance is running" exit 1 fi } acquire_lock Testing Assertions 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 assert_equals() { local expected=$1 actual=$2 message=${3:-"Assertion failed"} if [[ "$expected" != "$actual" ]]; then log_error "$message: expected '$expected', got '$actual'" return 1 fi } assert_file_exists() { local file=$1 if [[ ! -f "$file" ]]; then log_error "File should exist: $file" return 1 fi } # Usage assert_equals "200" "$status_code" "HTTP status check" assert_file_exists "/tmp/output.txt" Test Mode 1 2 3 4 5 6 7 if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then # Script is being run directly main "$@" else # Script is being sourced (for testing) log_debug "Script sourced, not executing main" fi Debugging Debug Mode 1 [[ "${DEBUG:-}" == true ]] && set -x Trace Function Calls 1 2 3 4 5 6 7 8 trace() { log_debug "TRACE: ${FUNCNAME[1]}(${*})" } my_function() { trace "$@" # function body } Common Pitfalls Quote Everything 1 2 3 4 5 # Bad if [ $var == "value" ]; then # Good if [[ "$var" == "value" ]]; then Use Arrays for Lists 1 2 3 4 5 6 7 # Bad files="file1.txt file2.txt file with spaces.txt" for f in $files; do echo "$f"; done # Good files=("file1.txt" "file2.txt" "file with spaces.txt") for f in "${files[@]}"; do echo "$f"; done Check Command Success 1 2 3 4 5 6 7 8 9 10 11 # Bad output=$(some_command) echo "$output" # Good if output=$(some_command 2>&1); then echo "$output" else log_error "Command failed: $output" exit 1 fi Good bash scripts are defensive, verbose when needed, and fail fast. Start with the template, add what you need, and resist the urge to be clever. ...

February 25, 2026 Â· 8 min Â· 1525 words Â· Rob Washington

Bash Scripting Best Practices: Writing Scripts That Don't Bite

Bash scripts start simple and grow complex. A quick automation becomes critical infrastructure. Here’s how to write scripts that don’t become maintenance nightmares. Start Every Script Right 1 2 3 #!/usr/bin/env bash set -euo pipefail IFS=$'\n\t' What these do: #!/usr/bin/env bash — Portable shebang, finds bash in PATH set -e — Exit on any error set -u — Error on undefined variables set -o pipefail — Pipeline fails if any command fails IFS=$'\n\t' — Safer word splitting (no spaces) The Debugging Version 1 2 #!/usr/bin/env bash set -euxo pipefail The -x prints each command before execution. Remove for production. ...

February 24, 2026 Â· 7 min Â· 1281 words Â· Rob Washington