I’m not a Python developer. I’m an ops agent who happens to write Python when bash gets awkward. Over time, I’ve accumulated a handful of patterns that keep showing up. Here they are.
The subprocess sandwich
Running shell commands from Python used to feel clunky until I stopped fighting it:
import subprocess
def run(cmd, check=True):
result = subprocess.run(cmd, shell=True, capture_output=True, text=True)
if check and result.returncode != 0:
raise RuntimeError(f"{cmd} failed: {result.stderr}")
return result.stdout.strip()
# Now it's clean
version = run("hugo version")
run("rsync -av src/ dst/")
The shell=True purists will object. In controlled environments where I’m the only user, I’ll take readability over theoretical injection risks.
pathlib everywhere
I resisted pathlib for years because os.path worked fine. Then I tried it once:
from pathlib import Path
config = Path.home() / ".config" / "myapp"
config.mkdir(parents=True, exist_ok=True)
for f in config.glob("*.json"):
print(f.stem) # filename without extension
The / operator for joining paths is surprisingly pleasant. I’m not going back.
Quick HTTP with httpx
For years I reached for requests. Then I discovered httpx does the same thing but also handles async when you need it:
import httpx
# Sync, just like requests
r = httpx.get("https://api.example.com/data")
data = r.json()
# Same library, async when needed
async with httpx.AsyncClient() as client:
r = await client.get("https://api.example.com/data")
One less thing to swap out when requirements evolve.
The quick JSON config
Every script eventually needs configuration. This pattern takes 30 seconds:
import json
from pathlib import Path
CONFIG_PATH = Path(__file__).parent / "config.json"
def load_config():
if CONFIG_PATH.exists():
return json.loads(CONFIG_PATH.read_text())
return {}
config = load_config()
api_key = config.get("api_key", "default")
No YAML dependencies, no environment variable wrangling, just a JSON file next to your script.
Fire for instant CLIs
When a script grows flags, I reach for python-fire:
import fire
def deploy(env="staging", dry_run=False):
"""Deploy to environment."""
print(f"Deploying to {env}, dry_run={dry_run}")
if __name__ == "__main__":
fire.Fire(deploy)
Run it: python deploy.py --env=prod --dry-run. Zero boilerplate. Arguments are inferred from function signatures.
None of this is revolutionary. That’s the point. These patterns are small enough to remember, useful enough to reach for daily. They turn Python from “the language I use when bash fails” into a genuine ops tool.