I spent way too long hardcoding database URLs before learning this stuff. Here's everything I wish I knew about configuration in Python.
Why Configuration Matters
When I started coding, my config looked like this:
# settings.py - don't do this
DATABASE_URL = "postgres://admin:supersecret@localhost:5432/myapp"
API_KEY = "sk-1234567890abcdef"
DEBUG = TrueThis breaks immediately when you:
- Share your repo (oops, exposed credentials)
- Deploy to production (different database)
- Run in CI (no local database)
- Rotate a secret (hardcoded everywhere)
Configuration needs to come from outside your code.
The Configuration Hierarchy
Here's the pattern I use on every project:
Environment variables (highest priority)
↓
.env file (local development)
↓
Defaults in code (lowest priority)
This means:
- Production sets real env vars
- Local dev uses a
.envfile - Sensible defaults catch everything else
Let's build this up step by step.
Level 1: os.environ
The simplest approach—read directly from environment:
import os
# Required values - crash if missing
database_url = os.environ["DATABASE_URL"]
# Optional values - use .get() with default
debug = os.environ.get("DEBUG", "false")
port = os.environ.get("PORT", "8000")The catch: Everything is a string. You need manual conversion:
# String to bool
debug = os.environ.get("DEBUG", "false").lower() in ("true", "1", "yes")
# String to int
port = int(os.environ.get("PORT", "8000"))
# String to list
allowed_hosts = os.environ.get("ALLOWED_HOSTS", "localhost").split(",")This works but gets tedious fast.
Level 2: python-dotenv for Local Dev
Running export VAR=value for every variable is painful. Enter .env files:
pip install python-dotenvCreate a .env file in your project root:
# .env - NEVER commit this file
DATABASE_URL=postgres://localhost:5432/myapp_dev
API_KEY=dev-key-for-testing
DEBUG=true
PORT=8000
LOG_LEVEL=debugLoad it at startup:
# config.py
from dotenv import load_dotenv
import os
# Load .env file into environment
load_dotenv()
# Now os.environ has your values
DATABASE_URL = os.environ["DATABASE_URL"]
DEBUG = os.environ.get("DEBUG", "false").lower() == "true"The .env.example Pattern
Commit a template so others know what to set:
# .env.example - commit this
DATABASE_URL=postgres://user:pass@localhost:5432/dbname
API_KEY=your-api-key-here
DEBUG=false
PORT=8000Add .env to .gitignore:
# .gitignore
.env
.env.local
.env.*.localNew developers just copy and fill in:
cp .env.example .env
# Edit with real valuesLevel 3: Pydantic Settings (Production-Grade)
For real projects, use pydantic-settings. It gives you:
- Automatic type conversion
- Validation
- Clear error messages
- IDE autocomplete
pip install pydantic-settings# config.py
from pydantic_settings import BaseSettings
from pydantic import Field
class Settings(BaseSettings):
# Required - no default means it must exist
database_url: str
api_key: str
# Optional with defaults
debug: bool = False
port: int = 8000
log_level: str = "info"
# With validation
max_connections: int = Field(default=10, ge=1, le=100)
# Lists
allowed_hosts: list[str] = ["localhost"]
class Config:
env_file = ".env"
env_file_encoding = "utf-8"
# Create a singleton
settings = Settings()Now use it throughout your app:
from config import settings
# Fully typed, validated, converted
print(settings.database_url) # str
print(settings.debug) # bool (not "true")
print(settings.port) # int (not "8000")
print(settings.allowed_hosts) # list[str]Nested Configuration
For complex apps, nest your settings:
from pydantic_settings import BaseSettings
from pydantic import BaseModel
class DatabaseSettings(BaseModel):
url: str
pool_size: int = 5
echo: bool = False
class CacheSettings(BaseModel):
host: str = "localhost"
port: int = 6379
ttl: int = 3600
class Settings(BaseSettings):
database: DatabaseSettings
cache: CacheSettings
debug: bool = False
class Config:
env_file = ".env"
env_nested_delimiter = "__"
settings = Settings()
# Access nested config
print(settings.database.url)
print(settings.cache.port)Environment variables use double underscore:
DATABASE__URL=postgres://...
DATABASE__POOL_SIZE=10
CACHE__HOST=redis.example.comThe 12-Factor App Principles
The 12-Factor App methodology defines best practices. Here's what matters for config:
1. Store Config in the Environment
Config that changes between deploys (dev/staging/prod) goes in env vars:
# Good - reads from environment
database_url = os.environ["DATABASE_URL"]
# Bad - environment-specific code
if env == "production":
database_url = "postgres://prod-server/..."
else:
database_url = "postgres://localhost/..."2. Strict Separation
The same code should run in all environments. Only config changes:
# Same code everywhere
app = create_app(
database_url=settings.database_url,
debug=settings.debug
)
# Environment provides the values
# Dev: DEBUG=true, DATABASE_URL=postgres://localhost/...
# Prod: DEBUG=false, DATABASE_URL=postgres://prod-server/...3. No Config "Groups"
Avoid config/development.py, config/production.py. Each env var is independent:
# Bad - grouped config files
from config.production import *
# Good - flat env vars
DATABASE_URL=...
REDIS_URL=...
DEBUG=...Secrets Handling
Secrets need extra care. Here's my approach:
Local Development
Use .env files (gitignored):
# .env
API_KEY=dev-key-safe-to-use-locallyProduction
Never put production secrets in files. Use your platform's secrets management:
AWS: Secrets Manager or Parameter Store
import boto3
def get_secret(name: str) -> str:
client = boto3.client("secretsmanager")
response = client.get_secret_value(SecretId=name)
return response["SecretString"]
# Fetch at startup, not per-request
DATABASE_URL = get_secret("myapp/database_url")Docker: Inject at runtime
docker run -e API_KEY="$API_KEY" myappKubernetes: Use Secrets
env:
- name: API_KEY
valueFrom:
secretKeyRef:
name: myapp-secrets
key: api-keyThe secrets Module for Generation
Python's secrets module generates secure tokens:
import secrets
# Generate a secure token
api_key = secrets.token_urlsafe(32)
# Generate a secure password
password = secrets.token_hex(16)
# Secure comparison (timing-attack safe)
secrets.compare_digest(user_token, stored_token)Putting It All Together
Here's my production config setup:
# config.py
from pydantic_settings import BaseSettings
from pydantic import Field, SecretStr
from functools import lru_cache
class Settings(BaseSettings):
# Database
database_url: str
database_pool_size: int = Field(default=5, ge=1, le=50)
# API
api_key: SecretStr # Won't print in logs
api_timeout: int = 30
# Application
debug: bool = False
log_level: str = "info"
allowed_hosts: list[str] = ["localhost"]
# Feature flags
enable_new_feature: bool = False
class Config:
env_file = ".env"
env_file_encoding = "utf-8"
def validate_production(self) -> None:
"""Extra checks for production."""
if not self.debug:
assert "localhost" not in self.database_url, \
"Production should not use localhost database"
@lru_cache
def get_settings() -> Settings:
"""Cached settings singleton."""
settings = Settings()
settings.validate_production()
return settings
# Usage
settings = get_settings()Usage in FastAPI:
from fastapi import FastAPI, Depends
from config import Settings, get_settings
app = FastAPI()
@app.get("/health")
def health(settings: Settings = Depends(get_settings)):
return {
"debug": settings.debug,
"database": "connected" # Don't expose URL
}Common Patterns I Use
Fail Fast at Startup
# In your main.py or app startup
from config import get_settings
def main():
# Validate config immediately
settings = get_settings()
# If we get here, config is valid
run_app(settings)
if __name__ == "__main__":
main()Environment-Specific .env Files
from dotenv import load_dotenv
import os
# Load base config, then override
load_dotenv(".env") # defaults
load_dotenv(".env.local", override=True) # local overridesTesting with Config
# tests/conftest.py
import pytest
from config import Settings
@pytest.fixture
def test_settings():
return Settings(
database_url="sqlite:///:memory:",
api_key="test-key",
debug=True
)
# Or use monkeypatch
@pytest.fixture(autouse=True)
def env_setup(monkeypatch):
monkeypatch.setenv("DATABASE_URL", "sqlite:///:memory:")
monkeypatch.setenv("API_KEY", "test-key")Config as Documentation
Your Settings class documents what your app needs:
class Settings(BaseSettings):
"""Application configuration.
Required environment variables:
- DATABASE_URL: PostgreSQL connection string
- API_KEY: External service API key
Optional:
- DEBUG: Enable debug mode (default: false)
- PORT: Server port (default: 8000)
"""
database_url: str
api_key: SecretStr
debug: bool = False
port: int = 8000Mistakes I've Made
Committing .env files: Set up .gitignore before you start.
Using os.environ directly everywhere: Centralize in a config module.
Not validating at startup: Find config errors immediately, not at 3am.
Mixing config with logic:
# Bad
if os.environ.get("ENV") == "production":
use_cache = True
# Good
use_cache = settings.enable_cache # Set by env varForgetting defaults:
# Crashes if missing
timeout = int(os.environ["TIMEOUT"])
# Safe
timeout = int(os.environ.get("TIMEOUT", "30"))My Checklist
For every new project:
- ✅ Create
.env.examplewith all vars - ✅ Add
.envto.gitignore - ✅ Create
config.pywith Settings class - ✅ Validate config at startup
- ✅ Use
SecretStrfor sensitive values - ✅ Document what each variable does
Configuration done right means you can deploy anywhere with confidence. Your code stays the same—only the environment changes.