add brain

This commit is contained in:
2026-03-12 15:17:52 +07:00
parent fd9f558fa1
commit e7821a7a9d
355 changed files with 93784 additions and 24 deletions

View File

@@ -0,0 +1,160 @@
---
name: python-backend
description: >
Python backend development expertise for FastAPI, security patterns, database operations,
Upstash integrations, and code quality. Use when: (1) Building REST APIs with FastAPI,
(2) Implementing JWT/OAuth2 authentication, (3) Setting up SQLAlchemy/async databases,
(4) Integrating Redis/Upstash caching, (5) Refactoring AI-generated Python code (deslopification),
(6) Designing API patterns, or (7) Optimizing backend performance.
---
# python-backend
Production-ready Python backend patterns for FastAPI, SQLAlchemy, and Upstash.
## When to Use This Skill
- Building REST APIs with FastAPI
- Implementing JWT/OAuth2 authentication
- Setting up SQLAlchemy async databases
- Integrating Redis/Upstash caching and rate limiting
- Refactoring AI-generated Python code
- Designing API patterns and project structure
## Core Principles
1. **Async-first** - Use async/await for I/O operations
2. **Type everything** - Pydantic models for validation
3. **Dependency injection** - Use FastAPI's Depends()
4. **Fail fast** - Validate early, use HTTPException
5. **Security by default** - Never trust user input
## Quick Patterns
### Project Structure
```
src/
├── auth/
│ ├── router.py # endpoints
│ ├── schemas.py # pydantic models
│ ├── models.py # db models
│ ├── service.py # business logic
│ └── dependencies.py
├── posts/
│ └── ...
├── config.py
├── database.py
└── main.py
```
### Async Routes
```python
# BAD - blocks event loop
@router.get("/")
async def bad():
time.sleep(10) # Blocking!
# GOOD - runs in threadpool
@router.get("/")
def good():
time.sleep(10) # OK in sync function
# BEST - non-blocking
@router.get("/")
async def best():
await asyncio.sleep(10) # Non-blocking
```
### Pydantic Validation
```python
from pydantic import BaseModel, EmailStr, Field
class UserCreate(BaseModel):
email: EmailStr
username: str = Field(min_length=3, max_length=50, pattern="^[a-zA-Z0-9_]+$")
age: int = Field(ge=18)
```
### Dependency Injection
```python
async def get_current_user(token: str = Depends(oauth2_scheme)) -> User:
payload = decode_token(token)
user = await get_user(payload["sub"])
if not user:
raise HTTPException(401, "User not found")
return user
@router.get("/me")
async def get_me(user: User = Depends(get_current_user)):
return user
```
### SQLAlchemy Async
```python
from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine
engine = create_async_engine(DATABASE_URL, pool_pre_ping=True)
SessionLocal = async_sessionmaker(engine, expire_on_commit=False)
async def get_session() -> AsyncGenerator[AsyncSession, None]:
async with SessionLocal() as session:
yield session
```
### Redis Caching
```python
from upstash_redis import Redis
redis = Redis.from_env()
@app.get("/data/{id}")
def get_data(id: str):
cached = redis.get(f"data:{id}")
if cached:
return cached
data = fetch_from_db(id)
redis.setex(f"data:{id}", 600, data)
return data
```
### Rate Limiting
```python
from upstash_ratelimit import Ratelimit, SlidingWindow
ratelimit = Ratelimit(
redis=Redis.from_env(),
limiter=SlidingWindow(max_requests=10, window=60),
)
@app.get("/api/resource")
def protected(request: Request):
result = ratelimit.limit(request.client.host)
if not result.allowed:
raise HTTPException(429, "Rate limit exceeded")
return {"data": "..."}
```
## Reference Documents
For detailed patterns, see:
| Document | Content |
|----------|---------|
| `references/fastapi_patterns.md` | Project structure, async, Pydantic, dependencies, testing |
| `references/security_patterns.md` | JWT, OAuth2, password hashing, CORS, API keys |
| `references/database_patterns.md` | SQLAlchemy async, transactions, eager loading, migrations |
| `references/upstash_patterns.md` | Redis, rate limiting, QStash background jobs |
## Resources
- [FastAPI Documentation](https://fastapi.tiangolo.com/)
- [SQLAlchemy 2.0 Documentation](https://docs.sqlalchemy.org/)
- [Upstash Documentation](https://upstash.com/docs)
- [Pydantic Documentation](https://docs.pydantic.dev/)

View File

@@ -0,0 +1,407 @@
# Database Patterns
SQLAlchemy and Alembic patterns for Python backends.
## Async Engine + Session
Use async engine + async session per request:
```python
from collections.abc import AsyncGenerator
from sqlalchemy.ext.asyncio import AsyncSession, async_sessionmaker, create_async_engine
DATABASE_URL = "postgresql+asyncpg://user:pass@localhost:5432/app"
engine = create_async_engine(DATABASE_URL, pool_pre_ping=True)
SessionLocal = async_sessionmaker(engine, expire_on_commit=False)
async def get_session() -> AsyncGenerator[AsyncSession, None]:
async with SessionLocal() as session:
yield session
```
---
## Commit/Rollback Pattern
Wrap write operations in try/except:
```python
from sqlalchemy.ext.asyncio import AsyncSession
async def create_user(session: AsyncSession, user: dict) -> User:
try:
db_user = User(**user)
session.add(db_user)
await session.commit()
await session.refresh(db_user)
return db_user
except Exception:
await session.rollback()
raise
```
### Nested Transaction with Savepoint
```python
async def transfer_funds(session: AsyncSession, from_id: int, to_id: int, amount: float):
async with session.begin_nested(): # Creates a savepoint
from_account = await session.get(Account, from_id)
to_account = await session.get(Account, to_id)
if from_account.balance < amount:
raise InsufficientFunds()
from_account.balance -= amount
to_account.balance += amount
await session.commit()
```
---
## Naming Conventions
Be consistent with names:
1. lower_case_snake
2. singular form (e.g. post, post_like, user_playlist)
3. group similar tables with module prefix (e.g. payment_account, payment_bill)
4. stay consistent across tables
5. _at suffix for datetime, _date suffix for date
### Constraint Naming
```python
from sqlalchemy import MetaData
NAMING_CONVENTION = {
"ix": "%(column_0_label)s_idx",
"uq": "%(table_name)s_%(column_0_name)s_key",
"ck": "%(table_name)s_%(constraint_name)s_check",
"fk": "%(table_name)s_%(column_0_name)s_fkey",
"pk": "%(table_name)s_pkey",
}
metadata = MetaData(naming_convention=NAMING_CONVENTION)
```
---
## Alembic Migration Naming
Use human-readable file template:
```ini
# alembic.ini
file_template = %%(year)d-%%(month).2d-%%(day).2d_%%(slug)s
# Results in: 2024-08-24_add_users_table.py
```
---
## Eager Loading - Avoid N+1
### selectinload for collections
```python
from sqlalchemy import select
from sqlalchemy.orm import selectinload
# Load all users with their orders in 2 queries
stmt = select(User).options(selectinload(User.orders))
users = await session.scalars(stmt)
for user in users:
print(user.orders) # No additional query
```
### joinedload for single relationships
```python
from sqlalchemy.orm import joinedload
# Load orders with their user in 1 query (JOIN)
stmt = select(Order).options(joinedload(Order.user))
orders = await session.scalars(stmt)
```
### raiseload to detect N+1
```python
from sqlalchemy.orm import raiseload
stmt = select(User).options(
selectinload(User.orders),
raiseload("*") # Raise on any other lazy load
)
```
---
## Cascade Delete
Configure at both ORM and database level:
```python
from sqlalchemy import ForeignKey, Integer, String
from sqlalchemy.orm import DeclarativeBase, Mapped, mapped_column, relationship
class Parent(Base):
__tablename__ = "parent"
id: Mapped[int] = mapped_column(primary_key=True)
children: Mapped[list["Child"]] = relationship(
back_populates="parent",
cascade="all, delete",
passive_deletes=True,
)
class Child(Base):
__tablename__ = "child"
id: Mapped[int] = mapped_column(primary_key=True)
parent_id: Mapped[int] = mapped_column(
ForeignKey("parent.id", ondelete="CASCADE")
)
parent: Mapped["Parent"] = relationship(back_populates="children")
```
---
## Soft Delete Pattern
Use a deleted_at timestamp instead of hard deletes:
```python
from datetime import datetime
from sqlalchemy import DateTime
from sqlalchemy.orm import Mapped, mapped_column
from sqlalchemy.ext.hybrid import hybrid_property
class SoftDeleteMixin:
deleted_at: Mapped[datetime | None] = mapped_column(
DateTime(timezone=True),
default=None,
index=True
)
@hybrid_property
def is_deleted(self) -> bool:
return self.deleted_at is not None
def soft_delete(self) -> None:
self.deleted_at = datetime.utcnow()
def restore(self) -> None:
self.deleted_at = None
```
Filter soft-deleted:
```python
# Only get non-deleted users
stmt = select(User).where(User.deleted_at.is_(None))
# Only deleted
stmt_deleted = select(User).where(User.deleted_at.isnot(None))
```
---
## Optimistic Locking
Use version_id_col to prevent lost updates:
```python
class Article(Base):
__tablename__ = "article"
id: Mapped[int] = mapped_column(primary_key=True)
title: Mapped[str] = mapped_column(String(200))
version_id: Mapped[int] = mapped_column(Integer, nullable=False, default=1)
__mapper_args__ = {
"version_id_col": version_id
}
# Handle StaleDataError
from sqlalchemy.orm.exc import StaleDataError
async def update_article(session: AsyncSession, article_id: int, data: dict):
try:
article = await session.get(Article, article_id)
for key, value in data.items():
setattr(article, key, value)
await session.commit()
except StaleDataError:
await session.rollback()
raise ConcurrentModificationError("Article was modified by another user.")
```
---
## Timestamp Mixin
Automatic created_at and updated_at:
```python
from datetime import datetime
from sqlalchemy import DateTime, func
from sqlalchemy.orm import Mapped, mapped_column
class TimestampMixin:
created_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True),
server_default=func.now(),
nullable=False
)
updated_at: Mapped[datetime] = mapped_column(
DateTime(timezone=True),
server_default=func.now(),
onupdate=func.now(),
nullable=False
)
```
---
## Bulk Operations
### Bulk Insert
```python
from sqlalchemy import insert
async def bulk_create_users(session: AsyncSession, users: list[dict]):
await session.execute(
insert(User),
[
{"name": "Alice", "email": "alice@example.com"},
{"name": "Bob", "email": "bob@example.com"},
],
)
await session.commit()
```
### Bulk Update
```python
from sqlalchemy import update
async def deactivate_old_users(session: AsyncSession, days: int = 365):
from datetime import datetime, timedelta
cutoff = datetime.utcnow() - timedelta(days=days)
result = await session.execute(
update(User)
.where(User.last_login < cutoff)
.values(is_active=False)
)
await session.commit()
return result.rowcount
```
---
## Upsert (PostgreSQL)
```python
from sqlalchemy.dialects.postgresql import insert as pg_insert
async def upsert_user(session: AsyncSession, user_data: dict):
stmt = pg_insert(User).values(**user_data)
stmt = stmt.on_conflict_do_update(
index_elements=[User.email],
set_={
"name": stmt.excluded.name,
"updated_at": func.now(),
}
)
await session.execute(stmt)
await session.commit()
```
---
## UUID Primary Keys
```python
import uuid
from sqlalchemy import Uuid
from sqlalchemy.orm import Mapped, mapped_column
class User(Base):
__tablename__ = "user"
id: Mapped[uuid.UUID] = mapped_column(
Uuid,
primary_key=True,
default=uuid.uuid4
)
```
### Dual ID Pattern (internal + public)
```python
class User(Base):
__tablename__ = "user"
# Internal ID for joins (faster)
id: Mapped[int] = mapped_column(Integer, primary_key=True)
# Public ID for API (safe to expose)
public_id: Mapped[uuid.UUID] = mapped_column(
Uuid,
unique=True,
default=uuid.uuid4,
index=True
)
```
---
## Repository Pattern
```python
from typing import Generic, TypeVar, Type
from sqlalchemy import select
from sqlalchemy.ext.asyncio import AsyncSession
ModelType = TypeVar("ModelType", bound=Base)
class BaseRepository(Generic[ModelType]):
def __init__(self, session: AsyncSession, model: Type[ModelType]):
self.session = session
self.model = model
async def get(self, id: int) -> ModelType | None:
return await self.session.get(self.model, id)
async def get_all(self, skip: int = 0, limit: int = 100) -> list[ModelType]:
result = await self.session.scalars(
select(self.model).offset(skip).limit(limit)
)
return result.all()
async def create(self, obj: ModelType) -> ModelType:
self.session.add(obj)
await self.session.commit()
await self.session.refresh(obj)
return obj
class UserRepository(BaseRepository[User]):
def __init__(self, session: AsyncSession):
super().__init__(session, User)
async def get_by_email(self, email: str) -> User | None:
result = await self.session.scalar(
select(User).where(User.email == email)
)
return result
```

View File

@@ -0,0 +1,412 @@
# FastAPI Best Practices
Complete FastAPI patterns from [zhanymkanov/fastapi-best-practices](https://github.com/zhanymkanov/fastapi-best-practices).
## Project Structure - Domain-Driven
Domain-driven project structure inspired by Netflix's Dispatch. Store all domain directories inside `src` folder.
```
fastapi-project
├── alembic/
├── src
│ ├── auth
│ │ ├── router.py # core endpoints
│ │ ├── schemas.py # pydantic models
│ │ ├── models.py # db models
│ │ ├── dependencies.py
│ │ ├── config.py
│ │ ├── constants.py
│ │ ├── exceptions.py
│ │ ├── service.py
│ │ └── utils.py
│ ├── posts
│ │ ├── router.py
│ │ ├── schemas.py
│ │ ├── models.py
│ │ └── ...
│ ├── config.py # global configs
│ ├── models.py # global models
│ ├── exceptions.py
│ ├── pagination.py
│ ├── database.py
│ └── main.py
├── tests/
├── templates/
└── requirements/
```
Import from other packages with explicit module names:
```python
from src.auth import constants as auth_constants
from src.notifications import service as notification_service
from src.posts.constants import ErrorCode as PostsErrorCode
```
---
## Async Routes
### I/O Intensive Tasks
FastAPI handles sync routes in threadpool, async routes on event loop. Never use blocking operations in async routes.
```python
# BAD - blocks event loop
@router.get("/terrible-ping")
async def terrible_ping():
time.sleep(10) # I/O blocking operation, whole process blocked
return {"pong": True}
# GOOD - runs in threadpool
@router.get("/good-ping")
def good_ping():
time.sleep(10) # Blocking but in separate thread
return {"pong": True}
# PERFECT - non-blocking async
@router.get("/perfect-ping")
async def perfect_ping():
await asyncio.sleep(10) # Non-blocking I/O
return {"pong": True}
```
### CPU Intensive Tasks
CPU-intensive tasks should not be awaited or run in threadpool due to GIL. Send them to workers in another process.
### Sync SDK in Thread Pool
If you must use a library that's not async, use `run_in_threadpool`:
```python
from fastapi.concurrency import run_in_threadpool
from my_sync_library import SyncAPIClient
@app.get("/")
async def call_my_sync_library():
my_data = await service.get_my_data()
client = SyncAPIClient()
await run_in_threadpool(client.make_request, data=my_data)
```
---
## Pydantic Patterns
### Excessively Use Pydantic
Pydantic has rich features for validation:
```python
from enum import Enum
from pydantic import AnyUrl, BaseModel, EmailStr, Field
class MusicBand(str, Enum):
AEROSMITH = "AEROSMITH"
QUEEN = "QUEEN"
ACDC = "AC/DC"
class UserBase(BaseModel):
first_name: str = Field(min_length=1, max_length=128)
username: str = Field(min_length=1, max_length=128, pattern="^[A-Za-z0-9-_]+$")
email: EmailStr
age: int = Field(ge=18, default=None)
favorite_band: MusicBand | None = None
website: AnyUrl | None = None
```
### Custom Base Model
Create a controllable global base model:
```python
from datetime import datetime
from zoneinfo import ZoneInfo
from fastapi.encoders import jsonable_encoder
from pydantic import BaseModel, ConfigDict
def datetime_to_gmt_str(dt: datetime) -> str:
if not dt.tzinfo:
dt = dt.replace(tzinfo=ZoneInfo("UTC"))
return dt.strftime("%Y-%m-%dT%H:%M:%S%z")
class CustomModel(BaseModel):
model_config = ConfigDict(
json_encoders={datetime: datetime_to_gmt_str},
populate_by_name=True,
)
def serializable_dict(self, **kwargs):
"""Return a dict which contains only serializable fields."""
default_dict = self.model_dump()
return jsonable_encoder(default_dict)
```
### Decouple BaseSettings
Split BaseSettings across different modules:
```python
# src.auth.config
from pydantic_settings import BaseSettings
class AuthConfig(BaseSettings):
JWT_ALG: str
JWT_SECRET: str
JWT_EXP: int = 5 # minutes
REFRESH_TOKEN_KEY: str
SECURE_COOKIES: bool = True
auth_settings = AuthConfig()
# src.config
from pydantic import PostgresDsn, RedisDsn
from pydantic_settings import BaseSettings
class Config(BaseSettings):
DATABASE_URL: PostgresDsn
REDIS_URL: RedisDsn
SITE_DOMAIN: str = "myapp.com"
ENVIRONMENT: Environment = Environment.PRODUCTION
settings = Config()
```
### ValueError Becomes ValidationError
```python
from pydantic import BaseModel, field_validator
class ProfileCreate(BaseModel):
username: str
password: str
@field_validator("password", mode="after")
@classmethod
def valid_password(cls, password: str) -> str:
if not re.match(STRONG_PASSWORD_PATTERN, password):
raise ValueError(
"Password must contain at least "
"one lower character, "
"one upper character, "
"digit or "
"special symbol"
)
return password
```
---
## Dependencies
### Request Validation
Dependencies are excellent for request validation:
```python
# dependencies.py
async def valid_post_id(post_id: UUID4) -> dict[str, Any]:
post = await service.get_by_id(post_id)
if not post:
raise PostNotFound()
return post
# router.py
@router.get("/posts/{post_id}", response_model=PostResponse)
async def get_post_by_id(post: dict[str, Any] = Depends(valid_post_id)):
return post
@router.put("/posts/{post_id}", response_model=PostResponse)
async def update_post(
update_data: PostUpdate,
post: dict[str, Any] = Depends(valid_post_id),
):
updated_post = await service.update(id=post["id"], data=update_data)
return updated_post
```
### Chain Dependencies
Dependencies can use other dependencies:
```python
async def valid_post_id(post_id: UUID4) -> dict[str, Any]:
post = await service.get_by_id(post_id)
if not post:
raise PostNotFound()
return post
async def parse_jwt_data(
token: str = Depends(OAuth2PasswordBearer(tokenUrl="/auth/token"))
) -> dict[str, Any]:
try:
payload = jwt.decode(token, "JWT_SECRET", algorithms=["HS256"])
except JWTError:
raise InvalidCredentials()
return {"user_id": payload["id"]}
async def valid_owned_post(
post: dict[str, Any] = Depends(valid_post_id),
token_data: dict[str, Any] = Depends(parse_jwt_data),
) -> dict[str, Any]:
if post["creator_id"] != token_data["user_id"]:
raise UserNotOwner()
return post
```
### Dependency Caching
FastAPI caches dependency's result within a request's scope by default:
```python
# parse_jwt_data is used 3 times but called only once
@router.get("/users/{user_id}/posts/{post_id}", response_model=PostResponse)
async def get_user_post(
worker: BackgroundTasks,
post: Mapping = Depends(valid_owned_post), # uses parse_jwt_data
user: Mapping = Depends(valid_active_creator), # uses parse_jwt_data
):
# parse_jwt_data is called only once, cached for this request
worker.add_task(notifications_service.send_email, user["id"])
return post
```
### Prefer Async Dependencies
Sync dependencies run in the thread pool. Prefer async for non-I/O operations.
---
## API Design
### Follow REST Conventions
Use consistent variable names in paths:
```python
@router.get("/profiles/{profile_id}", response_model=ProfileResponse)
async def get_user_profile_by_id(profile: Mapping = Depends(valid_profile_id)):
return profile
@router.get("/creators/{profile_id}", response_model=ProfileResponse)
async def get_user_profile_by_id(
creator_profile: Mapping = Depends(valid_creator_id)
):
return creator_profile
```
### Hide Docs by Default
Unless your API is public:
```python
from fastapi import FastAPI
from starlette.config import Config
config = Config(".env")
ENVIRONMENT = config("ENVIRONMENT")
SHOW_DOCS_ENVIRONMENT = ("local", "staging")
app_configs = {"title": "My Cool API"}
if ENVIRONMENT not in SHOW_DOCS_ENVIRONMENT:
app_configs["openapi_url"] = None
app = FastAPI(**app_configs)
```
### Detailed API Documentation
```python
from fastapi import APIRouter, status
@router.post(
"/endpoints",
response_model=DefaultResponseModel,
status_code=status.HTTP_201_CREATED,
description="Description of the well documented endpoint",
tags=["Endpoint Category"],
summary="Summary of the Endpoint",
responses={
status.HTTP_200_OK: {
"model": OkResponse,
"description": "Ok Response",
},
status.HTTP_201_CREATED: {
"model": CreatedResponse,
"description": "Creates something from user request",
},
},
)
async def documented_route():
pass
```
---
## Testing
### Async Test Client from Day 0
Set the async test client immediately:
```python
import pytest
from httpx import AsyncClient, ASGITransport
from src.main import app
@pytest.fixture
async def client() -> AsyncGenerator[AsyncClient, None]:
async with AsyncClient(
transport=ASGITransport(app=app),
base_url="http://test"
) as client:
yield client
@pytest.mark.asyncio
async def test_create_post(client: AsyncClient):
resp = await client.post("/posts")
assert resp.status_code == 201
```
---
## Tooling
### Use Ruff
Ruff is blazingly-fast linter that replaces black, autoflake, isort:
```bash
#!/bin/sh -e
set -x
ruff check --fix src
ruff format src
```
---
## Deslop - Remove Emojis
AI-generated code often includes emojis. Remove them:
```python
import re
def remove_emoji(text: str) -> str:
"""Remove emoji characters from text."""
emoji_pattern = re.compile(
"["
"\U0001F600-\U0001F64F" # emoticons
"\U0001F300-\U0001F5FF" # symbols & pictographs
"\U0001F680-\U0001F6FF" # transport & map symbols
"\U0001F1E0-\U0001F1FF" # flags (iOS)
"\U00002702-\U000027B0"
"\U000024C2-\U0001F251"
"]+",
flags=re.UNICODE
)
return emoji_pattern.sub("", text)
```

View File

@@ -0,0 +1,130 @@
# Security Patterns
Authentication and security patterns for Python/FastAPI backends.
## Password Hashing (passlib + bcrypt)
Never store plaintext passwords. Hash with bcrypt:
```python
from passlib.context import CryptContext
pwd_context = CryptContext(schemes=["bcrypt"], deprecated="auto")
def hash_password(password: str) -> str:
return pwd_context.hash(password)
def verify_password(password: str, password_hash: str) -> bool:
return pwd_context.verify(password, password_hash)
```
---
## JWT Create/Verify (python-jose)
Issue short-lived access tokens and validate them on each request:
```python
from datetime import datetime, timedelta, timezone
from jose import JWTError, jwt
JWT_ALG = "HS256"
JWT_SECRET = "change-me"
def create_access_token(*, subject: str, expires_minutes: int = 15) -> str:
now = datetime.now(timezone.utc)
payload = {
"sub": subject,
"iat": int(now.timestamp()),
"exp": int((now + timedelta(minutes=expires_minutes)).timestamp()),
}
return jwt.encode(payload, JWT_SECRET, algorithm=JWT_ALG)
def decode_access_token(token: str) -> dict:
try:
return jwt.decode(token, JWT_SECRET, algorithms=[JWT_ALG])
except JWTError as e:
raise ValueError("Invalid token") from e
```
---
## FastAPI OAuth2 Bearer Dependency
Use OAuth2PasswordBearer to parse `Authorization: Bearer <token>`:
```python
from fastapi import Depends, HTTPException, status
from fastapi.security import OAuth2PasswordBearer
oauth2_scheme = OAuth2PasswordBearer(tokenUrl="/auth/token")
def get_current_user(token: str = Depends(oauth2_scheme)) -> dict:
try:
payload = decode_access_token(token)
except ValueError:
raise HTTPException(
status_code=status.HTTP_401_UNAUTHORIZED,
detail="Invalid token"
)
return {"user_id": payload["sub"]}
```
---
## API Key Auth via Header
Protect internal endpoints with an API key header:
```python
from fastapi import Depends, Header, HTTPException, status
API_KEY = "change-me"
def require_api_key(
x_api_key: str | None = Header(default=None, alias="X-API-Key")
) -> None:
if not x_api_key or x_api_key != API_KEY:
raise HTTPException(
status_code=status.HTTP_403_FORBIDDEN,
detail="Invalid API key"
)
```
---
## CORS Configuration
Lock CORS down to known origins:
```python
from fastapi import FastAPI
from fastapi.middleware.cors import CORSMiddleware
app = FastAPI()
app.add_middleware(
CORSMiddleware,
allow_origins=["https://example.com"],
allow_credentials=True,
allow_methods=["GET", "POST"],
allow_headers=["Authorization", "Content-Type"],
)
```
---
## Hide OpenAPI Docs by Default
Disable docs endpoints in production:
```python
from fastapi import FastAPI
ENV = "production" # e.g., from env vars
app = FastAPI(
title="My API",
openapi_url=None if ENV == "production" else "/openapi.json",
)
```

View File

@@ -0,0 +1,387 @@
# Upstash Patterns
Redis, QStash, and Rate Limiting patterns for Python backends.
## Redis Client Setup
```python
# Sync client from env
from upstash_redis import Redis
redis = Redis.from_env()
# Async client from env
from upstash_redis.asyncio import Redis
redis = Redis.from_env()
# Explicit credentials
redis = Redis(
url="UPSTASH_REDIS_REST_URL",
token="UPSTASH_REDIS_REST_TOKEN"
)
```
---
## Key Expiration (TTL)
```python
import datetime
redis.set("session", "data")
redis.expire("session", 300) # 5 minutes
redis.expire("token", datetime.timedelta(hours=1))
# Set with inline expiration
redis.set("key", "value", ex=300) # expires in 300s
redis.setex("key", 300, "value") # alternative syntax
# Check TTL
ttl = redis.ttl("session") # seconds remaining
```
---
## Hash Operations
Store structured data:
```python
# Set multiple fields
redis.hset("user:1", values={
"name": "Alice",
"email": "alice@example.com",
"status": "active"
})
# Get single field
name = redis.hget("user:1", "name")
# Get all fields
user = redis.hgetall("user:1")
```
---
## Transactions (Atomic)
```python
tx = redis.multi()
tx.set("account:1", 1000)
tx.decrby("account:1", 100)
tx.set("account:2", 500)
tx.incrby("account:2", 100)
results = tx.exec() # all or nothing
```
---
## Pipeline (Batch)
Send multiple commands in a single roundtrip:
```python
pipeline = redis.pipeline()
pipeline.set("foo", 1)
pipeline.incr("foo")
pipeline.get("foo")
result = pipeline.exec()
print(result) # [True, 2, '2']
```
---
## Lists (Queues)
```python
# Push to list
redis.lpush("queue", "task1", "task2") # Add to head
redis.rpush("queue", "task3") # Add to tail
# Get range
items = redis.lrange("queue", 0, -1) # All items
recent = redis.lrange("queue", 0, 9) # First 10
# Pop items
first = redis.lpop("queue") # Remove from head
# Recent activity feed (keep last 100)
redis.lpush("user:123:activity", activity_json)
redis.ltrim("user:123:activity", 0, 99)
```
---
## Sets (Unique Values)
```python
redis.sadd("tags:article:1", "python", "redis", "backend")
tags = redis.smembers("tags:article:1")
is_member = redis.sismember("tags:article:1", "python")
# Set operations
common = redis.sinter("user:1:skills", "user:2:skills")
all_skills = redis.sunion("user:1:skills", "user:2:skills")
```
---
## Sorted Sets (Leaderboards)
```python
# Add scores
redis.zadd("leaderboard", {"alice": 100, "bob": 85, "charlie": 92})
# Get top 3 (highest first)
top3 = redis.zrevrange("leaderboard", 0, 2, withscores=True)
# Get rank
rank = redis.zrevrank("leaderboard", "bob")
# Increment score
redis.zincrby("leaderboard", 10, "bob")
```
---
## FastAPI Caching
```python
from fastapi import FastAPI
from upstash_redis import Redis
app = FastAPI()
redis = Redis.from_env()
CACHE_TTL = 600 # 10 minutes
@app.get("/data/{id}")
def get_data(id: str):
cache_key = f"data:{id}"
# Check cache
cached = redis.get(cache_key)
if cached:
return {"source": "cache", "data": cached}
# Fetch from source
data = fetch_from_database(id)
# Cache and return
redis.setex(cache_key, CACHE_TTL, data)
return {"source": "db", "data": data}
```
---
## FastAPI Session Management
```python
from fastapi import FastAPI, Response, Cookie, HTTPException
from upstash_redis import Redis
import uuid
redis = Redis.from_env()
app = FastAPI()
SESSION_TTL = 900 # 15 minutes
@app.post("/login")
async def login(username: str, response: Response):
session_id = str(uuid.uuid4())
redis.hset(f"session:{session_id}", values={
"user": username, "status": "active"
})
redis.expire(f"session:{session_id}", SESSION_TTL)
response.set_cookie("session_id", session_id, httponly=True)
return {"message": "Logged in"}
@app.get("/profile")
async def profile(session_id: str = Cookie(None)):
if not session_id:
raise HTTPException(403, "No session")
session = redis.hgetall(f"session:{session_id}")
if not session:
raise HTTPException(401, "Session expired")
redis.expire(f"session:{session_id}", SESSION_TTL) # sliding
return session
```
---
## Rate Limiting
### Basic Setup
```python
from upstash_ratelimit import Ratelimit, FixedWindow, SlidingWindow, TokenBucket
from upstash_redis import Redis
redis = Redis.from_env()
# Fixed window: 10 requests per 10 seconds
ratelimit = Ratelimit(
redis=redis,
limiter=FixedWindow(max_requests=10, window=10),
)
# Sliding window (smoother)
ratelimit = Ratelimit(
redis=redis,
limiter=SlidingWindow(max_requests=10, window=10),
)
# Token bucket (allows bursts)
ratelimit = Ratelimit(
redis=redis,
limiter=TokenBucket(max_tokens=10, refill_rate=5, interval=10),
)
```
### FastAPI Rate Limiting
```python
from fastapi import FastAPI, HTTPException, Request
@app.get("/api/resource")
def protected(request: Request):
result = ratelimit.limit(request.client.host)
if not result.allowed:
raise HTTPException(429, "Rate limit exceeded")
return {"data": "..."}
```
### Multi-Tier Rate Limits
```python
ratelimits = {
"free": Ratelimit(
redis=redis,
limiter=SlidingWindow(max_requests=10, window=60),
prefix="ratelimit:free"
),
"pro": Ratelimit(
redis=redis,
limiter=SlidingWindow(max_requests=100, window=60),
prefix="ratelimit:pro"
),
}
```
### Rate Limit Headers
```python
@app.middleware("http")
async def rate_limit_middleware(request: Request, call_next):
result = ratelimit.limit(request.client.host)
if not result.allowed:
return Response(
content="Rate limit exceeded",
status_code=429,
headers={
"X-RateLimit-Limit": str(result.limit),
"X-RateLimit-Remaining": "0",
"Retry-After": str(result.reset)
}
)
response = await call_next(request)
response.headers["X-RateLimit-Limit"] = str(result.limit)
response.headers["X-RateLimit-Remaining"] = str(result.remaining)
return response
```
---
## QStash - Background Jobs
### Setup
```python
from qstash import QStash
client = QStash("<QSTASH_TOKEN>")
# Or from env
client = QStash.from_env()
```
### Publish Messages
```python
# Simple publish
res = client.message.publish_json(
url="https://my-api.com/webhook",
body={"event": "user_signup", "user_id": 123}
)
# With delay
res = client.message.publish_json(
url="https://my-api.com/process",
body={"task": "heavy_computation"},
delay="5m",
)
```
### Schedule Recurring Jobs
```python
# Daily at midnight
client.schedule.create(
destination="https://my-api.com/daily-report",
cron="0 0 * * *"
)
# Every hour
client.schedule.create(
destination="https://my-api.com/sync",
cron="0 * * * *"
)
```
### Signature Verification
```python
from qstash import Receiver
from fastapi import FastAPI, Request, HTTPException
receiver = Receiver(
current_signing_key="...",
next_signing_key="..."
)
@app.post("/webhook")
async def webhook(request: Request):
signature = request.headers.get("Upstash-Signature")
body = await request.body()
try:
receiver.verify(
body=body.decode(),
signature=signature,
url=str(request.url)
)
except Exception:
raise HTTPException(401, "Invalid signature")
data = await request.json()
return await process_task(data)
```
### Batch Messages
```python
result = client.message.batch_json([
{"url": "https://api.com/user/1/notify", "body": {"message": "Hello 1"}},
{"url": "https://api.com/user/2/notify", "body": {"message": "Hello 2"}},
])
```
---
## Best Practices
1. Use environment variables for credentials
2. Always set TTLs to avoid memory bloat
3. Use key prefixes: `user:123`, `session:abc`, `cache:weather:london`
4. Choose rate limit algorithm based on needs
5. Use async client for async routes
6. Verify QStash signatures for webhook security
7. Use transactions for atomic operations