Local Development¶
This guide covers the day-to-day development workflow for the 2Sigma Backend.
Prerequisites¶
Before starting, ensure you've completed the Getting Started Tutorial.
Development Environment Setup¶
IDE Configuration¶
The project works well with:
- VS Code - Recommended extensions: Python, Pylance, SQLAlchemy
- PyCharm - Configure async support and FastAPI run configurations
- Vim/Neovim - Use LSP with pyright or pylance
Environment Variables¶
Keep a .env file in the project root (never commit it):
# Development .env example
APP_NAME="2Sigma Backend"
DEBUG=true
ENVIRONMENT=development
DATABASE_URL=postgresql+asyncpg://ai_tutor_user:password@localhost:5432/ai_tutor_dev
DATABASE_ECHO=false # Set to true for SQL logging
SECRET_KEY=dev-secret-key-change-in-production
ACCESS_TOKEN_EXPIRE_MINUTES=10080 # 7 days for dev convenience
REFRESH_TOKEN_EXPIRE_DAYS=30
BACKEND_CORS_ORIGINS=["http://localhost:3000", "http://localhost:8000"]
# LLM Provider (choose one)
LLM_PROVIDER=anthropic
ANTHROPIC_API_KEY=your-key-here
ANTHROPIC_MODEL=claude-3-5-sonnet-20241022
# Alternative: AWS Bedrock
# LLM_PROVIDER=aws_bedrock
# AWS_BEDROCK_MODEL_ID=anthropic.claude-3-5-sonnet-20241022-v2:0
# AWS_REGION=eu-west-2
# AWS_ACCESS_KEY_ID=your-key
# AWS_SECRET_ACCESS_KEY=your-secret
# Optional: Langfuse for LLM observability
# LANGFUSE_PUBLIC_KEY=pk-...
# LANGFUSE_SECRET_KEY=sk-...
Daily Workflow¶
Starting Development¶
# 1. Activate virtual environment
source venv/bin/activate # Linux/macOS
# or
venv\Scripts\activate # Windows
# 2. Pull latest changes
git pull origin main
# 3. Update dependencies (if requirements.txt changed)
pip install -r requirements.txt
# 4. Run migrations (if new migrations exist)
alembic upgrade head
# 5. Start the server
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
Hot Reload¶
The --reload flag enables automatic reloading when code changes. FastAPI will restart the server when you modify:
- Python files in
app/ - Configuration files (
.envrequires manual restart)
Multiple Terminals¶
Common multi-terminal setup:
# Terminal 1: Run server
uvicorn app.main:app --reload --port 8000
# Terminal 2: Database operations
psql -U ai_tutor_user -d ai_tutor_dev
# Terminal 3: Tests
pytest --watch
# Terminal 4: Alembic migrations
alembic revision --autogenerate -m "description"
Code Organization¶
Project Structure¶
ai-tutor-backend/
├── app/
│ ├── __init__.py
│ ├── main.py # FastAPI app entry point
│ ├── config.py # Settings (Pydantic BaseSettings)
│ ├── database.py # SQLAlchemy setup
│ ├── dependencies.py # FastAPI dependencies
│ │
│ ├── models/ # SQLAlchemy ORM models
│ │ ├── __init__.py
│ │ ├── user.py
│ │ ├── course.py
│ │ └── ...
│ │
│ ├── schemas/ # Pydantic request/response models
│ │ ├── __init__.py
│ │ ├── user.py
│ │ └── ...
│ │
│ ├── crud/ # Database operations (repository pattern)
│ │ ├── base.py
│ │ ├── user.py
│ │ └── ...
│ │
│ ├── api/
│ │ └── v1/ # API version 1
│ │ ├── router.py # Main router
│ │ ├── auth.py
│ │ ├── users.py
│ │ └── ...
│ │
│ ├── core/ # Core utilities
│ │ ├── enums.py
│ │ └── security.py # JWT & password handling
│ │
│ └── services/ # Business logic & external services
│ └── llm_service.py
│
├── alembic/ # Database migrations
│ ├── versions/
│ └── env.py
│
├── tests/ # Test suite
│ ├── conftest.py
│ └── ...
│
├── requirements.txt
├── .env.example
├── alembic.ini
└── pytest.ini
Naming Conventions¶
Files:
- Models: Singular nouns (user.py, course.py)
- Routers: Plural nouns (users.py, courses.py)
- CRUD: Singular nouns matching model (user.py)
- Schemas: Singular nouns matching model (user.py)
Classes:
- Models: PascalCase singular (User, Course)
- Schemas: PascalCase with suffix (UserCreate, UserResponse, UserUpdate)
- CRUD: PascalCase singular (CRUDUser, CRUDCourse)
Functions:
- Routes: Verb-noun (get_user, create_course, update_enrollment)
- CRUD methods: CRUD verbs (get, create, update, delete, get_multi)
Common Development Tasks¶
Adding a New Endpoint¶
- Define the schema (
app/schemas/your_model.py):
from pydantic import BaseModel
class ItemCreate(BaseModel):
name: str
description: str | None = None
class ItemResponse(ItemCreate):
id: int
created_at: datetime
class Config:
from_attributes = True
- Create CRUD operations (
app/crud/your_model.py):
from app.crud.base import CRUDBase
from app.models.your_model import Item
from app.schemas.your_model import ItemCreate, ItemUpdate
class CRUDItem(CRUDBase[Item, ItemCreate, ItemUpdate]):
pass
item = CRUDItem(Item)
- Add the route (
app/api/v1/items.py):
from fastapi import APIRouter, Depends
from sqlalchemy.ext.asyncio import AsyncSession
from app import crud, schemas
from app.api import dependencies as deps
router = APIRouter()
@router.post("/", response_model=schemas.ItemResponse)
async def create_item(
*,
db: AsyncSession = Depends(deps.get_db),
item_in: schemas.ItemCreate,
current_user = Depends(deps.get_current_active_user)
):
item = await crud.item.create(db, obj_in=item_in)
return item
- Register the router (
app/api/v1/router.py):
from app.api.v1 import items
api_router.include_router(items.router, prefix="/items", tags=["items"])
Adding a Database Model¶
See the Database Migrations Guide for full details.
Quick steps:
# 1. Create model in app/models/
# 2. Import in app/models/__init__.py
# 3. Generate migration
alembic revision --autogenerate -m "add item model"
# 4. Review generated migration in alembic/versions/
# 5. Apply migration
alembic upgrade head
Debugging¶
Using Print Statements¶
Using Python Debugger¶
# Add breakpoint
import pdb; pdb.set_trace()
# Or use built-in breakpoint (Python 3.7+)
breakpoint()
Logging¶
import logging
logger = logging.getLogger(__name__)
logger.debug("Detailed debug info")
logger.info("General information")
logger.warning("Warning message")
logger.error("Error occurred", exc_info=True)
Logging¶
The application uses structured JSON logging with the following features:
- Request tracking: Each request gets a unique
request_id - Performance metrics: Duration, DB query count, and DB query time per request
- File logging: Logs are saved to
logs/app.logwith rotation
Log Format (JSON - production):
{
"timestamp": "2026-02-12T13:40:07.003Z",
"level": "INFO",
"request_id": "req_8badd9d102c6",
"method": "GET",
"path": "/api/v1/courses/20",
"status_code": 200,
"duration_ms": 5.32,
"db_query_count": 4,
"db_query_time_ms": 0.85
}
Log Format (Text - development):
Configure logging in .env:
LOG_LEVEL=DEBUG # DEBUG, INFO, WARNING, ERROR
LOG_FORMAT=text # "text" for dev, "json" for production
LOG_FILE_ENABLED=True # Save to logs/app.log
Viewing logs with lnav (recommended):
# Install lnav
sudo apt install lnav # or brew install lnav
# View logs interactively
lnav logs/app.log
# Keyboard shortcuts in lnav:
# e - Jump to next error
# w - Jump to next warning
# / - Search
# :filter-in level=ERROR - Filter by level
Viewing logs with jq:
# Pretty print
cat logs/app.log | jq .
# Filter errors
cat logs/app.log | jq 'select(.level == "ERROR")'
# Find slow requests (>100ms)
cat logs/app.log | jq 'select(.duration_ms > 100)'
SQL Query Logging¶
Enable in .env:
This prints all SQL queries to the console.
Testing Your Changes¶
# Run all tests
pytest
# Run specific test file
pytest tests/test_users.py
# Run specific test
pytest tests/test_users.py::test_create_user
# Run with output
pytest -v -s
# Run with coverage
pytest --cov=app --cov-report=html
Database Management¶
Connecting to the Database¶
# PostgreSQL CLI
psql -U ai_tutor_user -d ai_tutor_dev
# Common commands
\dt # List tables
\d users # Describe users table
\q # Quit
Resetting the Database¶
# Drop and recreate database
dropdb -U postgres ai_tutor_dev
createdb -U postgres ai_tutor_dev
psql -U postgres -d ai_tutor_dev -c "GRANT ALL PRIVILEGES ON DATABASE ai_tutor_dev TO ai_tutor_user;"
# Run migrations
alembic upgrade head
Inspecting Data¶
-- Check users
SELECT id, email, full_name, is_active FROM users;
-- Check enrollments
SELECT e.id, u.email, c.title
FROM course_enrollments e
JOIN users u ON e.user_id = u.id
JOIN course_offerings co ON e.course_offering_id = co.id
JOIN courses c ON co.course_id = c.id;
API Testing¶
Using FastAPI Docs¶
Navigate to http://localhost:8000/docs for interactive testing.
Prometheus Metrics¶
The application exposes Prometheus metrics at /metrics:
Available metrics:
| Metric | Type | Description |
|---|---|---|
http_requests_total |
Counter | Total requests by method, path, status |
http_request_duration_seconds |
Histogram | Request latency |
http_requests_in_progress |
Gauge | Active requests |
db_query_total |
Counter | Database queries by operation |
db_query_duration_seconds |
Histogram | DB query latency |
errors_total |
Counter | Errors by type and path |
Using curl¶
# Register user
curl -X POST http://localhost:8000/api/v1/auth/register \
-H "Content-Type: application/json" \
-d '{"email":"test@example.com","password":"test123","full_name":"Test User"}'
# Login
curl -X POST http://localhost:8000/api/v1/auth/login \
-H "Content-Type: application/json" \
-d '{"email":"test@example.com","password":"test123"}'
# Get current user (with token)
curl -X GET http://localhost:8000/api/v1/users/me \
-H "Authorization: Bearer YOUR_TOKEN"
Using HTTPie¶
# Install httpie
pip install httpie
# Register user
http POST localhost:8000/api/v1/auth/register \
email=test@example.com password=test123 full_name="Test User"
# Login
http POST localhost:8000/api/v1/auth/login \
email=test@example.com password=test123
# Authenticated request
http GET localhost:8000/api/v1/users/me \
Authorization:"Bearer YOUR_TOKEN"
Performance Profiling¶
Timing Endpoints¶
Use FastAPI middleware or manual timing:
import time
from contextlib import asynccontextmanager
@asynccontextmanager
async def timing_middleware(request, call_next):
start_time = time.time()
response = await call_next(request)
duration = time.time() - start_time
response.headers["X-Process-Time"] = str(duration)
return response
Database Query Performance¶
Enable SQL logging and review slow queries:
Use EXPLAIN for specific queries:
Troubleshooting Common Issues¶
Port Already in Use¶
# Find process using port 8000
lsof -i :8000 # macOS/Linux
netstat -ano | findstr :8000 # Windows
# Kill the process
kill -9 <PID> # macOS/Linux
taskkill /PID <PID> /F # Windows
Module Import Errors¶
# Ensure virtual environment is activated
which python # Should show venv path
# Reinstall dependencies
pip install -r requirements.txt
Database Connection Issues¶
# Check PostgreSQL is running
pg_isready
# Check connection details
psql postgresql://ai_tutor_user:password@localhost:5432/ai_tutor_dev
Migration Conflicts¶
# Check current migration
alembic current
# Reset to base (WARNING: drops all data)
alembic downgrade base
alembic upgrade head
Git Workflow¶
Branch Naming¶
- Features:
feature/short-description - Bugs:
fix/short-description - Docs:
docs/short-description
Commit Messages¶
Follow conventional commits:
feat: add user profile endpoint
fix: resolve authentication bug
docs: update API documentation
refactor: simplify CRUD operations
test: add unit tests for users
Pre-Commit Checklist¶
- Run tests:
pytest - Check code style:
black app/ tests/ - Run migrations:
alembic upgrade head - Update docs if needed