- API versioning: all REST endpoints prefixed with /api/v1/ - Structured logging: replaced stdlib logging with structlog (console/JSON modes) - Alembic migrations: versioned DB schema with initial migration - Error standardization: global exception handlers for consistent envelope format - Interrupt cleanup: asyncio background task for expired interrupt removal - Integration tests: +30 tests (analytics, replay, openapi, error, session APIs) - Frontend tests: +57 tests (all components, pages, useWebSocket hook) - Backend: 557 tests, 89.75% coverage | Frontend: 80 tests, 16 test files
3.8 KiB
Deployment Guide
Docker Compose (Recommended)
Prerequisites
- Docker Engine 24+
- Docker Compose v2
Quick Start
git clone <repo-url>
cd smart-support
# Configure environment
cp .env.example .env
# Edit .env: set ANTHROPIC_API_KEY (or OPENAI_API_KEY / GOOGLE_API_KEY)
# Start all services
docker compose up -d
# Verify health
docker compose ps
curl http://localhost/api/health
The app is available at http://localhost (frontend) and http://localhost:8000 (backend API).
Services
| Service | Port | Description |
|---|---|---|
| postgres | 5432 | PostgreSQL 16 database |
| backend | 8000 | FastAPI + LangGraph backend |
| frontend | 80 | React SPA served by nginx |
Stopping
docker compose down # Stop services, keep data
docker compose down -v # Stop services and delete database volume
Production Considerations
Environment Variables
Set these in production (never commit secrets):
| Variable | Required | Description |
|---|---|---|
POSTGRES_PASSWORD |
Yes | Strong random password |
ANTHROPIC_API_KEY |
Yes* | LLM provider API key |
LLM_PROVIDER |
Yes | anthropic, openai, or google |
LLM_MODEL |
Yes | Model name for your provider |
ADMIN_API_KEY |
Recommended | API key for admin endpoints (analytics, replay, openapi, WS). Leave empty to disable auth (dev mode only) |
WEBHOOK_URL |
No | Escalation notification endpoint |
SESSION_TTL_MINUTES |
No | Session timeout (default: 30) |
*Or OPENAI_API_KEY / GOOGLE_API_KEY depending on LLM_PROVIDER.
Authentication
When ADMIN_API_KEY is set, all admin REST endpoints require the X-API-Key header,
and WebSocket connections require a ?token=<key> query parameter.
When unset or empty, authentication is disabled (suitable for local development only).
HTTPS
For production, place a reverse proxy (nginx, Caddy, or a load balancer) in front of the frontend container and configure TLS termination there.
The WebSocket endpoint at /ws must be proxied with Upgrade: websocket headers.
The frontend nginx.conf handles this internally for the backend connection.
Example Caddy configuration:
example.com {
reverse_proxy localhost:80
}
Database Backups
# Backup
docker compose exec postgres pg_dump -U smart_support smart_support > backup.sql
# Restore
cat backup.sql | docker compose exec -T postgres psql -U smart_support smart_support
Scaling
The backend supports multi-worker deployments. LangGraph session state is
persisted in PostgreSQL via PostgresSaver. For full horizontal scaling, use
PgSessionManager and PgInterruptManager (instead of the default in-memory
managers) to share session and interrupt state across workers.
WebSocket connections are session-specific. Use sticky sessions or a shared session backend if load balancing WebSockets across multiple instances.
Manual / Development Setup
Backend
cd backend
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -e ".[dev]"
# Set environment variables
cp .env.example .env
# Edit .env
# Start database
docker compose up postgres -d
# Run backend
uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
Frontend
cd frontend
npm install
npm run dev # Dev server on http://localhost:5173
Running Tests
cd backend
pytest --cov=app --cov-report=term-missing
Health Checks
Backend health
GET /api/health
Response:
{"status": "ok", "version": "0.6.0"}
WebSocket health
Connect to ws://localhost:8000/ws and send:
{"type": "message", "thread_id": "health-check", "content": "ping"}
A message_complete or error response confirms the WebSocket is alive.