Backend: - ConversationTracker: Protocol + PostgresConversationTracker for lifecycle tracking - Error handler: ErrorCategory enum, classify_error(), with_retry() exponential backoff - Wire PostgresAnalyticsRecorder + ConversationTracker into ws_handler - Rate limiting (10 msg/10s per thread), edge case hardening - Health endpoint GET /api/health, version 0.5.0 - Demo seed data script + sample OpenAPI spec Frontend (all new): - React Router with NavBar (Chat / Replay / Dashboard / Review) - ReplayListPage + ReplayPage with ReplayTimeline component - DashboardPage with MetricCard, range selector, zero-state - ReviewPage for OpenAPI classification review - ErrorBanner for WebSocket disconnect handling - API client (api.ts) with typed fetch wrappers Infrastructure: - Frontend Dockerfile (multi-stage node -> nginx) - nginx.conf with SPA routing + API/WS proxy - docker-compose.yml with frontend service + healthchecks - .env.example files (root + backend) Documentation: - README.md with quick start and architecture - Agent configuration guide - OpenAPI import guide - Deployment guide - Demo script 48 new tests, 449 total passing, 92.87% coverage
3.2 KiB
Deployment Guide
Docker Compose (Recommended)
Prerequisites
- Docker Engine 24+
- Docker Compose v2
Quick Start
git clone <repo-url>
cd smart-support
# Configure environment
cp .env.example .env
# Edit .env: set ANTHROPIC_API_KEY (or OPENAI_API_KEY / GOOGLE_API_KEY)
# Start all services
docker compose up -d
# Verify health
docker compose ps
curl http://localhost/api/health
The app is available at http://localhost (frontend) and http://localhost:8000 (backend API).
Services
| Service | Port | Description |
|---|---|---|
| postgres | 5432 | PostgreSQL 16 database |
| backend | 8000 | FastAPI + LangGraph backend |
| frontend | 80 | React SPA served by nginx |
Stopping
docker compose down # Stop services, keep data
docker compose down -v # Stop services and delete database volume
Production Considerations
Environment Variables
Set these in production (never commit secrets):
| Variable | Required | Description |
|---|---|---|
POSTGRES_PASSWORD |
Yes | Strong random password |
ANTHROPIC_API_KEY |
Yes* | LLM provider API key |
LLM_PROVIDER |
Yes | anthropic, openai, or google |
LLM_MODEL |
Yes | Model name for your provider |
WEBHOOK_URL |
No | Escalation notification endpoint |
SESSION_TTL_MINUTES |
No | Session timeout (default: 30) |
*Or OPENAI_API_KEY / GOOGLE_API_KEY depending on LLM_PROVIDER.
HTTPS
For production, place a reverse proxy (nginx, Caddy, or a load balancer) in front of the frontend container and configure TLS termination there.
The WebSocket endpoint at /ws must be proxied with Upgrade: websocket headers.
The frontend nginx.conf handles this internally for the backend connection.
Example Caddy configuration:
example.com {
reverse_proxy localhost:80
}
Database Backups
# Backup
docker compose exec postgres pg_dump -U smart_support smart_support > backup.sql
# Restore
cat backup.sql | docker compose exec -T postgres psql -U smart_support smart_support
Scaling
The backend is stateless (session state is in PostgreSQL via LangGraph's PostgresSaver). You can run multiple backend replicas behind a load balancer.
The WebSocket connections are session-specific. Use sticky sessions or a shared session backend if load balancing WebSockets across multiple instances.
Manual / Development Setup
Backend
cd backend
python -m venv .venv
source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -e ".[dev]"
# Set environment variables
cp .env.example .env
# Edit .env
# Start database
docker compose up postgres -d
# Run backend
uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
Frontend
cd frontend
npm install
npm run dev # Dev server on http://localhost:5173
Running Tests
cd backend
pytest --cov=app --cov-report=term-missing
Health Checks
Backend health
GET /api/health
Response:
{"status": "ok", "version": "0.5.0"}
WebSocket health
Connect to ws://localhost:8000/ws and send:
{"type": "message", "thread_id": "health-check", "content": "ping"}
A message_complete or error response confirms the WebSocket is alive.