feat: complete phase 5 -- error hardening, frontend, Docker, demo, docs
Backend: - ConversationTracker: Protocol + PostgresConversationTracker for lifecycle tracking - Error handler: ErrorCategory enum, classify_error(), with_retry() exponential backoff - Wire PostgresAnalyticsRecorder + ConversationTracker into ws_handler - Rate limiting (10 msg/10s per thread), edge case hardening - Health endpoint GET /api/health, version 0.5.0 - Demo seed data script + sample OpenAPI spec Frontend (all new): - React Router with NavBar (Chat / Replay / Dashboard / Review) - ReplayListPage + ReplayPage with ReplayTimeline component - DashboardPage with MetricCard, range selector, zero-state - ReviewPage for OpenAPI classification review - ErrorBanner for WebSocket disconnect handling - API client (api.ts) with typed fetch wrappers Infrastructure: - Frontend Dockerfile (multi-stage node -> nginx) - nginx.conf with SPA routing + API/WS proxy - docker-compose.yml with frontend service + healthchecks - .env.example files (root + backend) Documentation: - README.md with quick start and architecture - Agent configuration guide - OpenAPI import guide - Deployment guide - Demo script 48 new tests, 449 total passing, 92.87% coverage
This commit is contained in:
@@ -1,19 +1,34 @@
|
||||
# Database
|
||||
# Smart Support Backend -- environment variables
|
||||
# Copy to .env and fill in your values
|
||||
|
||||
# Required: PostgreSQL connection string
|
||||
DATABASE_URL=postgresql://smart_support:dev_password@localhost:5432/smart_support
|
||||
|
||||
# LLM Provider: anthropic | openai | google
|
||||
# Required: LLM provider configuration
|
||||
# provider: anthropic | openai | google
|
||||
LLM_PROVIDER=anthropic
|
||||
LLM_MODEL=claude-sonnet-4-6
|
||||
|
||||
# API Keys (set the one matching your LLM_PROVIDER)
|
||||
# API keys -- provide the one matching LLM_PROVIDER
|
||||
ANTHROPIC_API_KEY=
|
||||
OPENAI_API_KEY=
|
||||
GOOGLE_API_KEY=
|
||||
|
||||
# Session
|
||||
# Optional: webhook endpoint for escalation notifications
|
||||
# The backend will POST a JSON payload when a conversation is escalated.
|
||||
WEBHOOK_URL=
|
||||
WEBHOOK_TIMEOUT_SECONDS=10
|
||||
WEBHOOK_MAX_RETRIES=3
|
||||
|
||||
# Session management
|
||||
SESSION_TTL_MINUTES=30
|
||||
INTERRUPT_TTL_MINUTES=30
|
||||
|
||||
# Server
|
||||
# Optional: load a named agent template instead of agents.yaml
|
||||
# Leave blank to use the default agents.yaml in the backend directory.
|
||||
# Available templates: ecommerce, saas, generic
|
||||
TEMPLATE_NAME=
|
||||
|
||||
# Server binding
|
||||
WS_HOST=0.0.0.0
|
||||
WS_PORT=8000
|
||||
|
||||
Reference in New Issue
Block a user