feat: complete phase 1 -- core framework with chat loop, agents, and React UI
Backend: - FastAPI WebSocket /ws endpoint with streaming via LangGraph astream - LangGraph Supervisor connecting 3 mock agents (order_lookup, order_actions, fallback) - YAML Agent Registry with Pydantic validation and immutable configs - PostgresSaver checkpoint persistence via langgraph-checkpoint-postgres - Session TTL with 30-min sliding window and interrupt extension - LLM provider abstraction (Anthropic/OpenAI/Google) - Token usage + cost tracking callback handler - Input validation: message size cap, thread_id format, content length - Security: no hardcoded defaults, startup API key validation, no input reflection Frontend: - React 19 + TypeScript + Vite chat UI - WebSocket hook with reconnect + exponential backoff - Streaming token display with agent attribution - Interrupt approval/reject UI for write operations - Collapsible tool call viewer Testing: - 87 unit tests, 87% coverage (exceeds 80% requirement) - Ruff lint + format clean Infrastructure: - Docker Compose (PostgreSQL 16 + backend) - pyproject.toml with full dependency management
This commit is contained in:
46
backend/app/config.py
Normal file
46
backend/app/config.py
Normal file
@@ -0,0 +1,46 @@
|
||||
"""Centralized application configuration via pydantic-settings."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from typing import Literal
|
||||
|
||||
from pydantic import model_validator
|
||||
from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||
|
||||
|
||||
class Settings(BaseSettings):
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env",
|
||||
env_file_encoding="utf-8",
|
||||
extra="ignore",
|
||||
)
|
||||
|
||||
database_url: str
|
||||
|
||||
llm_provider: Literal["anthropic", "openai", "google"] = "anthropic"
|
||||
llm_model: str = "claude-sonnet-4-6"
|
||||
|
||||
session_ttl_minutes: int = 30
|
||||
interrupt_ttl_minutes: int = 30
|
||||
|
||||
ws_host: str = "0.0.0.0"
|
||||
ws_port: int = 8000
|
||||
|
||||
anthropic_api_key: str = ""
|
||||
openai_api_key: str = ""
|
||||
google_api_key: str = ""
|
||||
|
||||
@model_validator(mode="after")
|
||||
def validate_provider_key(self) -> Settings:
|
||||
key_map = {
|
||||
"anthropic": self.anthropic_api_key,
|
||||
"openai": self.openai_api_key,
|
||||
"google": self.google_api_key,
|
||||
}
|
||||
key = key_map.get(self.llm_provider, "")
|
||||
if not key:
|
||||
raise ValueError(
|
||||
f"API key for provider '{self.llm_provider}' is required. "
|
||||
f"Set the corresponding environment variable."
|
||||
)
|
||||
return self
|
||||
Reference in New Issue
Block a user