# ChatSBT - Multi-Model Chat Application A modern chat application supporting multiple AI models through OpenRouter API. ## Features - Chat with multiple AI models (Qwen, Deepseek, Kimi) - Real-time streaming responses - Conversation history - Simple REST API backend - Modern Svelte frontend ## Tech Stack ### Frontend - Svelte - DaisyUI (Tailwind component library) - Vite ### Backend - Starlette (async Python web framework) - LangChain (LLM orchestration) - LangGraph (for potential future agent workflows) - OpenRouter API (multi-model provider) ## API Endpoints | Method | Path | Description | |--------|------|-------------| | POST | /chats | Create new chat session | | GET | /chats/{chat_id} | Get chat history | | POST | /chats/{chat_id}/messages | Post new message | | GET | /chats/{chat_id}/stream | Stream response from AI | ## Prerequisites - Python 3.11+ - Deno - UV (Python package manager) - OpenRouter API key (set in `.env` file) ## Installation 1. Clone the repository 2. Set up environment variables: ```bash echo "OPENROUTER_API_KEY=your_key_here" > .env echo "OPENROUTER_BASE_URL=https://openrouter.ai/api/v1" >> .env ``` 4. Install frontend dependencies: ```bash cd chatsbt deno install ``` ## Running 1. Start backend server: ```bash uv run app.py ``` 2. Start the frontend (another terminal): ```bash cd chatsbt deno run dev ``` The application will be available at `http://localhost:5173` ## Configuration Available models: - `qwen/qwen3-235b-a22b-2507` - `deepseek/deepseek-r1-0528` - `moonshotai/kimi-k2`