Full-stack chat application that allows users to interact with multiple AI models through a web interface
| frontend | ||
| .gitignore | ||
| app.py | ||
| chatgraph.py | ||
| controllers.py | ||
| Dockerfile | ||
| pyproject.toml | ||
| README.md | ||
| uv.lock | ||
ChatSBT - Multi-Model Chat Application
A modern chat application supporting multiple AI models through OpenRouter API.
Features
- Chat with multiple AI models (Qwen, Deepseek, Kimi)
- Real-time streaming responses
- Conversation history
- Simple REST API backend
- Modern Svelte frontend
Tech Stack
Frontend
- Svelte
- DaisyUI (Tailwind component library)
- Vite
Backend
- Starlette (async Python web framework)
- LangChain (LLM orchestration)
- LangGraph (for potential future agent workflows)
- OpenRouter API (multi-model provider)
API Endpoints
| Method | Path | Description |
|---|---|---|
| POST | /chats | Create new chat session |
| GET | /chats/{chat_id} | Get chat history |
| POST | /chats/{chat_id}/messages | Post new message |
| GET | /chats/{chat_id}/stream | Stream response from AI |
Prerequisites
- Python 3.11+
- Deno
- UV (Python package manager)
- OpenRouter API key (set in
.envfile)
Installation
- Clone the repository
- Set up environment variables:
echo "OPENROUTER_API_KEY=your_key_here" > .env
echo "OPENROUTER_BASE_URL=https://openrouter.ai/api/v1" >> .env
- Install frontend dependencies:
cd chatsbt
deno install
Running
- Start backend server:
uv run app.py
- Start the frontend (another terminal):
cd chatsbt
deno run dev
The application will be available at http://localhost:5173
Configuration
Available models:
qwen/qwen3-235b-a22b-2507deepseek/deepseek-r1-0528moonshotai/kimi-k2