3.8 KiB
ChatSBT
ChatSBT is a full-stack chat application that allows users to interact with multiple AI models through a web interface. The application features real-time streaming responses, chat history persistence, and support for various language models.
Features
- Real-time chat with multiple AI models
- Server-Sent Events (SSE) for streaming responses
- Chat history persistence
- Responsive web interface built with Svelte
- Docker support for easy deployment
Technology Stack
Backend
- Python 3.11+
- Starlette - ASGI framework for building asynchronous web applications
- Langchain - Framework for developing applications with LLMs
- Masonite ORM - Database ORM for Python
- SQLite - Default database (configurable)
Frontend
- Svelte 5 - Reactive UI framework
- Vite - Next-generation frontend tooling
- DaisyUI - Tailwind CSS components
- Marked - Markdown parser
Infrastructure
- Deno - JavaScript/TypeScript runtime for frontend build
- uv - Python package installer and resolver
Prerequisites
Before you begin, ensure you have the following installed:
- Python 3.11 or higher
- Deno 2.4 or higher
- uv (Python package installer)
- Docker (optional, for containerized deployment)
Installation
Backend Setup
-
Install Python dependencies using uv:
uv sync -
Set up environment variables: Create a
.envfile in the project root with the necessary configuration:# Database configuration DB_CONNECTION=sqlite DB_DATABASE=chatsbt.db # AI Provider API keys (add as needed) OPENAI_API_KEY=your_openai_api_key ANTHROPIC_API_KEY=your_anthropic_api_key
Frontend Setup
-
Navigate to the frontend directory:
cd frontend -
Install frontend dependencies:
deno install --allow-scripts -
Build the frontend:
deno run build
Running the Application
Development Mode
To run the application in development mode with hot reloading:
-
Start the backend server:
uv run app.pyThe backend will be available at http://localhost:8000
-
In a separate terminal, start the frontend development server:
cd frontend deno run devThe frontend will be available at http://localhost:5173
Production Mode
To run the application in production mode:
-
Build the frontend:
cd frontend deno run build cd .. -
Start the backend server:
uv run app.py
The application will be available at http://localhost:8000
Using Docker
To run the application using Docker:
-
Build the Docker image:
docker build -t chatsbt . -
Run the container:
docker run -p 8000:8000 chatsbt
The application will be available at http://localhost:8000
API Endpoints
The backend exposes the following RESTful API endpoints:
Models
GET /api/models- Retrieve list of available AI models- Response:
{"models": ["model1", "model2", ...]}
- Response:
Chats
-
POST /api/chats- Create a new chat session- Request:
{"model": "model_name"} - Response:
{"id": "chat_id", "model": "model_name"}
- Request:
-
GET /api/chats/{chat_id}- Retrieve chat history- Response:
{"messages": [{"role": "human|assistant", "content": "message_text"}, ...]}
- Response:
Messages
-
POST /api/chats/{chat_id}/messages- Send a message to a chat- Request:
{"message": "user_message", "model": "model_name"} - Response:
{"status": "queued", "message_id": "message_id"}
- Request:
-
GET /api/chats/{chat_id}/stream?message_id={message_id}- Stream AI response- Server-Sent Events (SSE) endpoint that streams the AI response token by token
- Events:
data: token_content- Individual tokens from the AI responseevent: done- Indicates the response is complete