Welcome to OpenChat
OpenChat is an open-source AI chat platform built on TanStack Start, Convex, and OpenRouter. It supports hosted usage atosschat.dev and full self-hosting.

Quickstart
Get OpenChat running locally in minutes.
Architecture
Understand web, API, Convex, and workflow layers.
Core Features
Learn chat, model switching, exports, and settings.
Self-Hosting
Deploy with Vercel + Convex Cloud or Docker.
What You Get
| Capability | What it means in OpenChat |
|---|---|
| Model access | 100+ models via OpenRouter plus direct provider credentials |
| Real-time sync | Conversations and sidebar state update across devices via Convex |
| Advanced chat controls | Edit, retry, fork, stop generation, and export chats |
| Search and reasoning | Optional web search and model reasoning support |
| Secure credentials | Encrypted provider secrets stored server-side |
| Self-hosting | Deploy in cloud or containers with first-party docs |
Core Product Areas
- Chat UX: streaming replies, message actions, branching, read state.
- Model UX: searchable model selector, favorites, provider switching.
- Settings: account, providers, chat preferences, models cache, shortcuts.
- Workflows: asynchronous chat title generation, export, cleanup, account deletion.
- Platform: Better Auth + GitHub OAuth, Convex schema/functions, Upstash limits and queues.