A modern, full-stack AI chat application built with TanStack AI TanStack Start, featuring multi-provider AI support, real-time streaming, and a beautiful UI.
- 🤖 Multi-Provider AI - OpenAI, Anthropic, Google Gemini, Ollama (local)
- 💬 Real-time Streaming - Natural typing animation for AI responses
- 📚 Persistent History - Chat history stored in PostgreSQL
- 🔍 Full-Text Search - Search across chats (⌘K / Ctrl+K)
- 📱 Responsive Design - Mobile-first with collapsible sidebar
- 🌓 Dark Mode - System-aware theme with manual override
- 🐳 Docker Ready - Easy deployment with Docker Compose
| Layer | Technologies |
|---|---|
| Frontend | TanStack Start, TanStack Router, TanStack Query, React, Tailwind CSS, shadcn/ui |
| Backend | TanStack AI, Drizzle ORM, PostgreSQL |
| Runtime | Bun (recommended) or Node.js |
| Provider | Models |
|---|---|
| OpenAI | GPT-4o, GPT-4o Mini, GPT-4 Turbo, GPT-3.5 Turbo |
| Anthropic | Claude Sonnet 4.5, Claude 3.5 Sonnet, Claude 3.5 Haiku |
| Gemini Pro, Gemini 2.0 Flash | |
| Ollama | Llama 3 (+ any local model) |
See TanStack AI docs for more details.
- Bun 1.0+ or Node.js 18+
- PostgreSQL 14+ (or use Docker)
- At least one AI provider API key:
git clone https://github.com/rs-4/tanstack-ai-demo.git
cd tanstack-ai-demo
bun install
cp .env.example .env.local
# Edit .env.local with your DB and API keys
bun run db:push
bun run devnpm install
cp .env.example .env.local
npm run db:push
npm run devcp .env.example .env.local
# Edit .env.local with your API keys (DATABASE_URL is auto-configured)
docker-compose up -d --buildVisit http://localhost:3000
# Database
DATABASE_URL=postgresql://user:password@localhost:5432/chatapp
# AI Provider API Keys (at least one required)
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...
GEMINI_API_KEY=...
# Ollama (optional)
OLLAMA_BASE_URL=http://localhost:11434
# Server (optional)
PORT=3000Run AI models locally without API keys:
# Install (macOS)
brew install ollama
# Install (Linux)
curl -fsSL https://ollama.com/install.sh | sh
# Pull and run a model
ollama pull llama3
ollama serveThen select "Ollama (Local)" in the model selector.
See TanStack AI Ollama docs for more details.
src/
├── components/
│ ├── ui/ # shadcn/ui components
│ ├── chat/ # Chat components
│ └── ...
├── db/ # Database (Drizzle schema)
├── lib/ # Server functions & utilities
├── routes/ # TanStack Router routes
└── types/ # TypeScript types
| Command | Description |
|---|---|
bun run dev |
Start dev server |
bun run build |
Build for production |
bun run db:push |
Push schema to database |
bun run db:studio |
Open Drizzle Studio |
docker-compose up -d |
Start with Docker |
Replace
bunwithnpmif using Node.js.
Edit src/lib/store.ts and src/lib/chat-actions.ts.
- Theme colors:
src/styles.css - Components:
src/components/ui/ - Fonts:
src/routes/__root.tsx
MIT License - see LICENSE