Skip to main content

Libre WebUI Documentation

Self-hosted AI chat that works with Ollama, OpenAI, Anthropic, and 9+ providers.

Quick Start

The fastest way to get started:

npx libre-webui

Opens at http://localhost:8080. That's it.

Requirements: Ollama for local AI, or API keys for cloud providers.

Installation Options

MethodCommandBest For
npxnpx libre-webuiQuick start, testing
Dockerdocker-compose up -dProduction, includes Ollama
Docker (external Ollama)docker-compose -f docker-compose.external-ollama.yml up -dWhen Ollama is already running
Kuberneteshelm install libre-webui oci://ghcr.io/libre-webui/charts/libre-webuiEnterprise, scaling
From sourcenpm install && npm run devDevelopment

Core Features

  • Real-time streaming chat with dark/light themes
  • Document Chat (RAG) - Upload PDFs and chat with your docs
  • Custom Personas - AI personalities with memory
  • Interactive Artifacts - Live HTML, SVG, code preview
  • Text-to-Speech - Multiple voices and providers

AI Providers

Local:

  • Ollama (full integration)

Cloud (via plugins):

  • OpenAI, Anthropic, Google, Groq, Mistral, OpenRouter, and more

Documentation

Getting Started

Deployment

Features

Administration

Troubleshooting

Configuration

Edit backend/.env:

# Local AI
OLLAMA_BASE_URL=http://localhost:11434

# Cloud providers (optional)
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...

Enterprise

Kroonen AI provides professional services:

  • On-premise & cloud deployment
  • SSO integration (Okta, Azure AD, SAML)
  • Custom development
  • SLA-backed support

Contact: [email protected]