
Ollama + Open WebUI
Self-hosted AI stack: Ollama LLM runtime with Open WebUI — a ChatGPT-like interface. Pre-wired together, zero config.
Deploy Ollama + Open WebUI in 3 Steps
Connect Your VPS
Add your server credentials to Server Compass
Select Ollama + Open WebUI
Choose from our template library
Deploy & Configure
Fill in settings and click Deploy
Learn How to Deploy Ollama + Open WebUI
Deploy Ollama + Open WebUI Yourself
Want full control? Here's how to deploy Ollama + Open WebUI yourself using Docker Compose.
Access Your VPS Terminal
Use your terminal to securely access your server. You'll need your server's IP address.
# Connect via SSH
ssh root@your-vps-ip
# Alternative with key file
ssh -i /path/to/key root@your-vps-ipFirst time? Make sure Docker is installed on your VPS. Run: curl -fsSL https://get.docker.com | sh
Set Up the Deployment Folder
Initialize a project folder on your server.
# Create and navigate to project directory
mkdir -p ~/apps/ollama-webui
cd ~/apps/ollama-webuiSet Up the Stack Definition
Use this Docker Compose configuration for your deployment:
services:
ollama:
image: ollama/ollama:latest
volumes:
- ollama_data:/root/.ollama
environment:
- OLLAMA_KEEP_ALIVE=5m
restart: unless-stopped
open-webui:
image: ghcr.io/open-webui/open-webui:main
ports:
- "3000:8080"
environment:
- OLLAMA_BASE_URL=http://ollama:11434
- WEBUI_SECRET_KEY=<your-secret-key>
- ENABLE_SIGNUP=true
volumes:
- open_webui_data:/app/backend/data
depends_on:
- ollama
restart: unless-stopped
volumes:
ollama_data:
open_webui_data:
PORTHost port for Open WebUI(default: 3000)SECRET_KEYSession secret keyKEEP_ALIVEModel keep-alive duration(default: 5m)Start the Containers
Bring up your containers in detached mode.
# Start all services
docker compose up -d
# List running containers
docker compose ps
# Watch the logs
docker compose logs -fSet Up Firewall Rules
Allow the application port through your server's firewall.
# Allow the application port through firewall
sudo ufw allow 3000/tcp
sudo ufw reload
# Access your app at:
# http://your-server-ip:3000Let Server Compass do the heavy lifting.
Skip the terminal and deploy Ollama + Open WebUI with a visual interface. Configure everything with clicks, not commands.
- Beautiful interface
- One-click deploys
- Let's Encrypt SSL
- Zero downtime
- Container monitoring
- Easy rollbacks
After Deployment
After deploying Ollama + Open WebUI with Server Compass, complete these steps to finish setup
Visit the Open WebUI URL and register the first account (becomes admin)
Open the Ollama tab in ServerCompass to pull models
Start chatting — models are instantly available in the interface
Need help? Check out our documentation for detailed guides.
Ollama + Open WebUI FAQ
Common questions about self-hosting Ollama + Open WebUI
How do I deploy Ollama + Open WebUI with Server Compass?
Simply download Server Compass, connect to your VPS, and select Ollama + Open WebUI from the templates list. Fill in the required configuration and click Deploy. The entire process takes under 3 minutes.
What are the system requirements for Ollama + Open WebUI?
Ollama + Open WebUI requires a minimum of 8192MB RAM. We recommend a VPS with at least 16384MB RAM for optimal performance. Any modern Linux server with Docker support will work.
Can I migrate my existing Ollama + Open WebUI data?
Yes! Server Compass provides volume mapping that allows you to import existing data. You can also use standard Ollama + Open WebUI backup and restore procedures.
How do I update Ollama + Open WebUI to the latest version?
Server Compass makes updates easy. Simply click the Update button in your deployment dashboard, and the latest Ollama + Open WebUI image will be pulled and deployed with zero downtime.
Is Ollama + Open WebUI free to self-host?
Ollama + Open WebUI is open-source software. You only pay for your VPS hosting (typically $5-20/month) and optionally Server Compass ($29 one-time). No subscription fees or per-seat pricing.
Related Templates
View all Development
PocketBase
Open-source backend in a single file with realtime database, auth, and file storage

Appwrite
Open-source backend-as-a-service - self-hosted Firebase alternative

Parse Server
Open-source backend framework with dashboard

Supabase
Full Supabase self-hosted with Kong, GoTrue Auth, Realtime, and Studio
Ready to Self-Host Ollama + Open WebUI?
Download Server Compass and deploy Ollama + Open WebUI to your VPS in under 3 minutes. No Docker expertise required.
Download Server Compass