Back to all templates
LiteLLM logo

LiteLLM

Development1024MB+ RAM

LLM API proxy - unified interface to 100+ LLM providers

aillmproxyapi-gatewayopenai-compatible

Deploy LiteLLM in 3 Steps

1

Connect Your VPS

Add your server credentials to Server Compass

2

Select LiteLLM

Choose from our template library

3

Deploy & Configure

Fill in settings and click Deploy

No Docker knowledge required

Why Self-Host LiteLLM?

LiteLLM is a unified API proxy that lets you call 100+ LLM providers through a single OpenAI-compatible interface. Self-hosting your LLM proxy means centralizing API key management, enforcing rate limits, and tracking costs across all your AI applications — without exposing credentials to a third-party service.

Single API for OpenAI, Anthropic, Google, Azure, and 100+ providers
Centralized API key management — developers never see raw keys
Built-in spend tracking, rate limiting, and usage analytics
Automatic fallback between providers for higher reliability
OpenAI-compatible API — works with any existing OpenAI SDK integration

LiteLLM vs Alternatives

LiteLLM vs OpenRouter

OpenRouter is a hosted proxy with markup fees. Self-hosted LiteLLM uses your own API keys with zero intermediary costs.

LiteLLM vs Direct API calls

Direct calls scatter API keys across apps. LiteLLM centralizes management, adds fallback logic, and provides unified logging.

LiteLLM vs Amazon Bedrock

Bedrock locks you into AWS. LiteLLM is provider-agnostic and lets you switch between any LLM without code changes.

Why Deploy LiteLLM with Server Compass?

Server Compass deploys LiteLLM with secure environment variable handling for your API keys. It sets up the proxy with persistent configuration so your model routing rules and spend limits survive container updates.

Download Server Compass

After Deployment

After deploying LiteLLM with Server Compass, complete these steps to finish setup

1

Configure model providers in config

2

Test the proxy endpoint

3

Use OpenAI-compatible API calls

Need help? Check out our documentation for detailed guides.

LiteLLM FAQ

Common questions about self-hosting LiteLLM

How do I deploy LiteLLM with Server Compass?

Simply download Server Compass, connect to your VPS, and select LiteLLM from the templates list. Fill in the required configuration and click Deploy. The entire process takes under 3 minutes.

What are the system requirements for LiteLLM?

LiteLLM requires a minimum of 1024MB RAM. We recommend a VPS with at least 2048MB RAM for optimal performance. Any modern Linux server with Docker support will work.

Can I migrate my existing LiteLLM data?

Yes! Server Compass provides volume mapping that allows you to import existing data. You can also use standard LiteLLM backup and restore procedures.

How do I update LiteLLM to the latest version?

Server Compass makes updates easy. Simply click the Update button in your deployment dashboard, and the latest LiteLLM image will be pulled and deployed with zero downtime.

Is LiteLLM free to self-host?

LiteLLM is open-source software. You only pay for your VPS hosting (typically $5-20/month) and optionally Server Compass ($29 one-time). No subscription fees or per-seat pricing.

Ready to Self-Host LiteLLM?

Download Server Compass and deploy LiteLLM to your VPS in under 3 minutes. No Docker expertise required.

Download Server Compass