Skip to main content
Version: dev

Ollama

Configure DB-GPT to use Ollama for running models locally. Ollama provides the easiest way to run open-source models on your own machine.

Prerequisites​

  • Ollama installed and running
  • DB-GPT installed with proxy_ollama extra

Install Ollama​

# Download from https://ollama.ai or use Homebrew:
brew install ollama

Pull models​

# Pull a chat model
ollama pull deepseek-r1:1.5b

# Pull an embedding model
ollama pull bge-m3:latest
tip

Use ollama list to see all downloaded models.

Install DB-GPT dependencies​

uv sync --all-packages \
--extra "base" \
--extra "proxy_ollama" \
--extra "rag" \
--extra "storage_chromadb" \
--extra "dbgpts"

Configuration​

Edit configs/dbgpt-proxy-ollama.toml:

[models]
[[models.llms]]
name = "deepseek-r1:1.5b"
provider = "proxy/ollama"
api_base = "http://localhost:11434"
api_key = ""

[[models.embeddings]]
name = "bge-m3:latest"
provider = "proxy/ollama"
api_url = "http://localhost:11434"
api_key = ""
info

The api_key can be left empty for local Ollama. If running Ollama on a different machine, update api_base to point to that host.

Chat models​

ModelPull commandSizeNotes
DeepSeek-R1 1.5Bollama pull deepseek-r1:1.5b~1 GBSmall, fast, reasoning
Qwen2.5 7Bollama pull qwen2.5:7b~4.7 GBGood balance
Llama 3.1 8Bollama pull llama3.1:8b~4.7 GBMeta's latest
Mistral 7Bollama pull mistral:7b~4.1 GBFast general use

Embedding models​

ModelPull commandNotes
bge-m3ollama pull bge-m3:latestMultilingual
nomic-embed-textollama pull nomic-embed-textEnglish-focused

Start the server​

Make sure Ollama is running first:

# Start Ollama (if not running as a service)
ollama serve

Then start DB-GPT:

uv run dbgpt start webserver --config configs/dbgpt-proxy-ollama.toml

Troubleshooting​

IssueSolution
Connection refusedEnsure Ollama is running: ollama serve
Model not foundPull the model first: ollama pull model-name
Slow responsesTry a smaller model or ensure GPU is being used
Out of memoryUse a smaller quantized model (e.g., qwen2.5:7b-q4_0)

What's next​