跳到主要内容
版本:dev

Proxy LLMs

DB-GPT can be deployed on servers with lower hardware requirements through proxy LLMs. DB-GPT supports many proxy LLMs, such as OpenAI, Azure, DeepSeek, Ollama, and more.

Installation and Configuration

Installing DB-GPT with proxy LLM support requires using the uv package manager for a faster and more stable dependency management experience.

Install Dependencies

# Use uv to install dependencies needed for OpenAI proxy
uv sync --all-packages \
--extra "base" \
--extra "proxy_openai" \
--extra "rag" \
--extra "storage_chromadb" \
--extra "dbgpts"

Configure OpenAI

Edit the configs/dbgpt-proxy-openai.toml configuration file to specify your OpenAI API key:

# Model Configurations
[models]
[[models.llms]]
name = "gpt-3.5-turbo"
provider = "proxy/openai"
api_key = "your-openai-api-key"
# Optional: To use GPT-4, change the name to "gpt-4" or "gpt-4-turbo"

[[models.embeddings]]
name = "text-embedding-ada-002"
provider = "proxy/openai"
api_key = "your-openai-api-key"

Run Webserver

uv run dbgpt start webserver --config configs/dbgpt-proxy-openai.toml
note

If you are in the China region, you can add --index-url=https://pypi.tuna.tsinghua.edu.cn/simple at the end of the uv sync command for faster package downloads.

Visit Website

After starting the webserver, open your browser and visit http://localhost:5670