Proxy LLMs
DB-GPT can be deployed on servers with lower hardware requirements through proxy LLMs. DB-GPT supports many proxy LLMs, such as OpenAI, Azure, DeepSeek, Ollama, and more.
Installation and Configuration
Installing DB-GPT with proxy LLM support requires using the uv
package manager for a faster and more stable dependency management experience.
- OpenAI
- Azure
- DeepSeek
- Ollama
- Qwen
- ChatGLM
- WenXin
Install Dependencies
# Use uv to install dependencies needed for OpenAI proxy
uv sync --all-packages \
--extra "base" \
--extra "proxy_openai" \
--extra "rag" \
--extra "storage_chromadb" \
--extra "dbgpts"
Configure OpenAI
Edit the configs/dbgpt-proxy-openai.toml
configuration file to specify your OpenAI API key:
# Model Configurations
[models]
[[models.llms]]
name = "gpt-3.5-turbo"
provider = "proxy/openai"
api_key = "your-openai-api-key"
# Optional: To use GPT-4, change the name to "gpt-4" or "gpt-4-turbo"
[[models.embeddings]]
name = "text-embedding-ada-002"
provider = "proxy/openai"
api_key = "your-openai-api-key"
Run Webserver
uv run dbgpt start webserver --config configs/dbgpt-proxy-openai.toml