跳到主要内容
版本:dev

ProxyLLMs

DB-GPT can be deployed on servers with lower hardware through proxy LLMs, and now dbgpt support many proxy llms, such as OpenAI、Azure、Wenxin、Tongyi、Zhipu and so on.

Proxy model

Install dependencies

pip install  -e ".[openai]"

Download embedding model

cd DB-GPT
mkdir models and cd models
git clone https://huggingface.co/GanymedeNil/text2vec-large-chinese

Configure the proxy and modify LLM_MODEL, PROXY_API_URL and API_KEY in the .envfile

# .env
LLM_MODEL=chatgpt_proxyllm
PROXY_API_KEY={your-openai-sk}
PROXY_SERVER_URL=https://api.openai.com/v1/chat/completions
# If you use gpt-4
# PROXYLLM_BACKEND=gpt-4
note

⚠️ Be careful not to overwrite the contents of the .env configuration file