- Embedding
- Chat
- Completion
Completion
corresponding
to the package langchain/llms
in langchain:
API Initialization
To use the LLM services based on Baidu Qianfan, you have to initialize these parameters: You could either choose to init the AK,SK in environment variables or init params:Current supported models
- ERNIE-Bot-turbo (default models)
- ERNIE-Bot
- BLOOMZ-7B
- Llama-2-7b-chat
- Llama-2-13b-chat
- Llama-2-70b-chat
- Qianfan-BLOOMZ-7B-compressed
- Qianfan-Chinese-Llama-2-7B
- ChatGLM2-6B-32K
- AquilaChat-7B
Use different models in Qianfan
In the case you want to deploy your own model based on EB or serval open sources model, you could follow these steps:-
- (Optional, if the model are included in the default models, skip it)Deploy your model in Qianfan Console, get your own customized deploy endpoint.
-
- Set up the field called
endpoint
in the initialization:
- Set up the field called
Model Params
For now, onlyERNIE-Bot
and ERNIE-Bot-turbo
support model params below, we might support more models in the future.
- temperature
- top_p
- penalty_score