change gpt-4-turbo maxResponse configuration template (#814)

用最新的配置文件4.6.8 ,对话选gpt-4-turbo 报错:
null max_tokens is too large: 62500. This model supports at most 4096 completion tokens, whereas you provided 62500. (request id: 20240202110253407344738SmDnkwX1)
原因是官方gpt-4-turbo 最大的返回token 4096.
This commit is contained in:
Robin Wang
2024-02-02 12:20:16 +08:00
committed by GitHub
parent 34602b25df
commit ec8e2512bc

View File

@@ -224,7 +224,7 @@ llm模型全部合并
"model": "gpt-4-0125-preview",
"name": "gpt-4-turbo",
"maxContext": 125000,
"maxResponse": 125000,
"maxResponse": 4000,
"quoteMaxToken": 100000,
"maxTemperature": 1.2,
"inputPrice": 0,