mirror of
https://github.com/labring/FastGPT.git
synced 2025-07-23 13:03:50 +00:00
change gpt-4-turbo maxResponse configuration template (#814)
用最新的配置文件4.6.8 ,对话选gpt-4-turbo 报错: null max_tokens is too large: 62500. This model supports at most 4096 completion tokens, whereas you provided 62500. (request id: 20240202110253407344738SmDnkwX1) 原因是官方gpt-4-turbo 最大的返回token 4096.
This commit is contained in:
@@ -224,7 +224,7 @@ llm模型全部合并
|
||||
"model": "gpt-4-0125-preview",
|
||||
"name": "gpt-4-turbo",
|
||||
"maxContext": 125000,
|
||||
"maxResponse": 125000,
|
||||
"maxResponse": 4000,
|
||||
"quoteMaxToken": 100000,
|
||||
"maxTemperature": 1.2,
|
||||
"inputPrice": 0,
|
||||
|
Reference in New Issue
Block a user