fix: package plus request (#4492)

* fix plus request (#4476)

* perf: package plus request

* perf: plus request fix

* fix: doc

---------

Co-authored-by: heheer <heheer@sealos.io>
This commit is contained in:
Archer
2025-04-09 23:44:14 +08:00
committed by GitHub
parent e4629a5c8c
commit c02864facc
21 changed files with 231 additions and 363 deletions

View File

@@ -34,6 +34,94 @@ weight: 852
### 请求
{{< tabs tabTotal="3" >}}
{{< tab tabName="基础请求示例" >}}
{{< markdownify >}}
```bash
curl --location --request POST 'http://localhost:3000/api/v1/chat/completions' \
--header 'Authorization: Bearer fastgpt-xxxxxx' \
--header 'Content-Type: application/json' \
--data-raw '{
"chatId": "my_chatId",
"stream": false,
"detail": false,
"responseChatItemId": "my_responseChatItemId",
"variables": {
"uid": "asdfadsfasfd2323",
"name": "张三"
},
"messages": [
{
"role": "user",
"content": "导演是谁"
}
]
}'
```
{{< /markdownify >}}
{{< /tab >}}
{{< tab tabName="图片/文件请求示例" >}}
{{< markdownify >}}
*`messages`有部分区别,其他参数一致。
* 目前不支持上传文件,需上传到自己的对象存储中,获取对应的文件链接。
```bash
curl --location --request POST 'http://localhost:3000/api/v1/chat/completions' \
--header 'Authorization: Bearer fastgpt-xxxxxx' \
--header 'Content-Type: application/json' \
--data-raw '{
"chatId": "abcd",
"stream": false,
"messages": [
{
"role": "user",
"content": [
{
"type": "text",
"text": "导演是谁"
},
{
"type": "image_url",
"image_url": {
"url": "图片链接"
}
},
{
"type": "file_url",
"name": "文件名",
"url": "文档链接,支持 txt md html word pdf ppt csv excel"
}
]
}
]
}'
```
{{< /markdownify >}}
{{< /tab >}}
{{< tab tabName="参数说明" >}}
{{< markdownify >}}
{{% alert context="info" %}}
- headers.Authorization: Bearer {{apikey}}
- chatId: string | undefined 。
-`undefined` 时(不传入),不使用 FastGpt 提供的上下文功能,完全通过传入的 messages 构建上下文。
-`非空字符串`时,意味着使用 chatId 进行对话,自动从 FastGpt 数据库取历史记录,并使用 messages 数组最后一个内容作为用户问题,其余 message 会被忽略。请自行确保 chatId 唯一长度小于250通常可以是自己系统的对话框ID。
- messages: 结构与 [GPT接口](https://platform.openai.com/docs/api-reference/chat/object) chat模式一致。
- responseChatItemId: string | undefined 。如果传入,则会将该值作为本次对话的响应消息的 IDFastGPT 会自动将该 ID 存入数据库。请确保,在当前`chatId`下,`responseChatItemId`是唯一的。
- detail: 是否返回中间值(模块状态,响应的完整结果等),`stream模式`下会通过`event`进行区分,`非stream模式`结果保存在`responseData`中。
- variables: 模块变量,一个对象,会替换模块中,输入框内容里的`{{key}}`
{{% /alert %}}
{{< /markdownify >}}
{{< /tab >}}
{{< /tabs >}}
<!-- #### v2
v1,v2 接口请求参数一致,仅请求地址不一样。
@@ -128,93 +216,7 @@ curl --location --request POST 'http://localhost:3000/api/v2/chat/completions' \
#### v1
{{< tabs tabTotal="3" >}}
{{< tab tabName="基础请求示例" >}}
{{< markdownify >}}
```bash
curl --location --request POST 'http://localhost:3000/api/v1/chat/completions' \
--header 'Authorization: Bearer fastgpt-xxxxxx' \
--header 'Content-Type: application/json' \
--data-raw '{
"chatId": "my_chatId",
"stream": false,
"detail": false,
"responseChatItemId": "my_responseChatItemId",
"variables": {
"uid": "asdfadsfasfd2323",
"name": "张三"
},
"messages": [
{
"role": "user",
"content": "导演是谁"
}
]
}'
```
{{< /markdownify >}}
{{< /tab >}}
{{< tab tabName="图片/文件请求示例" >}}
{{< markdownify >}}
* `messages`有部分区别,其他参数一致。
* 目前不支持上传文件,需上传到自己的对象存储中,获取对应的文件链接。
```bash
curl --location --request POST 'http://localhost:3000/api/v1/chat/completions' \
--header 'Authorization: Bearer fastgpt-xxxxxx' \
--header 'Content-Type: application/json' \
--data-raw '{
"chatId": "abcd",
"stream": false,
"messages": [
{
"role": "user",
"content": [
{
"type": "text",
"text": "导演是谁"
},
{
"type": "image_url",
"image_url": {
"url": "图片链接"
}
},
{
"type": "file_url",
"name": "文件名",
"url": "文档链接,支持 txt md html word pdf ppt csv excel"
}
]
}
]
}'
```
{{< /markdownify >}}
{{< /tab >}}
{{< tab tabName="参数说明" >}}
{{< markdownify >}}
{{% alert context="info" %}}
- headers.Authorization: Bearer {{apikey}}
- chatId: string | undefined 。
- `undefined` 时(不传入),不使用 FastGpt 提供的上下文功能,完全通过传入的 messages 构建上下文。
- `非空字符串`时,意味着使用 chatId 进行对话,自动从 FastGpt 数据库取历史记录,并使用 messages 数组最后一个内容作为用户问题,其余 message 会被忽略。请自行确保 chatId 唯一长度小于250通常可以是自己系统的对话框ID。
- messages: 结构与 [GPT接口](https://platform.openai.com/docs/api-reference/chat/object) chat模式一致。
- responseChatItemId: string | undefined 。如果传入,则会将该值作为本次对话的响应消息的 IDFastGPT 会自动将该 ID 存入数据库。请确保,在当前`chatId`下,`responseChatItemId`是唯一的。
- detail: 是否返回中间值(模块状态,响应的完整结果等),`stream模式`下会通过`event`进行区分,`非stream模式`结果保存在`responseData`中。
- variables: 模块变量,一个对象,会替换模块中,输入框内容里的`{{key}}`
{{% /alert %}}
{{< /markdownify >}}
{{< /tab >}}
{{< /tabs >}}
### 响应
@@ -745,8 +747,6 @@ curl --location --request POST 'https://api.fastgpt.in/api/v1/chat/completions'
### 请求示例
#### v1
```bash
curl --location --request POST 'http://localhost:3000/api/v1/chat/completions' \
--header 'Authorization: Bearer test-xxxxx' \
@@ -760,25 +760,8 @@ curl --location --request POST 'http://localhost:3000/api/v1/chat/completions' \
}'
```
#### v2
```bash
curl --location --request POST 'http://localhost:3000/api/v2/chat/completions' \
--header 'Authorization: Bearer test-xxxxx' \
--header 'Content-Type: application/json' \
--data-raw '{
"stream": false,
"chatId": "test",
"variables": {
"query":"你好"
}
}'
```
### 响应示例
#### v1
{{< tabs tabTotal="3" >}}
{{< tab tabName="detail=true,stream=false 响应" >}}
@@ -937,151 +920,6 @@ event取值
{{< /tab >}}
{{< /tabs >}}
#### v2
{{< tabs tabTotal="3" >}}
{{< tab tabName="detail=true,stream=false 响应" >}}
{{< markdownify >}}
* 插件的输出可以通过查找`responseData`中, `moduleType=pluginOutput`的元素,其`pluginOutput`是插件的输出。
* 流输出,仍可以通过`choices`进行获取。
```json
{
"responseData": [
{
"id": "bsH1ZdbYkz9iJwYa",
"nodeId": "pluginInput",
"moduleName": "workflow:template.plugin_start",
"moduleType": "pluginInput",
"runningTime": 0
},
{
"id": "zDgfqSPhbYZFHVIn",
"nodeId": "h4Gr4lJtFVQ6qI4c",
"moduleName": "AI 对话",
"moduleType": "chatNode",
"runningTime": 1.44,
"totalPoints": 0,
"model": "GPT-4o-mini",
"tokens": 34,
"inputTokens": 8,
"outputTokens": 26,
"query": "你好",
"reasoningText": "",
"historyPreview": [
{
"obj": "Human",
"value": "你好"
},
{
"obj": "AI",
"value": "你好!有什么我可以帮助你的吗?"
}
],
"contextTotalLen": 2
},
{
"id": "uLLwKKRZvufXzgF4",
"nodeId": "pluginOutput",
"moduleName": "common:core.module.template.self_output",
"moduleType": "pluginOutput",
"runningTime": 0,
"totalPoints": 0,
"pluginOutput": {
"result": "你好!有什么我可以帮助你的吗?"
}
}
],
"newVariables": {
},
"id": "test",
"model": "",
"usage": {
"prompt_tokens": 1,
"completion_tokens": 1,
"total_tokens": 1
},
"choices": [
{
"message": {
"role": "assistant",
"content": "你好!有什么我可以帮助你的吗?"
},
"finish_reason": "stop",
"index": 0
}
]
}
```
{{< /markdownify >}}
{{< /tab >}}
{{< tab tabName="detail=true,stream=true 响应" >}}
{{< markdownify >}}
* 插件的输出可以通过获取`event=flowResponses`中的字符串,并将其反序列化后得到一个数组。同样的,查找 `moduleType=pluginOutput`的元素,其`pluginOutput`是插件的输出。
* 流输出,仍和对话接口一样获取。
```bash
data: {"event":"flowNodeResponse","data":"{\"id\":\"q8ablUOqHGgqLIRM\",\"nodeId\":\"pluginInput\",\"moduleName\":\"workflow:template.plugin_start\",\"moduleType\":\"pluginInput\",\"runningTime\":0}"}
data: {"event":"flowNodeStatus","data":"{\"status\":\"running\",\"name\":\"AI 对话\"}"}
data: {"event":"answer","data":"{\"id\":\"\",\"object\":\"\",\"created\":0,\"model\":\"\",\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":\"你好\"},\"index\":0,\"finish_reason\":null}]}"}
data: {"event":"answer","data":"{\"id\":\"\",\"object\":\"\",\"created\":0,\"model\":\"\",\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":\"\"},\"index\":0,\"finish_reason\":null}]}"}
data: {"event":"answer","data":"{\"id\":\"\",\"object\":\"\",\"created\":0,\"model\":\"\",\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":\"有什么\"},\"index\":0,\"finish_reason\":null}]}"}
data: {"event":"answer","data":"{\"id\":\"\",\"object\":\"\",\"created\":0,\"model\":\"\",\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":\"我\"},\"index\":0,\"finish_reason\":null}]}"}
data: {"event":"answer","data":"{\"id\":\"\",\"object\":\"\",\"created\":0,\"model\":\"\",\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":\"可以\"},\"index\":0,\"finish_reason\":null}]}"}
data: {"event":"answer","data":"{\"id\":\"\",\"object\":\"\",\"created\":0,\"model\":\"\",\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":\"帮助\"},\"index\":0,\"finish_reason\":null}]}"}
data: {"event":"answer","data":"{\"id\":\"\",\"object\":\"\",\"created\":0,\"model\":\"\",\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":\"你\"},\"index\":0,\"finish_reason\":null}]}"}
data: {"event":"answer","data":"{\"id\":\"\",\"object\":\"\",\"created\":0,\"model\":\"\",\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":\"的吗\"},\"index\":0,\"finish_reason\":null}]}"}
data: {"event":"answer","data":"{\"id\":\"\",\"object\":\"\",\"created\":0,\"model\":\"\",\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":\"\"},\"index\":0,\"finish_reason\":null}]}"}
data: {"event":"flowNodeResponse","data":"{\"id\":\"rqlXLUap8QeiN7Kf\",\"nodeId\":\"h4Gr4lJtFVQ6qI4c\",\"moduleName\":\"AI 对话\",\"moduleType\":\"chatNode\",\"runningTime\":1.79,\"totalPoints\":0,\"model\":\"GPT-4o-mini\",\"tokens\":137,\"inputTokens\":111,\"outputTokens\":26,\"query\":\"你好\",\"reasoningText\":\"\",\"historyPreview\":[{\"obj\":\"Human\",\"value\":\"[{\\\"renderTypeList\\\":[\\\"reference\\\"],\\\"selectedTypeInde\\n\\n...[hide 174 chars]...\\n\\ncanSelectImg\\\":true,\\\"required\\\":false,\\\"value\\\":\\\"你好\\\"}]\"},{\"obj\":\"AI\",\"value\":\"你好!有什么我可以帮助你的吗?\"},{\"obj\":\"Human\",\"value\":\"你好\"},{\"obj\":\"AI\",\"value\":\"你好!有什么我可以帮助你的吗?\"}],\"contextTotalLen\":4}"}
data: {"event":"flowNodeResponse","data":"{\"id\":\"lHCpHI0MrM00HQlX\",\"nodeId\":\"pluginOutput\",\"moduleName\":\"common:core.module.template.self_output\",\"moduleType\":\"pluginOutput\",\"runningTime\":0,\"totalPoints\":0,\"pluginOutput\":{\"result\":\"你好!有什么我可以帮助你的吗?\"}}"}
data: {"event":"answer","data":"{\"id\":\"\",\"object\":\"\",\"created\":0,\"model\":\"\",\"choices\":[{\"delta\":{\"role\":\"assistant\",\"content\":null},\"index\":0,\"finish_reason\":\"stop\"}]}"}
data: {"event":"answer","data":"[DONE]"}
```
{{< /markdownify >}}
{{< /tab >}}
{{< tab tabName="输出获取" >}}
{{< markdownify >}}
event取值
- answer: 返回给客户端的文本(最终会算作回答)
- fastAnswer: 指定回复返回给客户端的文本(最终会算作回答)
- toolCall: 执行工具
- toolParams: 工具参数
- toolResponse: 工具返回
- flowNodeStatus: 运行到的节点状态
- flowNodeResponse: 单个节点详细响应
- updateVariables: 更新变量
- error: 报错
{{< /markdownify >}}
{{< /tab >}}
{{< /tabs >}}
# 对话 CRUD
{{% alert icon="🤖 " context="success" %}}

View File

@@ -62,4 +62,5 @@ curl --location --request POST 'https://{{host}}/api/admin/initv494' \
## 🐛 修复
1. 搜索应用/知识库时,无法点击目录进入下一层。
2. 重新训练时,参数未成功初始化。
2. 重新训练时,参数未成功初始化。
3. package/service 部分请求在多 app 中不一致。

29
packages/service/common/api/type.d.ts vendored Normal file
View File

@@ -0,0 +1,29 @@
import { FeishuServer, YuqueServer } from '@fastgpt/global/core/dataset/apiDataset';
import {
DeepRagSearchProps,
SearchDatasetDataResponse
} from '../../core/dataset/search/controller';
import { AuthOpenApiLimitProps } from '../../support/openapi/auth';
import { CreateUsageProps, ConcatUsageProps } from '@fastgpt/global/support/wallet/usage/api';
import {
GetProApiDatasetFileContentParams,
GetProApiDatasetFileListParams,
GetProApiDatasetFilePreviewUrlParams
} from '../../core/dataset/apiDataset/proApi';
declare global {
var textCensorHandler: (params: { text: string }) => Promise<{ code: number; message?: string }>;
var deepRagHandler: (data: DeepRagSearchProps) => Promise<SearchDatasetDataResponse>;
var authOpenApiHandler: (data: AuthOpenApiLimitProps) => Promise<any>;
var createUsageHandler: (data: CreateUsageProps) => Promise<void>;
var concatUsageHandler: (data: ConcatUsageProps) => Promise<void>;
// API dataset
var getProApiDatasetFileList: (data: GetProApiDatasetFileListParams) => Promise<APIFileItem[]>;
var getProApiDatasetFileContent: (
data: GetProApiDatasetFileContentParams
) => Promise<ApiFileReadContentResponse>;
var getProApiDatasetFilePreviewUrl: (
data: GetProApiDatasetFilePreviewUrlParams
) => Promise<string>;
}

View File

@@ -1,4 +1,3 @@
export const FastGPTProUrl = process.env.PRO_URL ? `${process.env.PRO_URL}/api` : '';
export const isFastGPTMainService = !!process.env.PRO_URL;
// @ts-ignore
export const isFastGPTProService = () => !!global.systemConfig;

View File

@@ -1,7 +1,6 @@
import { POST } from './plusRequest';
export const postTextCensor = (data: { text: string }) =>
POST<{ code?: number; message: string }>('/common/censor/check', data)
global
.textCensorHandler(data)
.then((res) => {
if (res?.code === 5000) {
return Promise.reject(res);

View File

@@ -0,0 +1,25 @@
import { ParentIdType } from '@fastgpt/global/common/parentFolder/type';
import { FeishuServer, YuqueServer } from '@fastgpt/global/core/dataset/apiDataset';
export enum ProApiDatasetOperationTypeEnum {
LIST = 'list',
READ = 'read',
CONTENT = 'content'
}
export type ProApiDatasetCommonParams = {
feishuServer?: FeishuServer;
yuqueServer?: YuqueServer;
};
export type GetProApiDatasetFileListParams = ProApiDatasetCommonParams & {
parentId?: ParentIdType;
};
export type GetProApiDatasetFileContentParams = ProApiDatasetCommonParams & {
apiFileId: string;
};
export type GetProApiDatasetFilePreviewUrlParams = ProApiDatasetCommonParams & {
apiFileId: string;
};

View File

@@ -9,7 +9,6 @@ import { readRawContentByFileBuffer } from '../../common/file/read/utils';
import { parseFileExtensionFromUrl } from '@fastgpt/global/common/string/tools';
import { APIFileServer, FeishuServer, YuqueServer } from '@fastgpt/global/core/dataset/apiDataset';
import { useApiDatasetRequest } from './apiDataset/api';
import { POST } from '../../common/api/plusRequest';
export const readFileRawTextByUrl = async ({
teamId,
@@ -168,11 +167,7 @@ export const readApiServerFileContent = async ({
}
if (feishuServer || yuqueServer) {
return POST<{
title?: string;
rawText: string;
}>(`/core/dataset/systemApiDataset`, {
type: 'content',
return global.getProApiDatasetFileContent({
feishuServer,
yuqueServer,
apiFileId

View File

@@ -24,7 +24,6 @@ import { MongoDatasetCollectionTags } from '../tag/schema';
import { readFromSecondary } from '../../../common/mongo/utils';
import { MongoDatasetDataText } from '../data/dataTextSchema';
import { ChatItemType } from '@fastgpt/global/core/chat/type';
import { POST } from '../../../common/api/plusRequest';
import { NodeInputKeyEnum } from '@fastgpt/global/core/workflow/constants';
import { datasetSearchQueryExtension } from './utils';
import type { RerankModelItemType } from '@fastgpt/global/core/ai/model.d';
@@ -850,5 +849,4 @@ export type DeepRagSearchProps = SearchDatasetDataProps & {
[NodeInputKeyEnum.datasetDeepSearchMaxTimes]?: number;
[NodeInputKeyEnum.datasetDeepSearchBg]?: string;
};
export const deepRagSearch = (data: DeepRagSearchProps) =>
POST<SearchDatasetDataResponse>('/core/dataset/deepRag', data);
export const deepRagSearch = (data: DeepRagSearchProps) => global.deepRagHandler(data);

View File

@@ -29,9 +29,9 @@ import { InteractiveNodeResponseType } from '@fastgpt/global/core/workflow/templ
import { getFileContentFromLinks, getHistoryFileLinks } from '../../tools/readFiles';
import { parseUrlToFileType } from '@fastgpt/global/common/file/tools';
import { FlowNodeTypeEnum } from '@fastgpt/global/core/workflow/node/constant';
import { postTextCensor } from '../../../../../common/api/requestPlusApi';
import { ModelTypeEnum } from '@fastgpt/global/core/ai/model';
import { getDocumentQuotePrompt } from '@fastgpt/global/core/ai/prompt/AIChat';
import { postTextCensor } from '../../../../chat/postTextCensor';
type Response = DispatchNodeResultType<{
[NodeOutputKeyEnum.answerText]: string;

View File

@@ -13,7 +13,6 @@ import type {
} from '@fastgpt/global/core/ai/type.d';
import { formatModelChars2Points } from '../../../../support/wallet/usage/utils';
import type { LLMModelItemType } from '@fastgpt/global/core/ai/model.d';
import { postTextCensor } from '../../../../common/api/requestPlusApi';
import { ChatCompletionRequestMessageRoleEnum } from '@fastgpt/global/core/ai/constants';
import type {
ChatDispatchProps,
@@ -51,6 +50,7 @@ import { getFileContentFromLinks, getHistoryFileLinks } from '../tools/readFiles
import { parseUrlToFileType } from '@fastgpt/global/common/file/tools';
import { i18nT } from '../../../../../web/i18n/utils';
import { ModelTypeEnum } from '@fastgpt/global/core/ai/model';
import { postTextCensor } from '../../../chat/postTextCensor';
export type ChatProps = ModuleDispatchProps<
AIChatNodeProps & {

View File

@@ -1,7 +1,6 @@
import { ERROR_ENUM } from '@fastgpt/global/common/error/errorCode';
import { updateApiKeyUsedTime } from './tools';
import { MongoOpenApi } from './schema';
import { POST } from '../../common/api/plusRequest';
import type { OpenApiSchema } from '@fastgpt/global/support/openapi/type';
export type AuthOpenApiLimitProps = { openApi: OpenApiSchema };
@@ -17,11 +16,10 @@ export async function authOpenApiKey({ apikey }: { apikey: string }) {
}
// auth limit
// @ts-ignore
if (global.feConfigs?.isPlus) {
await POST('/support/openapi/authLimit', {
await global.authOpenApiHandler({
openApi
} as AuthOpenApiLimitProps);
});
}
updateApiKeyUsedTime(openApi._id);

View File

@@ -1,67 +1,21 @@
import { UsageSourceEnum } from '@fastgpt/global/support/wallet/usage/constants';
import { MongoUsage } from './schema';
import { ClientSession, Types } from '../../../common/mongo';
import { ClientSession } from '../../../common/mongo';
import { addLog } from '../../../common/system/log';
import { ChatNodeUsageType } from '@fastgpt/global/support/wallet/bill/type';
import { ConcatUsageProps, CreateUsageProps } from '@fastgpt/global/support/wallet/usage/api';
import { i18nT } from '../../../../web/i18n/utils';
import { pushConcatBillTask, pushReduceTeamAiPointsTask } from './utils';
import { POST } from '../../../common/api/plusRequest';
import { isFastGPTMainService } from '../../../common/system/constants';
export async function createUsage(data: CreateUsageProps) {
try {
// In FastGPT server
if (isFastGPTMainService) {
await POST('/support/wallet/usage/createUsage', data);
} else if (global.reduceAiPointsQueue) {
// In FastGPT pro server
await MongoUsage.create(data);
pushReduceTeamAiPointsTask({ teamId: data.teamId, totalPoints: data.totalPoints });
if (data.totalPoints === 0) {
addLog.info('0 totalPoints', data);
}
}
await global.createUsageHandler(data);
} catch (error) {
addLog.error('createUsage error', error);
}
}
export async function concatUsage(data: ConcatUsageProps) {
try {
// In FastGPT server
if (isFastGPTMainService) {
await POST('/support/wallet/usage/concatUsage', data);
} else if (global.reduceAiPointsQueue) {
const {
teamId,
billId,
totalPoints = 0,
listIndex,
inputTokens = 0,
outputTokens = 0
} = data;
// billId is required and valid
if (!billId || !Types.ObjectId.isValid(billId)) return;
// In FastGPT pro server
pushConcatBillTask([
{
billId,
listIndex,
inputTokens,
outputTokens,
totalPoints
}
]);
pushReduceTeamAiPointsTask({ teamId, totalPoints });
if (data.totalPoints === 0) {
addLog.info('0 totalPoints', data);
}
}
await global.concatUsageHandler(data);
} catch (error) {
addLog.error('concatUsage error', error);
}

View File

@@ -1,6 +1,5 @@
import { findAIModel } from '../../../core/ai/model';
import { ModelTypeEnum } from '@fastgpt/global/core/ai/model';
import { ConcatBillQueueItemType } from './type';
export const formatModelChars2Points = ({
model,
@@ -35,20 +34,3 @@ export const formatModelChars2Points = ({
totalPoints
};
};
export const pushReduceTeamAiPointsTask = ({
teamId,
totalPoints
}: {
teamId: string;
totalPoints: number;
}) => {
global.reduceAiPointsQueue.push({
teamId: String(teamId),
totalPoints
});
};
export const pushConcatBillTask = (data: ConcatBillQueueItemType[]) => {
global.concatBillQueue.push(...data);
};

View File

@@ -97,7 +97,7 @@
"permission.des.read": "View knowledge base content",
"permission.des.write": "Ability to add and change knowledge base content",
"preview_chunk": "Preview chunks",
"preview_chunk_empty": "Unable to read the contents of the file",
"preview_chunk_empty": "File content is empty",
"preview_chunk_intro": "A total of {{total}} blocks, up to 10",
"preview_chunk_not_selected": "Click on the file on the left to preview",
"process.Auto_Index": "Automatic index generation",

View File

@@ -97,7 +97,7 @@
"permission.des.read": "可查看知识库内容",
"permission.des.write": "可增加和变更知识库内容",
"preview_chunk": "分块预览",
"preview_chunk_empty": "无法读取该文件内容",
"preview_chunk_empty": "文件内容为空",
"preview_chunk_intro": "共 {{total}} 个分块,最多展示 10 个",
"preview_chunk_not_selected": "点击左侧文件后进行预览",
"process.Auto_Index": "自动索引生成",

View File

@@ -97,7 +97,7 @@
"permission.des.read": "可檢視資料集內容",
"permission.des.write": "可新增和變更資料集內容",
"preview_chunk": "分塊預覽",
"preview_chunk_empty": "無法讀取該文件內容",
"preview_chunk_empty": "文件內容為空",
"preview_chunk_intro": "共 {{total}} 個分塊,最多展示 10 個",
"preview_chunk_not_selected": "點擊左側文件後進行預覽",
"process.Auto_Index": "自動索引生成",

View File

@@ -1,8 +1,6 @@
import { getFeishuAndYuqueDatasetFileList } from '@/service/core/dataset/apiDataset/controller';
import { NextAPI } from '@/service/middleware/entry';
import { DatasetErrEnum } from '@fastgpt/global/common/error/code/dataset';
import { ParentIdType } from '@fastgpt/global/common/parentFolder/type';
import { APIFileItem } from '@fastgpt/global/core/dataset/apiDataset';
import { ReadPermissionVal } from '@fastgpt/global/support/permission/constant';
import { useApiDatasetRequest } from '@fastgpt/service/core/dataset/apiDataset/api';
import { authDataset } from '@fastgpt/service/support/permission/dataset/auth';
@@ -14,8 +12,6 @@ export type GetApiDatasetFileListProps = {
datasetId: string;
};
export type GetApiDatasetFileListResponse = APIFileItem[];
async function handler(req: NextApiRequest) {
let { searchKey = '', parentId = null, datasetId } = req.body;
@@ -35,7 +31,7 @@ async function handler(req: NextApiRequest) {
return useApiDatasetRequest({ apiServer }).listFiles({ searchKey, parentId });
}
if (feishuServer || yuqueServer) {
return getFeishuAndYuqueDatasetFileList({
return global.getProApiDatasetFileList({
feishuServer,
yuqueServer,
parentId

View File

@@ -10,7 +10,6 @@ import { DatasetErrEnum } from '@fastgpt/global/common/error/code/dataset';
import { authChatCrud, authCollectionInChat } from '@/service/support/permission/auth/chat';
import { getCollectionWithDataset } from '@fastgpt/service/core/dataset/controller';
import { useApiDatasetRequest } from '@fastgpt/service/core/dataset/apiDataset/api';
import { POST } from '@fastgpt/service/common/api/plusRequest';
export type readCollectionSourceQuery = {};
@@ -106,8 +105,7 @@ async function handler(
}
if (feishuServer || yuqueServer) {
return POST<string>(`/core/dataset/systemApiDataset`, {
type: 'read',
return global.getProApiDatasetFilePreviewUrl({
apiFileId: collection.apiFileId,
feishuServer,
yuqueServer

View File

@@ -11,6 +11,18 @@ import { defaultGroup, defaultTemplateTypes } from '@fastgpt/web/core/workflow/c
import { MongoPluginGroups } from '@fastgpt/service/core/app/plugin/pluginGroupSchema';
import { MongoTemplateTypes } from '@fastgpt/service/core/app/templates/templateTypeSchema';
import { loadSystemModels } from '@fastgpt/service/core/ai/config/utils';
import { POST } from '@fastgpt/service/common/api/plusRequest';
import {
DeepRagSearchProps,
SearchDatasetDataResponse
} from '@fastgpt/service/core/dataset/search/controller';
import { AuthOpenApiLimitProps } from '@fastgpt/service/support/openapi/auth';
import { ConcatUsageProps, CreateUsageProps } from '@fastgpt/global/support/wallet/usage/api';
import {
getProApiDatasetFileContentRequest,
getProApiDatasetFileListRequest,
getProApiDatasetFilePreviewUrlRequest
} from '@/service/core/dataset/apiDataset/controller';
export const readConfigData = async (name: string) => {
const splitName = name.split('.');
@@ -36,12 +48,37 @@ export const readConfigData = async (name: string) => {
/* Init global variables */
export function initGlobalVariables() {
if (global.communityPlugins) return;
function initPlusRequest() {
global.textCensorHandler = function textCensorHandler({ text }: { text: string }) {
return POST<{ code: number; message?: string }>('/common/censor/check', { text });
};
global.deepRagHandler = function deepRagHandler(data: DeepRagSearchProps) {
return POST<SearchDatasetDataResponse>('/core/dataset/deepRag', data);
};
global.authOpenApiHandler = function authOpenApiHandler(data: AuthOpenApiLimitProps) {
return POST<AuthOpenApiLimitProps>('/support/openapi/authLimit', data);
};
global.createUsageHandler = function createUsageHandler(data: CreateUsageProps) {
return POST('/support/wallet/usage/createUsage', data);
};
global.concatUsageHandler = function concatUsageHandler(data: ConcatUsageProps) {
return POST('/support/wallet/usage/concatUsage', data);
};
global.getProApiDatasetFileList = getProApiDatasetFileListRequest;
global.getProApiDatasetFileContent = getProApiDatasetFileContentRequest;
global.getProApiDatasetFilePreviewUrl = getProApiDatasetFilePreviewUrlRequest;
}
global.communityPlugins = [];
global.qaQueueLen = global.qaQueueLen ?? 0;
global.vectorQueueLen = global.vectorQueueLen ?? 0;
initHttpAgent();
initPlusRequest();
}
/* Init system data(Need to connected db). It only needs to run once */

View File

@@ -1,14 +1,35 @@
import { GetApiDatasetFileListResponse } from '@/pages/api/core/dataset/apiDataset/list';
import { FeishuServer, YuqueServer } from '@fastgpt/global/core/dataset/apiDataset';
import { APIFileItem, ApiFileReadContentResponse } from '@fastgpt/global/core/dataset/apiDataset';
import { POST } from '@fastgpt/service/common/api/plusRequest';
import {
GetProApiDatasetFileContentParams,
GetProApiDatasetFileListParams,
GetProApiDatasetFilePreviewUrlParams,
ProApiDatasetOperationTypeEnum
} from '@fastgpt/service/core/dataset/apiDataset/proApi';
export const getFeishuAndYuqueDatasetFileList = async (data: {
feishuServer?: FeishuServer;
yuqueServer?: YuqueServer;
parentId?: string;
}) => {
const res = await POST<GetApiDatasetFileListResponse>('/core/dataset/systemApiDataset', {
type: 'list',
export const getProApiDatasetFileListRequest = async (data: GetProApiDatasetFileListParams) => {
const res = await POST<APIFileItem[]>('/core/dataset/systemApiDataset', {
type: ProApiDatasetOperationTypeEnum.LIST,
...data
});
return res;
};
export const getProApiDatasetFileContentRequest = async (
data: GetProApiDatasetFileContentParams
) => {
const res = await POST<ApiFileReadContentResponse>('/core/dataset/systemApiDataset', {
type: ProApiDatasetOperationTypeEnum.CONTENT,
...data
});
return res;
};
export const getProApiDatasetFilePreviewUrlRequest = async (
data: GetProApiDatasetFilePreviewUrlParams
) => {
const res = await POST<string>('/core/dataset/systemApiDataset', {
type: ProApiDatasetOperationTypeEnum.READ,
...data
});
return res;

View File

@@ -52,10 +52,7 @@ import type {
import type { UpdateDatasetDataProps } from '@fastgpt/global/core/dataset/controller';
import type { DatasetFolderCreateBody } from '@/pages/api/core/dataset/folder/create';
import type { PaginationProps, PaginationResponse } from '@fastgpt/web/common/fetch/type';
import type {
GetApiDatasetFileListProps,
GetApiDatasetFileListResponse
} from '@/pages/api/core/dataset/apiDataset/list';
import type { GetApiDatasetFileListProps } from '@/pages/api/core/dataset/apiDataset/list';
import type {
listExistIdQuery,
listExistIdResponse
@@ -74,6 +71,7 @@ import type {
getTrainingErrorBody,
getTrainingErrorResponse
} from '@/pages/api/core/dataset/training/getTrainingError';
import type { APIFileItem } from '@fastgpt/global/core/dataset/apiDataset';
/* ======================== dataset ======================= */
export const getDatasets = (data: GetDatasetListBody) =>
@@ -254,6 +252,6 @@ export const getCollectionSource = (data: readCollectionSourceBody) =>
/* ================== apiDataset ======================== */
export const getApiDatasetFileList = (data: GetApiDatasetFileListProps) =>
POST<GetApiDatasetFileListResponse>('/core/dataset/apiDataset/list', data);
POST<APIFileItem[]>('/core/dataset/apiDataset/list', data);
export const getApiDatasetFileListExistId = (data: listExistIdQuery) =>
GET<listExistIdResponse>('/core/dataset/apiDataset/listExistId', data);