Files
FastGPT/test/mocks/common/s3.ts
T
Archer 7506a147e6 V4.14.x (#6751)
* batch node (#6732)

* batch node

* docs: add local code quality standards and style guides for automated review

* refactor: remove enforced minimum for parallel concurrency, simplify edge handling in task runtime context, and fix loop output mapping

* feat: auto-infer and sync valueType for parallel loop input and output based on referenced array source

* fix: refactor parallelRun output type synchronization and improve sub-workflow error handling in dispatch service

* feat: enforce parallel concurrency limits and validate against workflow loop constraints

* feat: implement retry mechanism for parallel workflow tasks with usage tracking per attempt

* fix review

* perf: use function

* refactor: abstract nested node logic into useNestedNode hook and update parallelRun icon/service logic

* fix: type import

* refactor: update ParallelRunStatusEnum and i18n labels for improved status clarity

* feat: parallel run details and input/output display to chat response modal and service dispatch

* fix: config limit error

* refactor: optimize parallel run task execution, fix point accumulation, and improve error handling for sub-workflows

* fix: include totalPoints in parallel task results

* refactor: centralize nested input injection and point safety utilities for workflow dispatchers

* test: add unit tests for safePoints utility function

* refactor: update parallel workflow runtime types and clean up docstring placement in dispatch utils

* fix: include all runtime nodes in parallel execution to ensure variable reference accessibility

* refactor: update pushSubWorkflowUsage signature to use object parameter for improved consistency

---------

Co-authored-by: DigHuang <114602213+DigHuang@users.noreply.github.com>

* feat(s3): add proxy transfer mode with tokenized upload/download (#6729)

* feat(s3): add proxy transfer mode with tokenized upload/download

* wip: switch to proxy mode for upload progress

* fix: office mime types

* fix(s3): upload MIME validation, multer whitelist, API error status

- Treat AVI/MPEG mime aliases (incl. video/mp1s vs video/mpeg) as matching
- Optional allowedExtensions on multer for dataset images and localFile
- Map S3/business errors to 4xx in jsonRes where appropriate
- Align presign max size with team plan; fix dataset import size UX
- Add upload validation tests

Made-with: Cursor

* fix: show clear message when upload frequency limit is exceeded

- Reject ERROR_ENUM.uploadFileIntervalLimit from authFrequencyLimit instead of Mongo doc
- Add i18n for upload_file_interval_limit (zh-CN/en/zh-Hant)

Made-with: Cursor

* fix file token validation and upload mime checks

* fix: test

* fix(s3): treat m4a audio/mp4 and audio/x-m4a as equivalent

- Add MIME equivalence group for AAC/M4A container mismatch (mime-types vs file-type)
- Add upload validation test for minimal ftyp/M4A buffer
- Test env: keep FILE_TOKEN_KEY in vitest test.env and test/setup.ts (drop loadTestEnv file)

Made-with: Cursor

* fix(chat): 调试区文件类型与编辑态一致,并修复 accept 在 WebKit 下不更新

- ChatTest: 用 getAppChatConfig + getGuideModule 合并画布引导节点与 chatConfig
- useChatTest: 依赖 fileSelectConfig 序列化与 chatConfig,避免深层变更未触发预览更新
- useSelectFile: 用 useCallback + input key 替代 useMemoizedFn,确保 accept 变更后重建 input

Made-with: Cursor

* fix: invalid request

* feat: prompt inject (#6757)

* feat: resume chat stream (#6722)

* fix: openapi schema issue while creating openapi json

* feat: resume chat stream

* wip: chat status and read status

* feat: sync chat side bar status

* fix: allow reassignment of variables in chatTest handler

Made-with: Cursor

* feat(chat): stream resume hardening, resume modules in @fastgpt/service, stale generating cron

- Move stream resume mirror + resumeStatus into packages/service; update API imports
- chatTest: ensurePendingChatRoundItems, default responseChatItemId; zod default import for client
- useChatTest + HomeChatWindow: enableAutoResume and sync init chatGenerateStatus
- ChatContext: safe no-op defaults without provider
- Cron: clean MongoChat stuck in generating >30min; timer lock cleanStaleGeneratingChat

Made-with: Cursor

* fix(chat): address stream-resume PR review (zod/mongoose enum, legacy status, upsert, UI race)

- Zod: use z.nativeEnum(ChatGenerateStatusEnum); mongoose chatGenerateStatus enum as [0,1,2] only
- Init APIs: default missing chatGenerateStatus to done before read/unread logic
- ensurePendingChatRoundItems: unique index + upsert; rename ChatGenerateStatusEnum
- ChatBox auto-resume: guard by chatId; sidebar sync via targetChatId
- Tests: chat history/feedback APIs pass with schema fixes

Made-with: Cursor

* fix(chat): expose resume at /api/v2/chat/resume; openapi + review tidy

- Move handler from v1/stream to v2/chat/resume (pairs with v2 completions + Redis mirror)
- Update fetch, OpenAPI AIPath, comments; remove slim projects/app global chat api
- getHistoryStatus default chatGenerateStatus; team init + chatTest notes; ChatItem tweak

Made-with: Cursor

* fix(chat): fix resume JSON parse catch shadowing; drop unused resumeChatStream

Made-with: Cursor

* docs(chat): comment closed+stream mirror write path in workflow dispatch

Made-with: Cursor

* refactor: unify resumable stream mirroring

* fix: keep v1 chat completions out of resume flow

* refactor: make prepared chat rounds transactional

* fix: handle resume stream terminal errors

* fix: rerank max token

* feat(workflow): extend variable update node with Number/Boolean/Array operations (#6752)

* feat(workflow): extend variable update node with   Number/Boolean/Array ops

* feat: math operator icons and refactor variable update renderers for improved layout and consistency

* chore(workflow): clean up variable update types and restore icon   cleanup

* feat: add test

* fix:md_ascii_bug (#6755)

* md_ascii_bug

* md_ascii_bug

* md_ascii_bug

* md_ascii_bug

* md_ascii_bug

* perf: test

---------

Co-authored-by: archer <545436317@qq.com>

* doc

* del dataset

* perf: date auto coerce

* doc

* add test

* perf: channel setting

* doc

* fix: chat resume stream (#6759)

* refactor(api): move stream resume to /api/core/chat/resume

Relocate resume handler from pages/api/v2 to pages/api/core, update
OpenAPI paths, frontend streamResumeFetch URL, tests, and comments.

Made-with: Cursor

* fix: remove stray conflict markers; use z.nativeEnum for chatGenerateStatus

Made-with: Cursor

* fix: use enum instead of nativeEnum

* fix(chat): address resume review suggestions

* fix(chat): require sse when resuming generating chats

* revert(chat): keep chatitem dataId index non-unique

* fix: ts

* fix doc

* fix(chat): gate stream resume mirror by header (#6760)

* fix: remove stray conflict markers; use z.nativeEnum for chatGenerateStatus

Made-with: Cursor

* fix: use enum instead of nativeEnum

* fix(chat): address resume review suggestions

* fix(chat): require sse when resuming generating chats

* feat(chat): gate stream resume mirror by header

* refactor(chat): decouple resume mirror header parsing

* perf: dataset queue

* fix: multipleselect

* perf: workflow bug

* doc

* doc

* perf: deploy yml;fix: child nodes watch

* adapt embedding model defaultconfig

* install shell

* add mcp zod check

* feat: http tool zod schema

* Feat/batch UI (#6763)

* feat: aggregate parallel run results into task-specific virtual nodes and update UI to support i18n arguments for module names

* style: update workflow node card padding and table styling for improved layout consistency

* feat: implement parallel run workflow node with documentation and i18n support

* style(modal): WholeResponseModal UI and layout styling

* chore: improve chat resume UX (#6764)

* fix: remove stray conflict markers; use z.nativeEnum for chatGenerateStatus

Made-with: Cursor

* fix: use enum instead of nativeEnum

* fix(chat): address resume review suggestions

* fix(chat): require sse when resuming generating chats

* feat(chat): gate stream resume mirror by header

* refactor(chat): decouple resume mirror header parsing

* feat: improve stream resume fallback

* feat: block duplicate chat generation

* feat: polish resume unavailable recovery

* test: stabilize resume stream timeout

* fix: harden resume wait flow

* fix: get mcp tool raw schema

* style: update UI styling and layout for LLM request detail and response modals

* perf: http tool

* fix: test

* fix: http raw schema

* fix: test

* deploy yml

* deploy yml

---------

Co-authored-by: DigHuang <114602213+DigHuang@users.noreply.github.com>
Co-authored-by: Ryo <whoeverimf5@gmail.com>
Co-authored-by: YeYuheng <57035043+YYH211@users.noreply.github.com>
2026-04-17 23:28:43 +08:00

245 lines
8.3 KiB
TypeScript

import { vi } from 'vitest';
import { createVitestStorageMock } from '../../../sdk/storage/src/testing/vitestMock';
const mockStorageByBucket = new Map<string, ReturnType<typeof createVitestStorageMock>>();
const getMockStorage = (bucketName: string) => {
const existing = mockStorageByBucket.get(bucketName);
if (existing) return existing;
const storage = createVitestStorageMock({
vi,
bucketName,
baseUrl: 'http://localhost:9000'
});
mockStorageByBucket.set(bucketName, storage);
return storage;
};
// Create mock S3 bucket object for global use
const createMockS3Bucket = (bucketName = 'mock-bucket') => {
const client = getMockStorage(bucketName);
const externalClient = getMockStorage(bucketName);
return {
name: bucketName,
client,
externalClient,
exist: vi.fn().mockResolvedValue(true),
delete: vi.fn().mockResolvedValue(undefined),
putObject: vi.fn(async (key: string, body: any) => {
await client.uploadObject({ key, body });
}),
getFileStream: vi.fn(async (key: string) => {
const res = await client.downloadObject({ key });
return res.body;
}),
statObject: vi.fn(async (key: string) => {
const meta = await client.getObjectMetadata({ key });
return {
size: meta.contentLength ?? 0,
etag: meta.etag ?? 'mock-etag'
};
}),
move: vi.fn(async ({ from, to }: { from: string; to: string }) => {
await client.copyObjectInSelfBucket({ sourceKey: from, targetKey: to });
await client.deleteObject({ key: from });
}),
copy: vi.fn(async ({ from, to }: { from: string; to: string }) => {
await client.copyObjectInSelfBucket({ sourceKey: from, targetKey: to });
}),
addDeleteJob: vi.fn().mockResolvedValue(undefined),
createPostPresignedUrl: vi.fn().mockResolvedValue({
url: 'http://localhost:9000/mock-bucket',
fields: { key: 'mock-key' },
maxSize: 100 * 1024 * 1024
}),
createExternalUrl: vi.fn(async (key: string) => {
const { url } = await externalClient.generatePresignedGetUrl({ key });
return url;
}),
createGetPresignedUrl: vi.fn(async (key: string) => {
const { url } = await client.generatePresignedGetUrl({ key });
return url;
}),
createPublicUrl: vi.fn((key: string) => externalClient.generatePublicGetUrl({ key }).url)
};
};
// Initialize global s3BucketMap early to prevent any real S3 connections
const mockBucket = createMockS3Bucket();
global.s3BucketMap = {
'fastgpt-public': mockBucket,
'fastgpt-private': mockBucket
} as any;
// Mock minio Client to prevent real connections
const createMockMinioClient = vi.hoisted(() => {
return vi.fn().mockImplementation(() => ({
bucketExists: vi.fn().mockResolvedValue(true),
makeBucket: vi.fn().mockResolvedValue(undefined),
setBucketPolicy: vi.fn().mockResolvedValue(undefined),
copyObject: vi.fn().mockResolvedValue(undefined),
removeObject: vi.fn().mockResolvedValue(undefined),
putObject: vi.fn().mockResolvedValue({ etag: 'mock-etag' }),
getFileStream: vi.fn().mockResolvedValue(null),
statObject: vi.fn().mockResolvedValue({ size: 0, etag: 'mock-etag' }),
presignedGetObject: vi.fn().mockResolvedValue('http://localhost:9000/mock-bucket/mock-object'),
presignedPostPolicy: vi.fn().mockResolvedValue({
postURL: 'http://localhost:9000/mock-bucket',
formData: { key: 'mock-key' }
}),
newPostPolicy: vi.fn(() => ({
setKey: vi.fn().mockReturnThis(),
setBucket: vi.fn().mockReturnThis(),
setContentType: vi.fn().mockReturnThis(),
setContentLengthRange: vi.fn().mockReturnThis(),
setExpires: vi.fn().mockReturnThis(),
setUserMetaData: vi.fn().mockReturnThis()
}))
}));
});
vi.mock('minio', () => ({
Client: createMockMinioClient(),
S3Error: class S3Error extends Error {},
CopyConditions: vi.fn()
}));
// Simplified S3 bucket class mock
const createMockBucketClass = (defaultName: string) => {
return class MockS3Bucket {
public name: string;
public options: any;
public client = getMockStorage(defaultName);
public externalClient = getMockStorage(defaultName);
constructor(bucket?: string, options?: any) {
this.name = bucket || defaultName;
this.options = options || {};
this.client = getMockStorage(this.name);
this.externalClient = getMockStorage(this.name);
}
get bucketName(): string {
return this.name;
}
async exist() {
return true;
}
async delete() {}
async putObject(key: string, body: any) {
await this.client.uploadObject({ key, body });
}
async getFileStream() {
return null;
}
async statObject() {
return { size: 0, etag: 'mock-etag' };
}
async move({ from, to }: { from: string; to: string }) {
await this.client.copyObjectInSelfBucket({ sourceKey: from, targetKey: to });
await this.client.deleteObject({ key: from });
}
async copy({ from, to }: { from: string; to: string }) {
await this.client.copyObjectInSelfBucket({ sourceKey: from, targetKey: to });
}
async addDeleteJob() {}
async createPostPresignedUrl(params: any, options?: any) {
return {
url: 'http://localhost:9000/mock-bucket',
fields: { key: `mock/${params.teamId || 'test'}/${params.filename}` },
maxSize: (options?.maxFileSize || 100) * 1024 * 1024
};
}
async createExternalUrl(params: any) {
const { url } = await this.externalClient.generatePresignedGetUrl({
key: params.key,
expiredSeconds: params.expires
});
return url;
}
async createGetPresignedUrl(params: any) {
const { url } = await this.client.generatePresignedGetUrl({
key: params.key,
expiredSeconds: params.expires
});
return url;
}
createPublicUrl(objectKey: string) {
return this.externalClient.generatePublicGetUrl({ key: objectKey }).url;
}
};
};
vi.mock('@fastgpt/service/common/s3/buckets/base', () => ({
S3BaseBucket: createMockBucketClass('fastgpt-bucket')
}));
vi.mock('@fastgpt/service/common/s3/buckets/public', () => ({
S3PublicBucket: createMockBucketClass('fastgpt-public')
}));
vi.mock('@fastgpt/service/common/s3/buckets/private', () => ({
S3PrivateBucket: createMockBucketClass('fastgpt-private')
}));
// Mock S3 source modules
vi.mock('@fastgpt/service/common/s3/sources/avatar', () => ({
getS3AvatarSource: vi.fn(() => ({
prefix: '/avatar/',
createUploadAvatarURL: vi.fn().mockResolvedValue({
url: 'http://localhost:9000/mock-bucket',
fields: { key: 'mock-key' },
maxSize: 5 * 1024 * 1024
}),
createPublicUrl: vi.fn((key: string) => `http://localhost:9000/mock-bucket/${key}`),
removeAvatarTTL: vi.fn().mockResolvedValue(undefined),
deleteAvatar: vi.fn().mockResolvedValue(undefined),
refreshAvatar: vi.fn().mockResolvedValue(undefined),
copyAvatar: vi.fn().mockResolvedValue('http://localhost:9000/mock-bucket/mock-avatar')
}))
}));
vi.mock('@fastgpt/service/common/s3/sources/dataset/index', () => ({
getS3DatasetSource: vi.fn(() => ({
createUploadDatasetFileURL: vi.fn().mockResolvedValue({
url: 'http://localhost:9000/mock-bucket',
fields: { key: 'mock-key' },
maxSize: 500 * 1024 * 1024
}),
deleteDatasetFile: vi.fn().mockResolvedValue(undefined)
})),
S3DatasetSource: vi.fn()
}));
vi.mock('@fastgpt/service/common/s3/sources/chat/index', () => ({
S3ChatSource: vi.fn(),
getS3ChatSource: vi.fn(() => ({
createUploadChatFileURL: vi.fn().mockResolvedValue({
url: 'http://localhost:9000/mock-bucket',
fields: { key: 'mock-key' },
maxSize: 5 * 1024 * 1024
}),
deleteChatFilesByPrefix: vi.fn().mockResolvedValue(undefined),
deleteChatFile: vi.fn().mockResolvedValue(undefined)
}))
}));
// Mock S3 initialization
vi.mock('@fastgpt/service/common/s3', () => ({
initS3Buckets: vi.fn(() => {
const mockBucket = createMockS3Bucket();
global.s3BucketMap = {
'fastgpt-public': mockBucket,
'fastgpt-private': mockBucket
} as any;
}),
initS3MQWorker: vi.fn().mockResolvedValue(undefined)
}));
// Mock S3 MQ (Message Queue) operations
vi.mock('@fastgpt/service/common/s3/queue/delete', () => ({
prefixDel: vi.fn().mockResolvedValue(undefined),
addDeleteJob: vi.fn().mockResolvedValue(undefined)
}));