12 Commits

Author SHA1 Message Date
Vinlic
d14d062078 Release 0.0.26 2024-04-13 02:14:48 +08:00
Vinlic
1a3327cc8d 修复多轮对话下,无法重复唤起联网检索的问题 2024-04-13 02:14:28 +08:00
Vinlic科技
cfec318bd0 Merge pull request #56 from MichaelYuhe/master
docs: add deploy to Zeabur guide
2024-04-12 15:24:08 +08:00
Yuhang
1d18ac3f6b add deploy to Zeabur in Readme_en 2024-04-12 15:11:31 +08:00
Yuhang
b52e84bda0 add deploy to Zeabur in Readme 2024-04-12 15:10:35 +08:00
Vinlic
ee7cb9fdff Merge branch 'master' of https://github.com/Vinlic/kimi-free-api 2024-04-12 13:17:46 +08:00
Vinlic
a12a967202 update README 2024-04-12 13:17:23 +08:00
Vinlic科技
bff5623f73 update README 2024-04-11 18:53:03 +08:00
Vinlic
2d2454b65b update README 2024-04-11 15:03:04 +08:00
Vinlic
4642939835 update README 2024-04-11 14:28:32 +08:00
Vinlic
87593a270a 添加Render部署 2024-04-11 14:28:16 +08:00
Vinlic
ce89c29b05 添加Render部署 2024-04-11 14:27:27 +08:00
4 changed files with 48 additions and 8 deletions

View File

@@ -15,7 +15,7 @@
与ChatGPT接口完全兼容。
还有以下个free-api欢迎关注
还有以下个free-api欢迎关注
阶跃星辰 (跃问StepChat) 接口转API [step-free-api](https://github.com/LLM-Red-Team/step-free-api)
@@ -23,6 +23,8 @@
ZhipuAI (智谱清言) 接口转API [glm-free-api](https://github.com/LLM-Red-Team/glm-free-api)
秘塔AI (metaso) 接口转API [metaso-free-api](https://github.com/LLM-Red-Team/metaso-free-api)
聆心智能 (Emohaa) 接口转API [emohaa-free-api](https://github.com/LLM-Red-Team/emohaa-free-api)
## 目录
@@ -34,6 +36,7 @@ ZhipuAI (智谱清言) 接口转API [glm-free-api](https://github.com/LLM-Red-Te
* [多账号接入](#多账号接入)
* [Docker部署](#Docker部署)
* [Docker-compose部署](#Docker-compose部署)
* [Render部署](#Render部署)
* [Vercel部署](#Vercel部署)
* [原生部署](#原生部署)
* [接口列表](#接口列表)
@@ -60,6 +63,12 @@ ZhipuAI (智谱清言) 接口转API [glm-free-api](https://github.com/LLM-Red-Te
https://udify.app/chat/Po0F6BMJ15q5vu2P
## 测试接口
此接口实例部署在[Render](#Render部署)上面,遇到容器回收可能导致响应速度较慢,仅供测试,建议自行部署。
https://kimi-free-api-nut5.onrender.com
## 效果示例
### 验明正身Demo
@@ -150,13 +159,38 @@ services:
- TZ=Asia/Shanghai
```
### Render部署
**注意部分部署区域可能无法连接kimi如容器日志出现请求超时或无法连接新加坡实测不可用请切换其他区域部署**
**注意免费账户的容器实例将在一段时间不活动时自动停止运行这会导致下次请求时遇到50秒或更长的延迟建议查看[Render容器保活](https://github.com/LLM-Red-Team/free-api-hub/#Render%E5%AE%B9%E5%99%A8%E4%BF%9D%E6%B4%BB)**
1. fork本项目到你的github账号下。
2. 访问 [Render](https://dashboard.render.com/) 并登录你的github账号。
3. 构建你的 Web ServiceNew+ -> Build and deploy from a Git repository -> Connect你fork的项目 -> 选择部署区域 -> 选择实例类型为Free -> Create Web Service
4. 等待构建完成后复制分配的域名并拼接URL访问即可。
### Vercel部署
**注意Vercel免费账户的请求响应超时时间为10秒但接口响应通常较久可能会遇到Vercel返回的504超时错误**
**注意Vercel免费账户的请求响应超时时间为10秒但接口响应通常较久可能会遇到Vercel返回的504超时错误**
点击按钮快速部署:
请先确保安装了Node.js环境。
[![Deploy with Vercel](https://vercel.com/button)](https://vercel.com/import/project?template=https://github.com/LLM-Red-Team/kimi-free-api)
```shell
npm i -g vercel --registry http://registry.npmmirror.com
vercel login
git clone https://github.com/LLM-Red-Team/kimi-free-api
cd kimi-free-api
vercel --prod
```
### Zeabur 部署
**注意:免费账户的容器实例可能无法稳定运行**
[![Deploy on Zeabur](https://zeabur.com/button.svg)](https://zeabur.com/templates/GRFYBP)
## 原生部署
@@ -440,4 +474,4 @@ keepalive_timeout 120;
## Star History
[![Star History Chart](https://api.star-history.com/svg?repos=LLM-Red-Team/kimi-free-api&type=Date)](https://star-history.com/#LLM-Red-Team/kimi-free-api&Date)
[![Star History Chart](https://api.star-history.com/svg?repos=LLM-Red-Team/kimi-free-api&type=Date)](https://star-history.com/#LLM-Red-Team/kimi-free-api&Date)

View File

@@ -17,6 +17,8 @@ Ali Tongyi (Qwen) API to API [qwen-free-api](https://github.com/LLM-Red-Team/qwe
ZhipuAI (Wisdom Map Clear Words) API to API [glm-free-api](https://github.com/LLM-Red-Team/glm-free-api)
MetaAI (metaso) 接口转API [metaso-free-api](https://github.com/LLM-Red-Team/metaso-free-api)
Listening Intelligence (Emohaa) API to API [emohaa-free-api](https://github.com/LLM-Red-Team/emohaa-free-api)
## Table of Contents
@@ -191,6 +193,10 @@ Out of service
pm2 stop kimi-free-api
```
## Zeabur Deployment
[![Deploy on Zeabur](https://zeabur.com/button.svg)](https://zeabur.com/templates/GRFYBP)
## interface list
Currently, the `/v1/chat/completions` interface compatible with openai is supported. You can use the client access interface compatible with openai or other clients, or use online services such as [dify](https://dify.ai/) Access and use.
@@ -425,4 +431,4 @@ Since the inference side is not in kimi-free-api, the token cannot be counted an
## Star History
[![Star History Chart](https://api.star-history.com/svg?repos=LLM-Red-Team/kimi-free-api&type=Date)](https://star-history.com/ #LLM-Red-Team/kimi-free-api&Date)
[![Star History Chart](https://api.star-history.com/svg?repos=LLM-Red-Team/kimi-free-api&type=Date)](https://star-history.com/ #LLM-Red-Team/kimi-free-api&Date)

View File

@@ -1,6 +1,6 @@
{
"name": "kimi-free-api",
"version": "0.0.25",
"version": "0.0.26",
"description": "Kimi Free API Server",
"type": "module",
"main": "dist/index.js",

View File

@@ -392,7 +392,7 @@ function messagesPrepare(messages: any[]) {
return _content + `${message.role || "user"}:${v["text"] || ""}\n`;
}, content);
}
return content += `${message.role || 'user'}:${wrapUrlsToTags(message.content)}\n`;
return content += `${message.role || 'user'}:${message.role == 'user' ? wrapUrlsToTags(message.content) : message.content}\n`;
}, '');
logger.info("\n对话合并\n" + content);
return [