feat: support aws bedrockruntime claude3 (#1328)

* feat: support aws bedrockruntime claude3

closes #622, closes #749, closes #1300

* fix: convert to aws claude model id

* fix: Update AWS adapter to handle stream completions and calculate usage metrics

Based on the file summaries provided, here are the important bullet points for the commit message:

- Add functionality to handle stream completion events from AWS in the relay/adaptor/aws/main.go file
- Marshall AWS response to OpenAI format and calculate usage metrics in the same file
- Implement a custom render function for streaming events in the same file
- Improve error handling for JSON unmarshalling and marshalling errors in the same file

* fix: Implement AWS handler with usage tracking and error handling

- Implemented streaming response handling for AWS handler
- Set response content type to text/event-stream
- Added error handling for failed marshaling/unmarshaling
- Updated return values to include `relaymodel.ErrorWithStatusCode` and `relaymodel.Usage`
- Improved error handling and response formatting for AWS adaptor

* fix: Refactor AWS Adapter for Improved Model Mapping and Error Handling

* Refactor AWS adapter to improve model management
  - Replace hardcoded model list in `adapter.go` with a function to get models from `awsModelIDMap`
  - Update `GetModelList` function to return model list directly
  - Add `GetChannelName` function to get channel name from `Adaptor` object
* Improve error handling and code organization in main.go
  - Replace switch statement with a map to map AWS model IDs to OpenAI model IDs
  - Return an error if the model is not found in the map
  - Use a single return statement instead of wrapping multiple return statements in the `awsModelID` function
  - Add a new error message for when the model is not found in the map in the `Handler` function

* fix: bug fix

* chore: change variable name & package

* chore: change variable name

* perf: update config related code

---------

Co-authored-by: JustSong <songquanpeng@foxmail.com>
This commit is contained in:
Laisky.Cai
2024-04-20 00:40:47 +08:00
committed by GitHub
parent 1a0b039bcf
commit fc9a784950
22 changed files with 566 additions and 198 deletions

View File

@@ -106,9 +106,10 @@ func RelayImageHelper(c *gin.Context, relayMode int) *relaymodel.ErrorWithStatus
}
defer func(ctx context.Context) {
if resp.StatusCode != http.StatusOK {
if resp != nil && resp.StatusCode != http.StatusOK {
return
}
err := model.PostConsumeTokenQuota(meta.TokenId, quota)
if err != nil {
logger.SysError("error consuming token remain quota: " + err.Error())

View File

@@ -4,6 +4,9 @@ import (
"bytes"
"encoding/json"
"fmt"
"io"
"net/http"
"github.com/gin-gonic/gin"
"github.com/songquanpeng/one-api/common/logger"
"github.com/songquanpeng/one-api/relay"
@@ -14,9 +17,6 @@ import (
"github.com/songquanpeng/one-api/relay/channeltype"
"github.com/songquanpeng/one-api/relay/meta"
"github.com/songquanpeng/one-api/relay/model"
"io"
"net/http"
"strings"
)
func RelayTextHelper(c *gin.Context) *model.ErrorWithStatusCode {
@@ -86,12 +86,13 @@ func RelayTextHelper(c *gin.Context) *model.ErrorWithStatusCode {
logger.Errorf(ctx, "DoRequest failed: %s", err.Error())
return openai.ErrorWrapper(err, "do_request_failed", http.StatusInternalServerError)
}
errorHappened := (resp.StatusCode != http.StatusOK) || (meta.IsStream && resp.Header.Get("Content-Type") == "application/json")
if errorHappened {
billing.ReturnPreConsumedQuota(ctx, preConsumedQuota, meta.TokenId)
return RelayErrorHandler(resp)
if resp != nil {
errorHappened := (resp.StatusCode != http.StatusOK) || (meta.IsStream && resp.Header.Get("Content-Type") == "application/json")
if errorHappened {
billing.ReturnPreConsumedQuota(ctx, preConsumedQuota, meta.TokenId)
return RelayErrorHandler(resp)
}
}
meta.IsStream = meta.IsStream || strings.HasPrefix(resp.Header.Get("Content-Type"), "text/event-stream")
// do response
usage, respErr := adaptor.DoResponse(c, resp, meta)