hncboy / ai-beehive Goto Github PK
View Code? Open in Web Editor NEWAI 蜂巢,基于 Java 使用 Spring Boot 3 和 JDK 17,支持的功能有 ChatGPT、OpenAi Image、Midjourney、NewBing、文心一言等等
Home Page: https://front.aibeehive.icu/
License: Apache License 2.0
AI 蜂巢,基于 Java 使用 Spring Boot 3 和 JDK 17,支持的功能有 ChatGPT、OpenAi Image、Midjourney、NewBing、文心一言等等
Home Page: https://front.aibeehive.icu/
License: Apache License 2.0
我的问题与 #6 @heisexiong 一样,来一张截图如下:
我的配置如下:
server{
listen 80;
server_name 127.0.0.1;
index index.html;
proxy_buffering off;
root /opt/software/front/www/dist;
location /api/ {
proxy_pass http://localhost:3002/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
add_header X-Cache $upstream_cache_status;
}
location ~ ^/(\.user.ini|\.htaccess|\.git|\.env|\.svn|\.project|LICENSE|README.md) {
return 404;
}
}
server{
listen 8080;
server_name 127.0.0.1;
index index.html;
root /opt/software/front/admin/dist;
location /api/ {
proxy_pass http://localhost:3002/admin/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
add_header X-Cache $upstream_cache_status;
}
location ~ ^/(\.user.ini|\.htaccess|\.git|\.env|\.svn|\.project|LICENSE|README.md) {
return 404;
}
}
如题,请问是有什么配置吗?还是什么原因导致的啊
# 全局时间内最大请求次数
maxRequest: 60
# 全局最大请求时间间隔(秒)
maxRequestSecond: 1800
# ip 时间内最大请求次数
ipMaxRequest: 5
# ip 最大请求时间间隔(秒)
ipMaxRequestSecond: 3600
# 限制上下文对话的数量
limitQuestionContextCount: 3
以上几项的限流规则,全局、IP和限制上下文的对话数量是什么关系,谢谢。
如题,大佬,什么时候对接azure openai
Swagger接口文档 目前不能用吗,什么时候能更新呢
亲爱的开发者~
请问 有计划研发
用户端:文生图/图修复(replicate、Stable Diffusion等API方式)
管理端:配置管理
等功能嘛
参数api_reverse_proxy 配置的 https://chat.openai.com/backend-api/conversation,accessToken 是在https://chat.openai.com/ 页面从浏览器拿的。 现在是不是不能用这种方式,我试了调用接口不通,我看其他一些项目,还要配置cookie
> 你确定是这样配的,我咋觉得你的 nginx 配错了,没有代理 /api/。你本地启动 nginx,然后本地访问地址,nginx 不是用在本地的。
嗯嗯,这个是代理的前端地址,配置到后端地址的nginx server块中就可以了,谢谢大佬
Originally posted by @lovewe in #68 (comment)
如题。是为什么呢
"error": {
"message": "This model's maximum context length is 8192 tokens. However, you requested 8192 tokens (58 in the messages, 8134 in the completion). Please reduce the length of the messages or completion.",
"type": "invalid_request_error",
"param": "messages",
"code": "context_length_exceeded"
}
docker部署后提示这个,请问是什么问题呀
目前只获取了后端代码,并在本地环境运行。
然后想使用swagger进行调试,但是打开swagger后,提示:{"code":401,"data":null,"message":"未提供 token"}
应该是 http://127.0.0.1:3002/swagger-ui/index.html 吧
正好我也是原分支限流功能的作者,Java的我也想来试一下
对比与node的那个服务,上下文对话能力差很多,是配置的问题还是api请求逻辑的问题呢?
参考:
https://github.com/Chanzhaoyu/chatgpt-web
java.net.ConnectException: Failed to connect to api.openai.com/[2a03:2880:f126:83:face:b00c:0:25de]:443 at okhttp3.internal.connection.RealConnection.connectSocket(RealConnection.kt:297) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.connection.RealConnection.connect(RealConnection.kt:207) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.connection.ExchangeFinder.findConnection(ExchangeFinder.kt:226) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.connection.ExchangeFinder.findHealthyConnection(ExchangeFinder.kt:106) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.connection.ExchangeFinder.find(ExchangeFinder.kt:74) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.connection.RealCall.initExchange$okhttp(RealCall.kt:255) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.kt:32) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.kt:95) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.kt:83) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:76) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:na] at com.unfbx.chatgpt.interceptor.HeaderAuthorizationInterceptor.intercept(HeaderAuthorizationInterceptor.java:51) ~[chatgpt-java-1.0.11.jar:na] at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201) ~[okhttp-4.10.0.jar:na] at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:517) ~[okhttp-4.10.0.jar:na] at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[na:na] at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[na:na] at java.base/java.lang.Thread.run(Thread.java:833) ~[na:na] Suppressed: java.net.ConnectException: Failed to connect to api.openai.com/174.37.154.236:443 ... 21 common frames omitted Caused by: java.net.ConnectException: Connection timed out: no further information at java.base/sun.nio.ch.Net.pollConnect(Native Method) at java.base/sun.nio.ch.Net.pollConnectNow(Net.java:672) at java.base/sun.nio.ch.NioSocketImpl.timedFinishConnect(NioSocketImpl.java:549) at java.base/sun.nio.ch.NioSocketImpl.connect(NioSocketImpl.java:597) at java.base/java.net.SocksSocketImpl.connect(SocksSocketImpl.java:327) at java.base/java.net.Socket.connect(Socket.java:633) at okhttp3.internal.platform.Platform.connectSocket(Platform.kt:128) at okhttp3.internal.connection.RealConnection.connectSocket(RealConnection.kt:295) ... 20 common frames omitted
@service("FrontChatRoomServiceImpl")
public class ChatRoomServiceImpl extends ServiceImpl<ChatRoomMapper, ChatRoomDO> implements ChatRoomService {
@Override
public ChatRoomDO createChatRoom(ChatMessageDO chatMessageDO) {
ChatRoomDO chatRoom = new ChatRoomDO();
chatRoom.setId(IdWorker.getId());
chatRoom.setApiType(chatMessageDO.getApiType());
chatRoom.setIp(WebUtil.getIp());
chatRoom.setFirstChatMessageId(chatMessageDO.getId());
chatRoom.setFirstMessageId(chatMessageDO.getMessageId());
// 取一部分内容当标题
chatRoom.setTitle(StrUtil.sub(chatMessageDO.getContent(), 0, 50));
chatRoom.setCreateTime(new Date());
chatRoom.setUpdateTime(new Date());
save(chatRoom);
return chatRoom;
}
}
但是你们的测试环境没有报错不知道为什么
按用户邮箱的话可以有效限制,并且后期注册验证码可能需要加强
后续是否有增加用户体系功能?用于个人通话云同步和对话隔离
使用nginx反向代理java api,前端在调用/api/session时出现404错误,而在用node启动项目时并未出现该错误,打包后的项目就会出现该错误
[问题描述]
docker-compose up -d
命令启动后,启动报错,提示连不上数据库,请问这个IP地址是采用localhost,还有哪个地址?
2023-06-16T16:44:24.851+08:00 WARN 87648 --- [ncentcs.com/...] c.h.c.f.a.l.ParsedEventSourceListener : 消息发送异常,当前已接收消息:,响应内容:{
"error": {
"message": "",
"type": "invalid_request_error",
"param": null,
"code": "invalid_api_key"
}
}
,异常堆栈:
读取sensitive_word_base64.txt的方式可以修改下,docker部署的时候文件在jar包中读不到。或者得在docker build的时候将这个文件从jar包解压出来。
java.lang.IllegalStateException: ResponseBodyEmitter has already completed
at org.springframework.util.Assert.state(Assert.java:97)
at org.springframework.web.servlet.mvc.method.annotation.ResponseBodyEmitter.send(ResponseBodyEmitter.java:196)
at org.springframework.web.servlet.mvc.method.annotation.ResponseBodyEmitter.send(ResponseBodyEmitter.java:184)
at com.hncboy.chatgpt.api.listener.ResponseBodyEmitterStreamListener.onError(ResponseBodyEmitterStreamListener.java:49)
at com.hncboy.chatgpt.api.listener.ParsedEventSourceListener.onFailure(ParsedEventSourceListener.java:185)
at okhttp3.internal.sse.RealEventSource.processResponse(RealEventSource.kt:78)
at okhttp3.internal.sse.RealEventSource.onResponse(RealEventSource.kt:46)
at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
org.apache.catalina.connector.ClientAbortException: java.io.IOException: Broken pipe
at org.apache.catalina.connector.OutputBuffer.doFlush(OutputBuffer.java:309)
at org.apache.catalina.connector.OutputBuffer.flush(OutputBuffer.java:271)
at org.apache.catalina.connector.CoyoteOutputStream.flush(CoyoteOutputStream.java:118)
at sun.nio.cs.StreamEncoder.implFlush(StreamEncoder.java:297)
at sun.nio.cs.StreamEncoder.flush(StreamEncoder.java:141)
at java.io.OutputStreamWriter.flush(OutputStreamWriter.java:229)
at org.springframework.util.StreamUtils.copy(StreamUtils.java:148)
at org.springframework.http.converter.StringHttpMessageConverter.writeInternal(StringHttpMessageConverter.java:126)
at org.springframework.http.converter.StringHttpMessageConverter.writeInternal(StringHttpMessageConverter.java:44)
at org.springframework.http.converter.AbstractHttpMessageConverter.write(AbstractHttpMessageConverter.java:227)
at org.springframework.web.servlet.mvc.method.annotation.ResponseBodyEmitterReturnValueHandler$HttpMessageConvertingHandler.sendInternal(ResponseBodyEmitterReturnValueHandler.java:212)
at org.springframework.web.servlet.mvc.method.annotation.ResponseBodyEmitterReturnValueHandler$HttpMessageConvertingHandler.send(ResponseBodyEmitterReturnValueHandler.java:205)
at org.springframework.web.servlet.mvc.method.annotation.ResponseBodyEmitter.sendInternal(ResponseBodyEmitter.java:204)
at org.springframework.web.servlet.mvc.method.annotation.ResponseBodyEmitter.send(ResponseBodyEmitter.java:198)
at org.springframework.web.servlet.mvc.method.annotation.ResponseBodyEmitter.send(ResponseBodyEmitter.java:184)
at com.hncboy.chatgpt.api.listener.ResponseBodyEmitterStreamListener.onMessage(ResponseBodyEmitterStreamListener.java:33)
at com.hncboy.chatgpt.api.listener.ParsedEventSourceListener.onEvent(ParsedEventSourceListener.java:156)
at okhttp3.internal.sse.RealEventSource.onEvent(RealEventSource.kt:101)
at okhttp3.internal.sse.ServerSentEventReader.completeEvent(ServerSentEventReader.kt:108)
at okhttp3.internal.sse.ServerSentEventReader.processNextEvent(ServerSentEventReader.kt:52)
at okhttp3.internal.sse.RealEventSource.processResponse(RealEventSource.kt:75)
at okhttp3.internal.sse.RealEventSource.onResponse(RealEventSource.kt:46)
at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
server{
listen 80;
server_name 127.0.0.1;
index index.html;
proxy_buffering off;
root /opt/software/front/www/dist;
location /api/ {
proxy_pass http://localhost:3002/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
add_header X-Cache $upstream_cache_status;
}
location ~ ^/(\.user.ini|\.htaccess|\.git|\.env|\.svn|\.project|LICENSE|README.md) {
return 404;
}
}
server{
listen 8080;
server_name 127.0.0.1;
index index.html;
root /opt/software/front/admin/dist;
location /api/ {
proxy_pass http://localhost:3002/admin/;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
add_header X-Cache $upstream_cache_status;
}
location ~ ^/(\.user.ini|\.htaccess|\.git|\.env|\.svn|\.project|LICENSE|README.md) {
return 404;
}
}
请问如何运行和打包?
目前消息不支持连续对话
使用的是gpt-4吗?你们的演示的地址感觉的和我这gpt-3.5接口不太一样
我本地跑起来之后,感觉没有front2.stargpt.top这个智能,请问front2.stargpt.top用的是哪个模型?怎么更换成和front2.stargpt.top一样的
修改nginx配置信息即可实现流式输出。
Java后台已经实现了流式输出,数据到nginx 的时候进行了一次缓存,这时就需要关闭nginx的缓冲流。
如下:
http{
server{
xxxx xxx;
proxy_buffering off;
xxx xxxx;
}
}
没有索引,当数据量大的时候非常的卡
java.lang.RuntimeException: java.io.IOException: unexpected end of stream on https://api.openai.com/...
at io.reactivex.internal.util.ExceptionHelper.wrapOrThrow(ExceptionHelper.java:46) ~[rxjava-2.2.21.jar:na]
at io.reactivex.internal.observers.BlockingMultiObserver.blockingGet(BlockingMultiObserver.java:93) ~[rxjava-2.2.21.jar:na]
at io.reactivex.Single.blockingGet(Single.java:2870) ~[rxjava-2.2.21.jar:na]
at com.unfbx.chatgpt.OpenAiClient.creditGrants(OpenAiClient.java:845) ~[chatgpt-java-1.0.7.jar:na]
at com.hncboy.chatgpt.front.service.impl.ChatServiceImpl.getChatConfig(ChatServiceImpl.java:37) ~[classes/:na]
at com.hncboy.chatgpt.front.controller.ChatController.chatConfig(ChatController.java:38) ~[classes/:na]
at com.hncboy.chatgpt.front.controller.ChatController$$FastClassBySpringCGLIB$$f1911854.invoke() ~[classes/:na]
at org.springframework.cglib.proxy.MethodProxy.invoke(MethodProxy.java:218) ~[spring-core-5.3.26.jar:5.3.26]
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.invokeJoinpoint(CglibAopProxy.java:793) ~[spring-aop-5.3.26.jar:5.3.26]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:163) ~[spring-aop-5.3.26.jar:5.3.26]
at org.springframework.aop.framework.CglibAopProxy$CglibMethodInvocation.proceed(CglibAopProxy.java:763) ~[spring-aop-5.3.26.jar:5.3.26]
at org.springframework.aop.aspectj.MethodInvocationProceedingJoinPoint.proceed(MethodInvocationProceedingJoinPoint.java:89) ~[spring-aop-5.3.26.jar:5.3.26]
at com.hncboy.chatgpt.base.handler.aspect.PreAuthAspect.checkAuth(PreAuthAspect.java:46) ~[classes/:na]
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:na]
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[na:na]
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:na]
at java.base/java.lang.reflect.Method.invoke(Method.java:568) ~[na:na]
我本地启动之后两个用户访问,就不能正常使用了,是每个用户都需要配置一个apikey吗?
辛苦大佬帮忙解答
在原始的dockerfile中采用的是openjdk,但是这好像不会包含fontconfig,导致字体缺失,然后就无法显示验证码了。
后续,我在Dockerfile中加入这个后解决了
RUN apt-get update
RUN apt-get install -y fontconfig
windows下面正常,发布到CentOS 8.5.2111之后登录接口就报错。
最后没办法在项目启动的时候SecureUtil.disableBouncyCastle()禁用了BouncyCastle临时解决了,大家有没有碰到过,有没有更好的解决办法?
JAVA版本如下:
java version "17.0.7" 2023-04-18 LTS
Java(TM) SE Runtime Environment (build 17.0.7+8-LTS-224)
Java HotSpot(TM) 64-Bit Server VM (build 17.0.7+8-LTS-224, mixed mode, sharing)
错误信息如下:
cn.hutool.crypto.CryptoException: SecurityException: JCE cannot authenticate the provider BC
at cn.hutool.crypto.digest.mac.DefaultHMacEngine.init(DefaultHMacEngine.java:105) ~[hutool-all-5.8.18.jar!/:5.8.18]
at cn.hutool.crypto.digest.mac.DefaultHMacEngine.(DefaultHMacEngine.java:56) ~[hutool-all-5.8.18.jar!/:5.8.18]
at cn.hutool.crypto.digest.mac.MacEngineFactory.createEngine(MacEngineFactory.java:42) ~[hutool-all-5.8.18.jar!/:5.8.18]
at cn.hutool.crypto.digest.HMac.(HMac.java:86) ~[hutool-all-5.8.18.jar!/:5.8.18]
at cn.hutool.crypto.digest.HMac.(HMac.java:74) ~[hutool-all-5.8.18.jar!/:5.8.18]
at cn.hutool.crypto.digest.HMac.(HMac.java:63) ~[hutool-all-5.8.18.jar!/:5.8.18]
at cn.hutool.jwt.signers.HMacJWTSigner.(HMacJWTSigner.java:28) ~[hutool-all-5.8.18.jar!/:5.8.18]
at cn.hutool.jwt.signers.JWTSignerUtil.createSigner(JWTSignerUtil.java:239) ~[hutool-all-5.8.18.jar!/:5.8.18]
at cn.hutool.jwt.signers.JWTSignerUtil.hs256(JWTSignerUtil.java:36) ~[hutool-all-5.8.18.jar!/:5.8.18]
at cn.hutool.jwt.JWT.setKey(JWT.java:126) ~[hutool-all-5.8.18.jar!/:5.8.18]
at cn.dev33.satoken.jwt.SaJwtTemplate.generateToken(SaJwtTemplate.java:119) ~[sa-token-jwt-1.34.0.jar!/:?]
at cn.dev33.satoken.jwt.SaJwtTemplate.createToken(SaJwtTemplate.java:109) ~[sa-token-jwt-1.34.0.jar!/:?]
at cn.dev33.satoken.jwt.SaJwtUtil.createToken(SaJwtUtil.java:100) ~[sa-token-jwt-1.34.0.jar!/:?]
at cn.dev33.satoken.jwt.StpLogicJwtForStateless.createTokenValue(StpLogicJwtForStateless.java:60) ~[sa-token-jwt-1.34.0.jar!/:?]
at cn.dev33.satoken.jwt.StpLogicJwtForStateless.createLoginSession(StpLogicJwtForStateless.java:99) ~[sa-token-jwt-1.34.0.jar!/:?]
at cn.dev33.satoken.stp.StpLogic.login(StpLogic.java:331) ~[sa-token-core-1.34.0.jar!/:?]
at cn.dev33.satoken.stp.StpUtil.login(StpUtil.java:174) ~[sa-token-core-1.34.0.jar!/:?]
at com.hncboy.chatgpt.front.service.strategy.user.EmailAbstractRegisterStrategy.login(EmailAbstractRegisterStrategy.java:177) ~[classes!/:0.0.1-SNAPSHOT]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
2023-06-16T17:36:37.285+08:00 WARN 16708 --- [.openai.com/...] c.h.c.f.a.l.ParsedEventSourceListener : 消息发送异常,当前已接收消息:,响应内容:null,异常堆栈:
java.io.IOException: Unexpected response code for CONNECT: 501
at okhttp3.internal.connection.RealConnection.createTunnel(RealConnection.kt:483) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.connection.RealConnection.connectTunnel(RealConnection.kt:262) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.connection.RealConnection.connect(RealConnection.kt:201) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.connection.ExchangeFinder.findConnection(ExchangeFinder.kt:226) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.connection.ExchangeFinder.findHealthyConnection(ExchangeFinder.kt:106) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.connection.ExchangeFinder.find(ExchangeFinder.kt:74) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.connection.RealCall.initExchange$okhttp(RealCall.kt:255) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.connection.ConnectInterceptor.intercept(ConnectInterceptor.kt:32) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.cache.CacheInterceptor.intercept(CacheInterceptor.kt:95) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.http.BridgeInterceptor.intercept(BridgeInterceptor.kt:83) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.http.RetryAndFollowUpInterceptor.intercept(RetryAndFollowUpInterceptor.kt:76) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:na]
at com.unfbx.chatgpt.interceptor.HeaderAuthorizationInterceptor.intercept(HeaderAuthorizationInterceptor.java:51) ~[chatgpt-java-1.0.11.jar:na]
at okhttp3.internal.http.RealInterceptorChain.proceed(RealInterceptorChain.kt:109) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.connection.RealCall.getResponseWithInterceptorChain$okhttp(RealCall.kt:201) ~[okhttp-4.10.0.jar:na]
at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:517) ~[okhttp-4.10.0.jar:na]
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[na:na]
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) ~[na:na]
at java.base/java.lang.Thread.run(Thread.java:833) ~[na:na]
[问题描述]放开限流限制后,连续通过apikey发送20多个消息后,提示报错,报错信息如下:
2023-04-04 19:43:01.886 WARN 23814 --- [.openai.com/...] c.h.c.f.a.l.ParsedEventSourceListener : 消息发送异常,当前已接收消息:,响应内容:{
"error": {
"message": "This model's maximum context length is 4097 tokens. However, you requested 4283 tokens (3283 in the messages, 1000 in the completion). Please reduce the length of the messages or completion.",
"type": "invalid_request_error",
"param": "messages",
"code": "context_length_exceeded"
}
}
,异常堆栈:
Exception in thread "OkHttp Dispatcher" java.lang.IllegalArgumentException: argument "content" is null
at com.fasterxml.jackson.databind.ObjectMapper._assertNotNull(ObjectMapper.java:4829)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3596)
at com.hncboy.chatgpt.base.util.ObjectMapperUtil.fromJson(ObjectMapperUtil.java:44)
at com.hncboy.chatgpt.front.api.parser.ChatCompletionResponseParser.parseSuccess(ChatCompletionResponseParser.java:23)
at com.hncboy.chatgpt.front.api.parser.ChatCompletionResponseParser.parseSuccess(ChatCompletionResponseParser.java:18)
at com.hncboy.chatgpt.front.api.storage.ApiKeyDatabaseDataStorage.populateMessageUsageToken(ApiKeyDatabaseDataStorage.java:44)
at com.hncboy.chatgpt.front.api.storage.ApiKeyDatabaseDataStorage.onErrorMessage(ApiKeyDatabaseDataStorage.java:34)
at com.hncboy.chatgpt.front.api.storage.AbstractDatabaseDataStorage.onError(AbstractDatabaseDataStorage.java:121)
at com.hncboy.chatgpt.front.api.listener.ParsedEventSourceListener.onFailure(ParsedEventSourceListener.java:167)
at okhttp3.internal.sse.RealEventSource.processResponse(RealEventSource.kt:52)
at okhttp3.internal.sse.RealEventSource.onResponse(RealEventSource.kt:46)
at okhttp3.internal.connection.RealCall$AsyncCall.run(RealCall.kt:519)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:833)
2023-04-04 19:47:49.232 ERROR 23814 --- [nio-8699-exec-6] c.h.c.f.service.impl.ChatServiceImpl : 请求参数:{"prompt":"你是谁?","options":{"conversationId":"e41f46e7-b05e-4dc4-a47e-25160817d265","parentMessageId":"9bfed09b-db52-4c71-9100-ed815139b153"},"systemMessage":null},Back-end closed the emitter connection.
2023-04-04 19:47:49.244 ERROR 23814 --- [nio-8699-exec-6] c.h.c.b.e.RestExceptionTranslator : 服务器异常
org.springframework.web.context.request.async.AsyncRequestTimeoutException: null
at org.springframework.web.context.request.async.TimeoutDeferredResultProcessingInterceptor.handleTimeout(TimeoutDeferredResultProcessingInterceptor.java:42) ~[spring-web-5.3.26.jar!/:5.3.26]
at org.springframework.web.context.request.async.DeferredResultInterceptorChain.triggerAfterTimeout(DeferredResultInterceptorChain.java:79) ~[spring-web-5.3.26.jar!/:5.3.26]
at org.springframework.web.context.request.async.WebAsyncManager.lambda$startDeferredResultProcessing$5(WebAsyncManager.java:438) ~[spring-web-5.3.26.jar!/:5.3.26]
at java.base/java.util.ArrayList.forEach(ArrayList.java:1511) ~[na:na]
at org.springframework.web.context.request.async.StandardServletAsyncWebRequest.onTimeout(StandardServletAsyncWebRequest.java:151) ~[spring-web-5.3.26.jar!/:5.3.26]
at org.apache.catalina.core.AsyncListenerWrapper.fireOnTimeout(AsyncListenerWrapper.java:44) ~[tomcat-embed-core-9.0.73.jar!/:na]
at org.apache.catalina.core.AsyncContextImpl.timeout(AsyncContextImpl.java:133) ~[tomcat-embed-core-9.0.73.jar!/:na]
at org.apache.catalina.connector.CoyoteAdapter.asyncDispatch(CoyoteAdapter.java:137) ~[tomcat-embed-core-9.0.73.jar!/:na]
at org.apache.coyote.AbstractProcessor.dispatch(AbstractProcessor.java:240) ~[tomcat-embed-core-9.0.73.jar!/:na]
at org.apache.coyote.AbstractProcessorLight.process(AbstractProcessorLight.java:57) ~[tomcat-embed-core-9.0.73.jar!/:na]
at org.apache.coyote.AbstractProtocol$ConnectionHandler.process(AbstractProtocol.java:926) ~[tomcat-embed-core-9.0.73.jar!/:na]
at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1791) ~[tomcat-embed-core-9.0.73.jar!/:na]
at org.apache.tomcat.util.net.SocketProcessorBase.run(SocketProcessorBase.java:49) ~[tomcat-embed-core-9.0.73.jar!/:na]
at org.apache.tomcat.util.threads.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1191) ~[tomcat-embed-core-9.0.73.jar!/:na]
at org.apache.tomcat.util.threads.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:659) ~[tomcat-embed-core-9.0.73.jar!/:na]
at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) ~[tomcat-embed-core-9.0.73.jar!/:na]
at java.base/java.lang.Thread.run(Thread.java:833) ~[na:na]
- CHAT_OPENAI_API_KEY=xxx
- CHAT_OPENAI_ACCESS_TOKEN=xxx
这两个是必填项吗,我两个都填写了运行后提示TOKEN有问题,向大佬请教一下如何配置
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.