OkHttp3:获得';意外终止流';读取大型HTTP响应时发生异常
我有一个Java客户端,它正在对Hasura服务器(v1.3.3)的v1/graphql端点进行POST调用 我使用Square okhttp3库(v4.9.1)进行HTTP调用。数据传输在HTTP1.1上进行,使用分块传输编码 客户端失败,出现以下错误:OkHttp3:获得';意外终止流';读取大型HTTP响应时发生异常,http,okhttp,hasura,Http,Okhttp,Hasura,我有一个Java客户端,它正在对Hasura服务器(v1.3.3)的v1/graphql端点进行POST调用 我使用Square okhttp3库(v4.9.1)进行HTTP调用。数据传输在HTTP1.1上进行,使用分块传输编码 客户端失败,出现以下错误: Caused by: java.net.ProtocolException: unexpected end of stream at okhttp3.internal.http1.Http1ExchangeCodec$ChunkedS
Caused by: java.net.ProtocolException: unexpected end of stream
at okhttp3.internal.http1.Http1ExchangeCodec$ChunkedSource.read(Http1ExchangeCodec.kt:415) ~[okhttp-4.9.1.jar:?]
at okhttp3.internal.connection.Exchange$ResponseBodySource.read(Exchange.kt:276) ~[okhttp-4.9.1.jar:?]
at okio.RealBufferedSource.read(RealBufferedSource.kt:189) ~[okio-jvm-2.8.0.jar:?]
at okio.RealBufferedSource.exhausted(RealBufferedSource.kt:197) ~[okio-jvm-2.8.0.jar:?]
at okio.InflaterSource.refill(InflaterSource.kt:112) ~[okio-jvm-2.8.0.jar:?]
at okio.InflaterSource.readOrInflate(InflaterSource.kt:76) ~[okio-jvm-2.8.0.jar:?]
at okio.InflaterSource.read(InflaterSource.kt:49) ~[okio-jvm-2.8.0.jar:?]
at okio.GzipSource.read(GzipSource.kt:69) ~[okio-jvm-2.8.0.jar:?]
at okio.Buffer.writeAll(Buffer.kt:1642) ~[okio-jvm-2.8.0.jar:?]
at okio.RealBufferedSource.readString(RealBufferedSource.kt:95) ~[okio-jvm-2.8.0.jar:?]
at okhttp3.ResponseBody.string(ResponseBody.kt:187) ~[okhttp-4.9.1.jar:?]
请求标头:
INFO: Content-Type: application/json; charset=utf-8
INFO: Content-Length: 1928
INFO: Host: localhost:10191
INFO: Connection: Keep-Alive
INFO: Accept-Encoding: gzip
INFO: User-Agent: okhttp/4.9.1
响应标题:
INFO: Transfer-Encoding: chunked
INFO: Date: Tue, 27 Apr 2021 12:06:39 GMT
INFO: Server: Warp/3.3.10
INFO: x-request-id: d019408e-e2e3-4583-bcd6-050d4a496b11
INFO: Content-Type: application/json; charset=utf-8
INFO: Content-Encoding: gzip
这是用于进行POST呼叫的客户端代码:
private static final MediaType MEDIA_TYPE_JSON = MediaType.parse("application/json; charset=utf-8");
private static OkHttpClient okHttpClient = new OkHttpClient.Builder()
.connectTimeout(30, TimeUnit.SECONDS)
.writeTimeout(5, TimeUnit.MINUTES)
.readTimeout(5, TimeUnit.MINUTES)
.addNetworkInterceptor(loggingInterceptor)
.build();
public GenericHttpResponse httpPost(String url, String textBody, GenericHttpMediaType genericMediaType) throws HttpClientException {
RequestBody body = RequestBody.create(okHttpMediaType, textBody);
Request postRequest = new Request.Builder().url(url).post(body).build();
Call postCall = okHttpClient.newCall(okHttpRequest);
Response postResponse = postCall.execute();
return GenericHttpResponse
.builder()
.body(okHttpResponse.body().string())
.headers(okHttpResponse.headers().toMultimap())
.code(okHttpResponse.code())
.build();
}
这种故障只会发生在响应大小较大的情况下。根据服务器日志,响应大小(经过gzip编码后)约为52MB,但调用仍然失败。对于10-15MB左右的响应大小,同样的代码工作得很好
我尝试通过一个简单的cURL调用复制相同的问题,但成功地运行了:
curl -v -s --request POST 'http://<hasura_endpoint>/v1/graphql' \
--header 'Content-Type: application/json' \
--header 'Accept-Encoding: gzip, deflate, br' \
--data-raw '...'
* Trying ::1...
* TCP_NODELAY set
* Connected to <host> (::1) port <port> (#0)
> POST /v1/graphql HTTP/1.1
> Host: <host>:<port>
> User-Agent: curl/7.64.1
> Accept: */*
> Content-Type: application/json
> Accept-Encoding: gzip, deflate, br
> Content-Length: 1840
> Expect: 100-continue
>
< HTTP/1.1 100 Continue
} [1840 bytes data]
* We are completely uploaded and fine
< HTTP/1.1 200 OK
< Transfer-Encoding: chunked
< Date: Tue, 27 Apr 2021 11:59:24 GMT
< Server: Warp/3.3.10
< x-request-id: 27e3ff3f-8b95-4328-a1bc-a5492e68f995
< Content-Type: application/json; charset=utf-8
< Content-Encoding: gzip
<
{ [6 bytes data]
* Connection #0 to host <host> left intact
* Closing connection 0
curl-v-s——请求帖子'http:///v1/graphql' \
--标题“内容类型:应用程序/json”\
--标题“接受编码:gzip,deflate,br”\
--原始数据“…”
*正在尝试::1。。。
*TCP_节点集
*已连接到(::1)端口(#0)
>POST/v1/graphql HTTP/1.1
>主持人:
>用户代理:curl/7.64.1
>接受:*/*
>内容类型:application/json
>接受编码:gzip,deflate,br
>内容长度:1840
>预期:100人继续
>
所以我假设这个错误是特定于Java客户机的
根据类似帖子中提供的建议,我尝试了以下其他方法:
连接:关闭请求的标题
传输编码:gzip
头retryOnConnectionFailure
设置为true
非常感谢您对这方面的任何见解。谢谢。您确定查询本身正常吗?您可以在hasura UI中运行它以确定吗?超时查询有时看起来像是错误编码的json输出。类似的内容用括号括起来
{
/}
没有关闭。我看到了好几次,然后查询或索引优化的更正和问题消失了。你写了“大输出”-可能是这样。我建议您怀疑curl查询是否成功。将输出重定向到文件,然后尝试解码该json。我希望它不会有右括号。也许这不是Java客户端的错。我们使用apollo作为前端,使用不同的客户端作为后端。Jupyter笔记本作为文档,我们使用python/gql作为通信关于分母。我可以推荐它用于调试目的。