Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/375.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java 插入文档时达到最大池大小时的ConnectionPoolTimeoutException_Java_Apache Httpclient 4.x_Azure Cosmosdb - Fatal编程技术网

Java 插入文档时达到最大池大小时的ConnectionPoolTimeoutException

Java 插入文档时达到最大池大小时的ConnectionPoolTimeoutException,java,apache-httpclient-4.x,azure-cosmosdb,Java,Apache Httpclient 4.x,Azure Cosmosdb,我在Azure DocumentDB上玩了几天,在插入数据时遇到了一个奇怪的行为。(我使用maven的java sdk 1.0版) 要插入数据,我在POJO上循环并尝试如下插入: for (Order order : jsonImporter.getOrderList()) { Document doc = new Document(gson.toJson(order, Order.class)); doc.setId(order.uuid); doc.set(TYPE,

我在Azure DocumentDB上玩了几天,在插入数据时遇到了一个奇怪的行为。(我使用maven的java sdk 1.0版)

要插入数据,我在POJO上循环并尝试如下插入:

for (Order order : jsonImporter.getOrderList()) {
    Document doc = new Document(gson.toJson(order, Order.class));
    doc.setId(order.uuid);
    doc.set(TYPE, TYPE_ORDER);
    try {
        client.createDocument(getCollection().getSelfLink(), doc, null, true);
    } catch (DocumentClientException e) {
        System.err.printf("AzureDocumentDB - insertOrder request failed (order uuid %s)", order.uuid);
        System.err.println(e.getMessage());
    }
}
问题是,当我到达第一百个元素(即最大连接池大小)时,我得到一个异常,即我的连接池无法为我提供另一个连接。 我还试图修改连接池的设置,以增加连接数据,并减少“IdleConnectionTimeout”以更早地释放连接,但没有成功。
例如,当我将池大小增加到500时,在第500个元素之后出现异常

ConnectionPolicy connectionPolicy = new ConnectionPolicy();
connectionPolicy.setMaxPoolSize(500);
connectionPolicy.setIdleConnectionTimeout(10);
client = new DocumentClient(END_POINT, MASTER_KEY, connectionPolicy, ConsistencyLevel.Session);
有人在我的代码中看到API误用吗?是否有人在插入数据时遇到了相同的行为?或者sdk或apache http客户端中是否存在导致连接未被释放的已知错误?我将非常感谢你的帮助

例外情况:

java.lang.IllegalStateException: Http client execution failed.
    at com.microsoft.azure.documentdb.GatewayProxy.performPostRequest(GatewayProxy.java:350)
    at com.microsoft.azure.documentdb.GatewayProxy.doCreate(GatewayProxy.java:90)
    at com.microsoft.azure.documentdb.DocumentClient.doCreate(DocumentClient.java:1968)
    at com.microsoft.azure.documentdb.DocumentClient.createDocument(DocumentClient.java:456)
    ...
Caused by: org.apache.http.conn.ConnectionPoolTimeoutException: Timeout waiting for connection from pool
    at org.apache.http.impl.conn.PoolingClientConnectionManager.leaseConnection(PoolingClientConnectionManager.java:226)
    at org.apache.http.impl.conn.PoolingClientConnectionManager$1.getConnection(PoolingClientConnectionManager.java:195)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:423)
    at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57)
    at com.microsoft.azure.documentdb.GatewayProxy.performPostRequest(GatewayProxy.java:347)

您看到的是
ConnectionPoolTimeoutException
,因为调用
createXXXXXXXX()
时DocumentClient不会自动关闭流。此行为旨在支持
createAttachment()
的流blob附件

要关闭流,可以调用
close()
关闭流,或者调用
.getResource()
返回资源,然后关闭流

换言之,替换以下行:

client.createDocument(getCollection().getSelfLink(),doc,null,true)

与:

client.createDocument(getCollection().getSelfLink(),doc,null,true).close()

或者:


doc=client.createDocument(getCollection().getSelfLink(),doc,null,true).getResource()

谢谢,下次我将更详细地了解API。我看到在github的示例中总是使用getResource(),但我认为这只是为了获取插入的文档。可能只需在注释中添加几个词,这也会负责关闭连接。如果在运行代码doc=client.createDocument(getCollection().getSelfLink(),doc,null,true)时出错,会发生什么情况?我怎样才能强制用挡块关闭连接?似乎这导致我“等待连接池超时”。