Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/361.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/file/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Java,限制BufferedReader读取时间_Java_Timeout_Max_Bufferedreader_Execution Time - Fatal编程技术网

Java,限制BufferedReader读取时间

Java,限制BufferedReader读取时间,java,timeout,max,bufferedreader,execution-time,Java,Timeout,Max,Bufferedreader,Execution Time,我编写了这段非常经典的代码来读取URL的内容 // Prepares the connection URL url = new URL(urlString); URLConnection uc = url.openConnection(); // Reads the data StringBuilder data; String inputLine; try (InputStreamReader isr = new InputStreamReader(uc.getInputStream(),

我编写了这段非常经典的代码来读取URL的内容

// Prepares the connection
URL url = new URL(urlString);
URLConnection uc = url.openConnection();

// Reads the data
StringBuilder data;
String inputLine;
try (InputStreamReader isr = new InputStreamReader(uc.getInputStream(), "utf-8");
     BufferedReader in = new BufferedReader(isr)) {
    data = new StringBuilder();
    while ((inputLine = in.readLine()) != null) {
        data.append(inputLine);
    }
}

// Returns the read data
return data.toString();
但是有时候,我正在读取的url包含太多数据,或者客户端连接太慢,或者其他什么。。。所以阅读要花太多的时间

是否有方法指定BufferedReader(或InputStreamReader,或URLConnection?)的“最大读取时间”?理想情况下,它将在达到“最大读取时间”后抛出TimeoutException


我做了一些研究,但我能找到的只是对接收到的数据大小的限制,而不是对执行时间的限制。

在开始读取之前只需调用
URLConnection.setReadTimeout()
。如果超时过期,将抛出一个
SocketTimeoutException

只需在开始读取之前调用
URLConnection.setReadTimeout()
。如果超时过期,将抛出一个
SocketTimeoutException

出于好奇,您知道这是如何实现的吗?@SotiriosDelimanolis With
Socket.setSoTimeout(),
当然,在网络URL的情况下。对其他URL不做任何操作。@SotiriosDelimanolis当然可以,根据平台的不同使用
SO\RCVTIMEO
select()
。我不敢相信我错过了:无论如何,谢谢!出于好奇,你知道这是如何实现的吗?@SotiriosDelimanolis With
Socket.setSoTimeout(),
当然,在网络URL的情况下。对其他URL不做任何操作。@SotiriosDelimanolis当然可以,根据平台的不同使用
SO\RCVTIMEO
select()
。我不敢相信我错过了:无论如何,谢谢!