Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/java/373.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/3/go/7.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
java for和while在处理来自数据库的较大数据时以非常慢的速度循环_Java - Fatal编程技术网

java for和while在处理来自数据库的较大数据时以非常慢的速度循环

java for和while在处理来自数据库的较大数据时以非常慢的速度循环,java,Java,我有以下java代码: import com.google.gson.Gson; import java.sql.ResultSet; import java.sql.ResultSetMetaData; import java.sql.SQLException; import java.util.ArrayList; public class DBresult { int countColumns; public ArrayList<String> col

我有以下java代码:

import com.google.gson.Gson;    
import java.sql.ResultSet;
import java.sql.ResultSetMetaData;
import java.sql.SQLException;
import java.util.ArrayList;

public class DBresult {
    int countColumns;
    public ArrayList<String> columnLabels;
    public ArrayList<String[]> rows;
    public ArrayList<String> primaryKeys;
    public ArrayList<String> tablenameByColum;

    public static DBresult build(ResultSet set) throws SQLException {
        DBresult result = new DBresult();
        result.countColumns = set.getMetaData().getColumnCount();
        result.rows = new ArrayList<String[]>();
        result.columnLabels = new ArrayList<String>();
        result.tablenameByColum = new ArrayList<String>();
        String[] row;
        while (set.next()) {
            row = new String[result.countColumns];
            for (int i = 1; i <= result.countColumns; i++) {
                row[i-1] = set.getString(i);
            }
            result.rows.add(row);
        }
        ResultSetMetaData meta = set.getMetaData();
        for (int i = 1; i <= result.countColumns; i++) {
            result.columnLabels.add(meta.getColumnLabel(i));
            result.tablenameByColum.add(meta.getTableName(i));
        }

        return result;
    }

    @Override
    public String toString() {
        try {
            return new Gson().toJson(this);
        } catch (Exception e) {
            return null;
        }
    }

    public static DBresult fromString(String json){
        try {
            return new Gson().fromJson(json, DBresult.class);
        } catch (Exception e) {
            return null;
        }
    }
}
当我需要处理更大的数据(数千个条目)时,就会出现问题,速度非常慢,大约需要几分钟甚至更多的时间才能完成。你知道如何缩短处理时间吗

php中的类似代码:

$result = dibi::query(' 
    SELECT `NAME`,
    `ID`,
    `IDSECOND`,
    `TIME`
    FROM `NAMES`;');
$value = $result->fetchAll();
$data = '';
foreach ($value as $item)
{
    $data = $data . '["'.$item->NAME.'","'.$item->ID.'","'.$item->IDSECOND.'","'.$item->TIME.'"],';
} 
$data = substr($data, 0, -1);
$data='['.$data.']'; 
在php中,我可以这样简化它:

foreach ($value as $item)
{
    $array[] = '["'.$item->NAME.'","'.$item->ID.'","'.$item->IDSECOND.'","'.$item->TIME.'"]';
} 
$array = implode(",", $array);
$data='['.$array.']';

但是现在我想改进java代码以提高效率。我不懂java,所以非常感谢对新手的任何帮助。非常感谢:)

哪个部分比较慢?也许每个
set.next()
都可以在自己的线程中处理,并在全部完成后合并结果。我打赌瓶颈是序列化为JSON。你测量过每一步需要多长时间吗?(step1=查询,step2=填充
DBResult
,step3=序列化为JSON)
foreach ($value as $item)
{
    $array[] = '["'.$item->NAME.'","'.$item->ID.'","'.$item->IDSECOND.'","'.$item->TIME.'"]';
} 
$array = implode(",", $array);
$data='['.$array.']';