Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/postgresql/10.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Performance 效率更高的是:多个插入件与带有联合件的单个插入件_Performance_Postgresql_Insert - Fatal编程技术网

Performance 效率更高的是:多个插入件与带有联合件的单个插入件

Performance 效率更高的是:多个插入件与带有联合件的单个插入件,performance,postgresql,insert,Performance,Postgresql,Insert,我在Postgresql中有一个大表(~6M行,41列),如下所示: id | answer1 | answer2 | answer3 | ... | answer40 1 | xxx | yyy | null | ... | null 2 | xxx | null | null | ... | null 3 | xxx | null | zzz | ... | aaa 请注意,每行中都有许多空列,我只想要那些包含数据的列

我在Postgresql中有一个大表(~6M行,41列),如下所示:

id | answer1 | answer2 | answer3 | ... | answer40
1  | xxx     | yyy     | null    | ... | null
2  | xxx     | null    | null    | ... | null
3  | xxx     | null    | zzz     | ... | aaa
请注意,每行中都有许多空列,我只想要那些包含数据的列

我想将其正常化以获得以下结果:

id | answers
1  | xxx
1  | yyy
2  | xxx
3  | xxx
3  | zzz
...
3  | aaa
问题是,几次插入或一次插入和多个联合哪个更有效/快速

选择1

create new_table as 
select id, answer1 from my_table where answer1 is not null
union 
select id, answer2 from my_table where answer2 is not null
union 
select id, answer3 from my_table where answer3 is not null
union ... 
选择2

create new_table as select id, answer1 from my_table where answer1 is not null;
insert into new_table select id, answer2 from my_table where answer2 is not null;
insert into new_table select id, answer3 from my_table where answer3 is not null;
...

选项3:有更好的方法吗?

选项2应该更快

将所有语句包装在
begincommit
块中,以节省单个提交的时间

对于更快的选择,请确保要筛选的列(例如,answer1不为null的
)具有索引