Warning: file_get_contents(/data/phpspider/zhask/data//catemap/6/multithreading/4.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何在Ruby PG gem中使用PG::Connection类的send_查询?_Ruby_Multithreading_Postgresql_Query Optimization_Pg - Fatal编程技术网

如何在Ruby PG gem中使用PG::Connection类的send_查询?

如何在Ruby PG gem中使用PG::Connection类的send_查询?,ruby,multithreading,postgresql,query-optimization,pg,Ruby,Multithreading,Postgresql,Query Optimization,Pg,如何使用Ruby PG gem中PG::Connection类的公共实例方法send\u query 它能帮助加快这样一个程序的执行时间吗 a = [1,2,3,4,5,6,...,100000] # elements in no specific order conn = PG.connect(OMITTED) conn.transaction do |conn| a.each do |i] conn.exec("INSERT INTO numbers VALUES ($1)",

如何使用Ruby PG gem中
PG::Connection
类的公共实例方法
send\u query

它能帮助加快这样一个程序的执行时间吗

a = [1,2,3,4,5,6,...,100000]  # elements in no specific order
conn = PG.connect(OMITTED)
conn.transaction do |conn|
  a.each do |i]
    conn.exec("INSERT INTO numbers VALUES ($1)",[i])
  end
end
假设:如果我不等待结果,那么我可以继续向PostgreSQL服务器发送查询并更快完成

实验:重写版本:

a = [1,2,3,4]
conn = PG.connect(OMITTED)
conn.setnonblocking(true)
conn.transaction do |conn|
  a.each do |i]
    conn.send_query("INSERT INTO numbers VALUES ($1)",[i])
  end
end
结果:运行时间无差异

其他一些问题:

  • 我的假设正确吗
  • 这一定是线程的作业吗
  • 我应该以某种方式处理exec的输出吗
您是否尝试将所有SQL插入连接到一个中?在我的实验中,速度更快:

require "rubygems"
require "pg"

a = (0...10000).sort_by(&:rand)  # elements in no specific order
conn = PG.connect(:dbname => "numbers")
t = Time.now
conn.transaction do |conn|
  a.each do |i|
    conn.exec("INSERT INTO numbers VALUES ($1)",[i])
  end
end
p Time.now - t

t = Time.now
conn.transaction do |conn|
  conn.exec(
    a.map { |i|
      "INSERT INTO numbers VALUES (%d);" % i
    }.join("")
  )
end
p Time.now - t

#=> 2.658903, 0.572997

您是否尝试将所有SQL插入连接到一个SQL插入中?在我的实验中,速度更快:

require "rubygems"
require "pg"

a = (0...10000).sort_by(&:rand)  # elements in no specific order
conn = PG.connect(:dbname => "numbers")
t = Time.now
conn.transaction do |conn|
  a.each do |i|
    conn.exec("INSERT INTO numbers VALUES ($1)",[i])
  end
end
p Time.now - t

t = Time.now
conn.transaction do |conn|
  conn.exec(
    a.map { |i|
      "INSERT INTO numbers VALUES (%d);" % i
    }.join("")
  )
end
p Time.now - t

#=> 2.658903, 0.572997

我喜欢这个建议。不幸的是,我实际上并没有将要插入的数据存储为数组。如果我存储它,我的内存可能会用完。我有数以百万计的插入。就像建议的那样,把你的数据分成块N个记录。不幸的是,我实际上并没有将要插入的数据存储为数组。如果我存储它,我的内存可能会用完。我有数以百万计的插入。只要把你的数据分成块,每个块有N条记录