Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/solr/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/actionscript-3/6.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
SOLR-DIH多重查询 问题_Solr_Dataimporthandler_Dih - Fatal编程技术网

SOLR-DIH多重查询 问题

SOLR-DIH多重查询 问题,solr,dataimporthandler,dih,Solr,Dataimporthandler,Dih,SOLR DIH汇总每个迭代中的查询。像在第三次迭代中一样,生成以下输出 "entity:us-patent-grant-xslt", [ "document#3", [ "query", "/var/www/data1/US07985001-20110726.XML", "query", "/var/www/data1/US07985001-20110726.XML", "query", "/var/w

SOLR DIH汇总每个迭代中的查询。像在第三次迭代中一样,生成以下输出

"entity:us-patent-grant-xslt",
  [
    "document#3",
    [
      "query",
      "/var/www/data1/US07985001-20110726.XML",
      "query",
      "/var/www/data1/US07985001-20110726.XML",
      "query",
      "/var/www/data1/US07985001-20110726.XML",
      "time-taken",
      "0:0:0.0",
      "time-taken",
      "0:0:0.0",
      "time-taken",
      "0:0:0.0",
      null,
      "----------- row #1-------------",
      "id",
      "US7985001",
      "pub_date",
      "2011-07-26 00:00:00",
      null,
      "---------------------------------------------"
    ],
数据配置文件

<entity name="pickupdir"
        processor="FileListEntityProcessor"
        rootEntity="false"
        dataSource="null"
        fileName="^[\w\d-]+\.XML$"
        baseDir="/var/www/data1/"
        recursive="true"
        onError="skip">

            <entity name="us-patent-grant-xslt"
                   url="${pickupdir.fileAbsolutePath}"
                   xsl="data.xsl"
                   processor="XPathEntityProcessor"
                   useSolrAddSchema= "true" 
                   rootEntity="true"
                   onError="skip">

                       <field column="id" />
                       <field column="pub_date" />
           </entity>
</entity>

因此,当我在每次迭代中批量上传数据时,查询结果会汇总,性能会滞后。目前我的服务器每秒处理2个文档

我没有使用SQL实体,因此无法实现cachedsqlentity处理器。 类似问题

相关: