Apache Nutch 1.12和Apache Solr 6.2.1给出了一个错误

Apache Nutch 1.12和Apache Solr 6.2.1给出了一个错误,apache,solr,lucene,nutch,Apache,Solr,Lucene,Nutch,我正在使用Apache Nutch 1.12和Apache Solr 6.2.1对internet上的数据进行爬网并对它们进行索引,这种组合会产生一个错误:java.lang.Exception:java.lang.IllegalStateException:Connection pool Shutdown 我从Nutch教程中学到了以下几点: 复制了Nutch的schema.xml并将其放在Solr的config文件夹中 在Nutch的url/seed.txt中放置一个种子url(报纸公司的

我正在使用Apache Nutch 1.12和Apache Solr 6.2.1对internet上的数据进行爬网并对它们进行索引,这种组合会产生一个错误:java.lang.Exception:java.lang.IllegalStateException:Connection pool Shutdown

我从Nutch教程中学到了以下几点:

  • 复制了Nutch的schema.xml并将其放在Solr的config文件夹中
  • 在Nutch的url/seed.txt中放置一个种子url(报纸公司的)
  • 将nutch-site.xml中的http.content.limit值更改为“-1”。由于种子url是报业公司的url,所以我不得不取消http内容下载大小限制
运行以下命令时,出现错误:

bin/crawl -i -D solr.server.url=http://localhost:8983/solr/TSolr urls/ TestCrawl/ 2
上面,TSolr只是Solr核心的名称,您可能已经猜到了

我正在下面的hadoop.log中粘贴错误日志:

    2016-10-28 16:21:20,982 INFO  indexer.IndexerMapReduce - IndexerMapReduce: crawldb: TestCrawl/crawldb
2016-10-28 16:21:20,982 INFO  indexer.IndexerMapReduce - IndexerMapReduce: linkdb: TestCrawl/linkdb
2016-10-28 16:21:20,982 INFO  indexer.IndexerMapReduce - IndexerMapReduces: adding segment: TestCrawl/segments/20161028161642
2016-10-28 16:21:46,353 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1281422650/.staging/job_local1281422650_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2016-10-28 16:21:46,355 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1281422650/.staging/job_local1281422650_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2016-10-28 16:21:46,415 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1281422650_0001/job_local1281422650_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2016-10-28 16:21:46,416 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1281422650_0001/job_local1281422650_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2016-10-28 16:21:46,565 INFO  anchor.AnchorIndexingFilter - Anchor deduplication is: off
2016-10-28 16:21:52,308 INFO  indexer.IndexWriters - Adding org.apache.nutch.indexwriter.solr.SolrIndexWriter
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: content dest: content
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: title dest: title
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: host dest: host
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: segment dest: segment
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: boost dest: boost
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: digest dest: digest
2016-10-28 16:21:52,383 INFO  solr.SolrMappingReader - source: tstamp dest: tstamp
2016-10-28 16:21:52,424 INFO  solr.SolrIndexWriter - Indexing 42/42 documents
2016-10-28 16:21:52,424 INFO  solr.SolrIndexWriter - Deleting 0 documents
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: content dest: content
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: title dest: title
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: host dest: host
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: segment dest: segment
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: boost dest: boost
2016-10-28 16:21:53,468 INFO  solr.SolrMappingReader - source: digest dest: digest
2016-10-28 16:21:53,469 INFO  solr.SolrMappingReader - source: tstamp dest: tstamp
2016-10-28 16:21:53,472 INFO  indexer.IndexingJob - Indexer: number of documents indexed, deleted, or skipped:
2016-10-28 16:21:53,476 INFO  indexer.IndexingJob - Indexer:     42  indexed (add/update)
2016-10-28 16:21:53,477 INFO  indexer.IndexingJob - Indexer: finished at 2016-10-28 16:21:53, elapsed: 00:00:32
2016-10-28 16:21:54,199 INFO  indexer.CleaningJob - CleaningJob: starting at 2016-10-28 16:21:54
2016-10-28 16:21:54,344 WARN  util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016-10-28 16:22:19,739 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1653313730/.staging/job_local1653313730_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2016-10-28 16:22:19,741 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/staging/btaek1653313730/.staging/job_local1653313730_0001/job.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2016-10-28 16:22:19,797 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1653313730_0001/job_local1653313730_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.retry.interval;  Ignoring.
2016-10-28 16:22:19,799 WARN  conf.Configuration - file:/tmp/hadoop-btaek/mapred/local/localRunner/btaek/job_local1653313730_0001/job_local1653313730_0001.xml:an attempt to override final parameter: mapreduce.job.end-notification.max.attempts;  Ignoring.
2016-10-28 16:22:19,807 WARN  output.FileOutputCommitter - Output Path is null in setupJob()
2016-10-28 16:22:25,113 INFO  indexer.IndexWriters - Adding org.apache.nutch.indexwriter.solr.SolrIndexWriter
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: content dest: content
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: title dest: title
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: host dest: host
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: segment dest: segment
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: boost dest: boost
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: digest dest: digest
2016-10-28 16:22:25,188 INFO  solr.SolrMappingReader - source: tstamp dest: tstamp
2016-10-28 16:22:25,191 INFO  solr.SolrIndexWriter - SolrIndexer: deleting 6/6 documents
2016-10-28 16:22:25,300 WARN  output.FileOutputCommitter - Output Path is null in cleanupJob()
2016-10-28 16:22:25,301 WARN  mapred.LocalJobRunner - job_local1653313730_0001
java.lang.Exception: java.lang.IllegalStateException: Connection pool shut down
    at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462)
    at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:529)
Caused by: java.lang.IllegalStateException: Connection pool shut down
    at org.apache.http.util.Asserts.check(Asserts.java:34)
    at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:169)
    at org.apache.http.pool.AbstractConnPool.lease(AbstractConnPool.java:202)
    at org.apache.http.impl.conn.PoolingClientConnectionManager.requestConnection(PoolingClientConnectionManager.java:184)
    at org.apache.http.impl.client.DefaultRequestDirector.execute(DefaultRequestDirector.java:415)
    at org.apache.http.impl.client.AbstractHttpClient.doExecute(AbstractHttpClient.java:863)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:82)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:106)
    at org.apache.http.impl.client.CloseableHttpClient.execute(CloseableHttpClient.java:57)
    at org.apache.solr.client.solrj.impl.HttpSolrClient.executeMethod(HttpSolrClient.java:480)
    at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:241)
    at org.apache.solr.client.solrj.impl.HttpSolrClient.request(HttpSolrClient.java:230)
    at org.apache.solr.client.solrj.SolrRequest.process(SolrRequest.java:150)
    at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:483)
    at org.apache.solr.client.solrj.SolrClient.commit(SolrClient.java:464)
    at org.apache.nutch.indexwriter.solr.SolrIndexWriter.commit(SolrIndexWriter.java:190)
    at org.apache.nutch.indexwriter.solr.SolrIndexWriter.close(SolrIndexWriter.java:178)
    at org.apache.nutch.indexer.IndexWriters.close(IndexWriters.java:115)
    at org.apache.nutch.indexer.CleaningJob$DeleterReducer.close(CleaningJob.java:120)
    at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)
    at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)
    at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)
    at org.apache.hadoop.mapred.LocalJobRunner$Job$ReduceTaskRunnable.run(LocalJobRunner.java:319)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
    at java.lang.Thread.run(Thread.java:745)
2016-10-28 16:22:25,841 ERROR indexer.CleaningJob - CleaningJob: java.io.IOException: Job failed!
    at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:836)
    at org.apache.nutch.indexer.CleaningJob.delete(CleaningJob.java:172)
    at org.apache.nutch.indexer.CleaningJob.run(CleaningJob.java:195)
    at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
    at org.apache.nutch.indexer.CleaningJob.main(CleaningJob.java:206)
正如您在上面的bin/crawl命令中所看到的,我让Nutch运行了2轮爬网。问题是,上面的错误只发生在第二轮(种子点更深一层)。因此,索引在第一轮成功运行,但在第二轮的第二次爬网和解析之后,它抛出错误并停止

为了尝试与第一次跑步略有不同的方式,正如我在上面所做的那样,我在第二次跑步时做了以下操作:

  • 已删除TestCrawl文件夹以开始爬网并索引新的
  • 运行:
    bin/crawl-i-D solr.server.url=http://localhost:8983/solr/TSolr url/TestCrawl/1
    ==>请注意,我已将Nutch的轮数更改为“1”。并且,这将成功地执行爬网和索引
  • 然后,在第二轮中再次运行相同的命令以爬网更深一级:
    bin/crawl-i-dsolr.server.url=http://localhost:8983/solr/TSolr URL/TestCrawl/1
    ==>这给了我与上面粘贴的hadoop.log相同的错误
因此,因为我的Solr无法成功索引Nutch在第二轮或种子站点更深一层所爬过的内容

错误可能是由于种子站点的已解析内容大小造成的吗?种子网站是一家报纸公司的网站,因此我确信第二轮(更深一层)将包含大量解析为索引的数据。如果问题是解析内容大小,如何配置Solr来解决问题


如果错误是由其他原因引起的,是否有人可以帮助我确定它是什么以及如何修复它?

发生此错误是因为与Solr的连接已关闭,并且尝试了提交()。这是在Jira上的NUTCH-2269票上发现的,并且有一个PR正在进行中()。

对于那些经历过我经历过的事情的人,我想我会发布我遇到的问题的解决方案

首先,ApachNutch1.12似乎不支持ApacheSolr6.X。如果您查看ApacheNutch1.12发行说明,他们最近在Nuch1.12中添加了支持ApacheSolr5.X的特性,但不包括对Solr6.X的支持。因此,我决定使用Solr5.5.3,而不是Solr6.2.1。因此,我安装了ApacheSolr5.5.3来与ApacheNutch1.12配合使用

正如Jorge Luis所指出的,ApacheNutch1.12有一个bug,当它与ApacheSolr一起工作时会出现错误。他们会在某个时候修复这个bug并发布Nutch1.13,但我不知道什么时候会,所以我决定自己修复这个bug

我出现错误的原因是,CleaningJob.java(Nutch的)中的close方法首先被调用,然后是commit方法。然后,抛出以下异常:java.lang.IllegalStateException:连接池关闭

解决方法其实很简单。要了解解决方案,请访问此处:

正如您在上面的链接中所看到的,您只需要重新定位“writers.close();”方法

顺便说一下,为了修复错误,您需要Nutch scr包而不是二进制包,因为您将无法在Nutch二进制包中编辑CleaningJob.java文件。修复之后,运行ant,就可以全部设置好了

修复后,我不再得到错误


希望这能帮助那些面临我所面临的问题的人。

谢谢,乔治。事实上我发现了你在周末指出的。