Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/solr/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
无法将修补程序LUCENE-2899.patch应用于Windows上的SOLR_Solr_Lucene_Nlp_Solrcloud_Opennlp - Fatal编程技术网

无法将修补程序LUCENE-2899.patch应用于Windows上的SOLR

无法将修补程序LUCENE-2899.patch应用于Windows上的SOLR,solr,lucene,nlp,solrcloud,opennlp,Solr,Lucene,Nlp,Solrcloud,Opennlp,我正在尝试将修补程序LUCENE-2899.patch应用于Solr 我已经这样做了: 来自官方回购的克隆solr我在master branch 下载并安装了ant和GNU补丁,我在这里看到了 将Ant和GNU补丁放到路径env var。 我得到了这个。。。 ``` ``` 更新1 我正在尝试编译,但构建失败 D:\utils\solr_master\lucene-solr>ant compile Buildfile: D:\utils\solr_master\lucene-solr\bu

我正在尝试将修补程序LUCENE-2899.patch应用于Solr

我已经这样做了:

来自官方回购的克隆solr我在master branch 下载并安装了ant和GNU补丁,我在这里看到了 将Ant和GNU补丁放到路径env var。 我得到了这个。。。 ```

```

更新1

我正在尝试编译,但构建失败

D:\utils\solr_master\lucene-solr>ant compile
Buildfile: D:\utils\solr_master\lucene-solr\build.xml

BUILD FAILED
D:\utils\solr_master\lucene-solr\build.xml:21: The following error occurred while executing this line:
D:\utils\solr_master\lucene-solr\lucene\common-build.xml:623: java.lang.NullPointerException
        at java.util.Arrays.stream(Arrays.java:5004)
        at java.util.stream.Stream.of(Stream.java:1000)
        at java.util.stream.ReferencePipeline$7$1.accept(ReferencePipeline.java:267)
        at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
        at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
        at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
        at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
        at org.apache.tools.ant.util.ChainedMapper.lambda$mapFileName$1(ChainedMapper.java:36)
        at java.util.stream.ReduceOps$1ReducingSink.accept(ReduceOps.java:80)
        at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
        at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.util.stream.ReferencePipeline.reduce(ReferencePipeline.java:484)
        at org.apache.tools.ant.util.ChainedMapper.mapFileName(ChainedMapper.java:35)
        at org.apache.tools.ant.util.CompositeMapper.lambda$mapFileName$0(CompositeMapper.java:32)
        at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
        at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
        at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1382)
        at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
        at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
        at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:545)
        at java.util.stream.AbstractPipeline.evaluateToArrayNode(AbstractPipeline.java:260)
        at java.util.stream.ReferencePipeline.toArray(ReferencePipeline.java:438)
        at org.apache.tools.ant.util.CompositeMapper.mapFileName(CompositeMapper.java:33)
        at org.apache.tools.ant.taskdefs.PathConvert.execute(PathConvert.java:363)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
        at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:346)
        at org.apache.tools.ant.Target.execute(Target.java:448)
        at org.apache.tools.ant.helper.ProjectHelper2.parse(ProjectHelper2.java:172)
        at org.apache.tools.ant.taskdefs.ImportTask.importResource(ImportTask.java:221)
        at org.apache.tools.ant.taskdefs.ImportTask.execute(ImportTask.java:165)
        at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:292)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106)
        at org.apache.tools.ant.Task.perform(Task.java:346)
        at org.apache.tools.ant.Target.execute(Target.java:448)
        at org.apache.tools.ant.helper.ProjectHelper2.parse(ProjectHelper2.java:183)
        at org.apache.tools.ant.ProjectHelper.configureProject(ProjectHelper.java:93)
        at org.apache.tools.ant.Main.runBuild(Main.java:824)
        at org.apache.tools.ant.Main.startAnt(Main.java:228)
        at org.apache.tools.ant.launch.Launcher.run(Launcher.java:283)
        at org.apache.tools.ant.launch.Launcher.main(Launcher.java:101)

Total time: 0 seconds
更新2

我已从下载Solr

但无论是7.3版本还是8.0主版本,我都没有在contrib目录中看到opennlp目录。我在哪里能找到它

更新3

我已经运行了我在这里下载的master branch witch的版本,并且我在这篇文章中尝试像绅士一样运行OpenNLP:

但我和他有同样的错误

数字板\u共享1\u副本\u n1: org.apache.solr.common.SolrException:org.apache.solr.common.SolrException:>无法为核心编号加载conf\u shard1\u replica\u n1:无法加载架构>托管架构:[schema.xml]字段类型>文本\u opennlp\u nvf:插件初始化失败[schema.xml]analyzer/tokenizer:>实例化类时出错:“org.apache.lucene.analysis.opennlp.OpenNLPTokenizerFactory”

如果补丁LUCENE-2899被合并到master中,我为什么会有这个错误

更新5

我重新启动了solr,错误消失了。但是

我试图将字段添加到托管架构以形成示例:

但当我尝试在云模式下运行Solr时,我发现:

D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin>solr -e cloud

Welcome to the SolrCloud example!

This interactive session will help you launch a SolrCloud cluster on your local workstation.
To begin, how many Solr nodes would you like to run in your local cluster? (specify 1-4 nodes) [2]:
1
Ok, let's start up 1 Solr nodes for your example SolrCloud cluster.
Please enter the port for node1 [8983]:

Solr home directory D:\utils\solr-7.3.0-7\solr-7.3.0-7\example\cloud\node1\solr already exists.

Starting up Solr on port 8983 using command:
"D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin\solr.cmd" start -cloud -p 8983 -s "D:\utils\solr-7.3.0-7\solr-7.3.0-7\example\cloud\node1\solr"

Waiting up to 30 to see Solr running on port 8983
Started Solr server on port 8983. Happy searching!
INFO  - 2018-03-26 14:42:26.961; org.apache.solr.client.solrj.impl.ZkClientClusterStateProvider; Cluster at localhost:9983 ready

Now let's create a new collection for indexing documents in your 1-node cluster.
Please provide a name for your new collection: [gettingstarted]
numberplate

Collection 'numberplate' already exists!
Do you want to re-use the existing collection or create a new one? Enter 1 to reuse, 2 to create new [1]:
1

Enabling auto soft-commits with maxTime 3 secs using the Config API

POSTing request to Config API: http://localhost:8983/solr/numberplate/config
{"set-property":{"updateHandler.autoSoftCommit.maxTime":"3000"}}

ERROR: Error from server at http://localhost:8983/solr: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 404 Not Found</title>
</head>
<body><h2>HTTP ERROR 404</h2>
<p>Problem accessing /solr/numberplate/config. Reason:
<pre>    Not Found</pre></p>
</body>
</html>




SolrCloud example running, please visit: http://localhost:8983/solr


D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin>
更新6

我已经创建了新的集合,我得到了更精确的错误:

test\u collection\u shard1\u replica\u n1:>org.apache.solr.common.SolrException:org.apache.solr.common.SolrException:>无法加载核心测试\u collection\u shard1\u replica\u n1:无法加载>架构管理的架构:org.apache.solr.core.SolrResourceNotFoundException:>在类路径或“/configs/\u default”中找不到资源“opennlp/en sent.bin”,>cwd=D:\utils\solr-7.3.0-7\solr-7.3.0-7\server 请查看您的日志以了解更多信息

也许我需要复制OpenNLP模型

但是我可以把这些模型放在哪里呢

你能帮我吗?我做错了什么

正如您在上所看到的,修补程序已经应用于8.0 master和7.3

您可以在和处找到预先构建的夜间睡眠

opennlp库捆绑在工件内部:

<lib dir="../../../contrib/analysis-extras/lib/" regex="opennlp-.*\.jar" />
<lib dir="../../../contrib/analysis-extras/lucene-libs/lucene-analyzers-opennlp-.*\.jar" regex=".*\.jar" />
然后,您必须告诉Solr装入这些罐子,这是非常重要的


确认JAR已按预期加载到Solr的日志文件中。

谢谢,但7.3或8.0版仅通过源代码编译提供?我哪里都不能下载二进制文件?你说你在master上,打算手动应用补丁程序-这意味着你要自己编译它。你可以在和处找到预建的夜生活。无论如何,谢谢你给我指点预建的夜生活,它会帮我很多。但是还有一个问题。根据这里的说明。我在contrib文件夹中找不到opennlp文件夹。我需要单独下载吗?换句话说,我没有这个路径-cd-solr/contrib/opennlp。
D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin>solr -e cloud

Welcome to the SolrCloud example!

This interactive session will help you launch a SolrCloud cluster on your local workstation.
To begin, how many Solr nodes would you like to run in your local cluster? (specify 1-4 nodes) [2]:
1
Ok, let's start up 1 Solr nodes for your example SolrCloud cluster.
Please enter the port for node1 [8983]:

Solr home directory D:\utils\solr-7.3.0-7\solr-7.3.0-7\example\cloud\node1\solr already exists.

Starting up Solr on port 8983 using command:
"D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin\solr.cmd" start -cloud -p 8983 -s "D:\utils\solr-7.3.0-7\solr-7.3.0-7\example\cloud\node1\solr"

Waiting up to 30 to see Solr running on port 8983
Started Solr server on port 8983. Happy searching!
INFO  - 2018-03-26 14:42:26.961; org.apache.solr.client.solrj.impl.ZkClientClusterStateProvider; Cluster at localhost:9983 ready

Now let's create a new collection for indexing documents in your 1-node cluster.
Please provide a name for your new collection: [gettingstarted]
numberplate

Collection 'numberplate' already exists!
Do you want to re-use the existing collection or create a new one? Enter 1 to reuse, 2 to create new [1]:
1

Enabling auto soft-commits with maxTime 3 secs using the Config API

POSTing request to Config API: http://localhost:8983/solr/numberplate/config
{"set-property":{"updateHandler.autoSoftCommit.maxTime":"3000"}}

ERROR: Error from server at http://localhost:8983/solr: Expected mime type application/octet-stream but got text/html. <html>
<head>
<meta http-equiv="Content-Type" content="text/html;charset=utf-8"/>
<title>Error 404 Not Found</title>
</head>
<body><h2>HTTP ERROR 404</h2>
<p>Problem accessing /solr/numberplate/config. Reason:
<pre>    Not Found</pre></p>
</body>
</html>




SolrCloud example running, please visit: http://localhost:8983/solr


D:\utils\solr-7.3.0-7\solr-7.3.0-7\bin>
solr-8.0.0-3304 find . -name '*nlp*'
[...]
./contrib/langid/lib/opennlp-tools-1.8.3.jar
./contrib/analysis-extras/lib/opennlp-maxent-3.0.3.jar
./contrib/analysis-extras/lib/opennlp-tools-1.8.3.jar
./contrib/analysis-extras/lucene-libs/lucene-analyzers-opennlp-8.0.0-3304.jar
<lib dir="../../../contrib/analysis-extras/lib/" regex="opennlp-.*\.jar" />
<lib dir="../../../contrib/analysis-extras/lucene-libs/lucene-analyzers-opennlp-.*\.jar" regex=".*\.jar" />