Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/solr/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
将mahout 0.6与solr 4.2一起使用时的IndexFormatToolDexception_Solr_Mahout - Fatal编程技术网

将mahout 0.6与solr 4.2一起使用时的IndexFormatToolDexception

将mahout 0.6与solr 4.2一起使用时的IndexFormatToolDexception,solr,mahout,Solr,Mahout,我使用的是mahout发行版0.6和solr 4.2,我想为solr索引生成mahout向量,但该命令给出了一个兼容性错误。为什么我会遇到这个错误,以及如何解决它 ~/mahout$ bin/mahout lucene.vector --dir /home/newscontext/solr/solr-4.2.0/example/solr/collection1/data/index --output tmp/part-out.vec --field 0 --dictOut /tmp/dict.o

我使用的是mahout发行版0.6和solr 4.2,我想为solr索引生成mahout向量,但该命令给出了一个兼容性错误。为什么我会遇到这个错误,以及如何解决它

~/mahout$ bin/mahout lucene.vector --dir /home/newscontext/solr/solr-4.2.0/example/solr/collection1/data/index --output tmp/part-out.vec --field 0 --dictOut /tmp/dict.out --norm 2
MAHOUT_LOCAL is not set; adding HADOOP_CONF_DIR to classpath.
no HADOOP_HOME set, running locally
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/newscontext/mahout/examples/target/mahout-examples-0.6-job.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/newscontext/mahout/examples/target/dependency/slf4j-jcl-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/newscontext/mahout/examples/target/dependency/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
Exception in thread "main" org.apache.lucene.index.IndexFormatTooOldException: Format version is not supported in file 'segments_2': 1071082519 (needs to be between -1 and -11). This version of Lucene only supports indexes created with release 3.0 and later.
    at org.apache.lucene.index.SegmentInfos.read(SegmentInfos.java:275)
    at org.apache.lucene.index.DirectoryReader$1.doBody(DirectoryReader.java:79)
    at org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:754)
    at org.apache.lucene.index.DirectoryReader.open(DirectoryReader.java:75)
    at org.apache.lucene.index.IndexReader.open(IndexReader.java:421)
    at org.apache.lucene.index.IndexReader.open(IndexReader.java:281)
    at org.apache.mahout.utils.vectors.lucene.Driver.dumpVectors(Driver.java:84)
    at org.apache.mahout.utils.vectors.lucene.Driver.main(Driver.java:250)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:616)
    at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:68)
    at org.apache.hadoop.util.ProgramDriver.driver(ProgramDriver.java:139)
    at org.apache.mahout.driver.MahoutDriver.main(MahoutDriver.java:188)

0.6是旧的。即使是目前的0.7版本,也相当陈旧。我确信他们没有使用Lucene 4.2,这是非常新的。事实上,它可能是在Lucene 2.x上发布的,因为它是在2年多前发布的。使用SVN的最新版本,而不是0.6。

谢谢,是的,我可以使用旧的lucene版本。