Java Hibernate搜索与Apache Solr的集成无法索引数据

Java Hibernate搜索与Apache Solr的集成无法索引数据,java,solr,lucene,spring-data-jpa,hibernate-search,Java,Solr,Lucene,Spring Data Jpa,Hibernate Search,在我当前的应用程序中,我使用hibernate搜索来索引和搜索数据。它很好用。但是,在构建服务器实例集群时,我不需要使用使用JMS或JGroups的主从集群 所以我尝试将hibernate搜索与ApacheSolr集成。我必须跟随 并做了一些小改动以与新的apache.lucene.core版本兼容 public class HibernateSearchSolrWorkerBackend implements BackendQueueProcessor { private static fin

在我当前的应用程序中,我使用hibernate搜索来索引和搜索数据。它很好用。但是,在构建服务器实例集群时,我不需要使用使用JMS或JGroups的主从集群

所以我尝试将hibernate搜索与ApacheSolr集成。我必须跟随

并做了一些小改动以与新的apache.lucene.core版本兼容

public class HibernateSearchSolrWorkerBackend implements BackendQueueProcessor {
private static final String ID_FIELD_NAME = "id";

private static final ReentrantReadWriteLock readWriteLock = new ReentrantReadWriteLock();
private static final ReentrantReadWriteLock.WriteLock writeLock = readWriteLock.writeLock();

private ConcurrentUpdateSolrClient solrServer;

@Override
public void initialize(Properties properties, WorkerBuildContext workerBuildContext, DirectoryBasedIndexManager directoryBasedIndexManager) {
    solrServer = new ConcurrentUpdateSolrClient("http://localhost:8983/solr/test", 20, 4);

}

@Override
public void close() {
}

@Override
public void applyWork(List<LuceneWork> luceneWorks, IndexingMonitor indexingMonitor) {
    List<SolrInputDocument> solrWorks = new ArrayList<>(luceneWorks.size());
    List<String> documentsForDeletion = new ArrayList<>();

    for (LuceneWork work : luceneWorks) {
        SolrInputDocument solrWork = new SolrInputDocument();
        if (work instanceof AddLuceneWork) {
            handleAddLuceneWork((AddLuceneWork) work, solrWork);
        } else if (work instanceof UpdateLuceneWork) {
            handleUpdateLuceneWork((UpdateLuceneWork) work, solrWork);
        } else if (work instanceof DeleteLuceneWork) {
            documentsForDeletion.add(((DeleteLuceneWork)work).getIdInString());
        } else {
            throw new RuntimeException("Encountered unsupported lucene work " + work);
        }
        solrWorks.add(solrWork);
    }
    try {
         deleteDocs(documentsForDeletion);
         solrServer.add(solrWorks);
         softCommit();

    } catch (SolrServerException | IOException e) {
        throw new RuntimeException("Failed to update solr", e);
    }

}

@Override
public void applyStreamWork(LuceneWork luceneWork, IndexingMonitor indexingMonitor) {
    throw new RuntimeException("HibernateSearchSolrWorkerBackend.applyStreamWork isn't implemented");
}

@Override
public Lock getExclusiveWriteLock() {
    return writeLock;
}

@Override
public void indexMappingChanged() {
}

private void deleteDocs(Collection<String> collection) throws IOException, SolrServerException {
    if (collection.size()>0) {
        StringBuilder stringBuilder = new StringBuilder(collection.size()*10);
        stringBuilder.append(ID_FIELD_NAME).append(":(");
        boolean first=true;
        for (String id : collection) {
            if (!first) {
                stringBuilder.append(',');
            }
            else {
                first=false;
            }
            stringBuilder.append(id);
        }
        stringBuilder.append(')');
        solrServer.deleteByQuery(stringBuilder.toString());
    }
}

private void copyFields(Document document, SolrInputDocument solrInputDocument) {
    boolean addedId = false;
    for (IndexableField fieldable : document.getFields()) {
        if (fieldable.name().equals(ID_FIELD_NAME)) {
            if (addedId)
                continue;
            else
                addedId = true;
        }
        solrInputDocument.addField(fieldable.name(), fieldable.stringValue());
    }
}

private void handleAddLuceneWork(AddLuceneWork luceneWork, SolrInputDocument solrWork) {
    copyFields(luceneWork.getDocument(), solrWork);
}

private void handleUpdateLuceneWork(UpdateLuceneWork luceneWork, SolrInputDocument solrWork) {
    copyFields(luceneWork.getDocument(), solrWork);
}

private void softCommit() throws IOException, SolrServerException {
    UpdateRequest updateRequest = new UpdateRequest();
    updateRequest.setParam("soft-commit", "true");
    updateRequest.setAction(UpdateRequest.ACTION.COMMIT,false, false);
    updateRequest.process(solrServer);

}
}
这会将记录保存到数据库。如果我删除

<property name="hibernate.search.default.worker.backend" value="search.adapter.HibernateSearchSolrWorkerBackend"/>


并在配置文件中给出hibernate搜索的索引位置,以便正确创建索引并成功执行搜索。但是,当我将自定义worker后端添加为apache solr时,它不会在apache solr core data文件夹中创建任何索引。

Hi,您是否可以记录异常,而不是将其包装到RuntimeException中?我怀疑它们可能会被框架堆栈的某一层吞没,因为某些索引操作有时会发生在后台线程中。或者使用ErrorHandler:嘿,谢谢。我会尽力让你知道的。
@Test
@Transactional(propagation = Propagation.REQUIRES_NEW)
@Rollback(false)
public void saveBooks() {
    Book bk1 = new Book(1L, "book1", "book1 description", 100.0);   
    Book bk2 = new Book(2L, "book2", "book2 description", 100.0);
    bookRepository.save(bk1);
    bookRepository.save(bk2);
}
<property name="hibernate.search.default.worker.backend" value="search.adapter.HibernateSearchSolrWorkerBackend"/>