0

I've integrated Hibernate Search to my backend for Full Text Search,

The application works fine, I added some info to my database(see the screenshots) but when I search for a word sometimes (word="1") I only get one result instead of multiple results, sometimes(for an other word "ex: maroc") I don't get anything. I think it's an indexing problem not a query problem.

enter image description here

enter image description here

enter image description here

This is my Entity code:

import org.apache.lucene.analysis.standard.StandardAnalyzer;
import org.hibernate.search.annotations.*;

import javax.persistence.*;;
import java.util.Set;


@Entity
@Indexed
@Table(name="client")
public class Client {

    @Id
    @Column(name="id")
    @GeneratedValue(strategy=GenerationType.AUTO)
    private long id;

    @Column(name="fullname")
    @Field(termVector = TermVector.YES,analyzer = @Analyzer(impl = StandardAnalyzer.class))
    String fullName;

    @Column(name="adress")
    @Field
    String adress;

This is my SearchService code:

import org.apache.lucene.search.Query;
import org.hibernate.search.jpa.FullTextEntityManager;
import org.hibernate.search.jpa.Search;
import org.hibernate.search.query.dsl.QueryBuilder;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Component;
import org.springframework.transaction.annotation.Transactional;
import javax.persistence.EntityManager;
import javax.persistence.EntityManagerFactory;
import javax.persistence.NoResultException;
import java.util.List;

@Component
public class SearchService {


    @Autowired
    private EntityManager entityManager;


    @Autowired
    public SearchService( EntityManagerFactory entityManagerFactory) {
        super();
        this.entityManager = entityManagerFactory.createEntityManager();
    }

    public void initializeHibernateSearch() {
        try {
            FullTextEntityManager fullTextEntityManager = Search.getFullTextEntityManager(entityManager);
            fullTextEntityManager.createIndexer().startAndWait();
        } catch (InterruptedException e) {
            e.printStackTrace();
        }
    }

    @Transactional
    public List<Client> clientSearch(String searchTerm) {

        FullTextEntityManager fullTextEntityManager = Search.getFullTextEntityManager(entityManager);
        QueryBuilder qb = fullTextEntityManager.getSearchFactory().buildQueryBuilder().forEntity(Client.class).get();
        Query luceneQuery = qb.keyword().fuzzy().withEditDistanceUpTo(1).withPrefixLength(1).onFields("id","adress","fullName")
                .matching(searchTerm).createQuery();

        javax.persistence.Query jpaQuery = fullTextEntityManager.createFullTextQuery(luceneQuery, Client.class);

        // execute search

        List<Client> clientList = null;
        try {
            clientList = jpaQuery.getResultList();
        } catch (NoResultException nre) {
        }

        return clientList;
    }

This is my HibernateSearchConfiguration code:

import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.boot.autoconfigure.EnableAutoConfiguration;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

import javax.persistence.EntityManagerFactory;

@Configuration
@EnableAutoConfiguration

public class HibernateSearchConfiguration {

    private EntityManagerFactory entityManagerFactory;

    @Autowired
    public HibernateSearchConfiguration(EntityManagerFactory entityManagerFactory) {
        this.entityManagerFactory = entityManagerFactory;
    }

    @Bean
    SearchService hibernateSearchService() {
        SearchService hibernateSearchService = new SearchService(this.entityManagerFactory);
        hibernateSearchService.initializeHibernateSearch();
        return hibernateSearchService;
    }
}

that's the error in the terminal btw:

2020-05-05 13:21:11,084 ERROR org.hibernate.search.exception.impl.LogErrorHandler : HSEARCH000058: HSEARCH000117: IOException on the IndexWriter
org.apache.lucene.store.LockObtainFailedException: Lock held by another program: D:\ggg\jpa\indexpath\com.skylark.training.jpa.model.Client\write.lock
    at org.apache.lucene.store.NativeFSLockFactory.obtainFSLock(NativeFSLockFactory.java:118)
    at org.apache.lucene.store.FSLockFactory.obtainLock(FSLockFactory.java:41)
    at org.apache.lucene.store.BaseDirectory.obtainLock(BaseDirectory.java:45)
    at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:776)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.createNewIndexWriter(IndexWriterHolder.java:127)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:93)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:112)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriter(AbstractWorkspaceImpl.java:114)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriterDelegate(AbstractWorkspaceImpl.java:215)
    at org.hibernate.search.backend.impl.lucene.LuceneBackendTaskStreamer.doWork(LuceneBackendTaskStreamer.java:45)
    at org.hibernate.search.backend.impl.lucene.WorkspaceHolder.applyStreamWork(WorkspaceHolder.java:75)
    at org.hibernate.search.indexes.spi.DirectoryBasedIndexManager.performStreamOperation(DirectoryBasedIndexManager.java:110)
    at org.hibernate.search.backend.impl.StreamingOperationExecutorSelector$AddSelectionExecutor.performStreamOperation(StreamingOperationExecutorSelector.java:109)
    at org.hibernate.search.backend.impl.StreamingOperationDispatcher.executeWork(StreamingOperationDispatcher.java:55)
    at org.hibernate.search.backend.impl.StreamingOperationDispatcher.dispatch(StreamingOperationDispatcher.java:39)
    at org.hibernate.search.backend.impl.batch.DefaultBatchBackend.enqueueAsyncWork(DefaultBatchBackend.java:52)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.index(IdentifierConsumerDocumentProducer.java:296)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.indexAllQueue(IdentifierConsumerDocumentProducer.java:222)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.loadList(IdentifierConsumerDocumentProducer.java:176)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.loadAllFromQueue(IdentifierConsumerDocumentProducer.java:140)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.run(IdentifierConsumerDocumentProducer.java:120)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
    at java.base/java.lang.Thread.run(Thread.java:832)
2020-05-05 13:21:11,084 ERROR org.hibernate.search.backend.impl.lucene.LuceneBackendTaskStreamer : HSEARCH000072: Couldn't open the IndexWriter because of previous error: operation skipped, index ouf of sync!
2020-05-05 13:21:11,085 ERROR org.hibernate.search.exception.impl.LogErrorHandler : HSEARCH000058: HSEARCH000117: IOException on the IndexWriter
org.apache.lucene.store.LockObtainFailedException: Lock held by another program: D:\ggg\jpa\indexpath\com.skylark.training.jpa.model.Client\write.lock
    at org.apache.lucene.store.NativeFSLockFactory.obtainFSLock(NativeFSLockFactory.java:118)
    at org.apache.lucene.store.FSLockFactory.obtainLock(FSLockFactory.java:41)
    at org.apache.lucene.store.BaseDirectory.obtainLock(BaseDirectory.java:45)
    at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:776)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.createNewIndexWriter(IndexWriterHolder.java:127)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:93)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:112)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriter(AbstractWorkspaceImpl.java:114)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriterDelegate(AbstractWorkspaceImpl.java:215)
    at org.hibernate.search.backend.impl.lucene.LuceneBackendTaskStreamer.doWork(LuceneBackendTaskStreamer.java:45)
    at org.hibernate.search.backend.impl.lucene.WorkspaceHolder.applyStreamWork(WorkspaceHolder.java:75)
    at org.hibernate.search.indexes.spi.DirectoryBasedIndexManager.performStreamOperation(DirectoryBasedIndexManager.java:110)
    at org.hibernate.search.backend.impl.StreamingOperationExecutorSelector$AddSelectionExecutor.performStreamOperation(StreamingOperationExecutorSelector.java:109)
    at org.hibernate.search.backend.impl.StreamingOperationDispatcher.executeWork(StreamingOperationDispatcher.java:55)
    at org.hibernate.search.backend.impl.StreamingOperationDispatcher.dispatch(StreamingOperationDispatcher.java:39)
    at org.hibernate.search.backend.impl.batch.DefaultBatchBackend.enqueueAsyncWork(DefaultBatchBackend.java:52)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.index(IdentifierConsumerDocumentProducer.java:296)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.indexAllQueue(IdentifierConsumerDocumentProducer.java:222)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.loadList(IdentifierConsumerDocumentProducer.java:176)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.loadAllFromQueue(IdentifierConsumerDocumentProducer.java:140)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.run(IdentifierConsumerDocumentProducer.java:120)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
    at java.base/java.lang.Thread.run(Thread.java:832)
2020-05-05 13:21:11,085 ERROR org.hibernate.search.backend.impl.lucene.LuceneBackendTaskStreamer : HSEARCH000072: Couldn't open the IndexWriter because of previous error: operation skipped, index ouf of sync!
2020-05-05 13:21:11,087 ERROR org.hibernate.search.exception.impl.LogErrorHandler : HSEARCH000058: HSEARCH000117: IOException on the IndexWriter
org.apache.lucene.store.LockObtainFailedException: Lock held by another program: D:\ggg\jpa\indexpath\com.skylark.training.jpa.model.Client\write.lock
    at org.apache.lucene.store.NativeFSLockFactory.obtainFSLock(NativeFSLockFactory.java:118)
    at org.apache.lucene.store.FSLockFactory.obtainLock(FSLockFactory.java:41)
    at org.apache.lucene.store.BaseDirectory.obtainLock(BaseDirectory.java:45)
    at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:776)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.createNewIndexWriter(IndexWriterHolder.java:127)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:93)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:112)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriter(AbstractWorkspaceImpl.java:114)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriterDelegate(AbstractWorkspaceImpl.java:215)
    at org.hibernate.search.backend.impl.lucene.LuceneBackendTaskStreamer.doWork(LuceneBackendTaskStreamer.java:45)
    at org.hibernate.search.backend.impl.lucene.WorkspaceHolder.applyStreamWork(WorkspaceHolder.java:75)
    at org.hibernate.search.indexes.spi.DirectoryBasedIndexManager.performStreamOperation(DirectoryBasedIndexManager.java:110)
    at org.hibernate.search.backend.impl.StreamingOperationExecutorSelector$AddSelectionExecutor.performStreamOperation(StreamingOperationExecutorSelector.java:109)
    at org.hibernate.search.backend.impl.StreamingOperationDispatcher.executeWork(StreamingOperationDispatcher.java:55)
    at org.hibernate.search.backend.impl.StreamingOperationDispatcher.dispatch(StreamingOperationDispatcher.java:39)
    at org.hibernate.search.backend.impl.batch.DefaultBatchBackend.enqueueAsyncWork(DefaultBatchBackend.java:52)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.index(IdentifierConsumerDocumentProducer.java:296)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.indexAllQueue(IdentifierConsumerDocumentProducer.java:222)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.loadList(IdentifierConsumerDocumentProducer.java:176)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.loadAllFromQueue(IdentifierConsumerDocumentProducer.java:140)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.run(IdentifierConsumerDocumentProducer.java:120)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
    at java.base/java.lang.Thread.run(Thread.java:832)
2020-05-05 13:21:11,087 ERROR org.hibernate.search.backend.impl.lucene.LuceneBackendTaskStreamer : HSEARCH000072: Couldn't open the IndexWriter because of previous error: operation skipped, index ouf of sync!
2020-05-05 13:21:11,088 ERROR org.hibernate.search.exception.impl.LogErrorHandler : HSEARCH000058: HSEARCH000117: IOException on the IndexWriter
org.apache.lucene.store.LockObtainFailedException: Lock held by another program: D:\ggg\jpa\indexpath\com.skylark.training.jpa.model.Client\write.lock
    at org.apache.lucene.store.NativeFSLockFactory.obtainFSLock(NativeFSLockFactory.java:118)
    at org.apache.lucene.store.FSLockFactory.obtainLock(FSLockFactory.java:41)
    at org.apache.lucene.store.BaseDirectory.obtainLock(BaseDirectory.java:45)
    at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:776)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.createNewIndexWriter(IndexWriterHolder.java:127)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:93)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:112)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriter(AbstractWorkspaceImpl.java:114)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriterDelegate(AbstractWorkspaceImpl.java:215)
    at org.hibernate.search.backend.impl.lucene.LuceneBackendTaskStreamer.doWork(LuceneBackendTaskStreamer.java:45)
    at org.hibernate.search.backend.impl.lucene.WorkspaceHolder.applyStreamWork(WorkspaceHolder.java:75)
    at org.hibernate.search.indexes.spi.DirectoryBasedIndexManager.performStreamOperation(DirectoryBasedIndexManager.java:110)
    at org.hibernate.search.backend.impl.StreamingOperationExecutorSelector$AddSelectionExecutor.performStreamOperation(StreamingOperationExecutorSelector.java:109)
    at org.hibernate.search.backend.impl.StreamingOperationDispatcher.executeWork(StreamingOperationDispatcher.java:55)
    at org.hibernate.search.backend.impl.StreamingOperationDispatcher.dispatch(StreamingOperationDispatcher.java:39)
    at org.hibernate.search.backend.impl.batch.DefaultBatchBackend.enqueueAsyncWork(DefaultBatchBackend.java:52)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.index(IdentifierConsumerDocumentProducer.java:296)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.indexAllQueue(IdentifierConsumerDocumentProducer.java:222)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.loadList(IdentifierConsumerDocumentProducer.java:176)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.loadAllFromQueue(IdentifierConsumerDocumentProducer.java:140)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.run(IdentifierConsumerDocumentProducer.java:120)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
    at java.base/java.lang.Thread.run(Thread.java:832)
2020-05-05 13:21:11,088 ERROR org.hibernate.search.backend.impl.lucene.LuceneBackendTaskStreamer : HSEARCH000072: Couldn't open the IndexWriter because of previous error: operation skipped, index ouf of sync!
2020-05-05 13:21:11,090 ERROR org.hibernate.search.exception.impl.LogErrorHandler : HSEARCH000058: HSEARCH000117: IOException on the IndexWriter
org.apache.lucene.store.LockObtainFailedException: Lock held by another program: D:\ggg\jpa\indexpath\com.skylark.training.jpa.model.Client\write.lock
    at org.apache.lucene.store.NativeFSLockFactory.obtainFSLock(NativeFSLockFactory.java:118)
    at org.apache.lucene.store.FSLockFactory.obtainLock(FSLockFactory.java:41)
    at org.apache.lucene.store.BaseDirectory.obtainLock(BaseDirectory.java:45)
    at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:776)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.createNewIndexWriter(IndexWriterHolder.java:127)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:93)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:112)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriter(AbstractWorkspaceImpl.java:114)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriterDelegate(AbstractWorkspaceImpl.java:215)
    at org.hibernate.search.backend.impl.lucene.LuceneBackendTaskStreamer.doWork(LuceneBackendTaskStreamer.java:45)
    at org.hibernate.search.backend.impl.lucene.WorkspaceHolder.applyStreamWork(WorkspaceHolder.java:75)
    at org.hibernate.search.indexes.spi.DirectoryBasedIndexManager.performStreamOperation(DirectoryBasedIndexManager.java:110)
    at org.hibernate.search.backend.impl.StreamingOperationExecutorSelector$AddSelectionExecutor.performStreamOperation(StreamingOperationExecutorSelector.java:109)
    at org.hibernate.search.backend.impl.StreamingOperationDispatcher.executeWork(StreamingOperationDispatcher.java:55)
    at org.hibernate.search.backend.impl.StreamingOperationDispatcher.dispatch(StreamingOperationDispatcher.java:39)
    at org.hibernate.search.backend.impl.batch.DefaultBatchBackend.enqueueAsyncWork(DefaultBatchBackend.java:52)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.index(IdentifierConsumerDocumentProducer.java:296)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.indexAllQueue(IdentifierConsumerDocumentProducer.java:222)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.loadList(IdentifierConsumerDocumentProducer.java:176)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.loadAllFromQueue(IdentifierConsumerDocumentProducer.java:140)
    at org.hibernate.search.batchindexing.impl.IdentifierConsumerDocumentProducer.run(IdentifierConsumerDocumentProducer.java:120)
    at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)
    at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
    at java.base/java.lang.Thread.run(Thread.java:832)
2020-05-05 13:21:11,090 ERROR org.hibernate.search.backend.impl.lucene.LuceneBackendTaskStreamer : HSEARCH000072: Couldn't open the IndexWriter because of previous error: operation skipped, index ouf of sync!
2020-05-05 13:21:11,092 ERROR org.hibernate.search.exception.impl.LogErrorHandler : HSEARCH000058: HSEARCH000117: IOException on the IndexWriter
org.apache.lucene.store.LockObtainFailedException: Lock held by another program: D:\ggg\jpa\indexpath\com.skylark.training.jpa.model.Client\write.lock
    at org.apache.lucene.store.NativeFSLockFactory.obtainFSLock(NativeFSLockFactory.java:118)
    at org.apache.lucene.store.FSLockFactory.obtainLock(FSLockFactory.java:41)
    at org.apache.lucene.store.BaseDirectory.obtainLock(BaseDirectory.java:45)
    at org.apache.lucene.index.IndexWriter.<init>(IndexWriter.java:776)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.createNewIndexWriter(IndexWriterHolder.java:127)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:93)
    at org.hibernate.search.backend.impl.lucene.IndexWriterHolder.getIndexWriter(IndexWriterHolder.java:112)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriter(AbstractWorkspaceImpl.java:114)
    at org.hibernate.search.backend.impl.lucene.AbstractWorkspaceImpl.getIndexWriterDelegate(AbstractWorkspaceImpl.java:215)
    at org.hibernate.search.backend.impl.lucene.LuceneBackendTaskStreamer.doWork(LuceneBackendTaskStreamer.java:45)
    at org.hibernate.search.backend.impl.lucene.WorkspaceHolder.applyStreamWork(WorkspaceHolder.java:75)

.....etc

3 Answers 3

2

Alright, the stacktrace changes everything. The problem is not in your mapping.

See this:

2020-05-05 13:21:11,084 ERROR org.hibernate.search.exception.impl.LogErrorHandler : HSEARCH000058: HSEARCH000117: IOException on the IndexWriter
org.apache.lucene.store.LockObtainFailedException: Lock held by another program: D:\ggg\jpa\indexpath\com.skylark.training.jpa.model.Client\write.lock

Hibernate Search failed to obtain a lock to the index writer. This can happen in two cases:

  1. Your application was stopped (very) brutally and didn't release the lock. In this case, the easiest solution is to delete everything and reindex, because it's likely you lost some data when the application was stopped brutally.
  2. You're attempting to use the same index from two separate instances of your applications. If you really need to do this, you should consider relying on Elasticsearch instead of a local Lucene index; see here.
Sign up to request clarification or add additional context in comments.

9 Comments

thank you for your help. can you please tell me what do you mean by deleting everything and reindexing and how to reIndex?
again when i restart my computer the writer problem desappear but still returns empty list, with debugger (in SearchService) at the LuceneQuerry"qb" variable it throws the exception : Method threw 'java.lang.StackOverflowError' exception. Cannot evaluate org.hibernate.search.engine.metadata.impl.PropertyMetadata.toString()
i've restarted the hole think and its working now so i was in the first case thank u so much
@yrodiere Let's say in Case 2, that I have an Application 1 that only needs to read the index, and that another Application 2 writes and reads that same index. Why does the reading Application 1 need to bother about the lock? And was that any different in Hibernate 5 vs. Hibernate 7?
@f1v3 Sure. As I mentioned, If you were just using Hibernate Search with one writing node, and one or more read-only nodes relying on a network share, that would probably only require adding a read-only configuration flag to Hibernate Search. And, come to think of it, you might even be able to handle multiple nodes writing to the database, as long as you configure outbox-polling and disable event processors on all nodes but one.
|
0

You really should use the same analyzer in all text fields. If you don't specify an analyzer, the default one will be used, which may cause the problem you're experiencing.

    @Id
    @Column(name="id")
    @GeneratedValue(strategy=GenerationType.AUTO)
    private long id;

    @Column(name="fullname")
    @Field(termVector = TermVector.YES,analyzer = @Analyzer(impl = StandardAnalyzer.class))
    String fullName;

    @Column(name="adress")
    @Field(analyzer = @Analyzer(impl = StandardAnalyzer.class))
    String adress;

1 Comment

same problem! when i turned off my computer and start it again it worked without any changes and today it's not working again i don't know where is the problem
0

As yrodiere mentioned, working with the same index is not possible. A better solution would be to utilize ElasticSearch.

In Hibernate Search versions prior to 6, you can establish a master-slave relationship between different applications that share the same indexes. JSM Master-slave

Alternatively, as a workaround, you could consider deleting the write.lock file inside the index folder. Please note that this is a more forceful approach and may have unintended consequences.

I hope this information is helpful to you.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.