0

This code can fail with OutOfMemory Exception when recNos array is large ( >20,000) when I commit the close session. I'm only reading information and only reading one object at a time (the idea was to reduce memory load by reading one at a time) and I only use the object for that invocation in the loop, but I'm not explicitly discarding it. Am I doing something wrong can I explicitly release the memory.

        try
        {
            session = com.jthink.songlayer.hibernate.HibernateUtil.getSession();
            for (Integer next : recNos)
            {
                Song song = SongCache.loadSongFromDatabase(session, next);
                folderToSongIds.put(new File(song.getFilename()).getParent(),song.getRecNo());
            }
        }
        finally
        {
            HibernateUtil.closeSession(session);
        }

This is the stacktrace

java.lang.OutOfMemoryError: GC overhead limit exceeded
at org.hibernate.internal.util.collections.IdentityMap.entryArray(IdentityMap.java:165)
at org.hibernate.internal.util.collections.IdentityMap.concurrentEntries(IdentityMap.java:76)
at org.hibernate.engine.internal.StatefulPersistenceContext.clear(StatefulPersistenceContext.java:237)
at org.hibernate.internal.SessionImpl.cleanup(SessionImpl.java:651)
at org.hibernate.internal.SessionImpl.close(SessionImpl.java:363)
at com.jthink.songlayer.hibernate.HibernateUtil.closeSession(HibernateUtil.java:94)
1
  • Can this be due to new instance of File being created everytime ? Too many files getting loaded ? Commented Aug 2, 2014 at 13:37

2 Answers 2

1

You can detach the entity from the session when you are done with it by calling evict:

    {
        Song song = SongCache.loadSongFromDatabase(session, next);
        folderToSongIds.put(new File(song.getFilename()).getParent(),song.getRecNo());
        session.evict(song);
    }
Sign up to request clarification or add additional context in comments.

Comments

0

According to this article this exception happens when GC is unable to empty more than 2% of the heap which I think it can mean memory leak.

It seems that you are writing a lot of files which can be a good source of memory leakage.I am not sure if these files are closed after they have been created.

BTW I suggest you use a memory profiler to see what is occupying the heap.

**20% percent was a typo that I updated

2 Comments

Don't you mean 2%, its a File object rather than a FileWriter so nothing to close, but perhaps I could use Paths instead of having to construct files in the first place.
Anyway my answer was to use a memory profiler.As I mentioned I didn't know how you were creating yuor files

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.