I have a big Spring Boot aplication composed of multiple micro services using Gradle buildSrc. It contains a lot of tests seperated into seperate tasks (unit, integration, acceptance...).
These tests run on Jenkins pipelines each in a dedicated pod with 64GB memory and 8 cores using Gradle cache and 8 workers.
When running the integration tests without cache and with -PforceRunTests=true we run into OutOfMemory heap error in 50% of the runs.
Tried adding -XX:+HeapDumpOnOutOfMemoryError to the tasks but no dump is being generated. Also reproducing locally and monitoring with VisualVM does not show any jvm using more that allocated memory.
Any idea why HeapDumpOnOutOfMemoryError wouldn't generate dump does that indicate something?
If not how can we pin point the issue, is it possible that integration tests should be optimized and their contexts if so how can we prove that the tests are the issue?
I tried changing maxHeapSize (Xmx) and minHeapSize (Xms) and settled on both being set to 7GB which is more than enough.
Adding jvm args -XX:+HeapDumpOnOutOfMemoryError and -XX:HeapDumpPath still not generating heap dump on error.
@SpringBootTestwith a lot of different configurations, properties and/or combinations of@MockBeanand friends?@SpringBootTestfor all the situations, don't use it for things that should be a simple unit test. If you really must use that try to reduce the differences, and also reduce the number of contexts being cached by setting thespring.test.context.cache.maxSizesystem property (default is 32)@MockitoBean.