I have detected a severe memory leak in a piece of code that is used for compiling and running Java code in run time. I have created heap dumps and it seems com.sun.tools.javac.util.SharedNameTable$NameImpL is the culprit.
What I'd like to know is how I can prevent SharedNameTable from taking up so much space. Is there a way to force free the SharedNameTable ?
Compiler code:
public static Object compile(String code) throws IOException, InvocationTargetException, IllegalAccessException, ClassNotFoundException, NoSuchMethodException, java.lang.InstantiationException {
String uuid = UUID.randomUUID().toString();
File theDir = new File(uuid);
if (!theDir.exists()) {
System.out.println("creating directory: " + uuid);
boolean result = false;
try{
theDir.mkdir();
result = true;
}
catch(SecurityException se){
System.out.println(se);
}
if(result) {
System.out.println("DIR created");
}
}
//Compile
JavaCompiler jc = ToolProvider.getSystemJavaCompiler();
StandardJavaFileManager sjfm = jc.getStandardFileManager(null, null, null);
JavaFileObject javaObjectFromString = getJavaFileContentsAsString(new StringBuilder(code));
System.out.println(javaObjectFromString.getName());
Iterable fileObjects = Arrays.asList(javaObjectFromString);
String[] options = new String[]{"-d", "/src/"+uuid};
DiagnosticCollector<JavaFileObject> diagnostics = new DiagnosticCollector<JavaFileObject>();
if (jc.getTask(null, null, diagnostics, Arrays.asList(options), null, fileObjects).call()) {
sjfm.close();
System.out.println("Class has been successfully compiled");
//Load
URL[] urls = new URL[]{new URL("file:///src/"+uuid+"/")};
URLClassLoader ucl = new URLClassLoader(urls);
Class cl = ucl.loadClass("TestClass");
System.out.println("Class has been successfully loaded");
//Run
final Method method = cl.getDeclaredMethod("testMethod");
final Object object = cl.newInstance();
ExecutorService executor = Executors.newFixedThreadPool(1);
Callable<Object> task = new Callable<Object>() {
public Object call() throws IllegalAccessException, InvocationTargetException {
return method.invoke(object);
}
};
Future<Object> future = executor.submit(task);
try {
Object result = future.get(20, TimeUnit.SECONDS);
return result;
} catch (TimeoutException ex) {
return "Method timed out (20 seconds). Please review your code.";
} catch (InterruptedException e) {
return "Method interrupted.";
} catch (ExecutionException e) {
e.printStackTrace();
return String.format("ExecutionException thrown! %s", e.getMessage());
} finally {
future.cancel(true);
executor.shutdown();
}
}
sjfm.close();
System.out.println("Class compilation failed!");
//Create diagnostics and return them
List<String> errors = new ArrayList<>();
for (Diagnostic diagnostic : diagnostics.getDiagnostics()){
String s = String.format("[Code:%s] %s on line %d / column % d in %s",
diagnostic.getCode(),
diagnostic.getKind().toString(),
diagnostic.getLineNumber(),
diagnostic.getColumnNumber(),
diagnostic.getSource());
errors.add(s);
errors.add(diagnostic.toString());
}
return errors.toArray();
}
Edit:
I have found a similar question where SoftReferences are pointed out to be the cause of the problem. Forcing an OutOfMemoryException should force clear these. However I'm running this code within an node.js app using the 'node-java' npm package. When I try to force an OoM, my node.js app will be killed before the Exception is thrown.

jc.getTask(null, null, diagnostics, Arrays.asList(options), null, fileObjects).call()??jc.getTask(null, null, diagnostics, Arrays.asList(options), null, fileObjects).call()outside the if statement and it had no result on memory consumption.SharedNameTableandZipFileIndexincreasing in size when usingjavax.tools.JavaCompiler$CompilationTask.call).