11

I'm building a java application with Spark framework with embedded Jetty and handlebars template engine. But when i get an 500 Internal Error, the console didn't say anything. I have added to my pom.xml the dependencies here: http://sparkjava.com/documentation.html#add-a-logger but does not print all exceptions / errors (like errors 500)

Here my pom.xml dependecies

<dependencies>

    <!-- FRAMEWORK:     Spark -->
    <dependency>
        <groupId>com.sparkjava</groupId>
        <artifactId>spark-core</artifactId>
        <version>2.5</version>
    </dependency>

    <!-- TEMPLATES:     Handlebars -->
    <dependency>
        <groupId>com.sparkjava</groupId>
        <artifactId>spark-template-handlebars</artifactId>
        <version>2.3</version>
    </dependency>

    <!-- DB-MAPPING:    sql2o -->
    <dependency>
        <groupId>org.sql2o</groupId>
        <artifactId>sql2o</artifactId>
        <version>1.5.4</version>
    </dependency>

    <!-- DRIVERS: sqlite-->
    <dependency>
        <groupId>org.xerial</groupId>
        <artifactId>sqlite-jdbc</artifactId>
        <version>3.8.11.2</version>
    </dependency>

    <!-- LOGGER:        slf4j -->
    <dependency>
        <groupId>org.slf4j</groupId>
        <artifactId>slf4j-simple</artifactId>
        <version>1.7.21</version>
    </dependency>

</dependencies>

How i can enable all the logging for spark?

1
  • you can configure log4j for capturings logs normally Commented Jul 22, 2016 at 14:06

4 Answers 4

17

To enable logging, just add the following dependency to your project:

<dependency>
  <groupId>org.slf4j</groupId>
  <artifactId>slf4j-simple</artifactId>
  <version>1.7.21</version>
</dependency>

and you can register a catch-all Spark exception handler to log uncaught exceptions:

Spark.exception(Exception.class, (exception, request, response) -> {
    exception.printStackTrace();
});
Sign up to request clarification or add additional context in comments.

Comments

2

Use log4j to make a logging implementation. That's why you don't have an idea why are you getting an internal server error

http://logging.apache.org/log4j/2.x/

Comments

-1

Not sure if this meant disabling spark or Hadoop in built logging but if thats the case, setting the log level in SparkContext helped me.

sc.setLogLevel("ERROR");

Possible options are ALL, DEBUG, ERROR, FATAL, INFO, OFF, TRACE, WARN

https://spark.apache.org/docs/2.2.0/api/java/org/apache/spark/SparkContext.html#setLogLevel-java.lang.String-

Comments

-3

Have you added a log4j properties file? Have a look at this documentation.

Configuring Logging Spark uses log4j for logging. You can configure it by adding a log4j.properties file in the conf directory. One way to start is to copy the existing log4j.properties.template located there.

3 Comments

I'm using the 2.5 version, and on the website it say to add slf4j: sparkjava.com/documentation.html#add-a-logger
The documentation you're referring to is for Spark the data processing framework, not Spark the web framework. Yes, it's an annoying namespace conflict. ;)
This does not seem to answer the question. This is for Spark Java and not Apache spark :) sparkjava.com/documentation.html#how-do-i-enable-logging

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.