-
Notifications
You must be signed in to change notification settings - Fork 1.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Spark swallows exceptions #585
Comments
You can register a catch-all Spark exception handler to log uncaught exceptions: Spark.exception(Exception.class, (exception, request, response) -> {
exception.printStackTrace();
}); IMHO, this should be the default behavior tho. |
@lewtds thank you, that would print the Exception expected in the faulty code above. However if there are no Exceptions printed at all it's a maddening process to go and find the wrong pieces yourself especially if your codebase is large I believe. Thus I also think this should be the default. |
I did that, but when doing so the return status code in the response is 404 instead of 500 (expected). Does this happen to you as well? |
This is the same as #486 |
lewtds's solution works, but I still don't get the expected behavior on 2.5.2. |
Whenever I write code that causes e.g. a NullPointerException, like calling methods on an Object that is null, eclipse does not print any exceptions at all. See following code:
Now I didn't realize get() would return null if the key wasn't in the HashMap. However running this code gave me a 500 serverside error, while eclipse did not print a single exception. I had to find and fix the error myself like so:
Please note: I have slf4j in my pom.xml and buildpath:
The text was updated successfully, but these errors were encountered: