New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Job submission followed by quick get job details fails with 404 #516
Comments
note: even if I wait a while (tested with 30secs) before doing
I still get a 404. My config is as follows:
|
Have you tried clearing the /tmp/spark-jobserver/filedao/data directory? In the past this has worked for people. It may be corrupted.
|
Yes, I deleted /tmp/spark-jobserver before starting the jobserver. |
@bjoernlohrmann Which version of SJS are you using? |
I was using the current master at the time. |
@bjoernlohrmann A fix 485cb6a related to this went in. Can you see whether this is reproducible now? |
I tested a build created from todays master d5190aa which worked (jobserver in local mode, context-per-jvm=false). |
closing this. |
From gitter channel.
Björn Lohrmann @bjoernlohrmann 03:06
my apologies for complaining all the time I am currently doing a smoke-test on the jobserver and I have run into an issue. I am getting the following behavior on a freshly started jobserver (vm-per-context is false):
Jobserver log contains the following suspicious looking line:
[2016-06-22 17:29:48,914] INFO r$RemoteDeadLetterActorRef [] [akka://JobServer/deadLetters] - Message [spark.jobserver.JobInfoActor$JobConfigStored$] from Actor[akka://JobServer/user/job-info#1380245094] to Actor[akka://JobServer/deadLetters] was not delivered. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'akka.log-dead-letters' and 'akka.log-dead-letters-during-shutdown'.
@noorul is also facing similar issues with jvm-per-context set to tru
The text was updated successfully, but these errors were encountered: