Skip to content

Conversation

@Tethik
Copy link

@Tethik Tethik commented Mar 13, 2015

Added a new function to the wrapper to fetch specific job logs. Since it's not part of the normal api functions using json, I had to hack around the client too.

Joakim Uddholm added 3 commits March 13, 2015 02:52
…ossible string format. Client accepts non_json responses when explicitly told to.
@coveralls
Copy link

Coverage Status

Coverage remained the same at 100.0% when pulling 53779d8 on Tethik:get-job-log into 665e55b on djm:master.

@djm
Copy link
Owner

djm commented Mar 23, 2015

Thanks for the effort, this is just a quick message to say I've seen it - unfortunately I'm abroad at the moment and don't have decent enough internet to review it properly yet.

From a very quick look though, it looks like something we could support certainly, the use case is certainly there even if the Logs endpoint isn't part of the actual Scrapyd API as such.

I will get back to this ASAP.

@Tethik
Copy link
Author

Tethik commented Apr 3, 2015

Cool, no rush. Good job on the testing/documentation btw, great intro for someone who is a beginner to best practices when it comes to python development.

My own use case is that I'm using lib for a backend admin website where we are, among other things, mining content using scrapy. So being able to see the logs helps us see if something went wrong etc. Having to code a separate http- alongside the lib usage felt ugly since they're both calling the same endpoint.

@djm
Copy link
Owner

djm commented Apr 23, 2015

Hi @Tethik,

First off, sorry for the long wait - I finally got some time to take a good look at this PR and what it would mean for this project.

I've come to the conclusion that this is out of scope for this specific project as the logs endpoint is not part of the official scrapyd API and as such we'd be supporting an internal API of scrapyd itself which is probably unadvisable from our perspective and theirs if they wanted to change it.

I'm going to close the PR for now but thank you very much for your work on it despite it not going in :) It might be worth talking to the scrapyd project to see if they like the idea of surfacing the logs as part of their official API; if and when that happens, I am more than happy to support it as part of this project.

Cheers!

@djm djm closed this Apr 23, 2015
@Tethik
Copy link
Author

Tethik commented Apr 24, 2015

No problem, I kind of expected as much. Might look at scrapyd in the future then.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants