Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

jobs.run(spider, job_args = {arg1: val1}) can't have val1 as a list. #142

Closed
jdvala opened this issue Jan 15, 2020 · 2 comments
Closed

jobs.run(spider, job_args = {arg1: val1}) can't have val1 as a list. #142

jdvala opened this issue Jan 15, 2020 · 2 comments

Comments

@jdvala
Copy link

jdvala commented Jan 15, 2020

I have tried to pass a list in val1 argument and I was unable to run my spider for multiple links which I pass as an value in job_args.

The way I got around that was to use repr to convert my list to a string and then on the spider side evaluate the string using ast.literal_eval and then run the spider.

Can somebody help me with a more "scrapy" solution.

@hermit-crab
Copy link
Contributor

As far as I know passing spider arguments externally (command line, ScrapyCloud API, this client) they will always be cast to plain strings and seen as such within spider. So there is no simple way to avoid serializing/deserializing explicitly the way you do it. JSON or CSV or ast.literal_eval would all be valid approaches. Perhaps look into https://github.com/ejulio/spider-feeder, but depending on your use case it could be an overkill.

@jdvala
Copy link
Author

jdvala commented Feb 4, 2020

Thanks I got my answer. I will close this issue

@jdvala jdvala closed this as completed Feb 4, 2020
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants