You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have tried to pass a list in val1 argument and I was unable to run my spider for multiple links which I pass as an value in job_args.
The way I got around that was to use repr to convert my list to a string and then on the spider side evaluate the string using ast.literal_eval and then run the spider.
Can somebody help me with a more "scrapy" solution.
The text was updated successfully, but these errors were encountered:
As far as I know passing spider arguments externally (command line, ScrapyCloud API, this client) they will always be cast to plain strings and seen as such within spider. So there is no simple way to avoid serializing/deserializing explicitly the way you do it. JSON or CSV or ast.literal_eval would all be valid approaches. Perhaps look into https://github.com/ejulio/spider-feeder, but depending on your use case it could be an overkill.
I have tried to pass a list in
val1
argument and I was unable to run my spider for multiple links which I pass as an value injob_args
.The way I got around that was to use
repr
to convert my list to a string and then on the spider side evaluate the string usingast.literal_eval
and then run the spider.Can somebody help me with a more "scrapy" solution.
The text was updated successfully, but these errors were encountered: