Skip to content

HTTPS clone URL

Subversion checkout URL

You can clone with
or
.
Download ZIP

Loading…

Update scrapy command line doc with additional scrapy parse options #613

Merged
merged 1 commit into from

2 participants

@breno

Added several options to scrapy command line (parse command) that weren't reflected in the docs.

Some observations:

  • I've modified the description of the --spider argument from the one availabe in the scrapy built-in help "use this spider without looking for one". I think my description is more accurate, but I can change it back if you think it's not. If my description is really better I can have it applied to the built-in help of the scrapy coomand.

  • I've included placeholder arguments for the added options that support/require arguments. I can reflect it on the other options or I can remove them altogether from the ones I've added. I think adding them better reflects actual usage.

@pablohoffman pablohoffman merged commit 0bc2cba into scrapy:master
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
This page is out of date. Refresh to see the latest.
Showing with 8 additions and 0 deletions.
  1. +8 −0 docs/topics/commands.rst
View
8 docs/topics/commands.rst
@@ -367,9 +367,15 @@ method passed with the ``--callback`` option, or ``parse`` if not given.
Supported options:
+* ``--spider=SPIDER``: bypass spider autodetection and force use of specific spider
+
+* ``--a NAME=VALUE``: set spider argument (may be repeated)
+
* ``--callback`` or ``-c``: spider method to use as callback for parsing the
response
+* ``--pipelines``: process items through pipelines
+
* ``--rules`` or ``-r``: use :class:`~scrapy.contrib.spiders.CrawlSpider`
rules to discover the callback (ie. spider method) to use for parsing the
response
@@ -378,6 +384,8 @@ Supported options:
* ``--nolinks``: don't show extracted links
+* ``--nocolour``: avoid using pygments to colorize the output
+
* ``--depth`` or ``-d``: depth level for which the requests should be followed
recursively (default: 1)
Something went wrong with that request. Please try again.