Skip to content

Commit

Permalink
Merge pull request #5681 from mattkohl-flex/master
Browse files Browse the repository at this point in the history
Documentation: typo fixes
  • Loading branch information
kmike committed Oct 18, 2022
2 parents 20b79a0 + c49764f commit 82f25bc
Show file tree
Hide file tree
Showing 5 changed files with 5 additions and 5 deletions.
2 changes: 1 addition & 1 deletion docs/intro/install.rst
Expand Up @@ -187,7 +187,7 @@ solutions:
* Install `homebrew`_ following the instructions in https://brew.sh/

* Update your ``PATH`` variable to state that homebrew packages should be
used before system packages (Change ``.bashrc`` to ``.zshrc`` accordantly
used before system packages (Change ``.bashrc`` to ``.zshrc`` accordingly
if you're using `zsh`_ as default shell)::

echo "export PATH=/usr/local/bin:/usr/local/sbin:$PATH" >> ~/.bashrc
Expand Down
2 changes: 1 addition & 1 deletion docs/topics/contracts.rst
Expand Up @@ -102,7 +102,7 @@ override three methods:
.. method:: Contract.post_process(output)

This allows processing the output of the callback. Iterators are
converted listified before being passed to this hook.
converted to lists before being passed to this hook.

Raise :class:`~scrapy.exceptions.ContractFail` from
:class:`~scrapy.contracts.Contract.pre_process` or
Expand Down
2 changes: 1 addition & 1 deletion docs/topics/extensions.rst
Expand Up @@ -17,7 +17,7 @@ settings, just like any other Scrapy code.

It is customary for extensions to prefix their settings with their own name, to
avoid collision with existing (and future) extensions. For example, a
hypothetic extension to handle `Google Sitemaps`_ would use settings like
hypothetical extension to handle `Google Sitemaps`_ would use settings like
``GOOGLESITEMAP_ENABLED``, ``GOOGLESITEMAP_DEPTH``, and so on.

.. _Google Sitemaps: https://en.wikipedia.org/wiki/Sitemaps
Expand Down
2 changes: 1 addition & 1 deletion docs/topics/leaks.rst
Expand Up @@ -154,7 +154,7 @@ Too many spiders?
If your project has too many spiders executed in parallel,
the output of :func:`prefs()` can be difficult to read.
For this reason, that function has a ``ignore`` argument which can be used to
ignore a particular class (and all its subclases). For
ignore a particular class (and all its subclasses). For
example, this won't show any live references to spiders:

>>> from scrapy.spiders import Spider
Expand Down
2 changes: 1 addition & 1 deletion docs/topics/request-response.rst
Expand Up @@ -446,7 +446,7 @@ class).
Scenarios where changing the request fingerprinting algorithm may cause
undesired results include, for example, using the HTTP cache middleware (see
:class:`~scrapy.downloadermiddlewares.httpcache.HttpCacheMiddleware`).
Changing the request fingerprinting algorithm would invalidade the current
Changing the request fingerprinting algorithm would invalidate the current
cache, requiring you to redownload all requests again.

Otherwise, set :setting:`REQUEST_FINGERPRINTER_IMPLEMENTATION` to ``'2.7'`` in
Expand Down

0 comments on commit 82f25bc

Please sign in to comment.