Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: use __init__ method instead of constructor #4088

Merged
merged 6 commits into from Nov 12, 2019
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/conf.py
Expand Up @@ -237,7 +237,7 @@
r'\bContractsManager\b$',

# For default contracts we only want to document their general purpose in
# their constructor, the methods they reimplement to achieve that purpose
# their __init__ method, the methods they reimplement to achieve that purpose
# should be irrelevant to developers using those contracts.
r'\w+Contract\.(adjust_request_args|(pre|post)_process)$',

Expand Down
10 changes: 5 additions & 5 deletions docs/news.rst
Expand Up @@ -84,12 +84,12 @@ New features
convenient way to build JSON requests (:issue:`3504`, :issue:`3505`)

* A ``process_request`` callback passed to the :class:`~scrapy.spiders.Rule`
constructor now receives the :class:`~scrapy.http.Response` object that
``__init__`` method now receives the :class:`~scrapy.http.Response` object that
originated the request as its second argument (:issue:`3682`)

* A new ``restrict_text`` parameter for the
:attr:`LinkExtractor <scrapy.linkextractors.lxmlhtml.LxmlLinkExtractor>`
constructor allows filtering links by linking text (:issue:`3622`,
``__init__`` method allows filtering links by linking text (:issue:`3622`,
:issue:`3635`)

* A new :setting:`FEED_STORAGE_S3_ACL` setting allows defining a custom ACL
Expand Down Expand Up @@ -255,7 +255,7 @@ The following deprecated APIs have been removed (:issue:`3578`):

* From :class:`~scrapy.selector.Selector`:

* ``_root`` (both the constructor argument and the object property, use
* ``_root`` (both the ``__init__`` method argument and the object property, use
``root``)

* ``extract_unquoted`` (use ``getall``)
Expand Down Expand Up @@ -2479,7 +2479,7 @@ Scrapy changes:
- removed ``ENCODING_ALIASES`` setting, as encoding auto-detection has been moved to the `w3lib`_ library
- promoted :ref:`topics-djangoitem` to main contrib
- LogFormatter method now return dicts(instead of strings) to support lazy formatting (:issue:`164`, :commit:`dcef7b0`)
- downloader handlers (:setting:`DOWNLOAD_HANDLERS` setting) now receive settings as the first argument of the constructor
- downloader handlers (:setting:`DOWNLOAD_HANDLERS` setting) now receive settings as the first argument of the ``__init__`` method
- replaced memory usage acounting with (more portable) `resource`_ module, removed ``scrapy.utils.memory`` module
- removed signal: ``scrapy.mail.mail_sent``
- removed ``TRACK_REFS`` setting, now :ref:`trackrefs <topics-leaks-trackrefs>` is always enabled
Expand Down Expand Up @@ -2693,7 +2693,7 @@ API changes
- ``Request.copy()`` and ``Request.replace()`` now also copies their ``callback`` and ``errback`` attributes (#231)
- Removed ``UrlFilterMiddleware`` from ``scrapy.contrib`` (already disabled by default)
- Offsite middelware doesn't filter out any request coming from a spider that doesn't have a allowed_domains attribute (#225)
- Removed Spider Manager ``load()`` method. Now spiders are loaded in the constructor itself.
- Removed Spider Manager ``load()`` method. Now spiders are loaded in the ``__init__`` method itself.
- Changes to Scrapy Manager (now called "Crawler"):
- ``scrapy.core.manager.ScrapyManager`` class renamed to ``scrapy.crawler.Crawler``
- ``scrapy.core.manager.scrapymanager`` singleton moved to ``scrapy.project.crawler``
Expand Down
4 changes: 2 additions & 2 deletions docs/topics/email.rst
Expand Up @@ -21,7 +21,7 @@ Quick example
=============

There are two ways to instantiate the mail sender. You can instantiate it using
the standard constructor::
the standard ``__init__`` method::

from scrapy.mail import MailSender
mailer = MailSender()
Expand Down Expand Up @@ -111,7 +111,7 @@ uses `Twisted non-blocking IO`_, like the rest of the framework.
Mail settings
=============

These settings define the default constructor values of the :class:`MailSender`
These settings define the default ``__init__`` method values of the :class:`MailSender`
class, and can be used to configure e-mail notifications in your project without
writing any code (for those extensions and code that uses :class:`MailSender`).

Expand Down
40 changes: 20 additions & 20 deletions docs/topics/exporters.rst
Expand Up @@ -87,8 +87,8 @@ described next.
1. Declaring a serializer in the field
--------------------------------------

If you use :class:`~.Item` you can declare a serializer in the
:ref:`field metadata <topics-items-fields>`. The serializer must be
If you use :class:`~.Item` you can declare a serializer in the
:ref:`field metadata <topics-items-fields>`. The serializer must be
a callable which receives a value and returns its serialized form.

Example::
Expand Down Expand Up @@ -144,7 +144,7 @@ BaseItemExporter
defining what fields to export, whether to export empty fields, or which
encoding to use.

These features can be configured through the constructor arguments which
These features can be configured through the ``__init__`` method arguments which
populate their respective instance attributes: :attr:`fields_to_export`,
:attr:`export_empty_fields`, :attr:`encoding`, :attr:`indent`.

Expand Down Expand Up @@ -246,8 +246,8 @@ XmlItemExporter
:param item_element: The name of each item element in the exported XML.
:type item_element: str

The additional keyword arguments of this constructor are passed to the
:class:`BaseItemExporter` constructor.
The additional keyword arguments of this ``__init__`` method are passed to the
:class:`BaseItemExporter` ``__init__`` method

A typical output of this exporter would be::

Expand Down Expand Up @@ -306,9 +306,9 @@ CsvItemExporter
multi-valued fields, if found.
:type include_headers_line: str

The additional keyword arguments of this constructor are passed to the
:class:`BaseItemExporter` constructor, and the leftover arguments to the
`csv.writer`_ constructor, so you can use any ``csv.writer`` constructor
The additional keyword arguments of this ``__init__`` method are passed to the
:class:`BaseItemExporter` ``__init__`` method, and the leftover arguments to the
`csv.writer`_ ``__init__`` method, so you can use any ``csv.writer`` ``__init__`` method
argument to customize this exporter.

A typical output of this exporter would be::
Expand All @@ -334,8 +334,8 @@ PickleItemExporter

For more information, refer to the `pickle module documentation`_.

The additional keyword arguments of this constructor are passed to the
:class:`BaseItemExporter` constructor.
The additional keyword arguments of this ``__init__`` method are passed to the
:class:`BaseItemExporter` ``__init__`` method.

Pickle isn't a human readable format, so no output examples are provided.

Expand All @@ -351,8 +351,8 @@ PprintItemExporter
:param file: the file-like object to use for exporting the data. Its ``write`` method should
accept ``bytes`` (a disk file opened in binary mode, a ``io.BytesIO`` object, etc)

The additional keyword arguments of this constructor are passed to the
:class:`BaseItemExporter` constructor.
The additional keyword arguments of this ``__init__`` method are passed to the
:class:`BaseItemExporter` ``__init__`` method

A typical output of this exporter would be::

Expand All @@ -367,10 +367,10 @@ JsonItemExporter
.. class:: JsonItemExporter(file, \**kwargs)

Exports Items in JSON format to the specified file-like object, writing all
objects as a list of objects. The additional constructor arguments are
passed to the :class:`BaseItemExporter` constructor, and the leftover
arguments to the `JSONEncoder`_ constructor, so you can use any
`JSONEncoder`_ constructor argument to customize this exporter.
objects as a list of objects. The additional ``__init__`` method arguments are
passed to the :class:`BaseItemExporter` ``__init__`` method, and the leftover
arguments to the `JSONEncoder`_ ``__init__`` method, so you can use any
`JSONEncoder`_ ``__init__`` method argument to customize this exporter.

:param file: the file-like object to use for exporting the data. Its ``write`` method should
accept ``bytes`` (a disk file opened in binary mode, a ``io.BytesIO`` object, etc)
Expand Down Expand Up @@ -398,10 +398,10 @@ JsonLinesItemExporter
.. class:: JsonLinesItemExporter(file, \**kwargs)

Exports Items in JSON format to the specified file-like object, writing one
JSON-encoded item per line. The additional constructor arguments are passed
to the :class:`BaseItemExporter` constructor, and the leftover arguments to
the `JSONEncoder`_ constructor, so you can use any `JSONEncoder`_
constructor argument to customize this exporter.
JSON-encoded item per line. The additional ``__init__`` method arguments are passed
to the :class:`BaseItemExporter` ``__init__`` method and the leftover arguments to
the `JSONEncoder`_ ``__init__`` method, so you can use any `JSONEncoder`_
``__init__`` method argument to customize this exporter.

:param file: the file-like object to use for exporting the data. Its ``write`` method should
accept ``bytes`` (a disk file opened in binary mode, a ``io.BytesIO`` object, etc)
Expand Down
2 changes: 1 addition & 1 deletion docs/topics/extensions.rst
Expand Up @@ -28,7 +28,7 @@ Loading & activating extensions

Extensions are loaded and activated at startup by instantiating a single
instance of the extension class. Therefore, all the extension initialization
code must be performed in the class constructor (``__init__`` method).
code must be performed in the class ``__init__`` method.

To make an extension available, add it to the :setting:`EXTENSIONS` setting in
your Scrapy settings. In :setting:`EXTENSIONS`, each extension is represented
Expand Down
8 changes: 4 additions & 4 deletions docs/topics/items.rst
Expand Up @@ -16,12 +16,12 @@ especially in a larger project with many spiders.
To define common output data format Scrapy provides the :class:`Item` class.
:class:`Item` objects are simple containers used to collect the scraped data.
They provide a `dictionary-like`_ API with a convenient syntax for declaring
their available fields.
their available fields.

Various Scrapy components use extra information provided by Items:
Various Scrapy components use extra information provided by Items:
exporters look at declared fields to figure out columns to export,
serialization can be customized using Item fields metadata, :mod:`trackref`
tracks Item instances to help find memory leaks
tracks Item instances to help find memory leaks
(see :ref:`topics-leaks-trackrefs`), etc.

.. _dictionary-like: https://docs.python.org/2/library/stdtypes.html#dict
Expand Down Expand Up @@ -237,7 +237,7 @@ Item objects

Return a new Item optionally initialized from the given argument.

Items replicate the standard `dict API`_, including its constructor, and
Items replicate the standard `dict API`_, including its ``__init__`` method and
also provide the following additional API members:

.. automethod:: copy
Expand Down
26 changes: 13 additions & 13 deletions docs/topics/loaders.rst
Expand Up @@ -26,7 +26,7 @@ Using Item Loaders to populate items

To use an Item Loader, you must first instantiate it. You can either
instantiate it with a dict-like object (e.g. Item or dict) or without one, in
which case an Item is automatically instantiated in the Item Loader constructor
which case an Item is automatically instantiated in the Item Loader ``__init__`` method
using the Item class specified in the :attr:`ItemLoader.default_item_class`
attribute.

Expand Down Expand Up @@ -265,7 +265,7 @@ There are several ways to modify Item Loader context values:
loader.context['unit'] = 'cm'

2. On Item Loader instantiation (the keyword arguments of Item Loader
constructor are stored in the Item Loader context)::
``__init__`` methodare stored in the Item Loader context)::
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
``__init__`` methodare stored in the Item Loader context)::
``__init__`` method are stored in the Item Loader context)::

I’ve seen this issue in a few more replacements below, where the character right after ‘method’ (a space, a dot, a comma) was accidentally removed. Could you please take a look?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤦‍♂ of course

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@Gallaecio thanks for your patience, change pushed.


loader = ItemLoader(product, unit='cm')

Expand Down Expand Up @@ -494,7 +494,7 @@ ItemLoader objects
.. attribute:: default_item_class

An Item class (or factory), used to instantiate items when not given in
the constructor.
the ``__init__`` method

.. attribute:: default_input_processor

Expand All @@ -509,15 +509,15 @@ ItemLoader objects
.. attribute:: default_selector_class

The class used to construct the :attr:`selector` of this
:class:`ItemLoader`, if only a response is given in the constructor.
If a selector is given in the constructor this attribute is ignored.
:class:`ItemLoader`, if only a response is given in the ``__init__`` method
If a selector is given in the ``__init__`` method this attribute is ignored.
This attribute is sometimes overridden in subclasses.

.. attribute:: selector

The :class:`~scrapy.selector.Selector` object to extract data from.
It's either the selector given in the constructor or one created from
the response given in the constructor using the
It's either the selector given in the ``__init__`` methodor one created from
the response given in the ``__init__`` methodusing the
:attr:`default_selector_class`. This attribute is meant to be
read-only.

Expand Down Expand Up @@ -642,7 +642,7 @@ Here is a list of all built-in processors:
.. class:: Identity

The simplest processor, which doesn't do anything. It returns the original
values unchanged. It doesn't receive any constructor arguments, nor does it
values unchanged. It doesn't receive any ``__init__`` method arguments, nor does it
accept Loader contexts.

Example::
Expand All @@ -656,7 +656,7 @@ Here is a list of all built-in processors:

Returns the first non-null/non-empty value from the values received,
so it's typically used as an output processor to single-valued fields.
It doesn't receive any constructor arguments, nor does it accept Loader contexts.
It doesn't receive any ``__init__`` methodarguments, nor does it accept Loader contexts.

Example::

Expand All @@ -667,7 +667,7 @@ Here is a list of all built-in processors:

.. class:: Join(separator=u' ')

Returns the values joined with the separator given in the constructor, which
Returns the values joined with the separator given in the ``__init__`` method which
defaults to ``u' '``. It doesn't accept Loader contexts.

When using the default separator, this processor is equivalent to the
Expand Down Expand Up @@ -705,7 +705,7 @@ Here is a list of all built-in processors:
those which do, this processor will pass the currently active :ref:`Loader
context <topics-loaders-context>` through that parameter.

The keyword arguments passed in the constructor are used as the default
The keyword arguments passed in the ``__init__`` methodare used as the default
Loader context values passed to each function call. However, the final
Loader context values passed to functions are overridden with the currently
active Loader context accessible through the :meth:`ItemLoader.context`
Expand Down Expand Up @@ -749,12 +749,12 @@ Here is a list of all built-in processors:
['HELLO, 'THIS', 'IS', 'SCRAPY']

As with the Compose processor, functions can receive Loader contexts, and
constructor keyword arguments are used as default context values. See
``__init__`` method keyword arguments are used as default context values. See
:class:`Compose` processor for more info.

.. class:: SelectJmes(json_path)

Queries the value using the json path provided to the constructor and returns the output.
Queries the value using the json path provided to the ``__init__`` methodand returns the output.
Requires jmespath (https://github.com/jmespath/jmespath.py) to run.
This processor takes only one input at a time.

Expand Down
14 changes: 7 additions & 7 deletions docs/topics/request-response.rst
Expand Up @@ -137,7 +137,7 @@ Request objects

A string containing the URL of this request. Keep in mind that this
attribute contains the escaped URL, so it can differ from the URL passed in
the constructor.
the ``__init__`` method

This attribute is read-only. To change the URL of a Request use
:meth:`replace`.
Expand Down Expand Up @@ -400,7 +400,7 @@ fields with form data from :class:`Response` objects.

.. class:: FormRequest(url, [formdata, ...])

The :class:`FormRequest` class adds a new keyword parameter to the constructor. The
The :class:`FormRequest` class adds a new keyword parameter to the ``__init__`` method The
remaining arguments are the same as for the :class:`Request` class and are
not documented here.

Expand Down Expand Up @@ -473,7 +473,7 @@ fields with form data from :class:`Response` objects.
:type dont_click: boolean

The other parameters of this class method are passed directly to the
:class:`FormRequest` constructor.
:class:`FormRequest` ``__init__`` method

.. versionadded:: 0.10.3
The ``formname`` parameter.
Expand Down Expand Up @@ -547,7 +547,7 @@ dealing with JSON requests.

.. class:: JsonRequest(url, [... data, dumps_kwargs])

The :class:`JsonRequest` class adds two new keyword parameters to the constructor. The
The :class:`JsonRequest` class adds two new keyword parameters to the ``__init__`` method The
remaining arguments are the same as for the :class:`Request` class and are
not documented here.

Expand All @@ -556,7 +556,7 @@ dealing with JSON requests.

:param data: is any JSON serializable object that needs to be JSON encoded and assigned to body.
if :attr:`Request.body` argument is provided this parameter will be ignored.
if :attr:`Request.body` argument is not provided and data argument is provided :attr:`Request.method` will be
if :attr:`Request.body` argument is not provided and data argument is provided :attr:`Request.method` will be
set to ``'POST'`` automatically.
:type data: JSON serializable object

Expand Down Expand Up @@ -721,7 +721,7 @@ TextResponse objects
:class:`Response` class, which is meant to be used only for binary data,
such as images, sounds or any media file.

:class:`TextResponse` objects support a new constructor argument, in
:class:`TextResponse` objects support a new ``__init__`` method argument, in
addition to the base :class:`Response` objects. The remaining functionality
is the same as for the :class:`Response` class and is not documented here.

Expand Down Expand Up @@ -755,7 +755,7 @@ TextResponse objects
A string with the encoding of this response. The encoding is resolved by
trying the following mechanisms, in order:

1. the encoding passed in the constructor ``encoding`` argument
1. the encoding passed in the ``__init__`` method`` ``encoding`` argument

2. the encoding declared in the Content-Type HTTP header. If this
encoding is not valid (ie. unknown), it is ignored and the next
Expand Down
2 changes: 1 addition & 1 deletion scrapy/exporters.py
Expand Up @@ -31,7 +31,7 @@ def __init__(self, **kwargs):
def _configure(self, options, dont_fail=False):
"""Configure the exporter by poping options from the ``options`` dict.
If dont_fail is set, it won't raise an exception on unexpected options
(useful for using with keyword arguments in subclasses constructors)
(useful for using with keyword arguments in subclasses ``__init__`` methods)
"""
self.encoding = options.pop('encoding', None)
self.fields_to_export = options.pop('fields_to_export', None)
Expand Down
2 changes: 1 addition & 1 deletion scrapy/extensions/feedexport.py
Expand Up @@ -106,7 +106,7 @@ def __init__(self, uri, access_key=None, secret_key=None, acl=None):
warnings.warn(
"Initialising `scrapy.extensions.feedexport.S3FeedStorage` "
"without AWS keys is deprecated. Please supply credentials or "
"use the `from_crawler()` constructor.",
"use the `from_crawler()` ``__init__`` method.",
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it should be kept as "constructor" here

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

when I thought I got them all... bf5c1a3

category=ScrapyDeprecationWarning,
stacklevel=2
)
Expand Down
2 changes: 1 addition & 1 deletion scrapy/utils/datatypes.py
Expand Up @@ -246,7 +246,7 @@ def pop(self, key, *args):
class MergeDict(object):
"""
A simple class for creating new "virtual" dictionaries that actually look
up values in more than one dictionary, passed in the constructor.
up values in more than one dictionary, passed in the ``__init__`` method.

If a key appears in more than one of the given dictionaries, only the
first occurrence will be used.
Expand Down