Skip to content

Commit

Permalink
Disable closespider_itemcount by default
Browse files Browse the repository at this point in the history
This setting is mainly used for demo purposes. And it seems to be
highly confusing for a wide audience. We decided to disable it by
default, so crawly will not stop suddenly.
  • Loading branch information
oltarasenko committed Mar 27, 2020
1 parent 1e30858 commit 953b711
Show file tree
Hide file tree
Showing 4 changed files with 7 additions and 6 deletions.
1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,6 @@ historical archival.
config :crawly,
closespider_timeout: 10,
concurrent_requests_per_domain: 8,
closespider_itemcount: 1000,
middlewares: [
Crawly.Middlewares.DomainFilter,
Crawly.Middlewares.UniqueRequest,
Expand Down
6 changes: 3 additions & 3 deletions documentation/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,11 +93,11 @@ config :crawly,

Defines a list of middlewares responsible for pre-processing requests. If any of the requests from the `Crawly.Spider` is not passing the middleware, it's dropped.

### closespider_itemcount :: pos_integer()
### closespider_itemcount :: pos_integer() | :disabled

default: 5000
default: :disabled

An integer which specifies a number of items. If the spider scrapes more than that amount and those items are passed by the item pipeline, the spider will be closed. If set to nil the spider will not be stopped.
An integer which specifies a number of items. If the spider scrapes more than that amount and those items are passed by the item pipeline, the spider will be closed. If set to :disabled the spider will not be stopped.

### closespider_timeout :: pos_integer()

Expand Down
1 change: 0 additions & 1 deletion documentation/quickstart.md
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,6 @@ Goals:
config :crawly,
closespider_timeout: 10,
concurrent_requests_per_domain: 8,
closespider_itemcount: 1000,
middlewares: [
Crawly.Middlewares.DomainFilter,
{Crawly.Middlewares.RequestOptions, [timeout: 30_000]},
Expand Down
5 changes: 4 additions & 1 deletion lib/crawly/manager.ex
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,10 @@ defmodule Crawly.Manager do
delta = items_count - state.prev_scraped_cnt
Logger.info("Current crawl speed is: #{delta} items/min")

case Application.get_env(:crawly, :closespider_itemcount, 1000) do
case Application.get_env(:crawly, :closespider_itemcount, :disabled) do
:disabled ->
:ignored

cnt when cnt < items_count ->
Logger.info(
"Stopping #{inspect(state.name)}, closespider_itemcount achieved"
Expand Down

0 comments on commit 953b711

Please sign in to comment.