Skip to content

Commit

Permalink
[Solved] JOBDIR= None for when Scheduler initializes disk queue even …
Browse files Browse the repository at this point in the history
…if JOBDIR is empty string (#6124)

Co-authored-by: John Doe <johndoe@email.com>
  • Loading branch information
nihilisticneuralnet and Faten848 committed Oct 31, 2023
1 parent 7e6da37 commit 04024f1
Show file tree
Hide file tree
Showing 4 changed files with 8 additions and 4 deletions.
2 changes: 1 addition & 1 deletion docs/topics/settings.rst
Original file line number Diff line number Diff line change
Expand Up @@ -1120,7 +1120,7 @@ modify this setting in your project, modify :setting:`ITEM_PIPELINES` instead.
JOBDIR
------

Default: ``''``
Default: ``None``

A string indicating the directory for storing the state of a crawl when
:ref:`pausing and resuming crawls <topics-jobs>`.
Expand Down
2 changes: 1 addition & 1 deletion scrapy/core/scheduler.py
Original file line number Diff line number Diff line change
Expand Up @@ -352,7 +352,7 @@ def _dq(self):

def _dqdir(self, jobdir: Optional[str]) -> Optional[str]:
"""Return a folder name to keep disk queue state at"""
if jobdir is not None:
if jobdir:
dqdir = Path(jobdir, "requests.queue")
if not dqdir.exists():
dqdir.mkdir(parents=True)
Expand Down
2 changes: 2 additions & 0 deletions scrapy/settings/default_settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -206,6 +206,8 @@
ITEM_PIPELINES = {}
ITEM_PIPELINES_BASE = {}

JOBDIR = None

LOG_ENABLED = True
LOG_ENCODING = "utf-8"
LOG_FORMATTER = "scrapy.logformatter.LogFormatter"
Expand Down
6 changes: 4 additions & 2 deletions scrapy/utils/job.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,9 @@


def job_dir(settings: BaseSettings) -> Optional[str]:
path: str = settings["JOBDIR"]
if path and not Path(path).exists():
path: Optional[str] = settings["JOBDIR"]
if not path:
return None
if not Path(path).exists():
Path(path).mkdir(parents=True)
return path

0 comments on commit 04024f1

Please sign in to comment.