FileFeedStorage creates empty file when no items are scraped #872
Labels
Comments
@gbirke I have met the same problem, there are so many empty files when you save data to files, and your proposal sounds like a good idea to me. |
michael-yin
added a commit
to michael-yin/scrapy
that referenced
this issue
Oct 8, 2014
I'm having a similar issue, where the |
gbirke
added a commit
to gbirke/scrapy
that referenced
this issue
Jan 10, 2015
FileFeedStorage left empty files when no items were scraped. This patch adds a cleanup method to the IFeedStorage interface that will be called by FeedExporter when no items were scraped. Fixes scrapy#872
redapple
added a commit
to redapple/scrapy
that referenced
this issue
Sep 16, 2016
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
When no items are scraped, the corresponding file is created none the less, because it is created in by the
storage.open
call inFeedExporter.open_spider
. This behavior ignores the the setting ofFEED_STORE_EMPTY
when using file export.My proposal for this would be to add a
cleanup
method to theIFeedStorage
interface. ThenFeedExporter.close_spider
can call that method before returning in caseslot.itemcount
is zero andself.store_empty
isFalse
.cleanup
could also be called internally from thestore
methods of theIFeedStorage
implementations.The text was updated successfully, but these errors were encountered: