Skip to content

Commit

Permalink
Merge pull request #198 from Anderson-Liu/dev
Browse files Browse the repository at this point in the history
remove unused code.
  • Loading branch information
Madison Bahmer committed Aug 24, 2018
2 parents 2c2075a + 0d536ef commit effd4df
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 1 deletion.
1 change: 0 additions & 1 deletion crawler/crawling/pipelines.py
Original file line number Diff line number Diff line change
Expand Up @@ -149,7 +149,6 @@ def _clean_item(self, item):
del item_copy['status_msg']
item_copy['action'] = 'ack'
item_copy['logger'] = self.logger.name
item_copy

return item_copy

Expand Down
2 changes: 2 additions & 0 deletions docs/topics/introduction/quickstart.rst
Original file line number Diff line number Diff line change
Expand Up @@ -139,6 +139,8 @@ This will pull the latest stable images from Docker hub and build your scraping

At time of writing, there is no Docker container to interface and run all of the tests within your compose-based cluster. Instead, if you wish to run the unit and integration tests plese see the following steps.

.. note:: If you want to switch to python3, just modify ``docker-compose.yml`` to change kafka_monitor, redis_monitor, crawler and rest image to python3's tag like kafka-monitor-dev-py3. You can find all available tag in `DockerHub Tags <https://hub.docker.com/r/istresearch/scrapy-cluster/tags/>`_

4) To run the integration tests, get into the bash shell on any of the containers.

Kafka monitor
Expand Down

0 comments on commit effd4df

Please sign in to comment.