-
Notifications
You must be signed in to change notification settings - Fork 324
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Docker Support #72
Merged
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Adds a dockerfile to setup and run the kafka monitor. I do note that there are settings and configuration within that tie it into the docker-compose file, but I think they can be fairly easily overridden and maybe I switch them to just using the raw `settings.py` file instead of `localsettings.py`. This uses stand alone containers for Kafka, Zookeeper, and Redis, and each of the smaller scrapy cluster components will be their own separate containers as well. Everything will run under supervisord so you can scale the number of processes both within the container and outside. There is much more work to be done, for example reading in environment variables to override things in the settings file, configuring docker hub, etc.
This is an initial cut at making the redis monitor docker compatible. There are still many environmen t variables to specify but this gets us prett far along In testing this I also found a dormant issue with ensuring the ZK file path existed, which is easy enough to fix and will be merged in when this branch is decently complete
This commit adds the link spider to be compatible with Docker. This completes the initial cut of dockerization of the three core components, and todo is the following: - Define environment variable overrides for commonly altered configurations - Documentation under the Advanced Topics as well as Quickstart to supply yet another alternative to working and provisioning Scrapy Cluster. - Update Changelog - Merge branch back into dev. - Dockerhub builds, or at least a stable set of images on the hub for people to pull down - ??? Very excited to get this going!
This commit hopefully enables us to run the integration tests for our Docker images within Travis, enabling us to continuously test both the ansible provisioning and docker images. Added a new script to run tests within the container. Updated documentation for quickstart and advanced docker docs. Modded the Dockerfiles to include the new test script. Fixed .gitignore to ignore more files
Trying to make life easier by separating things into shell scripts instead of a massive script section in travis
Still need to figure out why kafka doesnt start immediately
Closes #48 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR completes the initial cut of Docker support for scrapy cluster. The 3 core containers are all tagged in dockerhub and it is pretty easy to push new images for the latest dev release that passes all tests once this is merged in.
New documentation is in the Quickstart Guide as well as docs within the Advanced section. Added support for better branching logic within Travis to test everything properly. Docker Compose is used to orchestrate things to give a sample compose file for how to spin up a Scrapy Cluster project via Docker.