Skip to content
This repository has been archived by the owner on Feb 28, 2018. It is now read-only.

WIP: Add Dockerfile #23

Closed
wants to merge 7 commits into from

Conversation

pedrommone
Copy link
Contributor

@pedrommone pedrommone commented Oct 4, 2016

This is a partial PR of #12.

Dockerfile

This is just a simple image with official Python 3.5 and all dependencies installed.

docker-compose.yml

I've split the whole application into four containers and one of them is the official postgres 9.6. This is just a base to work.

.travis.ci.yml

I've added some new cool stuff into the CI pipeline. Now we generate the docker images too, yay! I hope to improve the CI pipeline even more for fully CI/CD.

@coveralls
Copy link

Coverage Status

Coverage remained the same at 69.454% when pulling d3a43fa on pedrommone:add-docker-image into 76b6271 on datasciencebr:master.

@pedrommone
Copy link
Contributor Author

@cuducos I need your help here! :)
In order to generate secured tokens for Docker Hub, I need you to sign up the org and generate the credentials for me, after that you need encrypt them with this Travis feature

@coveralls
Copy link

Coverage Status

Coverage remained the same at 69.454% when pulling d3a43fa on pedrommone:add-docker-image into 76b6271 on datasciencebr:master.

@coveralls
Copy link

Coverage Status

Coverage remained the same at 69.454% when pulling 879c4f9 on pedrommone:add-docker-image into 76b6271 on datasciencebr:master.

@cuducos
Copy link
Collaborator

cuducos commented Oct 5, 2016

@pedrommone I added you as a contributor to hub.docker.com/u/datasciencebr

Could you generate the keys yet?

Copy link
Collaborator

@cuducos cuducos left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Many many thanks, @pedrommone!

I left some inline comments. Please, don't take me wrong, I'm not a dick (I guess).

I was just trying to learn from your files, I'm a newbie to Docker and most of the lines were pretty straightforward — so I just asked about bits that I had doubts, just as a matter of curiosity ; )

Finally, one general question: what's the advantage of having separated containers from migration and seeding? Why not just running python manage.py migrate (at every deploy) and python manage.py loaddatasets etc. (on provision) within Jarbas main container?


env:
global:
- secure: "" # DOCKER_EMAIL
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How does that secure works? Should I set these variables in Travis CI? Drop a line at telegram.me/cuducos so we can sort that out.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just discovered this feature too, travis has something to encrypt sensible data! You learn more here.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's awesome — not sure if this encryption is really safe hahaha… I still tend to prefer to leave credentials in the Settings panel ; )

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Of course, that's a lot better. We can change that.

- secure: "" # DOCKER_USER
- secure: "" # DOCKER_PASS
- REPOSITORY: "datasciencebr/jarbas"
- COMMIT=${TRAVIS_COMMIT::8}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just out of curiosity: what does this line do?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just to avoid some DRY.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But what is this exactly? The name/hash of the commit?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, it just put into environment some useful information.

@@ -28,3 +39,9 @@ script:

after_success:
- coveralls
- docker login -e $DOCKER_EMAIL -u $DOCKER_USER -p $DOCKER_PASS
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, n00b question again: what's the point of building and pushing the image (?) after the tests?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually is is something I want to improve, build a image every time and run tests on top of it.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Got it. Thus maybe it should be on before_script section, so when you got to the script (the tests) section, the Docker image is ready…

If these commands take a while to exit, take a look on travis_wait.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's just a login procedure, we need to call it before pushing images. Anyway I'll move it because we'll need in future.

@@ -0,0 +1,17 @@
FROM python:3.5

MAINTAINER Pedro Maia <pedro@pedromm.com>
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yay! Thanks for that ; )

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You're welcome.

postgres:
image: postgres:9.6
container_name: jarbas-postgres
environment:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is a good practice when it comes to sensible info like this? Does Docker reads from server envvar or something?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually we need to improve that too, I'll spend more time on it when we get a infrastructure to work on.

populate:
image: jarbas
container_name: jarbas-populate
command: python manage.py loaddatasets
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually there are two extras commands to populate the database, making a total of three:

  • python manage.py loaddatasets
  • python manage.py loadsuppliers
  • python manage.py ceapdatasets

Is this any of these syntax valid?

  command:
    - python manage.py loaddatasets
    - python manage.py loadsuppliers
    - python manage.py ceapdatasets

Or (worse since if one command fails, the following ones will not be executed):

  command: python manage.py loaddatasets && python manage.py loadsuppliers && python manage.py ceapdatasets

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually append them with & doesn't work (something related to django). I'll improve it.

@pedrommone
Copy link
Contributor Author

@cuducos I saw you request, I'll try to code more today.

@pedrommone
Copy link
Contributor Author

Just some feedback. I've faced some issues:

  • In order to make it functional, I need to build an container with NodeJS and Python and this is not cool.
  • For fully CI integration, we'll need to move to another CI SaaS that provide pipelines.
How to run what I've done?

Clone this branch and run docker build -t jarbas . then run docker-compose up. It will provision all containers and will fail into nodejs requirement step.

@cuducos
Copy link
Collaborator

cuducos commented Oct 11, 2016

In order to make it functional, I need to build an container with NodeJS and Python and this is not cool.

Actually this is a good thing. We can create another container to run NodeJS and this is the first step to address Issue #18 (splitting front-end and backend cc @leomeloxp).

What I can do is to some edit the repo in a separate branch and send a PR to this branch here. I make the Django and the Elm commands independent of each other — i.e. The elm-make command will be ran outside Django (e.g. npm run build instead of python manage.py assets build).

The only link between both worlds will be that the output file will be saved directly into the staticfiles/ (where Django HTML template will look for app.js). Surely later we can use nginx to have NodeJs creating the HTML too, but I'd say this is mostly a matter of Issue #18 (we're gonna address it later anyway).

Does that help?

For fully CI integration, we'll need to move to another CI SaaS that provide pipelines.

I have no idea about this limitation, but I don't mind leaving Travis CI either. Any ideas? cc @luiz-simples @vitallan @gwmoura @ayr-ton

Clone this branch and run docker build -t jarbas . then run docker-compose up. It will provision all containers and will fail into nodejs requirement step.

Does the separation I mentioned in my first point here helps? IMHO it worth it to have a working local version before depending on a CI tool. So if we fix the NodeJS thing the only pending issue will be the CI, am I right?

@pedrommone
Copy link
Contributor Author

Does that help?

Yep. it helps a lot.

Does the separation I mentioned in my first point here helps? IMHO it worth it to have a working local version before depending on a CI tool. So if we fix the NodeJS thing the only pending issue will be the CI, am I right?

Yep, we'll have some work to be done here. :)

@cuducos
Copy link
Collaborator

cuducos commented Oct 11, 2016

Done in extract-nodejs. No Django command depends on NodeJS anymore.

  • app.js is generated by npm run assets (not by python manage.py assets build)
  • ceapdatasets.html loads the CSS from a CDN (not from node_modules/)

Is there anything I might have left behind?

I felt that if I isolate it more I'd be splitting the repo and I'd like to do this with Docker linking front-end and back end. Otherwise I don't know how to locally have a static front-end (generated by NodeJS) at root (e.g. http://localhost:8000/) and a Django app at /api/ (e.g. http://localhost:8000/api/) serving it.

@gwmoura
Copy link

gwmoura commented Oct 17, 2016

For fully CI integration, we'll need to move to another CI SaaS that provide pipelines.

@cuducos we can create our simple pipeline on Travis CI like: test > build > push to docker hub > deploy container on server. One script for each step, the big problem is the deploy on the server.
Today I am using tsuru.io, but as we need deploy one app, I think better study docker swarm (a docker orchestration tool) or use some service of container deploy.

@gwmoura
Copy link

gwmoura commented Oct 17, 2016

e a static front-end (generated by NodeJS) at root (e.g. http://localhost:8000/) and a Django app at /api/ (e.g. http://localhost:8000/api/) serving it

@cuducos about this, we can create on the project two folders and use docker-compose to manage this environments:

web/
   |__ Dockerfile
   |__ files to frontend app
api/
   |__ Dockerfile
   |__ files to back-end app
docker-compose.yml

# docker-compose.yml content some like this
version: '2'
services:
  db:
   images: postgresql
  web:
    build: ./web
    ...
  api:
    build: ./api
    ...

@pedrommone
Copy link
Contributor Author

@gwmoura the main problem with this approach is the Travis itself, its made for Continuous Integration, not Continuous Delivery. We can't build a pipeline with Travis.

I'm sorry for holding this MR for too long, I've been busy with work and college (I need to finish my thesis).

@cuducos
Copy link
Collaborator

cuducos commented Nov 26, 2016

Closing this PR since we get a working docker-compose.yml on master branch 🎉

@cuducos cuducos closed this Nov 26, 2016
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants