Skip to content
This repository has been archived by the owner on Nov 14, 2022. It is now read-only.

Commit

Permalink
🎉 First public commit
Browse files Browse the repository at this point in the history
  • Loading branch information
tiangolo committed Dec 1, 2017
1 parent cfdda75 commit 598f433
Show file tree
Hide file tree
Showing 105 changed files with 12,894 additions and 0 deletions.
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
.vscode
57 changes: 57 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,57 @@
# Base Project

Generate a basic back end and front end stack.

## Features

* Full Docker integration (Docker based)
* Docker Swarm Mode deployment
* Docker Compose integration and optimization for local development
* Production ready Python web server using Nginx and uWSGI
* Python Flask back end with:
* Flask-apispec: Swagger live documentation generation
* Marshmallow: model and data serialization (convert model objects to JSON)
* Webargs: parse, validate and document inputs to the endpoint / route
* Secure password hashing by default
* JWT token authentication
* SQLAlchemy models (independent of Flask extensions, so they can be used with Celery workers directly)
* Basic starting models for users and groups (modify and remove as you need)
* Alembic migrations
* CORS (Cross Origin Resource Sharing)
* Celery worker that can import and use models and code from the rest of the back end selectively (you don't have to install the complete app in each worker)
* REST back end tests based on Pytest, integrated with Docker, so you can test the full API interaction, independent on the database. As it runs in Docker, it can build a new data store from scratch each time (so you can use ElasticSearch, MongoDB, CouchDB, or whatever you want, and just test that the API works)
* Easy Python integration with Jupyter Kernels for remote or in-Docker development with extensions like Atom Hydrogen or Visual Studio Code Jupyter
* Angular front end with:
* Docker server based on Nginx
* Docker multi-stage building, so you don't need to save or commit compiled code
* Docker building integrated tests with Chrome Headless
* PGAdmin for PostgreSQL database, you can modify it to use PHPMyAdmin and MySQL easily
* Swagger-UI for live interactive documentation
* Flower for Celery jobs monitoring
* Load balancing between front end and back end with Traefik, so you can have both under the same domain, separated by path, but served by different containers
* Traefik integration, including Let's Encrypt HTTPS certificates automatic generation
* GitLab CI (continuous integration), including front end and back end testing

## How to use it

Go to the directoy where you want to create your project and run:

```bash
pip install cookiecutter
cookiecutter https://github.com/senseta-os/base-project
```

### Generate passwords

You will be asked to provide passwords and secret keys for several components. Open another terminal and run:

```bash
< /dev/urandom tr -dc A-Za-z0-9 | head -c${1:-32};echo;
```

Copy the contents and use that as password / secret key. And run that again to generate another secure key.


## License

This project is licensed under the terms of the MIT license.
40 changes: 40 additions & 0 deletions cookiecutter.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
{
"project_name": "Base Project",
"project_slug": "{{ cookiecutter.project_name|lower|replace(' ', '-') }}",
"domain_main": "{{cookiecutter.project_slug}}.com",
"domain_staging": "stag.{{cookiecutter.domain_main}}",
"domain_branch": "branch.{{cookiecutter.domain_main}}",
"domain_dev": "dev.{{cookiecutter.domain_main}}",

"docker_swarm_stack_name_main": "{{domain_main|replace('.', '-')}}",
"docker_swarm_stack_name_staging": "{{domain_staging|replace('.', '-')}}",
"docker_swarm_stack_name_branch": "{{domain_branch|replace('.', '-')}}",

"secret_key": "changethis",
"first_superuser": "admin@{{cookiecutter.domain_main}}",
"first_superuser_password": "changethis",


"postgres_password": "changethis",
"pgadmin_default_user": "admin@{{cookiecutter.domain_main}}",
"pgadmin_default_user_password": "changethis",

"traefik_constraint_tag": "{{cookiecutter.domain_main}}",
"traefik_constraint_tag_staging": "{{cookiecutter.domain_staging}}",
"traefik_constraint_tag_branch": "{{cookiecutter.domain_branch}}",
"traefik_public_network": "traefik-public",
"traefik_public_constraint_tag": "traefik-public",

"flower_auth": "root:changethis",

"sentry_dsn": "",

"docker_image_backend": "backend",
"docker_image_celeryworker": "celeryworker",
"docker_image_frontend": "frontend",

"_copy_without_render": [
"frontend/src/**/*.html",
"frontend/node_modules/*"
]
}
1 change: 1 addition & 0 deletions {{cookiecutter.project_slug}}/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
.vscode
95 changes: 95 additions & 0 deletions {{cookiecutter.project_slug}}/.gitlab-ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,95 @@
image: docker:latest

before_script:
- apk add --no-cache py-pip
- pip install docker-compose
- docker login -u gitlab-ci-token -p $CI_JOB_TOKEN $CI_REGISTRY

stages:
- test
- build
- deploy

rest-tests:
stage: test
script:
- docker-compose -f docker-compose.test.yml build
- docker-compose -f docker-compose.test.yml up -d
- docker-compose -f docker-compose.test.yml exec -T backend-rest-tests pytest
- docker-compose -f docker-compose.test.yml down -v
tags:
- build
- test

build-branch:
stage: build
script:
- docker-compose -f docker-compose.branch.build.yml build
- docker-compose -f docker-compose.branch.build.yml push
except:
- master
- production
- tags
tags:
- build
- test

build-stag:
stage: build
script:
- docker-compose -f docker-compose.stag.build.yml build
- docker-compose -f docker-compose.stag.build.yml push
only:
- master
tags:
- build
- test

build-prod:
stage: build
script:
- docker-compose -f docker-compose.prod.build.yml build
- docker-compose -f docker-compose.prod.build.yml push
only:
- production
tags:
- build
- test

deploy-branch:
stage: deploy
script: docker stack deploy -c docker-compose.branch.yml --with-registry-auth {{cookiecutter.docker_swarm_stack_name_branch}}
environment:
name: staging
url: https://{{cookiecutter.domain_branch}}
except:
- master
- production
- tags
tags:
- swarm
- branch

deploy-stag:
stage: deploy
script: docker stack deploy -c docker-compose.stag.yml --with-registry-auth {{cookiecutter.docker_swarm_stack_name_staging}}
environment:
name: staging
url: https://{{cookiecutter.domain_staging}}
only:
- master
tags:
- swarm
- stag

deploy-prod:
stage: deploy
script: docker stack deploy -c docker-compose.prod.yml --with-registry-auth {{cookiecutter.docker_swarm_stack_name_main}}
environment:
name: production
url: https://{{cookiecutter.domain_main}}
only:
- production
tags:
- swarm
- prod
114 changes: 114 additions & 0 deletions {{cookiecutter.project_slug}}/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,114 @@
# {{project_name}}

## Back end local development

* Update your local `hosts` file, set the IP `127.0.0.1` (your `localhost`) to `{{cookiecutter.domain_dev}}`. The `docker-compose.override.yml` file will set the environment variable `SERVER_NAME` to that host. Otherwise you would receive 404 errors.

* Modify your hosts file, probably in `/etc/hosts` to include:

```
0.0.0.0 {{cookiecutter.domain_dev}}
```

...that will make your browser talk to your locally running server.

* Start the stack with Docker Compose:

```bash
docker-compose up -d
```

* Start an interactive session in the server container that is running an infinite loop doing nothing:

```bash
docker-compose exec server bash
```

* Run the local debugging Flask server, all the command is in the `RUN` environment variable:

```bash
$RUN
```

* Your OS will handle redirecting `{{cookiecutter.domain_dev}}` to your local stack. So, in your browser, go to: http://{{cookiecutter.domain_dev}}.

Add and modify SQLAlchemy models to `./backend/app/app/models/`, Marshmallow schemas to `./backend/app/app/schemas` and API endpoints to `./backend/app/app/api/`.

Add and modify tasks to the Celery worker in `./backend/app/app/worker.py`.

If you need to install any additional package to the worker, add it to the file `./backend/app/Dockerfile-celery-worker`.


### Back end tests

To test the back end run:

```bash
# Build the testing stack
docker-compose -f docker-compose.test.yml build
# Start the testing stack
docker-compose -f docker-compose.test.yml up -d
# Run the REST tests
docker-compose -f docker-compose.test.yml exec -T backend-rest-tests pytest
# Stop and eliminate the testing stack
docker-compose -f docker-compose.test.yml down -v
```

The tests run with Pytest, modify and add tests to `./backend/app/app/rest_tests/`.

If you need to install any additional package for the REST tests, add it to the file `./backend/app/Dockerfile-rest-tests`.

If you use GitLab CI the tests will run automatically.


### Migrations

* Start an interactive session in the server container that is running an infinite loop doing nothing:

```bash
docker-compose exec server bash
```

* After changing a model (for example, adding a column), inside the container, create a revision, e.g.:

```bash
alembic revision -m "Add column last_name to User model"
```

* Commit to the git repository the files generated in the alembic directory.

* After creating the revision, run the migration in the database (this is what will actually change the database):

```bash
alembic upgrade head
```

## Front end development

* Enter the `frontend` directory, install the NPM packages and start it the `npm` scrits:

```bash
cd frontend
npm install
npm run start
```

Check the file `package.json` to see other available options.

## Deployment

To deploy the stack to a Docker Swarm run, e.g.:

```bash
docker stack deploy -c docker-compose.prod.yml --with-registry-auth {{cookiecutter.docker_swarm_stack_name_main}}
```

Using the corresponding Docker Compose file.

If you use GitLab CI, it will automatically deploy it.

GitLab CI is configured assuming 3 environments following GitLab flow:

* `prod` (production) from the `production` branch.
* `stag` (staging) from the `master` branch.
* `branch`, from any other branch (a feature in development).
1 change: 1 addition & 0 deletions {{cookiecutter.project_slug}}/backend/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
__pycache__
21 changes: 21 additions & 0 deletions {{cookiecutter.project_slug}}/backend/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
FROM tiangolo/uwsgi-nginx-flask:python3.6

RUN pip install --upgrade pip
RUN pip install flask flask-cors psycopg2 raven[flask] celery==4.1.0 passlib[bcrypt] SQLAlchemy==1.1.13 flask-apispec flask-jwt-extended alembic

# For development, Jupyter remote kernel, Hydrogen
# Using inside the container:
# jupyter notebook --ip=0.0.0.0 --allow-root
ARG env=prod
RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyter ; fi"
EXPOSE 8888

COPY ./app /app
WORKDIR /app/

ENV STATIC_PATH /app/app/static
ENV STATIC_INDEX 1

ENV PYTHONPATH=/app

EXPOSE 80
19 changes: 19 additions & 0 deletions {{cookiecutter.project_slug}}/backend/Dockerfile-celery-worker
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
FROM python:3.6

RUN pip install psycopg2 raven pyyaml celery==4.1.0 SQLAlchemy==1.1.13 passlib[bcrypt]

# For development, Jupyter remote kernel, Hydrogen
# Using inside the container:
# jupyter notebook --ip=0.0.0.0 --allow-root
ARG env=prod
RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyter ; fi"
EXPOSE 8888

ENV C_FORCE_ROOT=1

COPY ./app /app
WORKDIR /app

ENV PYTHONPATH=/app

CMD celery worker -A app.worker -l info -Q main-queue -c 1
17 changes: 17 additions & 0 deletions {{cookiecutter.project_slug}}/backend/Dockerfile-rest-tests
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
FROM python:3.6

RUN pip install requests

# For development, Jupyter remote kernel, Hydrogen
# Using inside the container:
# jupyter notebook --ip=0.0.0.0 --allow-root
RUN pip install jupyter
EXPOSE 8888

RUN pip install faker==0.8.4 pytest

COPY ./app /app

ENV PYTHONPATH=/app

WORKDIR /app/app/rest_tests
Loading

0 comments on commit 598f433

Please sign in to comment.