Skip to content

Commit

Permalink
Merge pull request #274 from sbrunner/test
Browse files Browse the repository at this point in the history
Fix the master, upgrade...
  • Loading branch information
sbrunner committed Jan 26, 2017
2 parents aa156cf + 902ef00 commit 142c4c4
Show file tree
Hide file tree
Showing 41 changed files with 282 additions and 1,239 deletions.
16 changes: 6 additions & 10 deletions .travis.yml
Expand Up @@ -26,19 +26,18 @@ before_install:
- chmod 644 ~/.bashrc
- echo "PATH=${PATH}" > ~/.bashrc

- sudo rm -f /etc/apt/sources.list.d/pgdg-source.list
- sudo rm -rf /usr/local
- sudo add-apt-repository --yes ppa:stephane-brunner/trusty-gis
- sudo apt-get update
- sudo apt-get install -y --force-yes libstdc++-4.8-dev libc6 libmapnik-dev apache2 libapache2-mod-fcgid cgi-mapserver deploy libdb-dev optipng postgresql-9.3-postgis-2.1
- sudo apt-get install -y --force-yes libstdc++-4.8-dev libc6 libmapnik-dev apache2 libapache2-mod-fcgid cgi-mapserver libdb-dev optipng docker-engine

- sudo service postgresql stop
- docker build --tag=test-db docker/test-db
- docker run --publish=5432:5432 --detach test-db

- sudo -u postgres tilecloud_chain/tests/create_test_data.sh
- sudo a2enmod fcgid
- sudo cp tilecloud_chain/tests/apache.conf /etc/apache2/sites-enabled/apache.conf
- sudo apache2ctl graceful

- sudo tilecloud_chain/tests/create-deploy.sh

- ssh localhost echo add to know host

- sudo mkdir /tmp/tests
Expand All @@ -59,12 +58,9 @@ script:
- vvv-validate-rst CHANGES.rst
- python setup.py --long-description > PYPI.rst
- vvv-validate-rst PYPI.rst
- python setup.py nosetests --nocapture --nologcapture --stop --attr '!'nopy`echo $TRAVIS_PYTHON_VERSION | awk -F . '{{print $1}}'`
- python setup.py nosetests --attr '!'nopy`echo $TRAVIS_PYTHON_VERSION | awk -F . '{{print $1}}'`
- git --no-pager diff --check `git log --oneline | tail -1 | cut --fields=1 --delimiter=' '`

after_failure:
- python setup.py nosetests --logging-filter=tilecloud,tilecloud_chain --attr '!'nopy`echo $TRAVIS_PYTHON_VERSION | awk -F . '{{print $1}}'`

after_success:
# Report coverage results to coveralls.io
- pip install coveralls
Expand Down
64 changes: 14 additions & 50 deletions README.rst
Expand Up @@ -29,7 +29,7 @@ Features:
- Delete empty tiles.
- Copy files between caches.
- Be able to use an SQS queue to dispatch the generation.
- Post processon the generated tiles.
- Post processing the generated tiles.
- ...


Expand Down Expand Up @@ -166,7 +166,7 @@ To start the common attributes are:

``bbox`` used to limit the tiles generation.

``px_buffer`` a buffer in px arround the object area (geoms or extent).
``px_buffer`` a buffer in px around the object area (geoms or extent).


WMTS layout
Expand Down Expand Up @@ -277,7 +277,7 @@ Legends
~~~~~~~

To be able to generate legends with ``generate_controller --generate-legend-images``
you should have ``legend_mime`` and ``legend_extention`` in the layer config.
you should have ``legend_mime`` and ``legend_extention`` in the layer configuration.

for example:

Expand All @@ -288,11 +288,11 @@ for example:
Then it will create a legend image per layer and per zoom level named
``.../1.0.0/{{layer}}/{{wmts_style}}/legend{{zoom}}.{{legend_extention}}``
only if she is deferent than the previous zoom level. If we have only one legend image
only if she is different than the previous zoom level. If we have only one legend image
it still stores in the file named ``legend0.{{legend_extention}}``.

When we do ``generate_controller --generate-wmts-capabilities`` we will at first
parse the legend images to generate a layer config like this:
parse the legend images to generate a layer configuration like this:

.. code:: yaml
Expand Down Expand Up @@ -421,7 +421,7 @@ The Apache configuration look like this (default values):
headers:
Cache-Control: max-age=864000, public
If we use a proxy to access to the tiles we can specify a deferent URL to access
If we use a proxy to access to the tiles we can specify a different URL to access
to the tiles by adding the parameter ``tiles_url`` in the cache.

Configure MapCache
Expand Down Expand Up @@ -454,7 +454,7 @@ To generate the MapCache configuration we use the command::
Tiles error file
----------------

If we set a file path in config file:
If we set a file path in configuration file:

.. code:: yaml
Expand Down Expand Up @@ -538,7 +538,7 @@ The cache configuration is like this:
# the used folder in the bucket [default to '']
folder: ''
# for GetCapabilities
http_url: https://%(host)s/%(bucket)s/%(folder)s
http_url: https://%(host)s/%(bucket)s/%(folder)s/
hosts:
- wmts0.<host>
Expand Down Expand Up @@ -582,31 +582,6 @@ The configuration is like this:
The topic should already exists.

Configure and explain EC2
-------------------------

The generation can be deported on an external host.

This will deploy the code the database and the geodata to an external host,
configure or build the application, configure apache, and run the tile generation.

This work only with S3 and needs SQS.

In a future version it will start the new EC2 host, join an ESB, run the tile generation,
and do snapshot on the ESB.

The configuration is like this:

.. code:: yaml
ec2:
geodata_folder: /var/sig
deploy_config: tilegeneration/deploy.cfg
deploy_user: deploy
code_folder: /var/www/vhost/project/private/project
apache_config: /var/www/vhost/project/conf/tilegeneration.conf
apache_content: Include /var/www/vhost/project/private/project/apache/\*.conf
Amazon tool
-----------

Expand Down Expand Up @@ -692,7 +667,7 @@ The server can be configure as it:
# allowed extension in the static path (default value), not used for s3.
static_allow_extension: [jpeg, png, xml, js, html, css]
The minimal config is to enable it:
The minimal configuration is to enable it:

.. code:: yaml
Expand All @@ -705,7 +680,7 @@ You should also configure the ``http_url`` of the used `cache`, to something lik
Pyramid view
------------

To use the pyramid view use the following config:
To use the pyramid view use the following configuration:

.. code:: python
Expand All @@ -725,7 +700,7 @@ in ``production.ini``::
use = egg:tilecloud_chain#server
configfile = %(here)s/tilegeneration/config.yaml

with the apache configuration::
with the Apache configuration::

WSGIDaemonProcess tiles:${instanceid} display-name=%{GROUP} user=${modwsgi_user}
WSGIScriptAlias /${instanceid}/tiles ${directory}/apache/wmts.wsgi
Expand All @@ -742,12 +717,11 @@ Commands
Available commands
------------------

* ``generate_controller`` generate the annexe files like capabilities, legend, OpenLayers test page, MapCacke config, Apache config.
* ``generate_controller`` generate the annexe files like capabilities, legend, OpenLayers test page, MapCache configuration, Apache configuration.
* ``generate_tiles`` generate the tiles.
* ``generate_copy`` copy the tiles from a cache to an other.
* ``generate_process`` prosses the tiles using a configured prosess.
* ``generate_process`` process the tiles using a configured process.
* ``generate_cost`` estimate the cost.
* ``generate_amazon`` generate the tiles using EC2.
* ``import_expiretiles`` import the osm2pgsql expire-tiles file as geoms in the database.

Each commands have a ``--help`` option to give a full arguments help.
Expand Down Expand Up @@ -788,7 +762,7 @@ Generate a tiles near a tile coordinate (useful for test)::

generate_tiles --near <X> <Y>

Generate a tiles in a deferent cache than the default one::
Generate a tiles in a different cache than the default one::

generate_tiles --cache <a_cache>

Expand Down Expand Up @@ -816,17 +790,9 @@ Configuration (default values):
cost:
# [nb/month]
request_per_layers: 10000000
# GeoData size [Go]
esb_size: 100
cloudfront:
download: 0.12,
get: 0.009
ec2:
usage: 0.17
esb:
io: 260.0,
storage: 0.11
esb_size: 100
request_per_layers: 10000000
s3:
download: 0.12,
Expand All @@ -851,8 +817,6 @@ The following commands can be used to know the time and cost to do generation::

generate_controller --cost

This suppose that you use a separate EC2 host to generate the tiles.

Useful options
--------------

Expand Down
18 changes: 18 additions & 0 deletions docker/test-db/20_init.sql
@@ -0,0 +1,18 @@
CREATE SCHEMA tests;

CREATE TABLE tests.point (gid serial Primary KEY, name varchar(10));
SELECT AddGeometryColumn('tests', 'point','the_geom',21781,'POINT',2);

CREATE TABLE tests.line (gid serial Primary KEY, name varchar(10));
SELECT AddGeometryColumn('tests', 'line','the_geom',21781,'LINESTRING',2);

CREATE TABLE tests.polygon (gid serial Primary KEY, name varchar(10));
SELECT AddGeometryColumn('tests', 'polygon','the_geom',21781,'POLYGON',2);


INSERT INTO tests.point VALUES (0, 'point1', ST_GeomFromText('POINT (600000 200000)', 21781));
INSERT INTO tests.point VALUES (1, 'point2', ST_GeomFromText('POINT (530000 150000)', 21781));

INSERT INTO tests.line VALUES (0, 'line1', ST_GeomFromText('LINESTRING (600000 200000,530000 150000)', 21781));

INSERT INTO tests.polygon VALUES (0, 'polygon1', ST_GeomFromText('POLYGON ((600000 200000,600000 150000,530000 150000, 530000 200000, 600000 200000))', 21781));
6 changes: 6 additions & 0 deletions docker/test-db/Dockerfile
@@ -0,0 +1,6 @@
FROM camptocamp/postgresql:latest

ENV POSTGRES_USER www-data
ENV POSTGRES_DB tests

ADD *.sql /docker-entrypoint-initdb.d
1 change: 0 additions & 1 deletion setup.cfg
@@ -1,7 +1,6 @@
[nosetests]
match = ^test
where = tilecloud_chain/tests
nocapture = 1
cover-package = tilecloud_chain
with-coverage = 1
cover-erase = 1
2 changes: 0 additions & 2 deletions setup.py
Expand Up @@ -49,15 +49,13 @@
'console_scripts': [
'generate_tiles = tilecloud_chain.generate:main',
'generate_controller = tilecloud_chain.controller:main',
'generate_amazon = tilecloud_chain.amazon:main',
'generate_cost = tilecloud_chain.cost:main',
'generate_copy = tilecloud_chain.copy_:main',
'generate_process = tilecloud_chain.copy_:process',
'import_expiretiles = tilecloud_chain.expiretiles:main',
],
'pyramid.scaffold': [
'tilecloud_chain = tilecloud_chain.scaffolds:Create',
'tilecloud_chain_ec2 = tilecloud_chain.scaffolds:Ec2',
],
'paste.app_factory': [
'server = tilecloud_chain.server:app_factory',
Expand Down
21 changes: 10 additions & 11 deletions tilecloud_chain/__init__.py
Expand Up @@ -168,20 +168,19 @@ def __init__(self, config_file, options=None, layer_name=None, base_config={}):
self.config['cost']['s3'] = {}
if 'cloudfront' not in self.config['cost']:
self.config['cost']['cloudfront'] = {}
if 'ec2' not in self.config['cost']:
self.config['cost']['ec2'] = {}
if 'esb' not in self.config['cost']:
self.config['cost']['esb'] = {}
if 'sqs' not in self.config['cost']:
self.config['cost']['sqs'] = {}
if 'generation' not in self.config:
self.config['generation'] = {}
for gname, grid in sorted(self.config.get('grids', {}).items()):
grid["name"] = gname
if grid is not None:
grid["name"] = gname
for cname, cache in sorted(self.config.get('caches', {}).items()):
cache["name"] = cname
if cache is not None:
cache["name"] = cname
for lname, layer in sorted(self.config.get('layers', {}).items()):
layer["name"] = lname
if layer is not None:
layer["name"] = lname

c = Core(
source_data=self.config,
Expand Down Expand Up @@ -222,7 +221,7 @@ def __init__(self, config_file, options=None, layer_name=None, base_config={}):
]))
))
exit(1)
except NotSequenceError as e:
except NotSequenceError as e: # pragma: no cover
logger.error("The config file '{}' in invalid.\n - {}".format(
config_file, e.msg
))
Expand Down Expand Up @@ -287,7 +286,7 @@ def __init__(self, config_file, options=None, layer_name=None, base_config={}):
layer['dimensions'] = []
if layer['type'] == 'mapnik' and \
layer['output_format'] == 'grid' and \
layer.get('meta', False):
layer.get('meta', False): # pragma: no cover
logger.error("The layer '{}' is of type Mapnik/Grid, that can't support matatiles.".format(
lname
))
Expand Down Expand Up @@ -512,7 +511,7 @@ def set_layer(self, layer, options):

if options.near is not None or (
options.time is not None and 'bbox' in self.layer and options.zoom is not None
):
): # pragma: no cover
if options.zoom is None or len(options.zoom) != 1: # pragma: no cover
exit('Option --near needs the option --zoom with one value.')
if not (options.time is not None or options.test is not None): # pragma: no cover
Expand Down Expand Up @@ -618,7 +617,7 @@ def get_geoms(self, layer, extent=None):

def add_local_process_filter(self): # pragma: no cover
self.ifilter(LocalProcessFilter(
self.config["ec2"]["number_process"],
self.config["generation"]["number_process"],
self.options.local_process_number
))

Expand Down

0 comments on commit 142c4c4

Please sign in to comment.