Skip to content
Switch branches/tags

Latest commit


Git stats


Failed to load latest commit information.
Latest commit message
Commit time

Geolinkedata REST API

Geolinkedata REST API is a Django app to provide restful api for the project Geolinkedata.


It requires geogig. Install it with:

git clone

cd geogig/src/parent

mvn clean install

It is recommended to isolate the development in a virtual environment. For example you can use pyenv and install the plugin pyenv-virtualenv


Create a virtual enviroment with the specified version of python:

pyenv install 2.7.11
pyenv virtualenv 2.7.11 geolinkedata

Enter the virtual enviroment:

eval "$(pyenv init -)"
pyenv shell geolinkedata
pyenv activate geolinkedata

Firsty install the required Django version supported by GeoNode for compatibility.

pip install django==1.8.7
pip freeze > requirements.txt

and install these python packages:

pip install djangorestframework
pip install djangorestframework-xml
pip install django-oauth-toolkit
pip install django-rest-swagger
pip install geogig-py


  • Start a new Django project in the same virtual enviroment:
mkdir api_tutorial && cd api_tutorial
django-admin startproject api_tutorial .
  • Install the api application from the repository:
pip install -e <LOCAL_PATH>/geolod-api
  • Append required apps to INSTALLED_APPS var in your
  • add these configurations in the same file:
STATIC_ROOT = os.path.join(BASE_DIR, "static")

# dirs for upload and storing files
UPLOAD_SHAPE = '/tmp/shapes'
UPLOAD_TRIPLE_STORE = '/tmp/triple-stores'

# rest_framework config

            'default': '10/minute',
            'download': '50/minute',
            'utility': '5/minute',

# rest swagger config
    "exclude_namespaces": [],
    "api_version": '1.0',
    "api_path": "/",
    "enabled_methods": [
    "api_key": '',
    "is_authenticated": False,
    "is_superuser": False,
  • Create the api db tables:
python syncdb
  • Add api urls to of the api_tutorial application:
from django.conf import settings
from django.conf.urls.static import static

urlpatterns = [
  url(r'^admin/', include(,
  # api
  url(r'^', include('api.urls')),
  # api swaggerized
  url(r'^docs/', include('rest_framework_swagger.urls')),
] + static(settings.STATIC_URL, document_root=settings.STATIC_ROOT)
  • Start geogig with:
  • Run the command for serving static files:
cd usage
python collectstatic
  • Start the local server at the default port 8000 with gunicorn:
gunicorn api_tutorial.wsgi

Usage of the api_tutorial application with docker

Set up the shell with your docker machine:

eval $(docker-machine env default)

Rebuild the services with this command:

docker-compose build

Run the application on the container by executing:

docker-compose up

Add the first superuser for the application:

docker-compose run web python createsuperuser

Update database settings to Postgresql

You are going to modify to let you change the database configuration:

    # 'default': {
    #     'ENGINE': 'django.db.backends.sqlite3',
    #     'NAME': os.path.join(BASE_DIR, 'db.sqlite3'),
    # }
    'default': {
        'ENGINE': 'django.db.backends.postgresql_psycopg2',
        'NAME': 'postgres',
        'USER': 'postgres',
        'HOST': 'db',
        'PORT': 5432,

where the HOST is the link to the docker-compose.yml database service:

  image: postgres

Secondly we should also add the package psycopg2 as dependency:

pip install psycopg2
pip freeze > requirements.txt


A trouble with previous versions of Django for migrations can be arised. If you encounter that in the error of such message 'django.db.utils.ProgrammingError: relation "auth_user" does not exist' then accomplish the actions below

In order to build new migrations for the api reusable app you can execute this commands below. Delete the old compiled files .pyc:

rm -rf api/*.pyc
rm -rf api/migrations/*.pyc

Build new migrations for the api app:

python makemigrations api

After this the migrations will be generated again.

Update the container

Once you are ready with the projects then run the container from scratch:

docker-compose down
docker-compose build
docker-compose up

Then execute the migrate command in the api_tutorial django project inside the container:

docker-compose run web python migrate

Finally create a new superuser with the command:

docker-compose run web python createsuperuser

Test the api_tutorial application

You can test the API urls with the user just created in the sqlite database. First of all it would be useful to make it to the docker environment. So let's get started with some basic variable's settings for the docker host ip address:

DOCKER_HOST_IP=$(docker-machine ip)

In order to get the required cookies for making calls to the django site we can do the following request with curl or similar tools:

curl -Ic - -XGET http://$DOCKER_HOST_IP:8000/admin/login/\?next\=/admin/

Each request has a response cookie named csrftoken that we want to catch and use it as a variable for the following requests:

CSRFTOKEN=$(curl -c - -XGET "http://${DOCKER_HOST_IP}:8000/admin/login/?next=/admin/" | grep csrftoken | cut -f 7)


Alternatively you can use the commands below to extract the cookie:

curl -I -XGET http://$DOCKER_HOST_IP:8000/admin/login/?next=/admin/ -o /dev/null -c cookies.txt -s
grep csrftoken cookies.txt | cut -f 7

Once we have all the elements to accomplish the login request then run the HTTP POST with the following command:

curl -H "Cookie: csrftoken=$CSRFTOKEN" -d "username=admin&password=admin1234&csrfmiddlewaretoken=$CSRFTOKEN&next=/admin/" -XPOST http://$DOCKER_HOST_IP:8000/admin/login/ -v -c -

The response figures out two new cookies (csrftoken,*sessionid*) required for all authenticated calls to the web application urls. Embed the command above in a bash variable for automatically storing the cookies' value and then reuse them:

# csrftoken cookie
CSRFTOKEN_RESP=$(curl -H "Cookie: csrftoken=$CSRFTOKEN" -d "username=admin&password=admin1234&csrfmiddlewaretoken=$CSRFTOKEN&next=/admin/" -XPOST "http://${DOCKER_HOST_IP}:8000/admin/login/" -c - | grep csrftoken | cut -f 7)
# sessionid cookie
SESSIONID=$(curl -H "Cookie: csrftoken=$CSRFTOKEN" -d "username=admin&password=admin1234&csrfmiddlewaretoken=$CSRFTOKEN&next=/admin/" -XPOST "http://${DOCKER_HOST_IP}:8000/admin/login/" -c - | grep sessionid | cut -f 7)

At this point we are able to making all authenticated calls to the APIs. For example you can query as an administrator all the users actually available in the django system:

curl -H "Cookie: csrftoken=$CSRFTOKEN_RESP; sessionid=$SESSIONID" -XGET '' -v

e2e tests


Fetch the API model

Install the utility fetch-swagger-schema

npm install -g fetch-swagger-schema

Fetch and save schema as a json file:

fetch-swagger-schema api.json

How to document your API

Actually the current fetched schema is based on specs 1.2 since django-rest-swagger doesn't support the new version 2.0. You can also edit your API specification with the latter version by using the Swagger Editor GUI. Follow this commands below

npm install -g http-server
http-server swagger-editor

Then you can open the API console at the local url.


RESTful API for geographic linked open data




No releases published


No packages published