Skip to content

Commit

Permalink
Merge 6b70ee3 into bddc209
Browse files Browse the repository at this point in the history
  • Loading branch information
viklund committed Apr 5, 2018
2 parents bddc209 + 6b70ee3 commit 8ee21be
Show file tree
Hide file tree
Showing 36 changed files with 579 additions and 70 deletions.
5 changes: 5 additions & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
[run]
omit = /usr/local/*,/home/travis/virtualenv/*

[report]
omit = /usr/local/*,/home/travis/virtualenv/*
3 changes: 2 additions & 1 deletion .pylintrc
Original file line number Diff line number Diff line change
Expand Up @@ -138,7 +138,8 @@ disable=print-statement,
keyword-arg-before-vararg,
arguments-differ,
line-too-long,
import-error
import-error,
no-self-use

# Enable the message, report, category or checker with the given id(s). You can
# either give multiple identifier separated by comma (,) or put this option
Expand Down
1 change: 1 addition & 0 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,5 +7,6 @@ before_install:
- test/travis_before_install.sh
install:
- pip install -r backend/requirements.txt
- pip install coverage coveralls
script:
- test/travis_script.sh
13 changes: 13 additions & 0 deletions Dockerfile-backend
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
FROM ubuntu:16.04

RUN apt-get update && apt-get install -y \
python3 \
python3-pip \
libmysqlclient-dev

ADD . /code
WORKDIR /code

RUN pip3 install -r backend/requirements.txt

CMD ["python3", "backend/route.py", "--develop"]
24 changes: 24 additions & 0 deletions Dockerfile-frontend-rebuilder
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
FROM ubuntu:16.04

RUN apt-get update && \
apt-get install -y \
curl \
rsync \
python3 \
python3-pip \
python3-pyinotify \
inotify-tools \
libmysqlclient-dev && \
update-alternatives --install /usr/bin/python python /usr/bin/python3 5

RUN curl -sL https://deb.nodesource.com/setup_6.x | bash - && \
apt-get install -y nodejs

ADD . /code
WORKDIR /code

RUN pip3 install --upgrade pip && \
pip3 install -r backend/requirements.txt && \
pip3 install inotify

CMD ["python", "scripts/watch_frontend.py"]
4 changes: 2 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -42,6 +42,6 @@ static/js/app.js: $(JAVASCRIPT_FILES)
mkdir -p $$( dirname $@ )
cat $^ >$@

static/templates/%.html: frontend/templates/%.html
static/templates/%.html: frontend/templates/%.html frontend/templates/ng-templates/dataset-base.html
mkdir -p $$( dirname $@ ) 2>/dev/null || true
python scripts/compile_template.py ${COMPILE_TEMPLATE_OPTS} -b frontend/templates -s $< >$@
python3 scripts/compile_template.py ${COMPILE_TEMPLATE_OPTS} -b frontend/templates -s $< >$@
25 changes: 16 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,8 @@
SweFreq - Swedish Frequency database
====================================
[![Travis Status][travis-badge]][travis-link]
[![Coverage Status][coveralls-badge]][coveralls-link]


Running on a production system
------------------------------
Expand Down Expand Up @@ -55,19 +58,23 @@ The application has only been tested with python 3.5.2. It will most likely work
Quick development mode
----------------------

1. Install docker
2. Look at `test/travis_before_install.sh` to initiate the mysql docker image.
1. Install docker (and docker-compose in case it's not included in the installation)
2. Create test database
2.1. Initiate a mysql data volume by running `./scripts/download_and_create_docker_db_volume.sh`
2.2. Load mysql dummy data by running `./scripts/load_mysql_dummy_data.sh`
3. Copy `settings_sample.json` into `settings.json` and
- Change mysqlSchema into `swefreq_test`.
- Change mysqlPort to 3366
- Update the credentials for elixir and google oauth.
- Elixir/redirectUri: http://localhost:4000/elixir/login
- redirectUri: http://localhost:4000/login
4. Run "Test 2. Load the swefreq schema" from `test/travis_script.sh`.
5. Run `make` in the root directory of the project.
6. Make a symbolic link from `backend/static` to `static`.
7. Run the server:
- Set `mysqlHost` to `db`
4. Make a symbolic link from `backend/static` to `static`.
5. Run the server:
```bash
$ cd backend
$ python route.py --develop
$ docker-compose up
```

[travis-badge]: https://travis-ci.org/NBISweden/swefreq.svg?branch=develop
[travis-link]: https://travis-ci.org/NBISweden/swefreq
[coveralls-badge]: https://coveralls.io/repos/github/NBISweden/swefreq/badge.svg?branch=develop
[coveralls-link]: https://coveralls.io/github/NBISweden/swefreq?branch=develop
160 changes: 160 additions & 0 deletions backend/application.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,11 +3,17 @@
from os import path
import logging
from datetime import datetime, timedelta
from peewee import fn
import peewee
import smtplib
import socket
import tornado.web
import tornado
import random
import string
import uuid
import math
import re

import db
import handlers
Expand Down Expand Up @@ -36,6 +42,88 @@ def build_dataset_structure(dataset_version, user=None, dataset=None):
return r


class QuitHandler(handlers.UnsafeHandler):
def get(self):
ioloop = tornado.ioloop.IOLoop.instance()
ioloop.stop()


class GetSchema(handlers.UnsafeHandler):
"""
Returns the schema.org, and bioschemas.org, annotation for a given
url.
This function behaves quite differently from the rest of the application as
the structured data testing tool had trouble catching the schema inject
when it went through AngularJS. The solution for now has been to make this
very general function that "re-parses" the 'url' request parameter to
figure out what information to return.
"""
def get(self):

dataset = None
version = None
try:
url = self.get_argument('url')
match = re.match(".*/dataset/([^/]+)(/version/([^/]+))?", url)
if match:
dataset = match.group(1)
version = match.group(3)
except tornado.web.MissingArgumentError:
pass

base = {"@context": "http://schema.org/",
"@type": "DataCatalog",
"name": "SweFreq",
"alternateName": [ "The Swedish Frequency resource for genomics" ],
"description": "The Swedish Frequency resource for genomics (SweFreq) is a website developed to make genomic datasets more findable and accessible in order to promote collaboration, new research and increase public benefit.",
"url": "https://swefreq.nbis.se/",
"provider": {
"@type": "Organization",
"name": "National Bioinformatics Infrastructure Sweden",
"alternateName": [ "NBIS",
"ELIXIR Sweden" ],
"logo": "http://nbis.se/assets/img/logos/nbislogo-green.svg",
"url": "https://nbis.se/"
},
"datePublished": "2016-12-23",
"dateModified": "2017-02-01",
"license": {
"@type": "CreativeWork",
"name": "GNU General Public License v3.0",
"url": "https://www.gnu.org/licenses/gpl-3.0.en.html"
}
}

if dataset:
dataset_schema = {'@type':"Dataset"}

try:
dataset_version = db.get_dataset_version(dataset, version)

if dataset_version.available_from > datetime.now():
# If it's not available yet, only return if user is admin.
if not (self.current_user and self.current_user.is_admin(version.dataset)):
self.send_error(status_code=403)

base_url = "%s://%s" % (self.request.protocol, self.request.host)
dataset_schema['url'] = base_url + "/dataset/" + dataset_version.dataset.short_name
dataset_schema['@id'] = dataset_schema['url']
dataset_schema['name'] = dataset_version.dataset.short_name
dataset_schema['description'] = dataset_version.description
dataset_schema['identifier'] = dataset_schema['name']
dataset_schema['citation'] = dataset_version.ref_doi

base["dataset"] = dataset_schema

except db.DatasetVersion.DoesNotExist as e:
logging.error("Dataset version does not exist: {}".format(e))
except db.DatasetVersionCurrent.DoesNotExist as e:
logging.error("Dataset does not exist: {}".format(e))

self.finish(base)


class ListDatasets(handlers.UnsafeHandler):
def get(self):
# List all datasets available to the current user, earliear than now OR
Expand Down Expand Up @@ -362,6 +450,8 @@ def post(self, dataset, email):
server.sendmail(msg['from'], [msg['to']], msg.as_string())
except smtplib.SMTPException as e:
logging.error("Email error: {}".format(e))
except socket.gaierror as e:
logging.error("Email error: {}".format(e))

self.finish()

Expand Down Expand Up @@ -476,3 +566,73 @@ def get(self, dataset):
self.set_header("Content-Type", logo_entry.mimetype)
self.write(logo_entry.data)
self.finish()


class SFTPAccess(handlers.SafeHandler):
"""
Creates, or re-enables, sFTP users in the database.
"""
def get(self):
"""
Returns sFTP credentials for the current user.
"""
if db.get_admin_datasets(self.current_user).count() <= 0:
self.finish({'user':None, 'expires':None, 'password':None})

password = None
username = None
expires = None
# Check if an sFTP user exists for the current user
try:
data = self.current_user.sftp_user.get()
username = data.user_name
expires = data.account_expires.strftime("%Y-%m-%d %H:%M")
except db.SFTPUser.DoesNotExist:
# Otherwise return empty values
pass

self.finish({'user':username,
'expires':expires,
'password':password})

def post(self):
"""
Handles generation of new credentials. This function either creates a
new set of sftp credentials for a user, or updates the old ones with a
new password and expiry date.
"""
if db.get_admin_datasets(self.current_user).count() <= 0:
self.finish({'user':None, 'expires':None, 'password':None})

# Create a new password
username = "_".join(self.current_user.name.split()) + "_sftp"
password = self.generate_password()
expires = datetime.today() + timedelta(days=30)

# Check if an sFTP user exists for the current user when the database is ready
try:
self.current_user.sftp_user.get()
# if we have a user, update it
db.SFTPUser.update(password_hash = fn.SHA2(password, 256),
account_expires = expires
).where(db.SFTPUser.user == self.current_user).execute()
except db.SFTPUser.DoesNotExist:
# if there is no user, insert the user in the database
db.SFTPUser.insert(user = self.current_user,
user_uid = db.get_next_free_uid(),
user_name = username,
password_hash = fn.SHA2(password, 256),
account_expires = expires
).execute()

self.finish({'user':username,
'expires':expires.strftime("%Y-%m-%d %H:%M"),
'password':password})

def generate_password(self, size = 12):
"""
Generates a password of length 'size', comprised of random lowercase and
uppercase letters, and numbers.
"""
chars = string.ascii_letters + string.digits
return ''.join(random.SystemRandom().choice(chars) for _ in range(size))
52 changes: 44 additions & 8 deletions backend/db.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,7 @@
MySQLDatabase,
PrimaryKeyField,
TextField,
fn,
)
import settings

Expand Down Expand Up @@ -118,14 +119,16 @@ class Meta:


class DatasetVersion(BaseModel):
dataset_version = PrimaryKeyField(db_column='dataset_version_pk')
dataset = ForeignKeyField(db_column='dataset_pk', rel_model=Dataset, to_field='dataset', related_name='versions')
version = CharField()
description = TextField()
terms = TextField()
var_call_ref = CharField(null=True)
available_from = DateTimeField()
ref_doi = CharField(null=True)
dataset_version = PrimaryKeyField(db_column='dataset_version_pk')
dataset = ForeignKeyField(db_column='dataset_pk', rel_model=Dataset, to_field='dataset', related_name='versions')
version = CharField()
description = TextField()
terms = TextField()
var_call_ref = CharField(null=True)
available_from = DateTimeField()
ref_doi = CharField(null=True)
data_contact_name = CharField(null=True)
data_contact_link = CharField(null=True)

class Meta:
db_table = 'dataset_version'
Expand Down Expand Up @@ -252,6 +255,39 @@ class Meta:
db_table = 'dataset_version_current'


class SFTPUser(BaseModel):
sftp_user = PrimaryKeyField(db_column='sftp_user_pk')
user = ForeignKeyField(db_column='user_pk', rel_model=User, to_field='user', related_name='sftp_user')
user_uid = IntegerField(unique=True)
user_name = CharField(null=False)
password_hash = CharField(null=False)
account_expires = DateTimeField(null=False)

class Meta:
db_table = 'sftp_user'


def get_next_free_uid():
"""
Returns the next free uid >= 10000, and higher than the current uid's
from the sftp_user table in the database.
"""
default = 10000
next_uid = default
try:
current_max_uid = SFTPUser.select(fn.MAX(SFTPUser.user_uid)).get().user_uid
if current_max_uid:
next_uid = current_max_uid+1
except SFTPUser.DoesNotExist:
pass

return next_uid


def get_admin_datasets(user):
return DatasetAccess.select().where( DatasetAccess.user == user, DatasetAccess.is_admin)


def get_dataset(dataset):
dataset = Dataset.select().where( Dataset.short_name == dataset).get()
return dataset
Expand Down
Loading

0 comments on commit 8ee21be

Please sign in to comment.