Skip to content

Latest commit

 

History

History
355 lines (230 loc) · 9.85 KB

INSTALL.rst

File metadata and controls

355 lines (230 loc) · 9.85 KB

Installation Guide

There are two possibilities for setting up your own development version of CERN Analysis Preservation, a Bare Installation with python virtualenvwrapper and a Docker Installation.

Bare Installation

This is a step-by-step guide for installing CERN Analysis Preservation on your machine.

Prerequisites

CERN Analysis Preservation is based on Invenio v3.0 alpha, which requires some additional software packages:

For example, on Debian GNU/Linux, you can install them as follows:

sudo apt-get install elasticsearch postgresql rabbitmq-server redis-server

Now, add the following lines in your "elasticsearch.yml" (for Debian GNU/Linux the full path is /etc/elasticsearch/elasticsearch.yml):

# CAP CONFIGURATION
cluster.name: cap
discovery.zen.ping.multicast.enabled: false
http.port: 9200
http.publish_port: 9200

In order to use PostgreSQL you need to start the database server. This is very operation system specific, so you should check how it works for yours. When the server is running, switch to the default PostgreSQL user and create a user who is allowed to create databases:

createuser -d $Username

Finally, do a system-wide install (see below for how to do a local install enclosed inside your virtual environment instead) for the Sass preprocessor by following Sass web guide and running:

sudo npm install -g node-sass@3.8.0 clean-css@3.4.12 uglify-js requirejs

Installation

Let's start by cloning the repository:

git clone https://github.com/cernanalysispreservation/analysispreservation.cern.ch.git cap

All else will be installed inside a python virtualenv for easy maintenance and encapsulation of the libraries required. From inside your cap folder you can choose anytime whatever virtual environment you want to work on (just type workon virtualenv_installed) or you can choose to create a new one.

To do the latter, create a new virtual environment to hold our CAP instance from inside the repository folder:

cd cap
mkvirtualenv cap

Install the CAP package from inside your cap repository folder and run npm to install the necessary JavaScript assets the Invenio modules depend on:

pip install -r requirements.txt
cap npm
cdvirtualenv var/cap-instance/static
npm install bower
npm install

Build the assets from your repository folder:

cd -
cap collect -v
cap assets build
python ./scripts/schemas.py

Start Elasticsearch in the background:

elasticsearch &

Note: Instead of the following steps you may want to run ./scripts/init.sh.

Create a database to hold persistent data:

cap db init
cap db create

Create test user accounts and roles with which you can log in later:

cap users create info@inveniosoftware.org -a --password infoinfo
cap users create alice@inveniosoftware.org -a --password alicealice
cap users create atlas@inveniosoftware.org -a --password atlasatlas
cap users create cms@inveniosoftware.org -a --password cmscms
cap users create lhcb@inveniosoftware.org -a --password lhcblhcb

cap roles create analysis-preservation-support@cern.ch
cap roles create alice-member@cern.ch
cap roles create atlas-active-members-all@cern.ch
cap roles create cms-members@cern.ch
cap roles create lhcb-general@cern.ch

cap roles add info@inveniosoftware.org analysis-preservation-support@cern.ch
cap roles add alice@inveniosoftware.org alice-member@cern.ch
cap roles add atlas@inveniosoftware.org atlas-active-members-all@cern.ch
cap roles add cms@inveniosoftware.org cms-members@cern.ch
cap roles add lhcb@inveniosoftware.org lhcb-general@cern.ch

info is a superuser, alice is an ALICE user, atlas is an ATLAS user, cms is a CMS user and lhcb is a LHCB user.

Create some basic collections for ElasticSearch:

cap collections create CERNAnalysisPreservation
cap collections create CMS -p CERNAnalysisPreservation
cap collections create CMSQuestionnaire -p CMS -q '_type:cmsquestionnaire'
cap collections create CMSAnalysis -p CMS -q '_type:cmsanalysis'
cap collections create LHCb -p CERNAnalysisPreservation
cap collections create LHCbAnalysis -p LHCb -q '_type:lhcbanalysis'
cap collections create ATLAS -p CERNAnalysisPreservation
cap collections create ATLASWorkflows -p ATLAS -q '_type:atlasworkflows'
cap collections create ATLASAnalysis -p ATLAS -q '_type:atlasanalysis'
cap collections create ALICE -p CERNAnalysisPreservation

Create the index in ElasticSearch using the mappings:

cap index init

Create a location for files:

cap files location local var/data --default

Now you are ready to run the server.

Populating the Database with Example Records

If you want to populate the database with example records simply run:

# For creating demo records with schema validation
cap fixtures records

# For creating demo records without validation ( --force )
cap fixtures records -f

Prerequisites for Running the Server

To run an https server you will have to create a certificate. This needs to be done only once from inside your repository folder:

openssl genrsa 4096 > ssl.key
openssl req -key ssl.key -new -x509 -days 365 -sha256 -batch > ssl.crt

The certificate will be valid for 365 days.

Running the Server

Start a redis server in the background:

redis-server &

Start the web application locally in debug mode:

gunicorn -b 127.0.0.1:5000 --certfile=ssl.crt --keyfile=ssl.key cap.wsgi:application --workers 9 --log-level debug

Now you can log in locally in your browser by going to https://localhost:5000/app/login and entering one of the user credentials created above, e.g. user info@inveniosoftware.org with password infoinfo.

General Recommendations

Specify Python Version

You can specify the python version for the virtual environment on creation as follows (e.g. to use python 2.7):

mkvirtualenv -p /usr/bin/python2.7 cap

Local Installation of npms and gems

You do not need to install sass and all npm dependencies globally on your system. You can install them inside your virtual environment so they will only be accessible from within it. Simply add:

export GEM_HOME="$VIRTUAL_ENV/gems"
export GEM_PATH=""
export PATH="$GEM_HOME/bin:$PATH"
export npm_config_prefix=$VIRTUAL_ENV

to the postactivate of your .virtualenv folder and run

cdvirtualenv
gem install sass
npm -g install node-sass@3.8.0 clean-css@3.4.12 uglify-js requirejs

after creating your virtual environment.

Troubleshooting

Missing Requirements

If you have trouble with the setup, check if you are missing one of the following requirements, e.g. on Debian GNU/Linux:

sudo apt-get install npm ruby gcc python-virtualenvwrapper

The version of Python 2 given by python --version or python2 --version should be greater than 2.7.10.

Non-matching Requirements

If you encounter a problem with requirements that do not match it may be because the python eggs are not included in your virtualenv and you will have to update them running:

pip install -r requirements.txt

Database Indexing Problems

If you have trouble indexing the database try:

cap db destroy
cap db init

and if that does not work try:

curl -XDELETE 'http://localhost:9200/_all'
cap db init

Docker Installation

First, install docker-engine and docker-compose on your machine.

Second, build the CERN Analysis Preservation images, using the development configuration:

docker-compose -f docker-compose-dev.yml build

Third, start the CERN Analysis Preservation application:

docker-compose -f docker-compose-dev.yml up -d

Fourth, create database and initialise default collections and users:

docker exec -i -t analysispreservationcernch_web_1 /code/scripts/init.sh

Fifth, populate the database with some example records (optional):

docker exec -i -t analysispreservationcernch_web_1 cap fixtures records -f

Finally, see the site in action:

firefox http://localhost/