-
Live example: https://github.com/VEuPathDB/dataset-handler-biom
-
Script Expectations: https://github.com/VEuPathDB/util-user-dataset-handler-server#script-expectations
-
Config.yml Variables: https://veupathdb.github.io/util-user-dataset-handler-server
The script that should be executed to handle incoming datasets may be in any language, but must be executable by a CLI call.
By default Python2 is included, but other desired languages will have to be
installed via Alpine Linux’s apk
command.
A full listing of available packages for the version of Alpine used in our images can be found here.
If the language needed is not included in the package listing linked to above, the base image Alpine version can be updated by the UI-Infra team.
The config.yml
file is used to tell the HTTP server that will be running in
the container how to execute your script via a command line call.
A full reference for the syntax of this file can be found here.
The config file allows injecting variables into the CLI call allowing customization of the call based on the input HTTP request to the server.
A full, working example can be found here.
The project Jenkinsfile tells the Jenkins build workflow how to build your dataset handler. Generally, you will only need to edit one line, which is the build container name.
#!groovy
@Library('pipelib')
import org.veupathdb.lib.Builder
node('centos8') {
sh "env"
def builder = new Builder(this)
builder.gitClone()
builder.buildContainers([
[ name: 'user-dataset-handler-biom' ] (1)
])
}
-
This line configures the name of the image when it is built. It should be similar to the example in name following the template
user-dataset-handler-{my-handler-name}
Once your handler script/tool is ready to be deployed or tested in Docker, the
included Dockerfile
must be edited to include your scripts and necessary
libraries.
There is a section at the bottom of the included Dockerfile
for adding file
copies or other setup steps.
An example of this setup can be found in the dataset-handler-biom
project:
# # # # # # # # # # # # # # # #
# #
# Handler Specific Config #
# #
# # # # # # # # # # # # # # # #
COPY lib /opt/handler/lib
COPY bin /opt/handler/bin
RUN pip install git+https://github.com/VEuPathDB/dataset-handler-python-base \
&& chmod +x /opt/handler/bin/exportBiomToEuPathDB
In the above example:
-
The project local
lib
directory is copied into the Docker image under the root path/opt/handler/lib
. -
The project local
bin
directory is copied into the Docker image under the root path/opt/handler/bin
-
The VEuPathDB python project
dataset-handler-python-base
is installed. -
The executable python script included in the project is marked as executable.
Deploying a new handler to the service-user-dataset-import stack.
Publishing a handler image is the process of getting a Docker image for the import handler built and pushed to the VEuPathDB group on DockerHub.
Once published, it is available for deployment to VEuPathDB servers by our automated in-house processes which consume built images from DockerHub.
To get the handler image built and published you need to perform the following tasks:
If the build has not yet been added to Jenkins, then the Systems team will have to be alerted to create the build.
This can be done by creating a RedMine issue including a link to the repo of the image to build.
As part of having the build registered with Jenkins, the new image will need to be registered with DockerHub. If the image is not registered with DockerHub, the Jenkins build will be unable to publish the built image.
Having the new image registered with DockerHub is something that can only be
done by the Systems team. They can be alerted by creating a RedMine issue for
the registration of the image in DockerHub, the ticket should include the image
name as defined in the Jenkinsfile
.
Handler images that are already being built and published to DockerHub may be added to the service-user-dataset-import stack by performing the following steps:
-
Edit the
service-user-dataset-import
config.json
file to add the new import handler configuration. This file is what registers an import handler for use with the service. -
Edit the
service-user-dataset-import
docker-compose.yml
file to add the new import handler image to the Docker Compose stack. -
Create a new Git tag on the service-user-dataset-import repo, bumping the feature segment of the version number.
Assuming you have followed the above steps, and that the builds are working in Jenkins, the final deployment of the new handler can be performed by:
-
Add a new item to the tagger
versions.yml
file for the new import handler image. -
Update the entries for the user-dataset-import-service in the
tagger
versions.yml
file with the new Git tag version.