Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC basic user API #3

Closed
tiborsimko opened this issue May 23, 2017 · 2 comments
Closed

RFC basic user API #3

tiborsimko opened this issue May 23, 2017 · 2 comments

Comments

@tiborsimko
Copy link
Member

The purpose of this issue is to discuss basic REANA API from the end-user
perspective. We'll concentrate on the analysis API. There will be more later
such as resource API etc. So here, we are touching only how to start an
analysis, submit input data, run and monitor the workflow, and obtain the
results back.

Here are tentative sequence diagrams for illustration:

  1. Prepare new analysis
CLIENT ----> SERVER --------------------------> CLOUD

client ----> POST /api/:tenant/analyses    ---> create new analysis of :uuid in DB
                                                create workspace on the shared filesystem
                                                (e.g. /reana/analyses/:uuid/workspace/)
       <---- 201 CREATED :uuid
  1. Seed analysis workspace
CLIENT ------------> SERVER -----------------------------------------> CLOUD

         push files
client ------------> POST /api/:tenant/analyses/:uuid/workspace   ---> add posted files to :workspace
       <----------- 201 CREATED list of input files
  1. Enquire about analysis execution status
CLIENT -----------------> SERVER ------------------------------------> CLOUD

        get information
client -----------------> GET /api/:tenant/analyses/:uuid/status  ---> check status
       <---------------- 200 OK :status (waiting, running, stopped, paused, finished)
  1. Manage analysis execution
CLIENT -----------------> SERVER -----------------------------------> CLOUD

        start, stop, pause
client -----------------> PUT /api/:tenant/analyses/:uuid/status ---> run, stop, pause when you can (after a step)
       <---------------- 200 OK :status requested
  1. List results
CLIENT -----------------> SERVER ---------------------------------------> CLOUD

        get information
client -----------------> GET /api/:tenant/analyses/:uuid/workspace  ---> list available output files in the workspace
       <---------------- 200 OK list of filenames

  1. Grab results
CLIENT -----------------> SERVER ---------------------------------------> CLOUD

        get file content
client -----------------> GET /api/:tenant/analyses/:uuid/workspace/:filename  ---> grab content of output filename
       <---------------- 200 OK content blob
  1. Grab logs
CLIENT -----------------> SERVER ---------------------------------> CLOUD

client -----------------> GET /api/:tenant/analyses/:uuid/log  ---> grab content of stdout/stderr of analyses
       <---------------- 200 OK content                                (with pod ids, timestamps)
  1. Delete analysis
CLIENT -----------------> SERVER -------------------------------> CLOUD

            delete
client -----------------> DELETE /api/:tenant/analyses/:uuid ---> clean workspace
       <---------------- 204  No content

The above serves to start discussion. Its content may be updated as we go along.

@lukasheinrich
Copy link
Member

for log grabbing, we should add endpoints to get logs for the individual jobs of the workflow.

@diegodelemos
Copy link
Member

Implementation has diverged (as it can be seen in REANA server API docs) since there was no agreement. This RFC can be reopened in the future if needed, closing for now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
No open projects
Development

No branches or pull requests

3 participants