Skip to content

Commit

Permalink
another comment for clarity; descriptions of each page
Browse files Browse the repository at this point in the history
  • Loading branch information
soniamitchell committed Jun 23, 2021
1 parent 369c869 commit 05f71c4
Show file tree
Hide file tree
Showing 3 changed files with 10 additions and 2 deletions.
1 change: 1 addition & 0 deletions content/docs/interface/example0/_index.md
Expand Up @@ -7,6 +7,7 @@ title: "Simple working examples"

# Simple working examples (with Data Pipeline API functionality)

This page gives simple examples of the **user written** *config.yaml* file alongside the working config file generated by `FAIR run`. Note that **the Data Pipeline API will take the working config file as an input**.
## Empty code run

### User written *config.yaml*
Expand Down
7 changes: 5 additions & 2 deletions content/docs/interface/example1/_index.md
Expand Up @@ -7,6 +7,8 @@ title: "Full working example (with descriptions of Data Pipeline API functionali

# Full working example (with Data Pipeline API functionality)

This page gives a full working example of the **user written** *config.yaml* file alongside the working config file generated by `FAIR run`. Note that **the Data Pipeline API will take the working config file as an input**.

The following example downloads some data from outside the pipeline, does some processing in R (for example), and records the original file and the resultant data product into the pipeline.

In this simple example, the user should run the following from the terminal:
Expand Down Expand Up @@ -234,7 +236,8 @@ Alternatively, the submission script may be written in Python.
from data_pipeline_api.standard_api import StandardAPI

with StandardAPI.from_config("config.yaml") as api:
matrix = read(api.link_read("records/SARS-CoV-2/scotland/cases-and-management"))
data = read(api.link_read("records/SARS-CoV-2/scotland/cases-and-management"))
data
api.write_array("records/SARS-CoV-2/scotland/cases-and-management/mortality", "mortality_data", matrix)
api.issue_with_component("records/SARS-CoV-2/scotland/cases-and-management/mortality", "mortality_data", "this data is bad", "7")
api.finalise()
Expand All @@ -255,7 +258,7 @@ input_path = link_read(handle)

# Process raw data and write data product
data = read_csv(input_path)
array = some_processing(data)
array = some_processing(data) # e.g. data wrangling, running a model, etc.
index = write_estimate(array,
handle,
data_product = "records/SARS-CoV-2/scotland/cases-and-management/mortality",
Expand Down
4 changes: 4 additions & 0 deletions content/docs/interface/example2/_index.md
Expand Up @@ -7,6 +7,10 @@ title: "Additional working examples"

# Additional working examples

This page gives addition examples of the **user written** *config.yaml* file alongside the working config file generated by `FAIR run`. Note that **the Data Pipeline API will take the working config file as an input**.

These examples may include variables not mentioned on previous pages or cases that might rarely be encountered such as using aliases.

## Read data product, process, and write data product (with aliases)

### User written *config.yaml*
Expand Down

0 comments on commit 05f71c4

Please sign in to comment.