diff --git a/docs/assets/fractal_tasks_model.png b/docs/assets/fractal_tasks_model.png new file mode 100644 index 0000000..f72e70b Binary files /dev/null and b/docs/assets/fractal_tasks_model.png differ diff --git a/docs/build_your_own_fractal_task.md b/docs/build_your_own_fractal_task.md index 33cc552..5c3eae9 100644 --- a/docs/build_your_own_fractal_task.md +++ b/docs/build_your_own_fractal_task.md @@ -1,4 +1,4 @@ -# Build a Fractal task +# Create a Fractal task Fractal tasks are the core processing units of to build your workflows. Each Fractal task loads the data from one (or many) OME-Zarr(s) and applies processing to them. Fractal tasks are Linux command line executables. For the purpose of this demo, we will look at the Python implementation. You can think of a Fractal task as a Python function that knows how to process an OME-Zarr image and save the results back into that OME-Zarr image. With a bit of syntax sugar, this becomes a Fractal task you can then run from the web interface. To understand the types of tasks, their API & how they provide information to Fractal server, check out the [V2 Tasks page](https://fractal-analytics-platform.github.io/version_2/tasks/). diff --git a/docs/version_2/image_list.md b/docs/image_list.md similarity index 77% rename from docs/version_2/image_list.md rename to docs/image_list.md index 3cbce0d..120cca8 100644 --- a/docs/version_2/image_list.md +++ b/docs/image_list.md @@ -2,7 +2,9 @@ layout: default --- -While applying a processing workflow to a given dataset, Fractal keeps a list of all the OME-Zarr images it is processing. In this page we describe the concepts of [images](#images) and [filters](#filters) - see also the [examples section](#examples). +# Image List + +While applying a processing workflow to a given dataset, Fractal keeps a list of all the OME-Zarr images it is processing and their metadata. In this page we describe the concepts of [images](#images) and [filters](#filters) - see also the [examples section](#examples). ## Images @@ -10,9 +12,9 @@ Each entry in the image list is defined by a unique `zarr_url` property (the ful ### Image types -Image types are boolean properties that allow to split the image list into different sub-lists (e.g. the `is_3D` type for 3D/2D images, or the `illumination_corrected` type for raw/corrected images when illumination correction was not run in-place). Types can be set both by the task manifest (e.g. after an MIP task, the resulting images always have the type `is_3D` set to `False` - see [task-manifest section](#dataset-filters)) as well as from within an individual task (see [task-API/output section](./tasks.md#output-api)). +Image types are boolean properties that allow to split the image list into different sub-lists (e.g. the `is_3D` type for 3D/2D images, or the `illumination_corrected` type for raw/corrected images when illumination correction was not run in-place). Types can be set both by the task manifest (e.g. after an MIP task, the resulting images always have the type `is_3D` set to `False` - see [task-manifest section](#dataset-filters)) as well as from within an individual task (see [task-API/output section](./tasks_spec.md#output-api)). -Note: whenever applying filters to the image list, the absence of a type corresponds to false by default. +Note: when applying filters to the image list, the absence of a type corresponds to false by default. ### Image attributes @@ -23,7 +25,7 @@ Fractal server uses the image list combined with filters (see [below](#dataset-f ## Filters -Before running a given task, Fractal prepares an appropriate image list by extracting the images that match with a given set of filters (that is, a set of specific values assigned to image types and/or image attributes). Filters can be defined for a dataset and/or for a workflow task. If a specific filter is set both for the dataset and for the workflow task, the workflow-task one takes priority. +Before running a given task, Fractal prepares an appropriate image list by extracting the images that match with a given set of filters (that is, a set of specific values assigned to image types and/or image attributes). Filters can be defined for a dataset and/or for a workflow task. If a specific filter is set both for the dataset and for the workflow task, the workflow-task filter takes priority. ### Dataset filters @@ -33,9 +35,9 @@ There are multiple ways a dataset may have a given filter set: 1. I manually set it, by modifying the dataset `filters` property. 2. While writing the Fractal manifest for a task package, I include the `output_types` attribute for a given task. These types are automatically included in the dataset filters after the task is run. Examples: - * An MIP task would have `output_types = {"is_3D": False}`: from this task onwards, the 2D images are processed (not the raw 3D images). - * An illumination-correction task would have `output_types = {"illumination_corrected": True}`: from this task onwards, the registered images are processed (not the raw images). -4. When writing the code for a specific task, the task output, I can include a `filters` property, for either image attributes and/or types - see the [section on task outputs](./tasks.md#output-api). + * An MIP task would set `output_types = {"is_3D": False}` in its output arguments: from this task onwards, the 2D images are processed (not the raw 3D images). + * An illumination-correction task would set `output_types = {"illumination_corrected": True}`: from this task onwards, the registered images are processed (not the raw images). +3. When writing the code for a specific task, the task output can include a `filters` property, for either image attributes and/or types - see the [section on task outputs](./tasks_spec.md#output-api). Examples: @@ -62,22 +64,20 @@ Examples: * The Apply Registration to Image task has `input_types={"registered": False}`, which means it cannot run on images with type `registered=True`. -> Note: as part of an [upcoming `fractal-web` update](https://github.com/fractal-analytics-platform/fractal-web/issues/442), it may become possible to see/edit the current filters upon job submission. - ## Examples After running a converter task, I may have an OME-Zarr HCS plate with 2 wells that contain one image each. In this case, the image list has 2 entries and each image has attributes for plate and well, as well as a true/fals `is_3D` type. -![Image List 1](../assets/image_list_x_1_two_wells_two_images.png) +![Image List 1](assets/image_list_x_1_two_wells_two_images.png) If I then run an illumination-correction task that does not overwrite its input images, the image list includes the two original images (without the `illumination_corrected` type) and two new ones (with `illumination_corrected` type set to true). Note that this task also sets the dataset type filters to `{"illumination_correction": True}`. -![Image List 2](../assets/image_list_x_2_two_wells_four_images.png) +![Image List 2](assets/image_list_x_2_two_wells_four_images.png) If I then run an MIP task, this will act on the two images with `illumination_corrected` set to true, due to the dataset filters. After the task has run, two new images are added to the list (with type `is_3D` set to false). -![Image list 3](../assets/image_list_x_3_two_wells_six_images.png) +![Image list 3](assets/image_list_x_3_two_wells_six_images.png) Another example is that if I have an OME-Zarr HCS plate with 3 wells and each well has 3 multiplexing acquisition, then the image list includes 9 OME-Zarr images (and those entries should have the acquisition attribute set). diff --git a/docs/run_fractal.md b/docs/run_fractal.md index 0e4107a..440d682 100644 --- a/docs/run_fractal.md +++ b/docs/run_fractal.md @@ -1,21 +1,14 @@ -# Deploy Fractal Server & Fractal Web +# Deploy Fractal -Fractal runs locally on a laptop (tested both Linux, macOS and Windows with subsystem Linux) or on a Linux server that submits jobs to a SLURM cluster. The [`fractal-server` documentation](https://fractal-analytics-platform.github.io/fractal-server/) describes the preconditions and the different configurations that can be changed. +Fractal is meant to be deployed to manage workflows on large cluster and currently has support for different modes of running on slurm clusters. It is deployed on Linux servers and also runs on macOS or Windows (by using Windows Subsystem Linux). +You can run a fully containerized local example that is useful for demos and testing purposes by following the instructions in the [fractal containers repository](https://github.com/fractal-analytics-platform/fractal-containers/tree/main/examples/full-stack) or by following along this walkthrough: -Fractal can be used via a command line client, as well as via a web client. To get started with Fractal, you can follow the setup in the [fractal-demos repository](https://github.com/fractal-analytics-platform/fractal-demos). - -Here is a video walk-through for how to set up a local Fractal server: - - +
-Once you have a Fractal server running, you can also access it via Fractal web. To do so, set up a Fractal web server as shown here: +More detailed documentation about the configuration of the different Fractal components can be found in the [`fractal-server` documentation](https://fractal-analytics-platform.github.io/fractal-server/) and the [`fractal web` documentation](https://fractal-analytics-platform.github.io/fractal-web/). - +----- -
- -If you prefer to follow a written guide, follow the instructions in the [server folder](https://github.com/fractal-analytics-platform/fractal-demos/tree/main/examples/server) to set up Fractal server. Once you have successfully installed and started the Fractal server, you can [install a fractal-client environment](https://github.com/fractal-analytics-platform/fractal-demos/tree/main/examples/00_user_setup) and interact with the Fractal server from there. To do so, follow the [instructions for the 01_cardio_tiny_dataset example](https://github.com/fractal-analytics-platform/fractal-demos/tree/main/examples/01_cardio_tiny_dataset). This also includes a link to a tiny dataset and instructions on how to run a full Fractal workflow on this dataset (which should run in under a minute). -To set up Fractal web, follow the instructions in the [Fractal web README](https://github.com/fractal-analytics-platform/fractal-web). -
+Fractal can also be deployed by manually setting up the server in a Python environment, configuring your own postgres database & setting up Fractal web from source. You can find some helpful material for this in the [fractal-demos repository](https://github.com/fractal-analytics-platform/fractal-demos) (especially the examples/server section). We also have older video walkthroughs on manual setups available for both the [fractal-server](https://www.youtube.com/watch?v=mEDHh9Kkdmk) as well as [fractal-web](https://www.youtube.com/watch?v=f_HaiOVH-ig). diff --git a/docs/tasks_spec.md b/docs/tasks_spec.md new file mode 100644 index 0000000..95b7634 --- /dev/null +++ b/docs/tasks_spec.md @@ -0,0 +1,160 @@ +--- +layout: default +--- + +# Tasks + +Fractal tasks are modular and interoperable processing units that handle data in OME-Zarr containers. Each task is an executable that runs on a single OME-Zarr image or a collection of OME-Zarr images. In Fractal, we the OME-Zarrs to be processed by giving the tasks the zarr_urls(s), the paths to a given OME-Zarr image on disk or in the cloud. All tasks load data from an OME-Zarr on disk and store their processing results in an OME-Zarr (the same or a new one) on disk again. The parameters and metadata of tasks are described in a [Fractal manifest in json form](#task-list-and-manifest). This page contains an overview of the Fractal task specification, the [types of Fractal tasks](#task-types), the [manifest](#task-list-and-manifest) that specifies task metadata as well as their [input](#input-api) & [output](#output-api) API. + +![Fractal task model](assets/fractal_tasks_model.png) + +## Task Types + +There are three types of tasks in Fractal V2: parallel tasks, non-parallel tasks & compound tasks. + +1. A **parallel task** is written to process a single OME-Zarr image and meant to be run in parallel across many OME-Zarr images. + - Parallel tasks are the typical scenario for compute tasks that don't need special input handling or subset parallelization. + - Parallel tasks can typically be run on any collection of OME-Zarrs. +2. A **non-parallel task** processes a list of images, and it only runs as a single job. + - Non-parallel tasks are useful to aggregate information across many OME-Zarrs or to create image-list updates (see [the Fractal image list](./image_list.md)). + - Non-parallel tasks can often be specific to given collection types like OME-Zarr HCS plates. +3. A **compound task** consists of an *initialization* (non-parallel) task and a (parallel) *compute* task. + - The initialization task runs in the same way as a non-parallel task and generates a custom parallelization list of zarr_urls & parameters to be used in the compute task. + - The compute tasks are run in parallel for each entry of the parallelization list and use the `init_args` dictionary as an extra input from the initialization task. + - Compound tasks can often be specific to given collection types like OME-Zarr HCS plates. A typical example are multiplexing-related tasks that use `acquisition` metadata on the well level to decide which pairs of images need to be processed. + + +## Task list and manifest + +A package that provides Fractal tasks must contain a manifest (stored as a `__FRACTAL_MANIFEST__.json` file within the package), that describes the parameters, executables and metadata of the tasks. `fractal-tasks-core` and `fractal-tasks-template` offer a simplified way to generate this manifest, based on a task list written in Python. + +### Task list +If the task package `my-pkg` was created based on the template, the task list is in `src/my-pkg/dev/task_list.py` and includes entries like +```python +TASK_LIST = [ + NonParallelTask( + name="My non-parallel task", + executable="my_non_parallel_task.py", + meta={"cpus_per_task": 1, "mem": 4000}, + category="Conversion", + docs_info="file:task_info/task_description.md", + tags=["tag1", "Microscope name"] + ), + ParallelTask( + name="My parallel task", + executable="my_parallel_task.py", + meta={"cpus_per_task": 1, "mem": 4000}, + category="Segmentation", + ), + CompoundTask( + name="My compound task", + executable_init="my_task_init.py", + executable="my_actual_task.py", + meta_init={"cpus_per_task": 1, "mem": 4000}, + meta={"cpus_per_task": 2, "mem": 12000}, + category="Registration", + ), +] +``` +where the different task models refer to the [different task types](#task-types). Given such task list, running the following command +```bash +python src/my-pkg/dev/create_manifest.py +``` +generates a JSON file with the up-to-date manifest. Note that advanced usage may require minor customizations of the create-manifest script. + +### Manifest metadata +The task manifest can contain additional metadata that makes it easier for people to browse tasks on the [Fractal task page](#./fractal_tasks.md) and the tasks available on a given server. The [Fractal task template](https://github.com/fractal-analytics-platform/fractal-tasks-template) provides good defaults for how all this metadata can be set. This metadata is also used to make tasks searchable. + +#### Docs info +Tasks can provide a structured summary of their functionality. If the image list does not contain a docs_info property for a given task, the docstring of the task function is used. A developer can provide a more structured markdown file by specifying the relative path to the markdown file with the task description (for example: `file:task_info/task_description.md`). The convention for these task descriptions is to contain a section on the purpose of the task as well as its limitations in a bullet-point list. + +#### Categories +Tasks can belong to a single category, which allows users to filter for the kind of task they are looking for. The standard categories are: `Conversion`, `Image Processing`, `Segmentation`, `Registration`, `Measurement`. + +#### Modalities +Tasks can have a single modality metadata. If a task works on all types of OME-Zarrs, no modality should be set. If a task is specifically designed to work on one modality (for example, a task that required OME-Zarr HCS plates), the modality should be specified. The standard modalities are: `HCS`, `lightsheet`, `EM`. + +#### Tags +Tasks can have arbitrary lists of string tags that describe their functionality. These are particularly helpful to increase the findability of a task using search. + +#### Authors +Task packages can specify an authors list. This metadata is configured in the create_manifest.py script for the whole task package. + +### How to get your task package on the Fractal tasks page +If you have a task package that you would like to see listed on the [Fractal task page](#./fractal_tasks.md) page, ping one of the Fractal maintainers about it or [make a PR to have your task included in the list of task sources here](https://github.com/fractal-analytics-platform/fractal-analytics-platform.github.io/blob/main/tasks_data_retrieval/sources.txt). For a task package to be listable on the Fractal tasks page, the package needs to contain a Fractal manifest and be available either via PyPI or via a whl in Github releases. The [Fractal task template](https://github.com/fractal-analytics-platform/fractal-tasks-template) provides examples for how to do both. +Future work will add support for adding additional task configurations (likely a specification for how to provide packages that are installable via Pixi). + + +## Input API + +### Parallel tasks + +The input arguments of a Fractal parallel tasks must include a `zarr_url` string argument. The `zarr_url` contains the full path to the zarr file to be processed. Only filesystem paths are currently supported, not S3 urls. +`zarr_url` is a reserved keyword argument: when running tasks through Fractal server, the server takes care to pass the correct `zarr_url` argument to the parallel task (based on filtering the image list). +Tasks can also take an arbitrary list of additional arguments that are specific to the task function and that the user can set. + +### Non-parallel tasks + +The input arguments of a Fractal non-parallel task must include a `zarr_urls` arguments (a list of strings) and `zarr_dir` argument (a single string). `zarr_urls` contains the full paths to the OME-Zarr images to be processed. We currently just support paths on filesystems, not S3 urls. `zarr_dir` is typically the base directory into which OME-Zarr files will be written by tasks and it is mostly used by converters. +Both `zarr_urls` and `zarr_dir` are reserved keyword arguments: when running tasks through Fractal server, the server takes care to pass the correct filtered list `zarr_urls` and the correct `zarr_dir` to the non-parallel task. +Tasks can also take an arbitrary list of additional arguments that are specific to the task function and that the user can set. + +### Compound tasks + +Compound tasks consist of an init part (similar to the non-parallel task) and a compute part (similar to the parallel task). +The init part has the same Input API as the non-parallel task (`zarr_urls` and `zarr_dir`), but it provides the parallelization list for the compute part as an output. +The compute part takes the `zarr_url` argument and an extra `init_args` dictionary argument (which is coming from the `parallelization_list` provided by the init task). + +## Output API + +Tasks can optionally return updates to the image list and/or [new dataset filters](./image_list.md#dataset-filters) (this is true for all tasks except the init phase of a compound tasks) or a parallelization list (just the init phase of a compound task). The output of a task is always a `task_output` dictionary. Note that this dictionary must be JSON-serializable, since it will be written to disk so that `fractal-server` can access it. + +For tasks that create new images or edit relevant image properties, `task_output` must include an `image_list_updates` property so the server can update its metadata about that image. + +> NOTE: if new filters are set, but both `image_list_updates` and `image_list_removals` are empty, in the task output, then `fractal-server` includes all the filtered image list in `image_list_updates`, so that they are updated with the appropriate `types` (see also [the image-list page](./image_list.md#image-types)). + +Task outputs with image list updates are returned as a dictionary that contains the `image_list_updates` key and a list containing the updates to individual images. The updates need to be for unique `zarr_url`s and each update needs to contain the `zarr_url` of the image it’s providing an update for. Additionally, they can provide an `origin` key, an `attributes` key and a `types` key. The `origin` key describes the `zarr_url` of another image already in the image list and will take the existing attributes and types from that image. Attributes and types can also be directly set by a task. + +Here's an example of `task_output`: +```python +{ + "image_list_updates" = [ + { + "zarr_url": "/path/to/my_zarr.zarr/B/03/0_processed", + "origin": "/path/to/origin_zarr.zarr/B/03/0", + "attributes": { + "plate": "plate_name", + "well": "B03" + }, + "types": { + "is_3D": True + } + } + ] +} +``` + +Here is an example of a task that provides new filter updates without changing the image list. This task sets the `is_3D` filter to True: +```python +{ + "filters" ={ + "types": { + "is_3D": True + } + } +} +``` + +The init part of a compound task must produe a parallelization lists, with elements having the `zarr_url` property as well as additional arbitrary arguments as an `init_args` dictionary. +Parallelization lists are provided in the following structure: +```python +{ + "parallelization_list": [ + { + "zarr_url": "/path/to/my_zarr.zarr/B/03/0", + "init_args": {"some_arg": "some_value"}, + } + ] +} +``` + diff --git a/docs/version_2/index.md b/docs/version_2/index.md deleted file mode 100644 index 1acfe5f..0000000 --- a/docs/version_2/index.md +++ /dev/null @@ -1,10 +0,0 @@ ---- -layout: default ---- - -# Fractal V2 - -This section describes the main concepts introduced with the new Fractal version (version 2 of the `fractal-server` backend): - -1. The [new dataset image list](./image_list.md); -2. The [new definition of tasks](./tasks.md). diff --git a/docs/version_2/tasks.md b/docs/version_2/tasks.md deleted file mode 100644 index 0a63fdc..0000000 --- a/docs/version_2/tasks.md +++ /dev/null @@ -1,109 +0,0 @@ ---- -layout: default ---- - -# Tasks - -Fractal v2 brings a large refactor to the task architecture to make tasks more flexible and allow for building more complex workflows, while also simplifying the task API. This page gives an overview over the different types of Fractal tasks, their input and output API and the elements that go into the Fractal task list. - -## Task Types - -There are three types of tasks in Fractal V2: parallel tasks, non-parallel tasks & compound tasks. - -1. A **parallel task** is written to process a single OME-Zarr image and meant to be run in parallel across many OME-Zarr images. This is the typical scenario for compute tasks that don't need special input handling or subset parallelization -2. A **non-parallel task** processes a list of images, and it only runs as a single job. It is useful to handle image-list updates and validation of Zarr collections (like Import OME-Zarr). -3. A **compound task** consists of an initialization (non-parallel) task, that provides a custom parallelization list to a subsequent (parallel) compute task. An example are registration tasks that need to run across multiple Zarr images, but parallelize over wells of a multi-well plate. The init task is like a non-parallel task, but it provides the parallelization list as output. The compute task is like a parallel task, but it takes an extra `init_args` dictionary as input from the init task. - -## Input API - -### Parallel tasks - -The input arguments of a Fractal parallel tasks must include a `zarr_url` string argument. The `zarr_url` contains the full path to the zarr file to be processed. Only filesystem paths are currently supported, not S3 urls. -`zarr_url` is a reserved keyword argument: when running tasks through Fractal server, the server takes care to pass the correct `zarr_url` argument to the parallel task (based on filtering the image list). -Tasks can also take an arbitrary list of additional arguments that are specific to the task function and that the user can set. - -### Non-parallel tasks - -The input arguments of a Fractal non-parallel task must include a `zarr_urls` arguments (a list of strings) and `zarr_dir` argument (a single string). `zarr_urls` contains the full paths to the OME-Zarr images to be processed. We currently just support paths on filesystems, not S3 urls. `zarr_dir` is typically the base directory into which OME-Zarr files will be written by tasks and it is mostly used by converters. -Both `zarr_urls` and `zarr_dir` are reserved keyword arguments: when running tasks through Fractal server, the server takes care to pass the correct filtered list `zarr_urls` and the correct `zarr_dir` to the non-parallel task. -Tasks can also take an arbitrary list of additional arguments that are specific to the task function and that the user can set. - -### Compound tasks - -Compound tasks consist of an init part (similar to the non-parallel task) and a compute part (similar to the parallel task). -The init part has the same Input API as the non-parallel task (`zarr_urls` and `zarr_dir`), but it provides the parallelization list for the compute part as an output. -The compute part takes the `zarr_url` argument and an extra `init_args` dictionary argument (which is coming from the `parallelization_list` provided by the init task). - -## Output API - -Tasks can optionally return updates to the image list and/or [new dataset filters](./image_list.md#dataset-filters) (this is true for all tasks except the init phase of a compound tasks) or a parallelization list (just the init phase of a compound task). The output of a task is always a `task_output` dictionary. Note that this dictionary must be JSON-serializable, since it will be written to disk so that `fractal-server` can access it. - -For tasks that create new images or edit relevant image properties, `task_output` must include an `image_list_updates` property so the server can update its metadata about that image. - -> NOTE: if both `image_list_updates` and `image_list_removals` are empty, in the task output, then `fractal-server` includes all the filtered image list in `image_list_updates`, so that they are updated with the appropriate `types` (see also [the image-list page](./image_list.md#image-types)). - -Task outputs with image list updates are returned as a dictionary that contains the `image_list_updates` key and a list containing the updates to individual images. The updates need to be for unique `zarr_url`s and each update needs to contain the `zarr_url` of the image it’s providing an update for. Additionally, they can provide an `origin` key, an `attributes` key and a `types` key. The `origin` key describes the `zarr_url` of another image already in the image list and will take the existing attributes and types from that image. Attributes and types can also be directly set by a task. - -Here's an example of `task_output`: -```python -{ - "image_list_updates" = [ - { - "zarr_url": "/path/to/my_zarr.zarr/B/03/0_processed", - "origin": "/path/to/origin_zarr.zarr/B/03/0", - "attributes": { - "plate": "plate_name", - "well": "B03" - }, - "types": { - "is_3D": True - } - } - ] -} -``` - -The init part of a compound task must produe a parallelization lists, with elements having the `zarr_url` property as well as additional arbitrary arguments as an `init_args` dictionary. -Parallelization lists are provided in the following structure: -```python -{ - "parallelization_list": [ - { - "zarr_url": "/path/to/my_zarr.zarr/B/03/0", - "init_args": {"some_arg": "some_value"}, - } - ] -} -``` - -## Task list and manifest - -A package that provides Fractal tasks must contain a manifest (stored as a `__FRACTAL_MANIFEST__.json` file within the package), that describes the metadata of these tasks. `fractal-tasks-core` and `fractal-tasks-template` offer a simplified way to generate this manifest, based on a task list written in Python. - -For instance if the task package `my-pkg` was created based on the template, the task list is in `src/my-pkg/dev/task_list.py` and includes entries like -```python -TASK_LIST = [ - NonParallelTask( - name="My non-parallel task", - executable="my_non_parallel_task.py", - meta={"cpus_per_task": 1, "mem": 4000}, - ), - ParallelTask( - name="My parallel task", - executable="my_parallel_task.py", - meta={"cpus_per_task": 1, "mem": 4000}, - ), - CompoundTask( - name="My compound task", - executable_init="my_task_init.py", - executable="my_actual_task.py", - meta_init={"cpus_per_task": 1, "mem": 4000}, - meta={"cpus_per_task": 2, "mem": 12000}, - ), -] -``` -where the different task models refer to the [different task types](#task-types). Given such task list, running the following command -```bash -python src/my-pkg/dev/create_manifest.py -``` -generates a JSON file with the up-to-date manifest. Note that advanced usage may require minor customizations of the create-manifest script. diff --git a/mkdocs.yml b/mkdocs.yml index 79cd484..4108e9c 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -43,10 +43,12 @@ plugins: nav: - Fractal: index.md - - Fractal V2 Changes: version_2/ - - Build Your Own Fractal Task: build_your_own_fractal_task.md - - Deploy Fractal Server & Web: run_fractal.md - Fractal tasks: fractal_tasks.md + - Fractal task spec: tasks_spec.md + - Create a Fractal Task: build_your_own_fractal_task.md + - Fractal Image List: image_list.md + - Deploy Fractal Server & Web: run_fractal.md + # Include some page, without including them in the ToC # not_in_nav: something.md