Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 7 additions & 7 deletions docs/documentation/running.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ of available arguments can be acquired with `./mfc.sh run -h`.

## Interactive Execution

To run all stages of MFC, that is [pre_process](src/pre_process/), [simulation](src/simulation/), and [post_process](src/post_process/) on the sample case [2D_shockbubble](examples/2D_shockbubble/),
To run all stages of MFC, that is [pre_process](https://github.com/MFlowCode/MFC/tree/master/src/pre_process/), [simulation](https://github.com/MFlowCode/MFC/tree/master/src/simulation/), and [post_process](https://github.com/MFlowCode/MFC/tree/master/src/post_process/) on the sample case [2D_shockbubble](https://github.com/MFlowCode/MFC/tree/master/examples/2D_shockbubble/),

```console
$ ./mfc.sh run examples/2D_shockbubble/case.py
Expand All @@ -20,13 +20,13 @@ specified with the `-j` option.

For example,

- Running [pre_process](src/pre_process/) with 2 cores:
- Running [pre_process](https://github.com/MFlowCode/MFC/tree/master/src/pre_process/) with 2 cores:

```console
$ ./mfc.sh run examples/2D_shockbubble/case.py -t pre_process -n 2
```

- Running [simulation](src/simulation/) and [post_process](src/post_process/)
- Running [simulation](https://github.com/MFlowCode/MFC/tree/master/src/simulation/) and [post_process](https://github.com/MFlowCode/MFC/tree/master/src/post_process/)
using 4 cores:

```console
Expand Down Expand Up @@ -67,16 +67,14 @@ provide support for a restricted subset of common configuration options. If MFC
to execute on your system, or if you wish to adjust how the program runs and resources
are requested to be allocated, you are invited to modify the template batch script for your queue system.
Upon execution of `./mfc.sh run`, MFC fills in the template with runtime parameters, to
generate the batch file it will submit. These files are located in the [templates](templates/)
generate the batch file it will submit. These files are located in the [templates](https://github.com/MFlowCode/MFC/tree/master/toolchain/templates/)
directory. To request GPUs, modification of the template will be required on most systems.

- Lines that begin with `#>` are ignored and won't figure in the final batch
script, not even as a comment.

- Statements of the form `${expression}` are string-replaced to provide
runtime parameters, most notably execution options. They reference the variables in the
same format as those under the "run" section of [defaults.yaml](defaults.yaml), replacing
`-` for `_`. You can perform therein any Python operation recognized by the built-in `expr()` function.
runtime parameters, most notably execution options. You can perform therein any Python operation recognized by the built-in `expr()` function.

As an example, one might request GPUs on a SLURM system using the following:

Expand Down Expand Up @@ -145,10 +143,12 @@ in which $t_i$ is the starting time, $t_f$ is the final time, and $SF$ is the sa
- Run the post_process
- There are several ways to do this. Keep in mind that, regardless of the .py file used, the post_process command will generate output files in the [`t_step_start`, `t_step_stop`] range, with `t_step_save` as the spacing between files.
- One way is to set `t_step_stop` to the restarting point $t_s$ in `case.py`. Then, run:

```console
$ ./mfc.sh run case.py -t post_process
$ ./mfc.sh run restart_case.py -t post_process
```

- The first command will run on timesteps $[t_i, t_s]$. The second command will run on $[t_s, t_{f2}]$. Therefore, the whole range $[t_i, t_{f2}]$ will be post processed.

We have provided an example `case.py` and `restart_case.py` in `/examples/1D_vacuum_restart/`. This simulation is a duplicate of the `1D_vacuum` case. It demonstrates stopping at timestep 7000, adding a new patch, and restarting the simulation. To test this code, run:
Expand Down
4 changes: 2 additions & 2 deletions docs/documentation/testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ $ ./mfc.sh test -g -j 8

It is recommended that a range be specified when generating golden files for new test cases, as described in the previous section, in an effort not to regenerate the golden files of existing test cases.

Adding a new test case can be done by modifying [cases.py](toolchain/mfc/tests/cases.py). The function `generate_cases` is responsible for generating the list of test cases. Loops and conditionals are used to vary parameters, whose defaults can be found in the `BASE_CFG` case object within [case.py](toolchain/mfc/tests/case.py). The function operates on two variables:
Adding a new test case can be done by modifying [cases.py](https://github.com/MFlowCode/MFC/tree/master/toolchain/mfc/test/cases.py). The function `generate_cases` is responsible for generating the list of test cases. Loops and conditionals are used to vary parameters, whose defaults can be found in the `BASE_CFG` case object within [case.py](https://github.com/MFlowCode/MFC/tree/master/toolchain/mfc/test/case.py). The function operates on two variables:

- `stack`: A stack that holds the variations to the default case parameters. By pushing and popping the stack inside loops and conditionals, it is easier to nest test case descriptions, as it holds the variations that are common to all future test cases within the same indentation level (in most scenarios).

Expand Down Expand Up @@ -81,5 +81,5 @@ To test updated post process code, append the `-a` or `--test-all` option:
$ ./mfc.sh test -a -j 8
```

This argument will re-run the test stack with `parallel_io=True`, which generates silo_hdf5 files. It will also turn most write parameters (`*_wrt`) on. Then, it uses the [h5py](https://www.h5py.org/) library to ensure that there are no NaNs or Infinity's written to the silo files. Although adding this option does not guarantee that accurate silo files are generated, it does ensure that post process does not fail or produce malformed data.
This argument will re-run the test stack with `parallel_io=True`, which generates silo_hdf5 files. It will also turn most write parameters (`*_wrt`) on. Then, it searches through the silo files using `h5dump` to ensure that there are no NaNs or Infinitys. Although adding this option does not guarantee that accurate silo files are generated, it does ensure that post process does not fail or produce malformed data.