Skip to content

Commit

Permalink
Merge pull request #34 from flindersuni/develop
Browse files Browse the repository at this point in the history
Software Suite & Technical Specification Update to Public
  • Loading branch information
The-Scott-Flinders committed Feb 9, 2022
2 parents 931a73f + 1895ea1 commit 5c7d892
Show file tree
Hide file tree
Showing 10 changed files with 297 additions and 138 deletions.
12 changes: 5 additions & 7 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -3,12 +3,6 @@ Welcome to the DeepThought Documentation

The new Flinder HPC is called DeepThought. This new HPC comprises of AMD EPYC based hardware and next-generation management software, allowing for a dynamic and agile HPC service.

.. _Upgrade Migration Information: migration/upgrademigration.html

.. warning::
DeepThought has recently undergone a series of upgrades that require some user intervention when utlising the upgraded cluster.
Please see `Upgrade Migration Information`_ for actions required.

.. attention::
This documentation is under active development, meaning that it can
change over time as we improve it. Please email deepthought@flinders.edu.au if
Expand Down Expand Up @@ -70,9 +64,13 @@ Table of Contents

software/softwaresuitesoverview.rst
software/ansys.rst
software/delft3d.rst
software/gromacs.rst
software/jupyter.rst
software/singularity.rst
software/lammps.rst
software/matlab.rst
software/singularity.rst
software/vasp.rst


.. toctree::
Expand Down
47 changes: 0 additions & 47 deletions docs/source/migration/upgrademigration.rst

This file was deleted.

132 changes: 67 additions & 65 deletions docs/source/software/ansys.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,103 +4,105 @@ ANSYS Engineering Suite
=======
Status
=======
ANSYS 2021R2 is the current version of the ANSYS Suite installed on the HPC.
ANSYS 2021R2 is the current version of the ANSYS Suite installed on the HPC. Both Single-Node (-smp) and Multi-Node (-dis) execution is supported.

Development on tighter Graphical User Interfaces is currently under investigation.

.. _ANSYS: https://www.ansys.com/

==========
Overview
==========
The ANSYS Engineering Suite is a comprehensive software suite for engineering simulation.
The new HPC ANSYS integration aims to bring the Remote Solver Manager to DeepThought - allowing
users to leverage the HPC from their desktops and familiar ANSYS interface.


=================
Known Issues
=================
The Intel MPI Library shipped with ANSYS is not 100% compatible with DeepThought. Running with the -smp flag will disable MPI and allow you to run ANSYS as normal on a single node.
The ANSYS Engineering Suite is a comprehensive software suite for engineering simulation. More information on can be found on the `ANSYS`_ website.


================================
Quickstart Command Line Guide
================================

To run a job with ANSYS on the HPC you will need the following:
- A ANSYS Script file
- An ANSYS Script file
- Any reference file(s) (eg, a .db file)

Ensure that the paths to anything in the script file reflect where it lives on the HPC, not your local machine.

You can then invoke ANSYS (after loading the ANSYS Module) in 'batch' mode for a single script like so:

1. Single-Node (Single-Node)


- ansys212 -smp -np <CPUS> -bd <DB File Memory Allocation> -m <SLURM Memory Allocation> -b -s < PATH_TO_SCRIPT_FILE
2. Distributed Mode (Multi-Node)


- ansys212 -dis -np <CPUS> -bd <DB FILE Memory Allocation> -m <SLURM Memory Allocation> -b -s < PATH_TO_SCRIPT_FILE > PATH_TO_OUTPUT_FILE

+++++++++++++++++++++++
ANSYS CLI Quick List
+++++++++++++++++++++++
The following is a quick list of some of the common CLI options.


+-------------------+---------------------------------------+
| CLI Option | Description |
+===================+=======================================+
| -acc nvidia | Enable GPU Compute Acceleration |
+-------------------+---------------------------------------+
| \-np value | Specify the Number of CPU's: -np 12 |
+-------------------+---------------------------------------+
| -smp | Run ANSYS in Single-Node Mode |
+-------------------+---------------------------------------+
| -g | Start the Graphical User interface |
+-------------------+---------------------------------------+
| -b | Enable ANSYS Batch mode. Needs -s. |
+-------------------+---------------------------------------+
| -i | Full Input file path |
+-------------------+---------------------------------------+
| -o | Full Output File Path |
+-------------------+---------------------------------------+
| -s | Read the Ansys Startup Script |
+-------------------+---------------------------------------+
| -dir /path | The Working Diretory of ANSYS |
+-------------------+---------------------------------------+
| -db | Initial Allocation for the .DB File |
+-------------------+---------------------------------------+
| \-m value | RAM Allocation for ANSYS. -m 40000 |
+-------------------+---------------------------------------+
| \< /path/ | Script file Path for batch Mode |
+-------------------+---------------------------------------+
+-------------------+--------------------------------------------+
| CLI Option | Description |
+===================+============================================+
| -acc nvidia | Enable GPU Compute Acceleration |
+-------------------+--------------------------------------------+
| -dis | Run ANSYS in Distributed (Multi-Node) Mode |
+-------------------+--------------------------------------------+
| \-np value | Specify the Number of CPU's: -np 12 |
+-------------------+--------------------------------------------+
| -smp | Run ANSYS in Single-Node Mode |
+-------------------+--------------------------------------------+
| -g | Start the Graphical User interface |
+-------------------+--------------------------------------------+
| -b | Enable ANSYS Batch mode. Needs -s. |
+-------------------+--------------------------------------------+
| -i | Full Input file path |
+-------------------+--------------------------------------------+
| -o | Full Output File Path |
+-------------------+--------------------------------------------+
| -s | Read the Ansys Startup Script |
+-------------------+--------------------------------------------+
| -dir /path | The Working Diretory of ANSYS |
+-------------------+--------------------------------------------+
| -db | Initial Allocation for the .DB File |
+-------------------+--------------------------------------------+
| \-m value | RAM Allocation for ANSYS. -m 40000 |
+-------------------+--------------------------------------------+
| \< /path/ | Script file Path for batch Mode |
+-------------------+--------------------------------------------+
| \> /path/file.out | Path to store ANSYS Output to file |
+-------------------+--------------------------------------------+


+++++++++++++++++++++++++
ANSYS Program Quick List
+++++++++++++++++++++++++
The following table lists the ANSYS programs and their associated CLI command


+-------------------+---------------------------------------+
| Program | Name |
+===================+=======================================+
| Mechanical ADPL | ansys212, ansys2021R2 |
+-------------------+---------------------------------------+
| Workbench | runwb2 |
+-------------------+---------------------------------------+
| CFX | cfx5 |
+-------------------+---------------------------------------+
| FLUENT | fluent |
+-------------------+---------------------------------------+
| ICEM CFD | icemcfd |
+-------------------+---------------------------------------+
| POLYFLOW | polyman |
+-------------------+---------------------------------------+
| CFD-Post | cfdpost |
+-------------------+---------------------------------------+
| Icepak | icepak |
+-------------------+---------------------------------------+
| TurboGrid | cfxtg |
+-------------------+---------------------------------------+
| AUTODYN | autodyn212 |
+-------------------+---------------------------------------+
The following table lists the ANSYS programs and their associated CLI command.


+-----------------+-----------------------+
| Program | Name |
+=================+=======================+
| Mechanical ADPL | ansys212, ansys2021R2 |
+-----------------+-----------------------+
| Workbench | runwb2 |
+-----------------+-----------------------+
| CFX | cfx5 |
+-----------------+-----------------------+
| FLUENT | fluent |
+-----------------+-----------------------+
| ICEM CFD | icemcfd |
+-----------------+-----------------------+
| POLYFLOW | polyman |
+-----------------+-----------------------+
| CFD-Post | cfdpost |
+-----------------+-----------------------+
| Icepak | icepak |
+-----------------+-----------------------+
| TurboGrid | cfxtg |
+-----------------+-----------------------+
| AUTODYN | autodyn212 |
+-----------------+-----------------------+


40 changes: 40 additions & 0 deletions docs/source/software/delft3d.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
-------------------------
Delft3D
-------------------------
=======
Status
=======
Delft3D 4, Revision 65936 is installed and available for use on the HPC.

.. Delft3D:
==========
Overview
==========
From `Delft3D`_:

Delft3D is Open Source Software and facilitates the hydrodynamic (Delft3D-FLOW module), morphodynamic (Delft3D-MOR module), waves (Delft3D-WAVE module), water quality (Delft3D-WAQ module including the DELWAQ kernel) and particle (Delft3D-PART module) modelling


================================
Known Issues
================================

Delft3D does **not** currently support Multi-Node Execution. The binary swan_mpi.exe will _not work and immediately crash with errors_.


+++++++++++++++++++++++++
Delft3D Program Quick List
+++++++++++++++++++++++++

Below are two main binaries that are used as part of the Delft3D Suite

+--------------+----------------------------------------------+
| Program | Description |
+==============+==============================================+
| wave | The WAVE module |
+--------------+----------------------------------------------+
| swan_omp.exe | The SWAN Module with Single-Node parallelism |
+--------------+----------------------------------------------+

Ignore the .exe - ending, it is valid linux binary. Due a transision state between CMake and Make for the Delft3D source-code,
the tools and scripts rely on the binary name ending in .exe.
44 changes: 44 additions & 0 deletions docs/source/software/gromacs.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
--------
GROMACS
--------
=======
Status
=======
GROMACS version 2021.5 is installed and available for use on the HPC.

.. _GROMACS: https://www.gromacs.org/
==========
Overview
==========
From `GROMACS`_:

GROMACS is a versatile package to perform molecular dynamics, i.e. simulate the Newtonian equations of motion for systems with hundreds to millions of particles.

It is primarily designed for biochemical molecules like proteins, lipids and nucleic acids that have a lot of complicated bonded interactions, but since GROMACS is extremely fast at calculating the nonbonded interactions (that usually dominate simulations) many groups are also using it for research on non-biological systems, e.g. polymers.

GROMACS supports all the usual algorithms you expect from a modern molecular dynamics implementation.


================================
Quickstart Command Line Guide
================================

Gromacs uses UCX and will require a custom mpirun invocation. The module system will warn you of this when you load the module. The following is a known good starting point:


``mpirun -mca pml ucx --mca btl ^vader,tcp,uct -x UCX_NET_DEVICES=bond0 <program> <options>``


+++++++++++++++++++++++++
GROMACS Program Quick List
+++++++++++++++++++++++++

Below is a quick reference list of the different programs that make up the GROMACS suite.

+------------+-------------------------------------+
| CLI Option | Description |
+============+=====================================+
| gmx | A Wrapper that calls gmx_mpi |
+------------+-------------------------------------+
| gmx_mpi | The main MPI Enabled GROMACS binary |
+------------+-------------------------------------+
55 changes: 55 additions & 0 deletions docs/source/software/lammps.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
------------------
LAMMPS
------------------
=======
Status
=======
LAMMPS was installed from the Development Branch on 7th Jan, 2022.

There are two versions of LAMMPS installed on DeepThought, each with their own modules:

1. A CPU only version, called lmp
2. a GPU only version, called lmp_gpu

*You cannot run the GPU enabled version without access to a GPU, as it will cause errors.*

.. _LAMMPS: https://lammps.org


=================
Overview
=================
From LAMMPS_:

LAMMPS is a classical molecular dynamics code with a focus on materials modeling. It's an acronym for Large-scale Atomic/Molecular Massively Parallel Simulator. LAMMPS has
potentials for solid-state materials (metals, semiconductors) and soft matter (biomolecules, polymers) and coarse-grained or mesoscopic systems. It can be used to model
atoms or, more generically, as a parallel particle simulator at the atomic, meso, or continuum scale.

LAMMPS runs on single processors or in parallel using message-passing techniques and a spatial-decomposition of the simulation domain. Many of its models have versions that
provide accelerated performance on CPUs, GPUs, and Intel Xeon Phis. The code is designed to be easy to modify or extend with new functionality. LAMMPS is distributed as an
open source code under the terms of the GPLv2. The current version can be downloaded here. Links are also included to older versions. All LAMMPS development is done via
GitHub, so all versions can also be accessed there. Periodic releases are also posted to SourceForge.

==========================
LAMMPS Installed Packages
==========================

The following is an extract from the ``lmp -h`` option, showing the enabled packages and capabilities of the LAMMPS installation.

ASPHERE ATC AWPMD BOCS BODY BROWNIAN CG-DNA CG-SDK CLASS2 COLLOID COLVARS
COMPRESS CORESHELL DIELECTRIC DIFFRACTION DIPOLE DPD-BASIC DPD-MESO DPD-REACT
DPD-SMOOTH DRUDE EFF EXTRA-COMPUTE EXTRA-DUMP EXTRA-FIX EXTRA-MOLECULE
EXTRA-PAIR FEP GPU GRANULAR H5MD INTERLAYER KIM KSPACE LATBOLTZ LATTE MACHDYN
MANIFOLD MANYBODY MC MDI MEAM MESONT MESSAGE MGPT MISC ML-HDNNP ML-IAP ML-PACE
ML-QUIP ML-RANN ML-SNAP MOFFF MOLECULE MOLFILE MPIIO MSCG NETCDF OPENMP OPT
ORIENT PERI PHONON PLUGIN PLUMED POEMS PTM PYTHON QEQ QMMM QTB REACTION REAXFF
REPLICA RIGID SCAFACOS SHOCK SMTBQ SPH SPIN SRD TALLY UEF VORONOI VTK YAFF

======================================
LAMMPS Quickstart Command Line Guide
======================================

LAMMPS uses UCX and will require a custom mpirun invocation. The module system will warn you of this when you load the module. The following is a known good starting point:


``mpirun -mca pml ucx --mca btl ^vader,tcp,uct -x UCX_NET_DEVICES=bond0 <program> <options>``

0 comments on commit 5c7d892

Please sign in to comment.