Skip to content

Commit

Permalink
Merge pull request #32 from flindersuni/feature/lammps
Browse files Browse the repository at this point in the history
Software Suite Updates
  • Loading branch information
The-Scott-Flinders committed Feb 9, 2022
2 parents 8e29295 + 7df2d91 commit 58f6021
Show file tree
Hide file tree
Showing 8 changed files with 233 additions and 82 deletions.
132 changes: 67 additions & 65 deletions docs/source/software/ansys.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,103 +4,105 @@ ANSYS Engineering Suite
=======
Status
=======
ANSYS 2021R2 is the current version of the ANSYS Suite installed on the HPC.
ANSYS 2021R2 is the current version of the ANSYS Suite installed on the HPC. Both Single-Node (-smp) and Multi-Node (-dis) execution is supported.

Development on tighter Graphical User Interfaces is currently under investigation.

.. _ANSYS: https://www.ansys.com/

==========
Overview
==========
The ANSYS Engineering Suite is a comprehensive software suite for engineering simulation.
The new HPC ANSYS integration aims to bring the Remote Solver Manager to DeepThought - allowing
users to leverage the HPC from their desktops and familiar ANSYS interface.


=================
Known Issues
=================
The Intel MPI Library shipped with ANSYS is not 100% compatible with DeepThought. Running with the -smp flag will disable MPI and allow you to run ANSYS as normal on a single node.
The ANSYS Engineering Suite is a comprehensive software suite for engineering simulation. More information on can be found on the `ANSYS`_ website.


================================
Quickstart Command Line Guide
================================

To run a job with ANSYS on the HPC you will need the following:
- A ANSYS Script file
- An ANSYS Script file
- Any reference file(s) (eg, a .db file)

Ensure that the paths to anything in the script file reflect where it lives on the HPC, not your local machine.

You can then invoke ANSYS (after loading the ANSYS Module) in 'batch' mode for a single script like so:

1. Single-Node (Single-Node)


- ansys212 -smp -np <CPUS> -bd <DB File Memory Allocation> -m <SLURM Memory Allocation> -b -s < PATH_TO_SCRIPT_FILE
2. Distributed Mode (Multi-Node)


- ansys212 -dis -np <CPUS> -bd <DB FILE Memory Allocation> -m <SLURM Memory Allocation> -b -s < PATH_TO_SCRIPT_FILE > PATH_TO_OUTPUT_FILE

+++++++++++++++++++++++
ANSYS CLI Quick List
+++++++++++++++++++++++
The following is a quick list of some of the common CLI options.


+-------------------+---------------------------------------+
| CLI Option | Description |
+===================+=======================================+
| -acc nvidia | Enable GPU Compute Acceleration |
+-------------------+---------------------------------------+
| \-np value | Specify the Number of CPU's: -np 12 |
+-------------------+---------------------------------------+
| -smp | Run ANSYS in Single-Node Mode |
+-------------------+---------------------------------------+
| -g | Start the Graphical User interface |
+-------------------+---------------------------------------+
| -b | Enable ANSYS Batch mode. Needs -s. |
+-------------------+---------------------------------------+
| -i | Full Input file path |
+-------------------+---------------------------------------+
| -o | Full Output File Path |
+-------------------+---------------------------------------+
| -s | Read the Ansys Startup Script |
+-------------------+---------------------------------------+
| -dir /path | The Working Diretory of ANSYS |
+-------------------+---------------------------------------+
| -db | Initial Allocation for the .DB File |
+-------------------+---------------------------------------+
| \-m value | RAM Allocation for ANSYS. -m 40000 |
+-------------------+---------------------------------------+
| \< /path/ | Script file Path for batch Mode |
+-------------------+---------------------------------------+
+-------------------+--------------------------------------------+
| CLI Option | Description |
+===================+============================================+
| -acc nvidia | Enable GPU Compute Acceleration |
+-------------------+--------------------------------------------+
| -dis | Run ANSYS in Distributed (Multi-Node) Mode |
+-------------------+--------------------------------------------+
| \-np value | Specify the Number of CPU's: -np 12 |
+-------------------+--------------------------------------------+
| -smp | Run ANSYS in Single-Node Mode |
+-------------------+--------------------------------------------+
| -g | Start the Graphical User interface |
+-------------------+--------------------------------------------+
| -b | Enable ANSYS Batch mode. Needs -s. |
+-------------------+--------------------------------------------+
| -i | Full Input file path |
+-------------------+--------------------------------------------+
| -o | Full Output File Path |
+-------------------+--------------------------------------------+
| -s | Read the Ansys Startup Script |
+-------------------+--------------------------------------------+
| -dir /path | The Working Diretory of ANSYS |
+-------------------+--------------------------------------------+
| -db | Initial Allocation for the .DB File |
+-------------------+--------------------------------------------+
| \-m value | RAM Allocation for ANSYS. -m 40000 |
+-------------------+--------------------------------------------+
| \< /path/ | Script file Path for batch Mode |
+-------------------+--------------------------------------------+
| \> /path/file.out | Path to store ANSYS Output to file |
+-------------------+--------------------------------------------+


+++++++++++++++++++++++++
ANSYS Program Quick List
+++++++++++++++++++++++++
The following table lists the ANSYS programs and their associated CLI command


+-------------------+---------------------------------------+
| Program | Name |
+===================+=======================================+
| Mechanical ADPL | ansys212, ansys2021R2 |
+-------------------+---------------------------------------+
| Workbench | runwb2 |
+-------------------+---------------------------------------+
| CFX | cfx5 |
+-------------------+---------------------------------------+
| FLUENT | fluent |
+-------------------+---------------------------------------+
| ICEM CFD | icemcfd |
+-------------------+---------------------------------------+
| POLYFLOW | polyman |
+-------------------+---------------------------------------+
| CFD-Post | cfdpost |
+-------------------+---------------------------------------+
| Icepak | icepak |
+-------------------+---------------------------------------+
| TurboGrid | cfxtg |
+-------------------+---------------------------------------+
| AUTODYN | autodyn212 |
+-------------------+---------------------------------------+
The following table lists the ANSYS programs and their associated CLI command.


+-----------------+-----------------------+
| Program | Name |
+=================+=======================+
| Mechanical ADPL | ansys212, ansys2021R2 |
+-----------------+-----------------------+
| Workbench | runwb2 |
+-----------------+-----------------------+
| CFX | cfx5 |
+-----------------+-----------------------+
| FLUENT | fluent |
+-----------------+-----------------------+
| ICEM CFD | icemcfd |
+-----------------+-----------------------+
| POLYFLOW | polyman |
+-----------------+-----------------------+
| CFD-Post | cfdpost |
+-----------------+-----------------------+
| Icepak | icepak |
+-----------------+-----------------------+
| TurboGrid | cfxtg |
+-----------------+-----------------------+
| AUTODYN | autodyn212 |
+-----------------+-----------------------+


40 changes: 40 additions & 0 deletions docs/source/software/delft3d.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
-------------------------
Delft3D Suite
-------------------------
=======
Status
=======
Delft3D 4, Revision 65936 is installed and available for use on the HPC.

.. Delft3D:
==========
Overview
==========
From `Delft3D`_:

Delft3D is Open Source Software and facilitates the hydrodynamic (Delft3D-FLOW module), morphodynamic (Delft3D-MOR module), waves (Delft3D-WAVE module), water quality (Delft3D-WAQ module including the DELWAQ kernel) and particle (Delft3D-PART module) modelling


================================
Known Issues
================================

Delft3D does **not** currently support Multi-Node Execution. The binary swan_mpi.exe will _not work and immediately crash with errors_.


+++++++++++++++++++++++++
Delft3D Program Quick List
+++++++++++++++++++++++++

Below are two main binaries that are used as part of the Delft3D Suite

+--------------+----------------------------------------------+
| Program | Description |
+==============+==============================================+
| wave | The WAVE module |
+--------------+----------------------------------------------+
| swan_omp.exe | The SWAN Module with Single-Node parallelism |
+--------------+----------------------------------------------+

Ignore the .exe - ending, it is valid linux binary. Due a transision state between CMake and Make for the Delft3D source-code,
the tools and scripts rely on the binary name ending in .exe.
44 changes: 44 additions & 0 deletions docs/source/software/gromacs.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
-------------------------
GROMACS Engineering Suite
-------------------------
=======
Status
=======
GROMACS version 2021.5 is installed and available for use on the HPC.

.. _GROMACS: https://www.gromacs.org/
==========
Overview
==========
From `GROMACS`_:

GROMACS is a versatile package to perform molecular dynamics, i.e. simulate the Newtonian equations of motion for systems with hundreds to millions of particles.

It is primarily designed for biochemical molecules like proteins, lipids and nucleic acids that have a lot of complicated bonded interactions, but since GROMACS is extremely fast at calculating the nonbonded interactions (that usually dominate simulations) many groups are also using it for research on non-biological systems, e.g. polymers.

GROMACS supports all the usual algorithms you expect from a modern molecular dynamics implementation.


================================
Quickstart Command Line Guide
================================

Gromacs uses UCX and will require a custom mpirun invocation. The module system will warn you of this when you load the module. The following is a known good starting point:


``mpirun -mca pml ucx --mca btl ^vader,tcp,uct -x UCX_NET_DEVICES=bond0 <program> <options>``


+++++++++++++++++++++++++
GROMACS Program Quick List
+++++++++++++++++++++++++

Below is a quick reference list of the different programs that make up the GROMACS suite.

+------------+-------------------------------------+
| CLI Option | Description |
+============+=====================================+
| gmx | A Wrapper that calls gmx_mpi |
+------------+-------------------------------------+
| gmx_mpi | The main MPI Enabled GROMACS binary |
+------------+-------------------------------------+
2 changes: 1 addition & 1 deletion docs/source/software/lammps.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ Status
=======
LAMMPS was installed from the Development Branch on 7th Jan, 2022.

There are two versions of LAMMPS installed on DeepThought:
There are two versions of LAMMPS installed on DeepThought, each with their own modules:

1. A CPU only version, called lmp
2. a GPU only version, called lmp_gpu
Expand Down
2 changes: 1 addition & 1 deletion docs/source/software/singularity.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Singularity Containers
=======
Status
=======
Released and available as a module on the HPC.
The Singularity Container Engine is available for use on the HPC.

==========
Overview
Expand Down
16 changes: 11 additions & 5 deletions docs/source/software/softwaresuitesoverview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,10 +16,16 @@ List of Enterprise Software Suites
.. _Singularity Containers: singularity.html
.. _MATLAB: matlab.html
.. _LAMMPS: lammps.html
.. _GROMACS: gromacs.html
.. _VASP: vasp.html
.. _Delft 3D: delft3d.html

1. `Jupyter Hub`_
2. `ANSYS`_
3. `LAMMPS`_
4. `Singularity Containers`_
5. `MATLAB`_
1. `ANSYS`_
2. `Delft 3D`_
3. `GROMACS`_
4. `Jupyter Hub`_
5. `LAMMPS`_
6. `MATLAB`_
7. `Singularity Containers`_
8. `VASP`_

49 changes: 49 additions & 0 deletions docs/source/software/vasp.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
-------------------------
VASP
-------------------------
=======
Status
=======
VASP: The Vienna Ab initio Simulation Package version 6.2.0 is operational with OpenACC GPU support on the HPC for the Standard, Gamma Ponit and Non-Collinear versions of the program.

.. _VASP: https://www.vasp.at/

==========
Overview
==========
From `VASP`_:

The Vienna Ab initio Simulation Package (VASP) is a computer program for atomic scale materials modelling, e.g. electronic structure calculations and quantum-mechanical molecular dynamics, from first principles.

VASP computes an approximate solution to the many-body Schrödinger equation, either within density functional theory (DFT), solving the Kohn-Sham equations, or within the Hartree-Fock (HF) approximation, solving the Roothaan equations. Hybrid functionals that mix the Hartree-Fock approach with density functional theory are implemented as well. Furthermore, Green’s functions methods (GW quasiparticles, and ACFDT-RPA) and many-body perturbation theory (2nd-order Møller-Plesset) are available in VASP.


================================
Quickstart Command Line Guide
================================
VASP must be started via the MPI wrapper script. If your SLURM Script has requested a GPU, VASP will autodetect and use the GPU for all supported operations. Substitute the VASP binary of your requirements.

``mpirun <MPI OPTIONS> vasp_std <OPTIONS>``

+++++++++++++++++++++++++
VASP Program Quick List
+++++++++++++++++++++++++

Below is a quick reference list of the different programs that make up the VASP suite.

+----------------+---------------------------------------------------------------------------------------------------------------+
| CLI Option | Description |
+================+===============================================================================================================+
| vasp_std | Standard version of VASP |
+----------------+---------------------------------------------------------------------------------------------------------------+
| vasp_std_debug | Debug, and slower version of the standard VASP binary. Unless specifically asked, you should use vasp_std |
+----------------+---------------------------------------------------------------------------------------------------------------+
| vasp_gam | Gamma Ponit version of VASP |
+----------------+---------------------------------------------------------------------------------------------------------------+
| vasp_gam_debug | Debug, and slower version of the Gamma Ponit VASP binary. Unless specifically asked, you should use vasp_gam |
+----------------+---------------------------------------------------------------------------------------------------------------+
| vasp_nlc | Non Collinear version of VASP |
+----------------+---------------------------------------------------------------------------------------------------------------+
| vasp_ncl_debug | Debug, and slower version of the Non Collinear VASP binary. Unless specifically asked, you should use vasp_ncl |
+----------------+---------------------------------------------------------------------------------------------------------------+

0 comments on commit 58f6021

Please sign in to comment.