Module for Ab Initio Structure Evolution (MAISE) features
* neural network-based description of interatomic interactions
* evolutionary optimization
* structure analysis
1. General info
2. Download and Installation
5. Setup input tag description
MAISE has been developed by
Current version 2.2 works on Linux platforms and combines 3 modules for modeling, optimizing, and analyzing atomic structures.
1 The neural network (NN) module builds, tests, and uses NN models to describe interatomic interactions with near-ab initio accuracy at a low computational cost compared to density functional theory calculations.
With the primary goal of using NN models to accelerate structure search, the main function of the module is to relax given structures. To simplify the NN application and comparison, we closely matched the input and output file formats with those used in the VASP software. Previously parameterized NN models available in the 'models/' directory have been generated and extensively tested for crystalline and/or nanostructured materials. First practical applications of NNs include the prediction of new synthesizable Mg-Ca alloys  and identification of more stable Cu-Pd-Ag nanoparticles .
Users can create their own NN models with MAISE which are typically trained on density functional theory (DFT) total energy and atomic force data for relatively small structures. The generation of relevant and diverse configurations is done separately with an 'evolutionary sampling' protocol detailed in our published work . The code introduces a unique feature, 'stratified training', of how to build robust NNs for chemical systems with several elements . NN models are developed in a hierarchical fashion, first for elements, then for binaries, and so on, which enables generation of reusable libraries for extended blocks in the periodic table.
2 The implemented evolutionary algorithm (EA) enables an efficient identification of ground state configurations at a given chemical composition. Our studies have shown that the EA is particularly advantageous in dealing with large structures when no experimental structural input is available [3,4].
The searches can be performed for 3D bulk crystals, 2D films, and 0D nanoparticles. Population of structures can be generated either randomly or predefined based on prior information. Essential operations are 'crossover', when a new configuration is created based on two parent structures in the previous generation, and 'mutation', when a parent structure is randomly distorted. For 0D nanoparticles we have introduced a multitribe evolutionary algorithm that allows an efficient simultaneous optimization of clusters in a specified size range .
3 The analysis functions include the comparison of structures based on
the radial distribution function (RDF), the determination of the space
group and the Wyckoff positions with an external SPGLIB package,
etc. In particular, the RDF-based structure dot product is essential
for eliminating duplicate structures in EA searches and selecting
different configurations in the pool of found low-energy structures.
The source code for MAISE can be obtained from the commandline by running:
git clone https://github.com/maise-guide/maise.git
git clone git://github.com/maise-guide/maise.git
wget -O master.zip https://github.com/maise-guide/maise/archive/master.zip unzip master.zip
The code has been extensively tested on Linux platforms. We will appreciate users' feedback on the installation and performance of the package on different platforms.
2 MAISE installation will check if these libraries are already present. If not, they will be automatically downloaded to ./ext-dep and installed in ./lib .
3 If the GSL or SPGLIB library installation is not completed automatically please install them manually and copy (i) libgsl.a, libgslcblas.a and libsymspg.a into the ./lib subdirectory; (ii) the spglib.h header into ./lib/include subdirectory; and (iii) all gsl headers into the ./lib/include/gsl subdirectory.
4 By default, the code will be compiled for parallel execution with OpenMP. If you do not wish to compile the parallel version set 'SERIAL ?= 1' in maise/makefile
5 Use 'make --jobs' for full compilation, 'make clean' for cleaning most relevant objects, and 'make clean-all' for cleaning all objects.
A “check” script is available in the examples/ directory which can be run after compiling the maise to ensure the proper functionality of the code. This script automatically checks for the performance of the code in parsing the data, training the neural network, and evaluating a crystal structure. If the compilation is fine the “check” script will output so; otherwise error logs will be provided with further information about the issue.
Main input files that define a simulation are 'setup' with job settings, 'model' with NN parameters, 'basis' with the symmetry functions converting a structure into the NN input, and 'table' with typical chemical element sizes. The atomic structure is read from the 'POSCAR' file that follows the VASP format.
| * for stratified training one needs to provide individual models
$ 'basis' stored in the parsed directory is appended to 'model' at the end of the training
# 'model' has 'basis' pasted at the end once training is finished
The structure examination and manipulation functions are run by calling maise with a flag:
|rdf||compute and plot the RDF for POSCAR|
|cxc||compute dot product for POSCAR0 and POSCAR1 using RDF|
|cmp||compare RDF, space group, and volume of POSCAR0 and POSCAR1|
|spg||convert POSCAR into str.cif, CONV, PRIM|
|cif||convert str.cif into CONV and PRIM|
|rot||rotate a nanoparticle along eigenvectors of moments of inertia|
|dim||find whether POSCAR is periodic (3) or non-periodic (0)|
|box||reset the box size for nanoparticles|
|sup||make a supercell specified by na x nb x nc|
|vol||compute volume per atom for crystal or nano structures|
Directory 'examples/' has samples of maise jobs for parsing data, training neural networks, and simulating structures with neural network models. Eash example has a README file, a setup file with only relevant tags for the particular job, and reference output files for comparison.
Input tags by type
Main job type selector
Neural Network model
Neural Network training