- Installation
- Setting Up Environment
- Basic Usage
- Advanced Usage: Socket Interface
- Troubleshooting
- Advanced Topics
- Support and Contribution
SPARC-X-API
is an ASE-compatible Python API for the density functional theory (DFT) code SPARC. It offers:
- ASE-compatible I/O format for SPARC files
- A JSON Schema interfacing with SPARC's C-code for parameter validation and conversion
- A comprehensive calculator interface for SPARC with socket-communication support.
Fig. 1 provides an overlook of the components of SPARC-X-API
and its relation with the SPARC C-code.
The Python API may be installed via either of the following approaches:
Set up a conda environment and install the Python API, which includes the pseudopotential files:
# Change 'sparc-env' to your desired name if needed
conda create -n sparc-env
conda activate sparc-env
conda install -c conda-forge sparc-x-api
On Linux platforms (x86_64, aarch64), you can also install the
precompiled sparc
DFT binaries alongside the API:
conda install -c conda-forge sparc-x
conda activate sparc-env # Re-activate to have the env variables effective
python -m pip install git+https://github.com/SPARC-X/SPARC-X-API
Optionally, you can download the latest SPMS pseudopotentials and unpacks the pseudopotential files into <python-lib-root>/site-packages/sparc/psp
:
python -m sparc.download_data
To utilize the API for drive SPARC calculations, please following the SPARC manual for compilation and installation of the SPARC DFT code itself.
We recommend the users to run a simple test after installation and setup:
python -m sparc.quicktest
A proper setup will display the following sections at the output's conclusion:
For using the API to parse SPARC input and output files, it's essential that the "Import" and "JSON API" tests are successful. For run SPARC calculations, all tests must pass.
Please refer to the Setting Up the Environment or guidance on correctly configuring the environment variables. If you run into further problems, consult our Trouble Shooting.
SPARC-X-API
is designed to automate the discovery of
pseudopotential files, the JSON API, and the SPARC binary. However,
you can exert fine-grained control over their setup:
Pseudopotential files (in Abinit
psp8 format) are loaded in the following
order:
- Via the
psp_dir
argument passed to thesparc.SPARC
calculator. - Through the environment variables
$SPARC_PSP_PATH
or$SPARC_PP_PATH
(this is the method employed byconda
installation). - By using
psp8
files bundled with the SPARC-X-API installation (see the manual installation).
To specify a custom path for your psp8 files, set the $SPARC_PSP_PATH
or $SPARC_PP_PATH
variable as follows:
export SPARC_PSP_PATH="/path/to/your/psp8/directory"
To determine the default location of psp8 files (as per option 3), run the following code:
python -c "from sparc.common import psp_dir; print(psp_dir)"
SPARC-X-API
is engineered for compatibility with the SPARC
C-code. It achieves this by loading a JSON schema for
parameter validation and unit conversion. You can review the default
schema used by the API at sparc.sparc_json_api.default_json_api
"FD_GRID": {
"symbol": "FD_GRID",
"label": "FD_GRID",
"type": "integer array",
"default": null,
"unit": "No unit",
"example": "FD_GRID: 26 26 30",
"description": "#<Some description...>",
"allow_bool_input": false,
"category": "system"
},
The schema file is generated from SPARC's LaTeX documentation. In
upcoming releases of SPARC-X-API
, we're aiming to provide users the
flexibility to use their own custom schema files. This would be
particularly useful for those who might be testing a development
branch of SPARC. By default, the JSON schema is packaged under
sparc/sparc_json_api
directory. If you have another version of SPARC
source code, you can set the environment variable $SPARC_DOC_PATH
to
the directory containing the LaTeX codes for the documentation, such
as <SPARC-source-code-root>/doc/.LaTeX
. If you obtain sparc-x
from
the conda method as mentioned above, By default, the JSON schema is
packaged under sparc/sparc_json_api
directory. If you have another
version of SPARC source code, you can set the environment variable
$SPARC_DOC_PATH
is automatically set to
<conda-env-root>/share/doc/sparc/.LaTeX
. Setting up the environment
variable $SPARC_DOC_PATH
helps loading the correct JSON schame that
is compatible with your SPARC binary code.
The command to execute SPARC calculations is determined based on the following priority:
- The command argument provided directly to the
sparc.SPARC
calculator. - The environment variable
$ASE_SPARC_COMMAND
- If neither of the above is defined,
SPARC-X-API
looks for the SPARC binary under current$PATH
and combine with the suitablempi
command prefix.
Example:
- Using
mpirun
(e.g. on a single test machine)
export ASE_SPARC_COMMAND="mpirun -n 8 -mca orte_abort_on_non_zero_status 1 /path/to/sparc -name PREFIX"
- Using
srun
(e.g. in HPC slurm job system)
export ASE_SPARC_COMMAND="srun -n 8 --kill-on-bad-exit /path/to/sparc -name PREFIX"
Notes:
-
The
-name PREFIX
part can be omitted thelabel
property of thesparc.SPARC
calculator is set (which is the default behavior). Any extra features of the SPARC code (e.g. GPU acceleration) should be specified in the command. -
We recommend adding kill switches for your MPI commands like the examples above when running
sparc
to avoid unexpected behaviors with exit signals.
In contrast to many other DFT codes, where the ASE I/O formats refer
to a single file, SPARC-X-API
operates on the whole calculation
directory, also known as a "SPARC bundle". This API integrates
seamlessly with ASE, allowing for the automatic detection of the SPARC
file format:
- Reading from a SPARC bundle
import sparc
from ase.io import read, write
atoms = read("test.sparc", index=-1)
Note: To read multiple output files from the same directory, e.g., SPARC.aimd, SPARC.aimd_01, pass the keyword argument include_all_files=True
to read()
- Writing a minimal SPARC bundle from atoms
import sparc
from ase.io import read, write
from ase.build import Bulk
atoms = Bulk("Al") * [4, 4, 4]
atoms.write("test.sparc")
For a deeper dive into the bundle I/O format, see Advanced Topics.
A recurring challenge of Python interfaces to DFT codes it the
inconsistencies between low-level codes (Fortran/C/C++) and outdated
upper-level APIs regarding parameter sets and default values. To
address this issue, SPARC-X-API
handles DFT parameters through a
JSON schema translated from SPARC's LaTeX documentation. Each release
of SPARC-X-API
is linked with a specific version of the SPARC
source code, ensuring compatibility and consistency with the default
parameter set. The main driver of this feature is the
sparc.api.SparcAPI
class.
If you've obtained the full SPARC source code, you can generate a copy of the schema by the following code:
python -m sparc.docparser <sparc-source-code-root>/doc/.LaTeX
which produces a parameters.json
file.
To learn more about the JSON schema design, please refer to Advanced Topics.
SPARC-X-API
offers a calculator interface based on file I/O that aligns with many
other ASE calculators. If you've worked with ASE modules like Vasp
,
QuantumEspresso
, or GPAW
, you'll find this package intuitive,
as shown in the following examples:
- Single point calculation
from sparc.calculator import SPARC
from ase.build import molecule
atoms = molecule("H2", cell=(10, 10, 10), pbc=True, directory="run_sp")
atoms.calc = SPARC(h=0.25)
atoms.get_potential_energy()
atoms.get_forces()
This example sets up a calculation for H2 atoms in a 10 Å x 10 Å x 10
Å PBC cell with default parameters (PBE exchange correlation
functional, and a grid spacing (h
) of 0.25 Å). Note by calling
atoms.get_forces
, the calculator will automatically sets the flags
for printing the forces.
- Geometric optimization (using SPARC's internal routines)
from sparc.calculator import SPARC
from ase.build import bulk
atoms = bulk("Al", cubic=True)
atoms.rattle(0.05)
atoms.calc = SPARC(h=0.25, kpts=(3, 3, 3), relax_flag=True, directory="run_opt")
atoms.get_potential_energy()
atoms.get_forces()
This example sets up a calculation for a rattled Aluminum primitive unit cell, calculate with PBE functional, grid spacing of 0.25 Å, and 3 x 3 x 3 k-point grid. Optimization of ionic positions is handled with SPARC's default LBFGS routine.
- AIMD in SPARC
from sparc.calculator import SPARC
from ase.build import bulk
md_params = dict(md_flag=True, ion_temp=800, md_method="NVE", md_timestep=0.6, md_nstep=5)
atoms = bulk("Al") * (3, 3, 3)
atoms.rattle()
atoms.calc = SPARC(h=0.25, kpts=(1, 1, 1), directory="run_aimd", **md_params)
atoms.get_potential_energy()
This example runs a short NVE MD simulation (5 steps) at 800 K for 27 Al atoms.
If you want to extract more information about the MD simulation steps, take a look at SPARC.raw_results
.
- Geometric optimization using ASE's optimizers
The power of SPARC-X-API
is to combine single point SPARC
calculations with advanced ASE optimizers, such as BFGS, FIRE or GPMin. Example 2 can be re-written as:
from sparc.calculator import SPARC
from ase.build import bulk
from ase.optimize import LBFGS
atoms = bulk("Al", cubic=True)
atoms.rattle(0.05)
atoms.calc = SPARC(h=0.25, kpts=(3, 3, 3), directory="run_opt_ase")
opt = LBFGS(atoms, alpha=90)
opt.run(fmax=0.02)
A simple command wrapper sparc-ase
is provided to add
support of SPARC file formats to the ase
cli tools. Simple
replace ase [subcommand] [args]
with sparc-ase [subcommand] [args]
to access your SPARC bundle files as you would use for other file
formats. As an example, use sparc-ase gui path/to/your/bundle.sparc
for the visualization of atomistic structures. Depending on the
bundle's contents, this could display individual atoms or multiple
images.
Fig. 2 is a screenshot showing the usage of sparc-ase gui
to visualize a
short MD trajectory.
In the SPARC DFT code, all input parameters conventionally employ atomic units, such as Hartree and Bohr. Conversely, ASE objects (like Atoms.positions
, Atoms.cell
, Atoms.get_potential_energy()
) utilize eV/Angstrom units.
When you set up a calculator as below:
atoms.calc = SPARC(h=0.25, REFERENCE_CUTOFF=0.5, EXX_RANGE_PBE=0.16, **params)
inputs following ASE's convention (e.g., h
) adopt eV/Angstrom units (thus the same setting can be applied to other DFT calculators),
On the other hand, all SPARC-specific parameters, which can often be recognized by their capitalized format (like REFERENCE_CUTOFF
, EXX_RANGE_PBE
), retain their original values consistent with their representation in the .inpt
files.
The reasoning and details about unit conversion can be found in the Rules for Input Parameters in Advanced Topics.
In order for SPARC-X-API
to be compatible with other ASE-based DFT calculators,
there is a list of special parameters consistent with the ASE convention and uses Å / eV / GPa / fs
unit system:
parameter name | meaning | example | equivalent SPARC input |
---|---|---|---|
xc |
Exchange-correlation functional | xc=pbe |
EXCHANGE_CORRELATION: GGA_PBE |
xc=lda |
EXCHANGE_CORRELATION: LDA_PZ |
||
xc=rpbe |
EXCHANGE_CORRELATION: GGA_RPBE |
||
xc=pbesol |
EXCHANGE_CORRELATION: GGA_PBEsol |
||
xc=pbe0 |
EXCHANGE_CORRELATION: PBE0 |
||
xc=hf |
EXCHANGE_CORRELATION: HF |
||
xc=hse or xc=hse03 |
EXCHANGE_CORRELATION: HSE |
||
xc=vdwdf1 or xc=vdw-df |
EXCHANGE_CORRELATION: vdWDF1 |
||
xc=vdwdf2 or xc=vdw-df2 |
EXCHANGE_CORRELATION: vdWDF2 |
||
xc=scan |
EXCHANGE_CORRELATION: SCAN |
||
h |
Real grid spacing (Å) | h=0.2 |
MESH_GRID: 0.38 (in Bohr) |
gpts |
Explicit grid points | gpts=[10, 10, 10] |
FD_GRID: 10 10 10 |
kpts |
Kpoint mesh | kpts=[3, 3, 3] |
KPOINT_GRID: 3 3 3 |
convergence |
Dict of convergence criteria (see below) | ||
energy eV/atom |
convergence={"energy": 1e-4} |
TOL_SCF: 3e-6 |
|
relax (forces) eV/Å |
convergence={"relax": 1e-2} |
TOL_RELAX: 2e-4 |
|
density e/atom |
convergence={ density: 1e-6} |
TOL_PSEUDOCHARGE: 1e-6 |
Users from other DFT codes can easily port their ASE codes to SPARC-X-API
using the special parameters with minimal modification:
Example 1: VASP vs SPARC
# Using VASP
from ase.calculators.vasp import Vasp
calc = Vasp(xc="rpbe", kpts=(9, 9, 9), directory="vasp-calc")
vs
# Using SPARC
from sparc.calculator import SPARC
calc = SPARC(xc="rpbe", kpts=(9, 9, 9), directory="sparc-calc.sparc")
Example 2: GPAW (another real-space DFT code) vs SPARC
# Using GPAW
from gpaw import GPAW
calc = GPAW(xc="PBE", kpts=(9, 9, 9), h=0.25, directory="vasp-calc", convergence={"energy": 1.e-4})
vs
# Using SPARC
from sparc.calculator import SPARC
calc = SPARC(xc="PBE", kpts=(9, 9, 9), h=0.25, directory="sparc-calc.sparc", convergence={"energy": 1.e-4})
Disclaimer: The socket communication feature in SPARC and SPARC-X-API are experimental and subject to changes until the release of v2.0 SPARC-X-API.
Experienced users can harness the power of SPARC and SPARC-X-API's socket communication layer to build efficient and flexible computational workflows. By integrating a socket communication interface directly into SPARC, users can significantly reduce the overhead typically associated with file I/O during calculation restarts. This feature is particularly beneficial for tasks involving repetitive operations like structural optimization and saddle point searches, where traditional file-based communication can become a bottleneck. The underlying software architecture is shown in Fig. 3:
Requirements: the SPARC binary must be manually compiled from the source
code with socket
support and with the
USE_SOCKET=1
flag enabled (see the installation
instructions.
The socket communication layer in SPARC and SPARC-X-API are designed for:
- Efficiency: Eliminates the need for intermediate file I/O, directly streaming data between processes.
- Speed: Enhances the performance of iterative calculations, crucial for large-scale simulations.
- Flexibility: Allows dynamic modification of calculation parameters without the need to restart the process.
The communication protocol implemented in SPARC and SPARC-X-API adheres to the i-PI protocol standard. Specifically, we implement the original i-PI protocol within the SPARC C-source code, while the python SPARC-X-API uses a backward-compatible protocol based on i-PI. The dual-mode design is aimed for both low-level and high-level interfacing of the DFT codes, providing the following features as shown in Fig. 4:
Based on the scenarios, the socket communication layer can be accessed via the following approaches as shown in Fig. 5:
-
SPARC binary only (Fig. 5 a)
SPARC binary with socket support can be readily coupled with any i-PI compatible socker server, such as
ase.calculators.socketio.SocketIOCalculator
, for examplefrom ase.calculators.socketio import SocketIOCalculator from subprocess import Popen calc = SocketIOCalculator(port=31415) with calc: # Start sparc at background process = Popen("mpirun -n 8 sparc -name SPARC -socket localhost:31415", shell=True) # Single point calculations process.kill()
The end user is responsible for generating the input files and making sure the same atoms structures are used by
SocketIOCalculator
and the SPARC binary. The mode is also limited to be run on a single computer system. -
Local-only Mode (Fig. 5 b)
Ideal for standalone calculations, this mode simulates a conventional calculator while benefiting from socket-based efficiency.
with SPARC(use_socket=True, **normal_parameters) as calc: # Execute single-point calculations
For most users we recommend using this mode when performing a calculation on a single HPC node.
-
Client (Relay) Mode (Fig. 5 c)
In this mode, the
sparc.SPARC
calculator servers as a passive client which listens to a remote i-PI-compatible server. When messages are received on the client side, it relays the relevant message to a local SPARC binary and send results back through the socket pipe. The server side can either be a normal i-PI compatible server (such asSocketIOCalculator
) or server-modesparc.SPARC
(see 4).Start the client by:
client = SPARC(use_socket=True, socket_params=dict(host="host.address.com", port=31415)) with client: client.run()
Or via Command-Line:
python -m sparc.client -s host:port
Note: when running SPARC-X-API as a socket client, the atoms object can be ommitted (is the server also runs the SPARC protocol). When new atoms positions and parameters arrive, the client will automatically determine if it is necessary to restart the SPARC subprocess.
-
Server Mode (Fig. 5 d)
Paired with the client mode in (3), SPARC-X-API can be run as a socket server, isolated from the node that performs the computation. This can be useful for highly-distributed computational workflows.
On the server node, run:
server_calc = SPARC(use_socket=True, socket_params=dict(port=31415, server_only=True), **normal_parameters) with server_calc: # Execute single point calculations for atoms_1 # Execute single point calculations for atoms_2
In this case, the server will opens
0.0.0.0:31415
for connection. Make sure your server is directly accessible from the clients and the port is not occupied. The socker server is capable of receivingraw_results
directly from the clients, making it possible to accessserver_calc.raw_results
without access to the file systems on the client side.
As shown in Fig. 4,
the SPARC socket protocol designs allows bidirectional control of
internal SPARC routines. Local- or server-mode sparc.SPARC
calculators can communicate with the SPARC binary via functions like
set_params
. This can be useful for applications like on-the-fly
force field learning, electron density fitting, setting up boundary
conditions etc. Applications will be updated in both SPARC and
SPARC-X-API repositories.
Please refer to the troubleshooting guidelines
A detailed description about how the API works can be found here
The API changes compared to the older release (v0.1) are summarized here
Please refer to our guidelines for contributors