[ICLR 2026] NextHAM: Advancing Universal Deep Learning for Electronic-Structure Hamiltonian Prediction of Materials
This is the official implementation of the paper "Advancing Universal Deep Learning for Electronic-Structure Hamiltonian Prediction of Materials", accepted at ICLR 2026.
Both training and inference were conducted on high-performance hardware:
- GPU: 4x NVIDIA A800 (80GiB VRAM each).
- Note: Due to the high dimensionality of Hamiltonian matrices, a large GPU memory is recommended for optimal performance.
The software dependencies are managed via Conda. All required packages are specified in the ./environment.yml file.
To set up the environment, run:
# Create the environment from the file
conda env create -f environment.yml
# Activate the environment
conda activate nexthamThe Materials-SOC-HAM dataset proposed in this paper is publicly available.
-
Download: You can access the dataset via the link below:
- Link: Materials-SOC-HAM Dataset
- Extraction Code:
DGEo
-
Configuration: Once the dataset is downloaded and extracted to your local directory, you must update the file roots:
- Locate the following files:
datasets/train.txt,datasets/val.txt, anddatasets/test.txt. - Replace the placeholder
/your_path/in each file with the absolute path to your local dataset directory.
- Locate the following files:
-
Pre-trained Models (Optional)
If you wish to train on new data, we highly recommend fine-tuning from our provided pre-trained models to achieve faster convergence.
- Download Link: NextHAM Pre-trained Weights
- Password:
QoYA - Setup: Please download the weights and place them in the ./pretrained_models directory.
-
Training To start the model training and validation process, execute the following script:
sh scripts/train/train_val.sh
-
Testing To test the model after training, execute the following script:
sh scripts/test/test.sh
@inproceedings{yin2026nextham,
title={Advancing Universal Deep Learning for Electronic-Structure Hamiltonian Prediction of Materials},
author={Shi Yin, Zujian Dai, Xinyang Pan, and Lixin He},
booktitle={International Conference on Learning Representations (ICLR)},
year={2026}
}