Skip to content

[ICLR 2025] The source code of "MolSpectra: Pre-training 3D Molecular Representation with Multi-modal Energy Spectra"

Notifications You must be signed in to change notification settings

Codingyyao/MolSpectra

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

MolSpectra: Pre-training 3D Molecular Representation with Multi-modal Energy Spectra

model

Implementation for paper: MolSpectra: Pre-training 3D Molecular Representation with Multi-modal Energy Spectra

This is the code for the ICLR'25 Paper: MolSpectra: Pre-training 3D Molecular Representation with Multi-modal Energy Spectra.

Usage

Download dataset: You can find and download the processed QM9S dataset named "qm9sp.zip" using this link. Select Files to access the dataset. Unzip the file into the target folder, and update the dataset_root configuration in examples/ET-QM9-QM9SP-PT.yaml and examples/ET-MD17-QM9SP-PT.yaml to point to the target folder.

Pre-train and fine-tune: Please refer to the scripts in the scripts/ directory for pre-training and fine-tuning.

Requirements

  • Python >= 3.10
  • torch>=2.3.1
  • torch_cluster>=1.6.3
  • torch_geometric>=2.6.1
  • torch_scatter>=2.1.2
  • ase>=3.23.0
  • h5py>=3.11.0
  • matplotlib>=3.10.0
  • numpy>=1.26.3
  • pytorch_lightning>=1.3.8
  • PyYAML>=5.4.1
  • tqdm>=4.66.5

Citation

Please cite our paper if you use the code:

@inproceedings{wang2025molspectra,
  author       = {Liang Wang and Shaozhen Liu and Yu Rong and Deli Zhao and Qiang Liu and Shu Wu and Liang Wang},
  title        = {MolSpectra: Pre-training 3D Molecular Representation with Multi-modal Energy Spectra},
  booktitle    = {ICLR},
  year         = {2025}
}

About

[ICLR 2025] The source code of "MolSpectra: Pre-training 3D Molecular Representation with Multi-modal Energy Spectra"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 98.4%
  • Shell 1.6%