Code implementation of the paper Accurate Interpolation for Scattered Data through Hierarchical Residual Refinement.
The implementation of this work is built upon the foundations of three existing projects: NIERT, NeuralSymbolicRegressionThatScales and TFR-HSS-Benchmark.
- We highly recommend utilizing the conda package manager to create and manage the project's environment:
conda create -n hint python=3.7
conda activate hint
- Install the required third-party libraries by executing the following command:
pip install torch==1.8.1+cu111 torchvision==0.9.1+cu111 -f https://download.pytorch.org/whl/torch_stable.html
pip install -r requirements.txt
The Mathit-2D dataset construction process builds upon the work of NeuralSymbolicRegressionThatScales and NIERT.
Follow the steps below to create the dataset:
# generate training equations set
python3 -m src.data.mathit.run_dataset_creation --number_of_equations 1000000 --no-debug
# generate testing equations set
python3 -m src.data.mathit.run_dataset_creation --number_of_equations 150 --no-debug
mkdir -p mathit_data/test_set
# convert the newly created validation dataset in a csv format
python3 -m src.data.mathit.run_dataload_format_to_csv raw_test_path=mathit_data/data/raw_datasets/150
# remove the validation equations from the training set
python3 -m src.data.mathit.run_filter_from_already_existing --data_path mathit_data/data/raw_datasets/1000000 --csv_path mathit_data/test_set/test_nc.csv
python3 -m src.data.mathit.run_apply_filtering --data_path mathit_data/data/raw_datasets/1000000
By following these steps, you will generate the Mathit-2D dataset.
The PTV dataset can be obtained from here. This dataset provides valuable resources for interpolating particle velocities and reconstructing velocity fields in velocity-based analyses.
The TFRD datasets can be obtained from here. This dataset is specifically designed for reconstructing temperature fields from measurements obtained by scattered temperature sensors.
Follow the steps below to Train HINT:
# Training on Mathit-2D dataset
CUDA_VISIBLE_DEVICES="0,1" python main.py --config_path ./config/config_Mathit.yml
# Training on Pelrin dataset
CUDA_VISIBLE_DEVICES="0" python main.py --config_path ./config/config_Perlin.yml
# Training on TFRD-ADlet dataset
CUDA_VISIBLE_DEVICES="0,1" python main.py --config_path ./config/config_TFR_adlet.yml
# Training on PTV dataset
CUDA_VISIBLE_DEVICES="0,1" python main.py --config_path ./config/config_PTV.yml
For Mathit dataset, we certainly need to fix a interpolation task test set from the equation skeleton test set.
python main.py -m save_Mathit_testdataset_as_file
Then we can evaluate HINT on such test set.
CUDA_VISIBLE_DEVICES="0,1" python main.py -m test_Mathit --resume_from_checkpoint path_of_hint_checkpoint
For evaluation on other datasets, just run:
CUDA_VISIBLE_DEVICES="0,1" python main.py -m test_<dataset_name> --resume_from_checkpoint path_of_hint_checkpoint