Skip to content

bin123apple/OpenMP-Fortran-CPP-Translation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

31 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fortran-CPP-HPC-code-translation-dataset

Our paper is avaliable at http://arxiv.org/abs/2307.07686.

This repository contains training and testing dataset and a simple test script.

We collect data form three different source:

Polybench

NAS Parallel Benchmarks

dataracebench

You can also download the dataset from : My Huggingface

Here is one data pair example:

Here is one data pair example:

We will add more data pairs in the future and will add a new "nature language" column for code generation task.

Reproduce our result

Google Colab has detailed steps.

https://colab.research.google.com/drive/1QqkGskaPPUKvjzwn_dmaV9z3yB9z2Vyu

Overview of steps are explained below.

It mainly contains of two steps, for the detailed description, please check this Colab. The brief description is shown below

  1. Finetune the model by using deepspeed
deepspeed --master_port 12345 main.py \
   --data_path Bin12345/HPC_Fortran_CPP \
   --model_name_or_path path/to/starcoder_model \
   --per_device_train_batch_size 1 \
   --per_device_eval_batch_size 1 \
   --max_seq_len 128 \
   --learning_rate 9.65e-6 \
   --weight_decay 0.1 \
   --num_train_epochs 3 \
   --gradient_accumulation_steps 2 \
   --lr_scheduler_type cosine \
   --num_warmup_steps 0 \
   --seed 1234 \
   --zero_stage $ZERO_STAGE \
   --deepspeed \
   --output_dir $OUTPUT \
   &> $OUTPUT/training.log
  1. Use the finetuned model to generate the prompts. You can try our simple test scripts. And for different models, there might be slightly difference.

Shield: CC BY 4.0

This work is licensed under a Creative Commons Attribution 4.0 International License.

CC BY 4.0

About

This repo contains the dataset for paper: Creating a Dataset Supporting Translation Between OpenMP Fortran and C++ Code

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages