Skip to content

MarijeTromp/BEP_How_To_Train_Your_Dragon

 
 

Repository files navigation

BEP_How_To_Train_Your_Dragon

Welcome,

This code is part of the Research Project 21/22 Q4 at TU Delft of M.R. Tromp, Q.B. Hofstede (add your name here). The topics of the projects are:

Author Project Title
M.R. Tromp Alternatives to the components of a Genetic Algorithm
Q.B. Hofstede How to train your dragon: on the application of the Metropolis-Hastings algorithm for Program Synthesis
Add your name and your project here :)

The results and data processing methods from these papers can be found in the Published-Results-Q4-2022 folder.

This codebase is an extension of the codebase provided for this research, which is called BEP_project_synthesis.

BEP_project_synthesis

This code is part of the BEP projects of F. Azimzade, B. Jenneboer, N. Matulewicz, S. Rasing and V. van Wieringen.

Source of training/test data

Robot and Pixel test/training data was generated by A. Cropper and S. Dumančić for their paper "Learning large logic programs by going beyond entailment." arXiv preprint arXiv:2004.09855 (2020).

The String test/training data was received from S. Dumančić who took them from the paper by Lin, Dianhuan, et al. "Bias reformulation for one-shot function induction." (2014).


How to run:

Option 1: main.py
You can run the main.py file to run experiments. It contains comments with details on how to configure the experiment. e.g. which domain(s), which search algorithms etc.

For the Metropolis-Hastings algorithm you can also specify the domain as well as experiment parameter by calling main through the command line. For example:

python main.py "robot" "{'add_if_statement_random': 10,'add_loop_random': 10,'add_token_random': 10,'alpha': 1,'remove_token_random': 20,'start_over': 1,'type': 'metropolis_hastings'}"

You can of course store these parameters in a text file and generate it automatically with a script, for example through...

Option 2: DelftBlue
If you have access to the DHPC run DelftBlue super computer then you can create a series of SLURM jobs through the create_jobs.py file. It also contains details exactly on how to customize the various experiments it.

Results

The results will be output to the /results/ folder. This can be changed in the search/batch_run.py file's init_store_system method.

For example, this includes the commands to save the results to your scratch drive on DelftBlue (At the time of writing).


Genetic Algorithms

The code for this project is located in search/gen_prog/vanilla_GP_alternatives. The original code for VanillaGP is called vanilla_GP.py.

To run the alternatives code, make sure that main.py contains [VanillaGPReworked, "gpr"] in the list of search algorithms. If you want to change which alternatives are used, change _type to the wanted alternative. Any other settings were kept the same as in the original VanillaGP, but can be changed.

Metropolis-Hastings Algorithm

The code for this project is located in search/metropolis_hastings. The original code for Metropolis-Hastings is still present but has been abstracted to a specific run configuration for the algorithm, see "How to run" for details.

Original parameters used in the paper by V. van Wieringen:

{
    'add_if_statement_end': 10,
    'add_loop_end': 10,
    'add_token_end': 10,
    'alpha': 1.2,
    'remove_token_random': 20,
    'start_over': 2,
    'type': 'metropolis'
}

To run the alternatives code:
First, make sure that main.py contains [MetropolisHastings, "metro"] in the list of search algorithms. Then, you can either:

  1. adjust the code in main.py
  2. call main.py with a domain and parameter argument. See "How to run" for an example.

Here is a list of possible parameters:

Parameter Values Default Description
type "metropolis", "metropolis_hastings", "metropolis_hastings_2" "metropolis" Decides which acceptance function to use. Read the paper/code for details
alpha float >= 0 1 Sets the normalization factor alpha.
add_token_* natural number 0 Sets weight of the add_token mutation
remove_token_* natural number 0 Sets weight of the remove_token mutation
add_loop_* natural number 0 Sets weight of the add_loop mutation
add_if_statement_* natural number 0 Sets weight of the add_if_statement mutation
start_over natural number 0 Sets weight of the start_over mutation

Note: Replace the '*' with "end" or "random" to choose the specific locality option

About

No description or website provided.

Topics

Resources

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 55.0%
  • Python 44.9%
  • Shell 0.1%