Skip to content

LatPlan : A domain-independent, image-based classical planner

Notifications You must be signed in to change notification settings

clandestineland/latplan

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

./img/latplanlogo-simple.svg.png

LatPlan : A domain-independent, image-based classical planner

Use tagged versions for reliable reproduction of the results.

  • NEWS Initial release: Published after AAAI18.
  • NEWS Updates on version 2: Mainly the refactoring. AAAI18 experiments still works.
  • NEWS Updates on version 2.1: Version 2 was not really working and I could finally have time to fix it. Added ZSAE experiments.
  • NEWS Updates on version 2.2: Backported more minor improvements from the private repository
  • NEWS Updates on version 3.0: Updates for DSAMA system. More refactoring is done on the learner code. We newly introduced ama3-planner which takes a PDDL domain file.
  • NEWS Updates on version 4.0: Updates for Cube-Space AE in IJCAI2020 paper.
    • This is a version that can exploit the full potential of search heuristics (lower bounds in Branch-and-Bound), such as LM-cut or Bisimulation Merge-and-Shrink in Fast Downward.
    • We added the tuned hyperparameters to the repository so that reproducing the experiments will be easy(ish).
    • We now included a script for 15-puzzle instances randomly sampled from the states 14 or 21 steps away from the goal.
    • We also included a script for Korf’s 100 instances of 15-puzzle, but the accuracy was not sufficient in those problems where the shortest path length are typically around 50. Problems that require deeper searches also require more model accuracy because errors accumulate in each state transition.
  • NEWS Updates on version 4.1:
    • Improved the installation procedure. Now I recommend conda based installation, which specifies the correct Keras + TF versions.
    • The repository is now reorganized so that the library code goes to latplan directory and all other scripts remain in the root directory. During this migration I used git-filter-repo utility, which rewrites the history. This may have broken the older tags — I will inspect the breakage and fix them soon.
  • NEWS Updates on version 4.1.1, 4.1.2: The project is loadable with Anaconda, is pip-installable (somewhat).
  • NEWS Updates on version 4.1.3: Minor refactoring. We also released the trained weights. See Releases.
    • Future plan: I am currently thinking about porting Latplan to Pytorch, as well as allowing the integration with Gym environment, especally PDDLGym.

This repository contains the source code of LatPlan.

  • Asai, M; Muise, C.: 2020. Learning Neural-Symbolic Descriptive Planning Models via Cube-Space Priors: The Voyage Home (to STRIPS).
  • Asai, M.: 2019. Neural-Symbolic Descriptive Action Model from Images: The Search for STRIPS.
  • Asai, M.: 2019. Unsupervised Grounding of Plannable First-Order Logic Representation from Images (code available from https://github.com/guicho271828/latplan-fosae)
  • Asai, M.; Kajino, F: 2019. Towards Stable Symbol Grounding with Zero-Suppressed State AutoEncoder
  • Asai, M.; Fukunaga, A: 2018. Classical Planning in Deep Latent Space: Breaking the Subsymbolic-Symbolic Boundary.
  • Asai, M.; Fukunaga, A: 2017. Classical Planning in Deep Latent Space: From Unlabeled Images to PDDL (and back).
    • In Knowledge Engineering for Planning and Scheduling (KEPS) Workshop (ICAPS2017).
    • In Cognitum Workshop at ICJAI-2017.
    • In Neural-Symbolic Workshop 2017.

Setup

On Ubuntu, prerequisites can be installed via launching ./install.sh (It requires sudo several times). OSX users should be able to find the equivalents in homebrew. We listed the requirements.

General Requirements

Python 3.5 or later is required.

  • mercurial g++ cmake make python flex bison g++-multilib — these are required for compiling Fast Downward.
  • git build-essential automake libcurl4-openssl-dev — these are required for compiling [Roswell](http://roswell.github.io/). OSX users should use brew install roswell.
  • gnuplot — for plotting.
  • parallel — for running some scripts.

Python Dependency Installation with Anaconda / Miniconda (recommended)

anaconda / miniconda (https://docs.conda.io/en/latest/miniconda.html) is a python version dependency & mini environment management system. We recommend using miniconda, as it is smaller.

Run conda env create -f environment.yml then conda activate latplan.

Also run ./setup.py install, which install latplan.

Python Dependency Installation without Anaconda / Miniconda on Ubuntu

You should install python3-pip and python3-pil from the APT repository. Afterwards, run ./setup.py install, which installs latplan.

Running

Next, customize the following files for your job scheduler before running. The job submission commands are stored in a variable $common, which by default has the value like jbsub -mem 32g -cores 1+1 -queue x86_24h. You also need to uncomment the commands to run. By default, everything is commented out and nothing runs.

# You first need to set up a dataset.
./setup-dataset.sh

# This script launches the training for Cube-Space AEs, as well as SAEs used for AMA2.
./train_all.sh

# This script extracts PDDL files from the Cube-Space AE training results.
./train_others.sh

# This script launches the training for AAE, AD and SD for AMA2.
# The number of actions in AAE is tuned by the hyperparameter tuner.
./train_aae.sh

# This script trains AAEs with a fixed number of actions without tuning.
# It was used in the SAE + Cube-AAE experiments.
./train_aae-fixedactions.sh

# When the training finished, generate the problem instances.
# This script samples the initial states from the frontier of dijkstra search.
(cd problem-instances; ./example-dijkstra.sh)
# This script generates 15-puzzle instances.
(cd problem-instances-16; ./example-dijkstra.sh)
# This script generates Korf's 100 instances for 15-puzzle.
(cd problem-instances-16-korf; ./example-korf.sh)


# modify these scripts to adjust the job submission commands for your job scheduler.
./run_ama2_all.sh 
./run_ama3_all.sh 
./run_ama3_all-16.sh
./run_ama3_all-16-korf.sh
./run_ama3_all-cube-aae.sh

# after the experiments, run this script to generate the tables and figures.
# for details read the source code
./generate-all-csv.sh

file structure

  • Library code
    latplan/model.py
    network definitions.
    latplan/util/
    contains general-purpose utility functions for python code.
    latplan/puzzles/
    code for domain generators/validators.
    puzzles/*.py
    each file represents a domain.
    puzzles/model/*.py
    the core model (successor rules etc.) of the domain. this is disentangled from the images.
  • Scripts
    config.py, config_cpu.py
    keras/tensorflow configuration.
    strips.py
    (Bad name!) the program for training an SAE, and writes the propositional encoding of states/transitions to a CSV file.
    state_discriminator3.py
    The program for training an SD.
    action_autoencoder.py
    The program for training an AAE.
    action_discriminator.py
    The program for training an AD.
    ama1-planner.py
    Latplan using AMA1.
    ama2-planner.py
    Latplan using AMA2.
    ama3-planner.py
    Latplan using the visual input (init goal) and a PDDL domain file.
    run_ama{1,2,3}_all.sh
    Run all experiments.
    various sh files
    supporting scripts.
    helper/
    helper scripts for AMA1.
tests/
test files, mostly the unit tests for domain generator/validator
samples/
where the learned results should go. Each SAE training results are stored in a subdirectory.
problem-instances/
where the input problem isntances / experimental results should go.
(git submodule) planner-scripts/
My personal scripts for invoking domain-independent planners. Not just Fast Downward.

Gallery

./img/hanoi_4_3_36_81_conv_blind_path_0.png ./img/lightsout_digital_4_36_20000_conv_Astar_path_0.png ./img/lightsout_twisted_4_36_20000_conv_Astar_path_0.png ./img/puzzle_mandrill_3_3_36_20000_conv_blind_path_0.png ./img/puzzle_mnist_3_3_36_20000_conv_blind_path_0.png ./img/puzzle_spider_3_3_36_20000_conv_blind_path_0.png

About

LatPlan : A domain-independent, image-based classical planner

Resources

Stars

Watchers

Forks

Packages

 
 
 

Languages

  • Python 75.3%
  • Common Lisp 15.0%
  • Shell 9.0%
  • Other 0.7%