Skip to content
Planning Domain Definition Language (PDDL) samples demonstrating VS Code PDDL extension features
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Type Name Latest commit message Commit time
Failed to load latest commit information.
.vscode demo updates Jul 7, 2019
Airport ICAPS2019 show cases Jul 15, 2019
Blocksworld ICAPS2019 show cases Jul 15, 2019
CoffeeMachine clean-up Jun 6, 2019
DriverLog Refined readme description Jun 24, 2018
Generator flipped back the modified generator behavior that demos LP Jul 27, 2018
GettingStarted Added simple example for Python-based templating logic Jul 25, 2018
ScriptedTemplating Added simple example for Python-based templating logic Jul 25, 2018
Trucking clean-up Jul 18, 2019
ICAPS2019-System_demo.ptest.json ICAPS2019 show cases Jul 15, 2019
ICAPS2019-Tutorial.ptest.json ICAPS2019 show cases Jul 15, 2019
LICENSE Initial commit Jun 22, 2018 Added simple example for Python-based templating logic Jul 25, 2018

PDDL samples demonstrating features of the VS Code PDDL Extension

Planning Domain Definition Language PDDL samples demonstrating VS Code PDDL extension features.

Getting started

  1. Install Visual Studio Code the free and light-weight editor for developers
  2. Install the PDDL Extension
  3. Clone this repository and open the folder in Visual Studio Code
  4. Open the Test Explorer (View > Open View ... > Test)
  5. Unfold the folders in the Test Explorer and run the samples by right-clicking on them...



Start from empty files by using the domain and problem snippets.


This is a sample showing airport ground operations planning. It has an intentional bug. Can you find it?

Run the 1plane problem. The test case shows as failed. Export the plan to 1plane.pddl, save the file to disk. Select both 1plane.pddl and 1plane-expected.pddl in the File Explorer and select Compare selected. See the difference? The aircraft is not getting re-fueled.

Now go back to the plan visualization and click on the action that is missing a pre-condition. If you select a correct one, you will see a hint.

Fix the bug and re-run the test case - it should pass now.

Export the plan to a .plan file keeping the proposed default name. Run Tasks > Run task... > validate (with report) and observe that a .tex file got created. Install LaTeX Preview extension to open the preview.


This domain is interesting, because it generates a small search space and the planner does not run out of available memory even when you ask it to search for a more optimal plan (e.g. by using the -n flag in popf).

Open the PDDL files of one of the test case and pres Alt + P. When prompted, select specific options... and then type -n as the command-line option (or equivalent in your planner).

The plan visualization shows multiple plans and you can select the plan you want (the bar visualizes plan metric of the plan) and compare them easily.


This classic planning benchmark domain is plemented here using the problme templating approach. See the .ptest.json file for definition of the generated test cases. Each test case i generated from a .json file defining the initial and goal state. This demonstrates a regression test suite.

Driver log

This domain demonstrates the problem file templating and a programatic generation of scalability test suite.

Run the script (requires Python 3.5+ to be installed) and refresh the Test Explorer. You will see a list of Scalability tests for your planner. Run the suite to see how your planner struggles (or does not :-]). See how the .json files are concise to capture the test case definitions. Check the logic in the problem.pddl template.

Scripted Templating

For any more advanced data transformation during the problem file generation, refer to the ScriptedTemplate sample. It shows how the same problem.pddl template may be populated by a static .json file, or, by contrast, the template may be populated using data dynamically queried/transformed by a custom Python script. The script takes the arguments supplied via the .ptest.json test case manifest (i.e. numbers 1, 2 and 3) and sums them up before outputting the result to the PDDL problem. The script (or for that matter any custom program you may write) takes the templated problem from its standard input stream and outputs the rendered template to the standard output. VS Code just orchestrates the data flow during PDDL domain authoring and testing. When such a solution is deployed (e.g. as a planning service), it is easy to wrap the transform function with a Python Flask service.

This is an important pattern, which helps externalize calculations or decision making from the planning domain, when it is more efficient (or less complex) to perform it outside the planning problem. This helps making the planning process faster.

You can’t perform that action at this time.