Skip to content

This repository contains PyTorch implementations of Neural Process, Attentive Neural Process, and Recurrent Attentive Neural Process.

Notifications You must be signed in to change notification settings

Jiacheng-Zhu-AIML/Neural-Processes

Repository files navigation

Neural-Processes

This repository contains PyTorch implementations of the following Neural Process variants

  • Recurrent Attentive Neural Process (ANPRNN)
  • Neural Processes (NPs)
  • Attentive Neural Processes (ANPs)

The model architectures follow the ones proposed in the papers.

This is the author implementation of NeurIPS 2019 Workshop paper

Installation & Requirements

It is recommended to use Python3. Using virtual environment is recommended.

PyTorch 
Numpy
Matplotlib

Descriptions

  • The Neural Process models are under /neural_process_models folder
  • In /neural_process_models/modules there are all the modules for building NP networks, including linear MLP, attention network and so on.
  • The data generation functions are under \misc

Results

1d function regression

10 context points

Usage

Simple example of training an Attentive Neural Process of 1d function regression

python3 main_ANP_1d_regression.py

For digital number inpainting trained on MNIST data

python3 main_ANP_mnist.py

See Neural Process models in /neural_process_models for detailed examples of construction NP models for specific tasks.

Acknowledgements

For any question, please refer to

License

MIT

About

This repository contains PyTorch implementations of Neural Process, Attentive Neural Process, and Recurrent Attentive Neural Process.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published