Skip to content

oo222bs/PVAE-BERT

Repository files navigation

Paired Variational Autoencoders - BERT

Language Model-Based Paired Variational Autoencoders for Robotic Language Learning

Last updated: 7 May 2024.

This code has been partially adapted from Copyright (c) 2018, Tatsuro Yamada

Copyright (c) 2022, Ozan Özdemir <ozan.oezdemir@uni-hamburg.de>

Requirements

  • Python 3
  • Pytorch
  • NumPy
  • Tensorboard

Implementation

PVAE & PVAE-BERT - Pytorch Implementation

Training Example

$ cd src
$ python main_pvae.py
  • main_pvae.py: trains the PVAE model
  • pvae.py: defines the PVAE and PVAE-BERT architecture
  • prae.py: defines the PRAE architecture.
  • channel_separated_cae: defines the channel separated CAE
  • standard_cae: defines the standard CAE
  • config.py: training and network configurations
  • data_util.py: for reading the data
  • generation.py: translates instructions to actions
  • recognition.py: translates actions to descriptions
  • extraction.py: extracts shared representations
  • reproduction.py: reproduces the actions
  • lang2lang.py: reproduces the descriptions

Trained PVAE-BERT Model

Available here

Citation

PVAE-BERT

@ARTICLE{OKWLW22,
  author={Özdemir, Ozan and Kerzel, Matthias and Weber, Cornelius and Hee Lee, Jae and Wermter, Stefan},
  journal={IEEE Transactions on Cognitive and Developmental Systems}, 
  title={Language-Model-Based Paired Variational Autoencoders for Robotic Language Learning}, 
  year={2023},
  volume={15},
  number={4},
  pages={1812-1824},
  doi={10.1109/TCDS.2022.3204452}}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages