Skip to content

rycolab/rnn-turing-completeness

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

16 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Code accompanying the EMNLP 2023 publication "On the Representational Capacity of Recurrent Neural Language Models".

This repository contains a library for proving the Turing completeness of recurrent neural network language models. Specifically, it includes implementations of weighted single-stack and two-stack pushdown automata encoded in recurrent neural nets (RNNs). Since two-stack pushdown automata are equivalent to Turing machines, this shows by construction that RNNs are Turing complete, and can even simulate certain kinds of probabilistic Turing machines.

Implementation programmed by Ryan Cotterell and Anej Svete. The original inspiration for the "proof by code" is Siegelmann and Sontag (1995).

Getting started with the code

Clone the repository:

$ git clone https://github.com/rycolab/rnn-turing-completeness.git
$ cd rnn-turing-completeness
$ pip install -e .

At this point it may be beneficial to create a new Python virtual environment. There are multiple solutions for this step, including Miniconda. We aim at Python 3.10 version and above.

Then you install the package in editable mode:

$ pip install -e .

We use black and flake8 to lint the code, pytype to check whether the types agree, and pytest to unit test the code.

To unit-test the code, run:

pytest .

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages