Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Rename package to keras_lmu #24

Merged
merged 1 commit into from Nov 6, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion .gitignore
@@ -1,4 +1,4 @@
lmu.egg-info
keras_lmu.egg-info
*.swo
*.swp
*.ipynb_checkpoints
Expand Down
19 changes: 9 additions & 10 deletions .nengobones.yml
@@ -1,7 +1,7 @@
project_name: NengoLMU
pkg_name: lmu
repo_name: nengo/lmu
description: Legendre Memory Units
project_name: KerasLMU
pkg_name: keras_lmu
repo_name: nengo/keras-lmu
description: Keras implementation of Legendre Memory Units

copyright_start: 2019

Expand All @@ -16,7 +16,7 @@ manifest_in: {}
setup_py:
python_requires: ">=3.6"
install_req:
- scipy
- scipy>=1.0.0
- tensorflow>=2.1.0
tests_req:
- pytest>=6.1.0
Expand All @@ -31,7 +31,6 @@ setup_py:
- numpydoc>=0.6
classifiers:
- "Development Status :: 3 - Alpha"
- "Framework :: Nengo"
- "Intended Audience :: Science/Research"
- "License :: Free for non-commercial use"
- "Operating System :: OS Independent"
Expand All @@ -51,10 +50,10 @@ docs_conf_py:
html_redirects:
getting_started.html: getting-started.html
autoautosummary_change_modules:
lmu:
- lmu.layers.LMUCell
- lmu.layers.LMU
- lmu.layers.LMUFFT
keras_lmu:
- keras_lmu.layers.LMUCell
- keras_lmu.layers.LMU
- keras_lmu.layers.LMUFFT
extensions:
- nengo_sphinx_theme.ext.autoautosummary
doctest_setup:
Expand Down
15 changes: 13 additions & 2 deletions CHANGES.rst
Expand Up @@ -19,9 +19,20 @@ Release history
- Removed
- Fixed

0.2.1 (unreleased)
0.3.0 (unreleased)
==================

**Changed**

- Renamed module from ``lmu`` to ``keras_lmu`` (so it will now be imported via
``import keras_lmu``), renamed package from ``lmu`` to
``keras-lmu`` (so it will now be installed via ``pip install keras-lmu``), and
changed any references to "NengoLMU" to "KerasLMU" (since this implementation is
based in the Keras framework rather than Nengo). In the future the ``lmu`` namespace
will be used as a meta-package to encapsulate LMU implementations in different
frameworks. (`#24`_)

.. _#24: https://github.com/abr/lmu/pull/24

0.2.0 (November 2, 2020)
========================
Expand Down Expand Up @@ -80,7 +91,7 @@ Release history
0.1.0 (June 22, 2020)
=====================

Initial release of NengoLMU 0.1.0! Supports Python 3.5+.
Initial release of KerasLMU 0.1.0! Supports Python 3.5+.

The API is considered unstable; parts are likely to change in the future.

Expand Down
10 changes: 5 additions & 5 deletions CONTRIBUTING.rst
@@ -1,19 +1,19 @@
.. Automatically generated by nengo-bones, do not edit this file directly

************************
Contributing to NengoLMU
Contributing to KerasLMU
************************

Issues and pull requests are always welcome!
We appreciate help from the community to make NengoLMU better.
We appreciate help from the community to make KerasLMU better.

Filing issues
=============

If you find a bug in NengoLMU,
If you find a bug in KerasLMU,
or think that a certain feature is missing,
please consider
`filing an issue <https://github.com/nengo/lmu/issues>`_!
`filing an issue <https://github.com/nengo/keras-lmu/issues>`_!
Please search the currently open issues first
to see if your bug or feature request already exists.
If so, feel free to add a comment to the issue
Expand All @@ -22,7 +22,7 @@ so that we know that multiple people are affected.
Making pull requests
====================

If you want to fix a bug or add a feature to NengoLMU,
If you want to fix a bug or add a feature to KerasLMU,
we welcome pull requests.
Ensure that you fill out all sections of the pull request template,
deleting the comments as you go.
Expand Down
6 changes: 3 additions & 3 deletions CONTRIBUTORS.rst
@@ -1,11 +1,11 @@
.. Automatically generated by nengo-bones, do not edit this file directly

*********************
NengoLMU contributors
KerasLMU contributors
*********************

See https://github.com/nengo/lmu/graphs/contributors
for a list of the people who have committed to NengoLMU.
See https://github.com/nengo/keras-lmu/graphs/contributors
for a list of the people who have committed to KerasLMU.
Thank you for your contributions!

For the full list of the many contributors to the Nengo ecosystem,
Expand Down
8 changes: 4 additions & 4 deletions LICENSE.rst
@@ -1,18 +1,18 @@
.. Automatically generated by nengo-bones, do not edit this file directly

****************
NengoLMU license
KerasLMU license
****************

Copyright (c) 2019-2020 Applied Brain Research

NengoLMU is made available under a proprietary license
KerasLMU is made available under a proprietary license
that permits using, copying, sharing, and making derivative works from
NengoLMU and its source code for any non-commercial purpose,
KerasLMU and its source code for any non-commercial purpose,
as long as the above copyright notice and this permission notice
are included in all copies or substantial portions of the software.

If you would like to use NengoLMU commercially,
If you would like to use KerasLMU commercially,
licenses can be purchased from Applied Brain Research.
Please contact info@appliedbrainresearch.com for more information.

Expand Down
6 changes: 3 additions & 3 deletions README.rst
@@ -1,9 +1,9 @@
Legendre Memory Units: Continuous-Time Representation in Recurrent Neural Networks
----------------------------------------------------------------------------------
KerasLMU: Recurrent neural networks using Legendre Memory Units
---------------------------------------------------------------

`Paper <https://papers.nips.cc/paper/9689-legendre-memory-units-continuous-time-representation-in-recurrent-neural-networks.pdf>`_

This is a python software library containing various implementations of the
This is a Keras-based implementation of the
Legendre Memory Unit (LMU). The LMU is a novel memory cell for recurrent neural
networks that dynamically maintains information across long windows of time using
relatively few resources. It has been shown to perform as well as standard LSTM or
Expand Down
4 changes: 2 additions & 2 deletions docs/api-reference.rst
Expand Up @@ -9,7 +9,7 @@ API reference
LMU Layers
==========

.. automodule:: lmu.layers
.. automodule:: keras_lmu.layers

.. autoautosummary:: lmu.layers
.. autoautosummary:: keras_lmu.layers
:nosignatures:
10 changes: 5 additions & 5 deletions docs/basic-usage.rst
Expand Up @@ -5,15 +5,15 @@ Basic usage
***********

The standard Legendre Memory Unit (LMU) layer
implementation in NengoLMU is defined in the
``lmu.LMU`` class. The following code creates
implementation in KerasLMU is defined in the
``keras_lmu.LMU`` class. The following code creates
a new LMU layer:

.. testcode::

import lmu
import keras_lmu

lmu_layer = lmu.LMU(
lmu_layer = keras_lmu.LMU(
memory_d=1,
order=256,
theta=784,
Expand All @@ -30,7 +30,7 @@ and ``units`` represents the dimensionality of the hidden component.
To learn more about these parameters, check out
the :ref:`LMU class API reference <api-reference-lc>`.

Creating NengoLMU layers
Creating KerasLMU layers
------------------------

The ``LMU`` class functions as a standard
Expand Down
2 changes: 1 addition & 1 deletion docs/citation.rst
Expand Up @@ -2,7 +2,7 @@
Citation
********

If you would like to cite NengoLMU in your research, please cite `this
If you would like to cite KerasLMU in your research, please cite `this
paper <http://compneuro.uwaterloo.ca/files/publications/voelker.2019.lmu.pdf>`_:

Aaron R. Voelker, Ivana Kajić, and Chris Eliasmith. Legendre Memory Units:
Expand Down
30 changes: 15 additions & 15 deletions docs/conf.py
Expand Up @@ -4,7 +4,7 @@

import os

import lmu
import keras_lmu

extensions = [
"sphinx.ext.autodoc",
Expand All @@ -31,7 +31,7 @@

# -- sphinx.ext.doctest
doctest_global_setup = """
import lmu
import keras_lmu
import numpy as np
import tensorflow as tf
"""
Expand All @@ -52,23 +52,23 @@

# -- notfound.extension
notfound_template = "404.html"
notfound_urls_prefix = "/lmu/"
notfound_urls_prefix = "/keras-lmu/"

# -- numpydoc config
numpydoc_show_class_members = False

# -- nengo_sphinx_theme.ext.autoautosummary
autoautosummary_change_modules = {
"lmu": [
"lmu.layers.LMUCell",
"lmu.layers.LMU",
"lmu.layers.LMUFFT",
"keras_lmu": [
"keras_lmu.layers.LMUCell",
"keras_lmu.layers.LMU",
"keras_lmu.layers.LMUFFT",
],
}

# -- nengo_sphinx_theme.ext.sourcelinks
sourcelinks_module = "lmu"
sourcelinks_url = "https://github.com/nengo/lmu"
sourcelinks_module = "keras_lmu"
sourcelinks_url = "https://github.com/nengo/keras-lmu"

# -- sphinx
nitpicky = True
Expand All @@ -84,20 +84,20 @@
linkcheck_anchors = True
default_role = "py:obj"
pygments_style = "sphinx"
user_agent = "lmu"
user_agent = "keras_lmu"

project = "NengoLMU"
project = "KerasLMU"
authors = "Applied Brain Research"
copyright = "2019-2020 Applied Brain Research"
version = ".".join(lmu.__version__.split(".")[:2]) # Short X.Y version
release = lmu.__version__ # Full version, with tags
version = ".".join(keras_lmu.__version__.split(".")[:2]) # Short X.Y version
release = keras_lmu.__version__ # Full version, with tags

# -- HTML output
templates_path = ["_templates"]
html_static_path = ["_static"]
html_theme = "nengo_sphinx_theme"
html_title = "NengoLMU {0} docs".format(release)
htmlhelp_basename = "NengoLMU"
html_title = "KerasLMU {0} docs".format(release)
htmlhelp_basename = "KerasLMU"
html_last_updated_fmt = "" # Default output format (suppressed)
html_show_sphinx = False
html_favicon = os.path.join("_static", "favicon.ico")
Expand Down
2 changes: 1 addition & 1 deletion docs/examples.rst
Expand Up @@ -9,7 +9,7 @@ are Jupyter notebooks and if you would like to run them yourself, refer to
the `Jupyter documentation <https://jupyter-notebook.readthedocs.io/en/latest/>`_ for
instructions on how to install and run Jupyter.

If you would like to see NengoLMU used to solve the Permuted Sequential MNIST task,
If you would like to see KerasLMU used to solve the Permuted Sequential MNIST task,
which is the problem used in the
`original paper <https://papers.nips.cc/paper/9689-legendre-memory-units-continuous-time-representation-in-recurrent-neural-networks.pdf>`_,
you can look through the example below:
Expand Down
6 changes: 3 additions & 3 deletions docs/examples/psMNIST.ipynb
Expand Up @@ -27,7 +27,7 @@
"to perform the task successfully, the network needs to process information across the\n",
"whole length of the input sequence.\n",
"\n",
"The following notebook uses a single NengoLMU layer inside a simple TensorFlow model to\n",
"The following notebook uses a single KerasLMU layer inside a simple TensorFlow model to\n",
"showcase the accuracy and efficiency of performing the psMNIST task using these novel\n",
"memory cells. Using the LMU for this task currently produces state-of-the-art results\n",
"this task ([see\n",
Expand All @@ -47,7 +47,7 @@
"from IPython.display import Image, display\n",
"import tensorflow as tf\n",
"\n",
"import lmu"
"import keras_lmu"
]
},
{
Expand Down Expand Up @@ -257,7 +257,7 @@
"n_pixels = X_train.shape[1]\n",
"\n",
"lmu_layer = tf.keras.layers.RNN(\n",
" lmu.LMUCell(\n",
" keras_lmu.LMUCell(\n",
" memory_d=1,\n",
" order=256,\n",
" theta=n_pixels,\n",
Expand Down