Skip to content

Commit

Permalink
Move to larq organisation (#140)
Browse files Browse the repository at this point in the history
* Move to larq organisation

* Update azure badges
  • Loading branch information
lgeiger committed Jul 12, 2019
1 parent 5d1fd09 commit 7997a11
Show file tree
Hide file tree
Showing 7 changed files with 12 additions and 12 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
# Larq

[![Azure DevOps builds](https://img.shields.io/azure-devops/build/plumerai/larq/5.svg?logo=azure-devops)](https://plumerai.visualstudio.com/larq/_build/latest?definitionId=5&branchName=master) [![Azure DevOps coverage](https://img.shields.io/azure-devops/coverage/plumerai/larq/5.svg?logo=azure-devops)](https://plumerai.visualstudio.com/larq/_build/latest?definitionId=5&branchName=master) [![PyPI - Python Version](https://img.shields.io/pypi/pyversions/larq.svg)](https://pypi.org/project/larq/) [![PyPI](https://img.shields.io/pypi/v/larq.svg)](https://pypi.org/project/larq/) [![PyPI - License](https://img.shields.io/pypi/l/larq.svg)](https://github.com/plumerai/larq/blob/master/LICENSE) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black)
[![Azure DevOps builds](https://img.shields.io/azure-devops/build/plumerai/larq/14.svg?logo=azure-devops)](https://plumerai.visualstudio.com/larq/_build/latest?definitionId=14&branchName=master) [![Azure DevOps coverage](https://img.shields.io/azure-devops/coverage/plumerai/larq/14.svg?logo=azure-devops)](https://plumerai.visualstudio.com/larq/_build/latest?definitionId=14&branchName=master) [![PyPI - Python Version](https://img.shields.io/pypi/pyversions/larq.svg)](https://pypi.org/project/larq/) [![PyPI](https://img.shields.io/pypi/v/larq.svg)](https://pypi.org/project/larq/) [![PyPI - License](https://img.shields.io/pypi/l/larq.svg)](https://github.com/larq/larq/blob/master/LICENSE) [![Code style: black](https://img.shields.io/badge/code%20style-black-000000.svg)](https://github.com/ambv/black)

[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/plumerai/larq/master?filepath=docs%2Fexamples) [![Join the community on Spectrum](https://withspectrum.github.io/badge/badge.svg)](https://spectrum.chat/larq)
[![Binder](https://mybinder.org/badge_logo.svg)](https://mybinder.org/v2/gh/larq/larq/master?filepath=docs%2Fexamples) [![Join the community on Spectrum](https://withspectrum.github.io/badge/badge.svg)](https://spectrum.chat/larq)

Larq is an open-source deep learning library for training neural networks with extremely low precision weights and activations, such as Binarized Neural Networks (BNNs).

Expand Down
4 changes: 2 additions & 2 deletions docs/examples/binarynet_advanced_cifar10.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
"source": [
"# BinaryNet on CIFAR10 (Advanced)\n",
"\n",
"<a href=\"https://mybinder.org/v2/gh/plumerai/larq/master?filepath=docs%2Fexamples%2Fbinarynet_advanced_cifar10.ipynb\"><button data-md-color-primary=\"blue\">Run on Binder</button></a> <a href=\"https://github.com/plumerai/larq/blob/master/docs/examples/binarynet_advanced_cifar10.ipynb\"><button data-md-color-primary=\"blue\">View on GitHub</button></a>\n",
"<a href=\"https://mybinder.org/v2/gh/larq/larq/master?filepath=docs%2Fexamples%2Fbinarynet_advanced_cifar10.ipynb\"><button data-md-color-primary=\"blue\">Run on Binder</button></a> <a href=\"https://github.com/larq/larq/blob/master/docs/examples/binarynet_advanced_cifar10.ipynb\"><button data-md-color-primary=\"blue\">View on GitHub</button></a>\n",
"\n",
"In this example we demonstrate how to use Larq to build and train BinaryNet on the CIFAR10 dataset to achieve a validation accuracy of around 90% using a heavy lifting GPU like a Nvidia V100.\n",
"On a Nvidia V100 it takes approximately 250 minutes to train. Compared to the original papers, [BinaryConnect: Training Deep Neural Networks with binary weights during propagations](https://arxiv.org/abs/1511.00363), and [Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1](http://arxiv.org/abs/1602.02830), we do not implement image whitening, but we use image augmentation, and a stepped learning rate scheduler."
Expand Down Expand Up @@ -293,4 +293,4 @@
},
"nbformat": 4,
"nbformat_minor": 2
}
}
4 changes: 2 additions & 2 deletions docs/examples/binarynet_cifar10.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
"source": [
"# BinaryNet on CIFAR10\n",
"\n",
"<a href=\"https://mybinder.org/v2/gh/plumerai/larq/master?filepath=docs%2Fexamples%2Fbinarynet_cifar10.ipynb\"><button data-md-color-primary=\"blue\">Run on Binder</button></a> <a href=\"https://github.com/plumerai/larq/blob/master/docs/examples/binarynet_cifar10.ipynb\"><button data-md-color-primary=\"blue\">View on GitHub</button></a>\n",
"<a href=\"https://mybinder.org/v2/gh/larq/larq/master?filepath=docs%2Fexamples%2Fbinarynet_cifar10.ipynb\"><button data-md-color-primary=\"blue\">Run on Binder</button></a> <a href=\"https://github.com/larq/larq/blob/master/docs/examples/binarynet_cifar10.ipynb\"><button data-md-color-primary=\"blue\">View on GitHub</button></a>\n",
"\n",
"In this example we demonstrate how to use Larq to build and train BinaryNet on the CIFAR10 dataset to achieve a validation accuracy approximately 83% on laptop hardware.\n",
"On a Nvidia GTX 1050 Ti Max-Q it takes approximately 200 minutes to train. For simplicity, compared to the original papers [BinaryConnect: Training Deep Neural Networks with binary weights during propagations](https://arxiv.org/abs/1511.00363), and [Binarized Neural Networks: Training Deep Neural Networks with Weights and Activations Constrained to +1 or -1](http://arxiv.org/abs/1602.02830), we do not impliment learning rate scaling, or image whitening."
Expand Down Expand Up @@ -542,4 +542,4 @@
},
"nbformat": 4,
"nbformat_minor": 2
}
}
4 changes: 2 additions & 2 deletions docs/examples/mnist.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
"source": [
"# Introduction to BNNs with Larq\n",
"\n",
"<a href=\"https://mybinder.org/v2/gh/plumerai/larq/master?filepath=docs%2Fexamples%2Fmnist.ipynb\"><button data-md-color-primary=\"blue\">Run on Binder</button></a> <a href=\"https://github.com/plumerai/larq/blob/master/docs/examples/mnist.ipynb\"><button data-md-color-primary=\"blue\">View on GitHub</button></a>\n",
"<a href=\"https://mybinder.org/v2/gh/larq/larq/master?filepath=docs%2Fexamples%2Fmnist.ipynb\"><button data-md-color-primary=\"blue\">Run on Binder</button></a> <a href=\"https://github.com/larq/larq/blob/master/docs/examples/mnist.ipynb\"><button data-md-color-primary=\"blue\">View on GitHub</button></a>\n",
"\n",
"This tutorial demonstrates how to train a simple binarized Convolutional Neural Network (CNN) to classify MNIST digits. This simple network will achieve approximately 98% accuracy on the MNIST test set. This tutorial uses Larq and the [Keras Sequential API](https://www.tensorflow.org/guide/keras), so creating and training our model will require only a few lines of code."
],
Expand Down Expand Up @@ -320,4 +320,4 @@
},
"nbformat": 4,
"nbformat_minor": 0
}
}
2 changes: 1 addition & 1 deletion docs/papers.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

One of the main focuses of Larq is to accelerate research on neural networks with extremely low precision weights and activations.

If you puplish a paper using using Larq please let us know and [add it to the list below](https://github.com/plumerai/larq/edit/master/docs/papers.md). Feel free to also add the author names, abstract and links to the paper and source code.
If you puplish a paper using using Larq please let us know and [add it to the list below](https://github.com/larq/larq/edit/master/docs/papers.md). Feel free to also add the author names, abstract and links to the paper and source code.

<h2><a class="headerlink" style="float:right; opacity: 1;" href="https://github.com/plumerai/rethinking-bnn-optimization" title="Source code"><i class="md-icon">code</i></a> <a class="headerlink" style="float:right; opacity: 1;" href="https://arxiv.org/abs/1906.02107" title="arXiv paper"><i class="md-icon">library_books</i></a> Latent Weights Do Not Exist: Rethinking Binarized Neural Network Optimization</h2>

Expand Down
2 changes: 1 addition & 1 deletion generate_api_docs.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ def callable_to_source_link(obj, scope):
path = scope.__file__.lstrip(".")
source = inspect.getsourcelines(obj)
line = source[-1] + 1 if source[0][0].startswith("@") else source[-1]
link = f"https://github.com/plumerai/larq/blob/master{path}#L{line}"
link = f"https://github.com/larq/larq/blob/master{path}#L{line}"
return f'<a class="headerlink code-link" style="float:right;" href="{link}" title="Source code"></a>'


Expand Down
4 changes: 2 additions & 2 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -31,8 +31,8 @@ nav:
- About: about.md
- FAQ: faq.md

repo_url: https://github.com/plumerai/larq
repo_name: plumerai/larq
repo_url: https://github.com/larq/larq
repo_name: larq/larq
edit_uri: ""
theme:
name: material
Expand Down

0 comments on commit 7997a11

Please sign in to comment.