Skip to content

Commit

Permalink
Rename master to main in embedded links.
Browse files Browse the repository at this point in the history
Tried to avoid the change on external links to repos that
have not yet renamed master.
  • Loading branch information
gnecula committed Jun 18, 2021
1 parent 4e58106 commit 6a48c60
Show file tree
Hide file tree
Showing 84 changed files with 186 additions and 190 deletions.
8 changes: 3 additions & 5 deletions .github/workflows/ci-build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,9 @@ on:
# but only for the main branch
push:
branches:
- master
- main
pull_request:
branches:
- master
- main

jobs:
Expand All @@ -21,7 +19,7 @@ jobs:
uses: styfle/cancel-workflow-action@0.8.0
with:
access_token: ${{ github.token }}
if: ${{github.ref != 'refs/head/master' && github.ref != 'refs/head/main'}}
if: ${{github.ref != 'refs/head/main'}}
- uses: actions/checkout@v2
- name: Set up Python 3.8
uses: actions/setup-python@v2
Expand Down Expand Up @@ -67,7 +65,7 @@ jobs:
uses: styfle/cancel-workflow-action@0.7.0
with:
access_token: ${{ github.token }}
if: ${{github.ref != 'refs/head/master' && github.ref != 'refs/head/main'}}
if: ${{github.ref != 'refs/head/main'}}
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
Expand Down Expand Up @@ -114,7 +112,7 @@ jobs:
uses: styfle/cancel-workflow-action@0.7.0
with:
access_token: ${{ github.token }}
if: ${{github.ref != 'refs/head/master' && github.ref != 'refs/head/main'}}
if: ${{github.ref != 'refs/head/main'}}
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
Expand Down
14 changes: 7 additions & 7 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,11 @@ Best viewed [here](https://jax.readthedocs.io/en/latest/changelog.html).
<!--
Remember to align the itemized text with the first line of an item within a list.
PLEASE REMEMBER TO CHANGE THE '..master' WITH AN ACTUAL TAG in GITHUB LINK.
PLEASE REMEMBER TO CHANGE THE '..main' WITH AN ACTUAL TAG in GITHUB LINK.
-->

## jax 0.2.15 (unreleased)
* [GitHub commits](https://github.com/google/jax/compare/jax-v0.2.14...master).
* [GitHub commits](https://github.com/google/jax/compare/jax-v0.2.14...main).
* New features:
* The {func}`jax2tf.convert` supports inequalities and min/max for booleans
({jax-issue}`#6956`).
Expand Down Expand Up @@ -137,7 +137,7 @@ PLEASE REMEMBER TO CHANGE THE '..master' WITH AN ACTUAL TAG in GITHUB LINK.
* `TraceContext` --> {func}`~jax.profiler.TraceAnnotation`
* `StepTraceContext` --> {func}`~jax.profiler.StepTraceAnnotation`
* `trace_function` --> {func}`~jax.profiler.annotate_function`
* Omnistaging can no longer be disabled. See [omnistaging](https://github.com/google/jax/blob/master/design_notes/omnistaging.md)
* Omnistaging can no longer be disabled. See [omnistaging](https://github.com/google/jax/blob/main/design_notes/omnistaging.md)
for more information.
* Python integers larger than the maximum `int64` value will now lead to an overflow
in all cases, rather than being silently converted to `uint64` in some cases ({jax-issue}`#6047`).
Expand Down Expand Up @@ -185,7 +185,7 @@ PLEASE REMEMBER TO CHANGE THE '..master' WITH AN ACTUAL TAG in GITHUB LINK.
* {func}`jax.scipy.stats.betabinom` is now available as a distribution with logpmf and pmf methods.
* Added {func}`jax.experimental.jax2tf.call_tf` to call TensorFlow functions
from JAX ({jax-issue}`#5627`)
and [README](https://github.com/google/jax/blob/master/jax/experimental/jax2tf/README.md#calling-tensorflow-functions-from-jax)).
and [README](https://github.com/google/jax/blob/main/jax/experimental/jax2tf/README.md#calling-tensorflow-functions-from-jax)).
* Extended the batching rule for `lax.pad` to support batching of the padding values.
* Bug fixes:
* {func}`jax.numpy.take` properly handles negative indices ({jax-issue}`#5768`)
Expand Down Expand Up @@ -300,7 +300,7 @@ PLEASE REMEMBER TO CHANGE THE '..master' WITH AN ACTUAL TAG in GITHUB LINK.
* [GitHub commits](https://github.com/google/jax/compare/jax-v0.2.5...jax-v0.2.6).
* New Features:
* Add support for shape-polymorphic tracing for the jax.experimental.jax2tf converter.
See [README.md](https://github.com/google/jax/blob/master/jax/experimental/jax2tf/README.md).
See [README.md](https://github.com/google/jax/blob/main/jax/experimental/jax2tf/README.md).
* Breaking change cleanup

* Raise an error on non-hashable static arguments for jax.jit and
Expand Down Expand Up @@ -360,7 +360,7 @@ PLEASE REMEMBER TO CHANGE THE '..master' WITH AN ACTUAL TAG in GITHUB LINK.
* Improvements:
* Ensure that `check_jaxpr` does not perform FLOPS. See {jax-issue}`#4650`.
* Expanded the set of JAX primitives converted by jax2tf.
See [primitives_with_limited_support.md](https://github.com/google/jax/blob/master/jax/experimental/jax2tf/primitives_with_limited_support.md).
See [primitives_with_limited_support.md](https://github.com/google/jax/blob/main/jax/experimental/jax2tf/primitives_with_limited_support.md).

## jax 0.2.4 (October 19 2020)

Expand Down Expand Up @@ -397,7 +397,7 @@ PLEASE REMEMBER TO CHANGE THE '..master' WITH AN ACTUAL TAG in GITHUB LINK.
* [GitHub commits](https://github.com/google/jax/compare/jax-v0.1.77...jax-v0.2.0).
* Improvements:
* Omnistaging on by default. See {jax-issue}`#3370` and
[omnistaging](https://github.com/google/jax/blob/master/design_notes/omnistaging.md)
[omnistaging](https://github.com/google/jax/blob/main/design_notes/omnistaging.md)

## jax (0.1.77) (September 15 2020)

Expand Down
26 changes: 13 additions & 13 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
<div align="center">
<img src="https://raw.githubusercontent.com/google/jax/master/images/jax_logo_250px.png" alt="logo"></img>
<img src="https://raw.githubusercontent.com/google/jax/main/images/jax_logo_250px.png" alt="logo"></img>
</div>

# JAX: Autograd and XLA
Expand Down Expand Up @@ -87,24 +87,24 @@ perex_grads = jit(vmap(grad_fun, in_axes=(None, 0, 0))) # fast per-example grad
Jump right in using a notebook in your browser, connected to a Google Cloud GPU.
Here are some starter notebooks:
- [The basics: NumPy on accelerators, `grad` for differentiation, `jit` for compilation, and `vmap` for vectorization](https://jax.readthedocs.io/en/latest/notebooks/quickstart.html)
- [Training a Simple Neural Network, with TensorFlow Dataset Data Loading](https://colab.research.google.com/github/google/jax/blob/master/docs/notebooks/neural_network_with_tfds_data.ipynb)
- [Training a Simple Neural Network, with TensorFlow Dataset Data Loading](https://colab.research.google.com/github/google/jax/blob/main/docs/notebooks/neural_network_with_tfds_data.ipynb)

**JAX now runs on Cloud TPUs.** To try out the preview, see the [Cloud TPU
Colabs](https://github.com/google/jax/tree/master/cloud_tpu_colabs).
Colabs](https://github.com/google/jax/tree/main/cloud_tpu_colabs).

For a deeper dive into JAX:
- [The Autodiff Cookbook, Part 1: easy and powerful automatic differentiation in JAX](https://jax.readthedocs.io/en/latest/notebooks/autodiff_cookbook.html)
- [Common gotchas and sharp edges](https://jax.readthedocs.io/en/latest/notebooks/Common_Gotchas_in_JAX.html)
- See the [full list of
notebooks](https://github.com/google/jax/tree/master/docs/notebooks).
notebooks](https://github.com/google/jax/tree/main/docs/notebooks).

You can also take a look at [the mini-libraries in
`jax.experimental`](https://github.com/google/jax/tree/master/jax/experimental/README.md),
`jax.experimental`](https://github.com/google/jax/tree/main/jax/experimental/README.md),
like [`stax` for building neural
networks](https://github.com/google/jax/tree/master/jax/experimental/README.md#neural-net-building-with-stax)
networks](https://github.com/google/jax/tree/main/jax/experimental/README.md#neural-net-building-with-stax)
and [`optimizers` for first-order stochastic
optimization](https://github.com/google/jax/tree/master/jax/experimental/README.md#first-order-optimization),
or the [examples](https://github.com/google/jax/tree/master/examples).
optimization](https://github.com/google/jax/tree/main/jax/experimental/README.md#first-order-optimization),
or the [examples](https://github.com/google/jax/tree/main/examples).

## Transformations

Expand Down Expand Up @@ -310,7 +310,7 @@ print(normalize(jnp.arange(4.)))
# prints [0. 0.16666667 0.33333334 0.5 ]
```

You can even [nest `pmap` functions](https://colab.research.google.com/github/google/jax/blob/master/cloud_tpu_colabs/Pmap_Cookbook.ipynb#scrollTo=MdRscR5MONuN) for more
You can even [nest `pmap` functions](https://colab.research.google.com/github/google/jax/blob/main/cloud_tpu_colabs/Pmap_Cookbook.ipynb#scrollTo=MdRscR5MONuN) for more
sophisticated communication patterns.

It all composes, so you're free to differentiate through parallel computations:
Expand Down Expand Up @@ -343,9 +343,9 @@ When reverse-mode differentiating a `pmap` function (e.g. with `grad`), the
backward pass of the computation is parallelized just like the forward pass.

See the [SPMD
Cookbook](https://colab.research.google.com/github/google/jax/blob/master/cloud_tpu_colabs/Pmap_Cookbook.ipynb)
Cookbook](https://colab.research.google.com/github/google/jax/blob/main/cloud_tpu_colabs/Pmap_Cookbook.ipynb)
and the [SPMD MNIST classifier from scratch
example](https://github.com/google/jax/blob/master/examples/spmd_mnist_classifier_fromscratch.py)
example](https://github.com/google/jax/blob/main/examples/spmd_mnist_classifier_fromscratch.py)
for more.

## Current gotchas
Expand All @@ -359,7 +359,7 @@ Some standouts:
1. [In-place mutating updates of
arrays](https://jax.readthedocs.io/en/latest/notebooks/Common_Gotchas_in_JAX.html#%F0%9F%94%AA-In-Place-Updates), like `x[i] += y`, aren't supported, but [there are functional alternatives](https://jax.readthedocs.io/en/latest/jax.ops.html). Under a `jit`, those functional alternatives will reuse buffers in-place automatically.
1. [Random numbers are
different](https://jax.readthedocs.io/en/latest/notebooks/Common_Gotchas_in_JAX.html#%F0%9F%94%AA-Random-Numbers), but for [good reasons](https://github.com/google/jax/blob/master/design_notes/prng.md).
different](https://jax.readthedocs.io/en/latest/notebooks/Common_Gotchas_in_JAX.html#%F0%9F%94%AA-Random-Numbers), but for [good reasons](https://github.com/google/jax/blob/main/design_notes/prng.md).
1. If you're looking for [convolution
operators](https://jax.readthedocs.io/en/latest/notebooks/Common_Gotchas_in_JAX.html#%F0%9F%94%AA-Convolutions),
they're in the `jax.lax` package.
Expand Down Expand Up @@ -490,7 +490,7 @@ To cite this repository:
```

In the above bibtex entry, names are in alphabetical order, the version number
is intended to be that from [jax/version.py](../master/jax/version.py), and
is intended to be that from [jax/version.py](../main/jax/version.py), and
the year corresponds to the project's open-source release.

A nascent version of JAX, supporting only automatic differentiation and
Expand Down
2 changes: 1 addition & 1 deletion build/LICENSE.txt
Original file line number Diff line number Diff line change
Expand Up @@ -369,7 +369,7 @@ ISC license used for completely new code in BoringSSL:
The code in third_party/fiat carries the MIT license:

Copyright (c) 2015-2016 the fiat-crypto authors (see
https://github.com/mit-plv/fiat-crypto/blob/master/AUTHORS).
https://github.com/mit-plv/fiat-crypto/blob/main/AUTHORS).

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
Expand Down
2 changes: 1 addition & 1 deletion cloud_tpu_colabs/JAX_NeurIPS_2020_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -451,7 +451,7 @@
"id": "jC-KIMQ1q-lK"
},
"source": [
"For more, see the [`pmap` cookbook](https://colab.research.google.com/github/google/jax/blob/master/cloud_tpu_colabs/Pmap_Cookbook.ipynb)."
"For more, see the [`pmap` cookbook](https://colab.research.google.com/github/google/jax/blob/main/cloud_tpu_colabs/Pmap_Cookbook.ipynb)."
]
},
{
Expand Down
4 changes: 2 additions & 2 deletions cloud_tpu_colabs/JAX_demo.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -861,7 +861,7 @@
"id": "f-FBsWeo1AXE"
},
"source": [
"<img src=\"https://raw.githubusercontent.com/google/jax/master/cloud_tpu_colabs/images/nested_pmap.png\" width=\"70%\"/>"
"<img src=\"https://raw.githubusercontent.com/google/jax/main/cloud_tpu_colabs/images/nested_pmap.png\" width=\"70%\"/>"
]
},
{
Expand All @@ -871,7 +871,7 @@
"id": "jC-KIMQ1q-lK"
},
"source": [
"For more, see the [`pmap` cookbook](https://colab.research.google.com/github/google/jax/blob/master/cloud_tpu_colabs/Pmap_Cookbook.ipynb)."
"For more, see the [`pmap` cookbook](https://colab.research.google.com/github/google/jax/blob/main/cloud_tpu_colabs/Pmap_Cookbook.ipynb)."
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion cloud_tpu_colabs/Pmap_Cookbook.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@
"\n",
"To run this notebook with any parallelism, you'll need multiple XLA devices available, e.g. with a multi-GPU machine or a Cloud TPU.\n",
"\n",
"The code in this notebook is simple. For an example of how to use these tools to do data-parallel neural network training, check out [the SPMD MNIST example](https://github.com/google/jax/blob/master/examples/spmd_mnist_classifier_fromscratch.py) or the much more capable [Trax library](https://github.com/google/trax/)."
"The code in this notebook is simple. For an example of how to use these tools to do data-parallel neural network training, check out [the SPMD MNIST example](https://github.com/google/jax/blob/main/examples/spmd_mnist_classifier_fromscratch.py) or the much more capable [Trax library](https://github.com/google/trax/)."
]
},
{
Expand Down
12 changes: 6 additions & 6 deletions cloud_tpu_colabs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,25 +23,25 @@ only available with the new architecture, such as complex number support).

The following notebooks showcase how to use and what you can do with Cloud TPUs on Colab:

### [Pmap Cookbook](https://colab.research.google.com/github/google/jax/blob/master/cloud_tpu_colabs/Pmap_Cookbook.ipynb)
### [Pmap Cookbook](https://colab.research.google.com/github/google/jax/blob/main/cloud_tpu_colabs/Pmap_Cookbook.ipynb)
A guide to getting started with `pmap`, a transform for easily distributing SPMD
computations across devices.

### [Lorentz ODE Solver](https://colab.research.google.com/github/google/jax/blob/master/cloud_tpu_colabs/Lorentz_ODE_Solver.ipynb)
### [Lorentz ODE Solver](https://colab.research.google.com/github/google/jax/blob/main/cloud_tpu_colabs/Lorentz_ODE_Solver.ipynb)
Contributed by Alex Alemi (alexalemi@)

Solve and plot parallel ODE solutions with `pmap`.

<img src="https://raw.githubusercontent.com/google/jax/master/cloud_tpu_colabs/images/lorentz.png" width=65%></image>
<img src="https://raw.githubusercontent.com/google/jax/main/cloud_tpu_colabs/images/lorentz.png" width=65%></image>

### [Wave Equation](https://colab.research.google.com/github/google/jax/blob/master/cloud_tpu_colabs/Wave_Equation.ipynb)
### [Wave Equation](https://colab.research.google.com/github/google/jax/blob/main/cloud_tpu_colabs/Wave_Equation.ipynb)
Contributed by Stephan Hoyer (shoyer@)

Solve the wave equation with `pmap`, and make cool movies! The spatial domain is partitioned across the 8 cores of a Cloud TPU.

![](https://raw.githubusercontent.com/google/jax/master/cloud_tpu_colabs/images/wave_movie.gif)
![](https://raw.githubusercontent.com/google/jax/main/cloud_tpu_colabs/images/wave_movie.gif)

### [JAX Demo](https://colab.research.google.com/github/google/jax/blob/master/cloud_tpu_colabs/JAX_demo.ipynb)
### [JAX Demo](https://colab.research.google.com/github/google/jax/blob/main/cloud_tpu_colabs/JAX_demo.ipynb)
An overview of JAX presented at the [Program Transformations for ML workshop at NeurIPS 2019](https://program-transformations.github.io/) and the [Compilers for ML workshop at CGO 2020](https://www.c4ml.org/). Covers basic numpy usage, `grad`, `jit`, `vmap`, and `pmap`.

## Performance notes
Expand Down
2 changes: 1 addition & 1 deletion docs/autodidax.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
},
"source": [
"[![Open in\n",
"Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/google/jax/blob/master/docs/autodidax.ipynb)"
"Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/google/jax/blob/main/docs/autodidax.ipynb)"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion docs/autodidax.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ limitations under the License.
```

[![Open in
Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/google/jax/blob/master/docs/autodidax.ipynb)
Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/google/jax/blob/main/docs/autodidax.ipynb)

+++

Expand Down
2 changes: 1 addition & 1 deletion docs/autodidax.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@
# ---

# [![Open in
# Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/google/jax/blob/master/docs/autodidax.ipynb)
# Colab](https://colab.research.google.com/assets/colab-badge.svg)](https://colab.research.google.com/github/google/jax/blob/main/docs/autodidax.ipynb)


# # Autodidax: JAX core from scratch
Expand Down
2 changes: 1 addition & 1 deletion docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@
# outputs stored in ipynb but not in md, so we must convert the ipynb.
source_suffix = ['.rst', '.ipynb', '.md']

# The master toctree document.
# The main toctree document.
main_doc = 'index'

# The language for content autogenerated by Sphinx. Refer to documentation
Expand Down
2 changes: 1 addition & 1 deletion docs/contributing.md
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,7 @@ Follow these steps to contribute code:

```bash
git fetch upstream
git rebase upstream/master
git rebase upstream/main
```

Finally, push your commit on your development branch and create a remote
Expand Down
4 changes: 2 additions & 2 deletions docs/developer.md
Original file line number Diff line number Diff line change
Expand Up @@ -274,15 +274,15 @@ You have to add this metadata by hand in the `.ipynb` file. It will be preserved
re-saves the notebook.

We exclude some notebooks from the build, e.g., because they contain long computations.
See `exclude_patterns` in [conf.py](https://github.com/google/jax/blob/master/docs/conf.py).
See `exclude_patterns` in [conf.py](https://github.com/google/jax/blob/main/docs/conf.py).

## Documentation building on readthedocs.io

JAX's auto-generated documentations is at <https://jax.readthedocs.io/>.

The documentation building is controlled for the entire project by the
[readthedocs JAX settings](https://readthedocs.org/dashboard/jax). The current settings
trigger a documentation build as soon as code is pushed to the GitHub `master` branch.
trigger a documentation build as soon as code is pushed to the GitHub `main` branch.
For each code version, the building process is driven by the
`.readthedocs.yml` and the `docs/conf.py` configuration files.

Expand Down
2 changes: 1 addition & 1 deletion docs/device_memory_profiling.md
Original file line number Diff line number Diff line change
Expand Up @@ -139,4 +139,4 @@ pprof --web --diff_base memory1.prof memory9.prof
![Device memory profile at end of execution](_static/device_memory_profile_leak2.svg)

The visualization shows that the memory growth can be attributed to the call to
`normal` inside `anotherfunc`.
`normal` inside `anotherfunc`.
2 changes: 1 addition & 1 deletion docs/faq.rst
Original file line number Diff line number Diff line change
Expand Up @@ -140,7 +140,7 @@ and its use is not recommended.)

For a worked-out example, we recommend reading through
``test_computation_follows_data`` in
`multi_device_test.py <https://github.com/google/jax/blob/master/tests/multi_device_test.py>`_.
`multi_device_test.py <https://github.com/google/jax/blob/main/tests/multi_device_test.py>`_.

.. _faq-benchmark:

Expand Down

0 comments on commit 6a48c60

Please sign in to comment.