Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 9 additions & 12 deletions lectures/aiyagari_jax.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,15 @@ kernelspec:

# The Aiyagari Model

```{admonition} GPU
:class: warning

This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.

Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.

Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
```

## Overview

Expand Down Expand Up @@ -55,18 +64,6 @@ import jax
import jax.numpy as jnp
```


Let’s check the backend used by JAX and the devices available.

```{code-cell} ipython3
# Check if JAX is using GPU
print(f"JAX backend: {jax.devices()[0].platform}")

# Check the devices available for JAX
print(jax.devices())
```


We will use 64 bit floats with JAX in order to increase the precision.

```{code-cell} ipython3
Expand Down
10 changes: 9 additions & 1 deletion lectures/arellano.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,15 @@ kernelspec:

# Default Risk and Income Fluctuations

+++
```{admonition} GPU
:class: warning

This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.

Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.

Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
```

In addition to what's in Anaconda, this lecture will need the following libraries:

Expand Down
10 changes: 10 additions & 0 deletions lectures/inventory_dynamics.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,16 @@ kernelspec:

# Inventory Dynamics

```{admonition} GPU
:class: warning

This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.

Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.

Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
```

```{index} single: Markov process, inventory
```

Expand Down
12 changes: 7 additions & 5 deletions lectures/jax_intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,12 +14,14 @@ kernelspec:
# JAX


```{note}
This lecture is built using [hardware](status:machine-details) that
has access to a GPU. This means that
```{admonition} GPU
:class: warning

This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.

Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.

1. the lecture might be significantly slower when running on your machine, and
2. the code is well-suited to execution with [Google colab](https://colab.research.google.com/github/QuantEcon/lecture-python-programming.notebooks/blob/master/jax_intro.ipynb)
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
```

This lecture provides a short introduction to [Google JAX](https://github.com/google/jax).
Expand Down
21 changes: 10 additions & 11 deletions lectures/kesten_processes.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,16 @@ kernelspec:
:depth: 2
```

```{admonition} GPU
:class: warning

This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.

Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.

Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
```

In addition to what's in Anaconda, this lecture will need the following libraries:

```{code-cell} ipython3
Expand Down Expand Up @@ -64,17 +74,6 @@ from jax import random
```


Let’s check the backend used by JAX and the devices available

```{code-cell} ipython3
# Check if JAX is using GPU
print(f"JAX backend: {jax.devices()[0].platform}")

# Check the devices available for JAX
print(jax.devices())
```


## Kesten processes

```{index} single: Kesten processes; heavy tails
Expand Down
20 changes: 10 additions & 10 deletions lectures/newtons_method.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,16 @@ kernelspec:

# Newton’s Method via JAX

```{admonition} GPU
:class: warning

This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.

Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.

Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
```

## Overview

Continuing from the [Newton's Method lecture](https://python.quantecon.org/newton_method.html), we are going to solve the multidimensional problem with `JAX`.
Expand All @@ -28,16 +38,6 @@ import jax.numpy as jnp
from scipy.optimize import root
```

Let’s check the backend used by JAX and the devices available.

```{code-cell} ipython3
# Check if JAX is using GPU
print(f"JAX backend: {jax.devices()[0].platform}")

# Check the devices available for JAX
print(jax.devices())
```

## The Two Goods Market Equilibrium

Let's have a quick recap of this problem -- a more detailed explanation and derivation can be found at [A Two Goods Market Equilibrium](https://python.quantecon.org/newton_method.html#two-goods-market).
Expand Down
26 changes: 13 additions & 13 deletions lectures/opt_invest.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,16 @@ kernelspec:

# Optimal Investment

```{admonition} GPU
:class: warning

This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.

Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.

Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
```

We require the following library to be installed.

```{code-cell} ipython3
Expand All @@ -26,7 +36,9 @@ We require the following library to be installed.
A monopolist faces inverse demand
curve

$$ P_t = a_0 - a_1 Y_t + Z_t, $$
$$
P_t = a_0 - a_1 Y_t + Z_t,
$$

where

Expand Down Expand Up @@ -61,18 +73,6 @@ import jax.numpy as jnp
import matplotlib.pyplot as plt
```


Let’s check the backend used by JAX and the devices available

```{code-cell} ipython3
# Check if JAX is using GPU
print(f"JAX backend: {jax.devices()[0].platform}")

# Check the devices available for JAX
print(jax.devices())
```


We will use 64 bit floats with JAX in order to increase the precision.

```{code-cell} ipython3
Expand Down
10 changes: 10 additions & 0 deletions lectures/opt_savings.md
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,16 @@ kernelspec:

# Optimal Savings

```{admonition} GPU
:class: warning

This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.

Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.

Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
```

In addition to what’s in Anaconda, this lecture will need the following libraries:

```{code-cell} ipython3
Expand Down
22 changes: 9 additions & 13 deletions lectures/short_path.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,15 @@ kernelspec:

# Shortest Paths

```{admonition} GPU
:class: warning

This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.

Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.

Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
```

## Overview

Expand All @@ -28,19 +37,6 @@ import jax.numpy as jnp
import jax
```



Let’s check the backend used by JAX and the devices available.

```{code-cell} ipython3
# Check if JAX is using GPU
print(f"JAX backend: {jax.devices()[0].platform}")

# Check the devices available for JAX
print(jax.devices())
```


## Solving for Minimum Cost-to-Go

Let $J(v)$ denote the minimum cost-to-go from node $v$,
Expand Down
10 changes: 9 additions & 1 deletion lectures/status.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,4 +20,12 @@ This table contains the latest execution statistics.

These lectures are built on `linux` instances through `github actions` and `amazon web services (aws)` to
enable access to a `gpu`. These lectures are built on a [p3.2xlarge](https://aws.amazon.com/ec2/instance-types/p3/)
that has access to `8 vcpu's`, a `V100 NVIDIA Tesla GPU`, and `61 Gb` of memory.
that has access to `8 vcpu's`, a `V100 NVIDIA Tesla GPU`, and `61 Gb` of memory.

You can check the backend used by JAX using:

```{code-cell} ipython3
import jax
# Check if JAX is using GPU
print(f"JAX backend: {jax.devices()[0].platform}")
```
22 changes: 15 additions & 7 deletions lectures/wealth_dynamics.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,19 @@ kernelspec:

# Wealth Distribution Dynamics

This lecture is the extended JAX implementation of [this lecture](https://python.quantecon.org/wealth_dynamics.html). Please refer that lecture for all background and notation.
```{admonition} GPU
:class: warning

This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.

Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.

Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
```

This lecture is the extended JAX implementation of [this lecture](https://python.quantecon.org/wealth_dynamics.html).

Please refer that lecture for all background and notation.

We will use the following imports.

Expand All @@ -28,14 +40,10 @@ from collections import namedtuple
```


Let's check the backend used by JAX and the devices available
Let's check the hardware we are running on:

```{code-cell} ipython3
# Check if JAX is using GPU
print(f"JAX backend: {jax.devices()[0].platform}")

# Check the devices available for JAX
print(jax.devices())
!nvidia-smi
```


Expand Down