Skip to content

Commit f559857

Browse files
authored
MAINT: Harmonise GPU admonition, remove backend checks + add nvidia-smi (#47)
* Remove backend check, add to status page, add GPU admonition * add nvidia-smi to view hardware
1 parent 3bf3de4 commit f559857

File tree

11 files changed

+111
-73
lines changed

11 files changed

+111
-73
lines changed

lectures/aiyagari_jax.md

Lines changed: 9 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,15 @@ kernelspec:
1313

1414
# The Aiyagari Model
1515

16+
```{admonition} GPU
17+
:class: warning
18+
19+
This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.
20+
21+
Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.
22+
23+
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
24+
```
1625

1726
## Overview
1827

@@ -55,18 +64,6 @@ import jax
5564
import jax.numpy as jnp
5665
```
5766

58-
59-
Let’s check the backend used by JAX and the devices available.
60-
61-
```{code-cell} ipython3
62-
# Check if JAX is using GPU
63-
print(f"JAX backend: {jax.devices()[0].platform}")
64-
65-
# Check the devices available for JAX
66-
print(jax.devices())
67-
```
68-
69-
7067
We will use 64 bit floats with JAX in order to increase the precision.
7168

7269
```{code-cell} ipython3

lectures/arellano.md

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,15 @@ kernelspec:
1313

1414
# Default Risk and Income Fluctuations
1515

16-
+++
16+
```{admonition} GPU
17+
:class: warning
18+
19+
This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.
20+
21+
Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.
22+
23+
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
24+
```
1725

1826
In addition to what's in Anaconda, this lecture will need the following libraries:
1927

lectures/inventory_dynamics.md

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,16 @@ kernelspec:
2323

2424
# Inventory Dynamics
2525

26+
```{admonition} GPU
27+
:class: warning
28+
29+
This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.
30+
31+
Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.
32+
33+
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
34+
```
35+
2636
```{index} single: Markov process, inventory
2737
```
2838

lectures/jax_intro.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -14,12 +14,14 @@ kernelspec:
1414
# JAX
1515

1616

17-
```{note}
18-
This lecture is built using [hardware](status:machine-details) that
19-
has access to a GPU. This means that
17+
```{admonition} GPU
18+
:class: warning
19+
20+
This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.
21+
22+
Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.
2023
21-
1. the lecture might be significantly slower when running on your machine, and
22-
2. the code is well-suited to execution with [Google colab](https://colab.research.google.com/github/QuantEcon/lecture-python-programming.notebooks/blob/master/jax_intro.ipynb)
24+
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
2325
```
2426

2527
This lecture provides a short introduction to [Google JAX](https://github.com/google/jax).

lectures/kesten_processes.md

Lines changed: 10 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,16 @@ kernelspec:
2929
:depth: 2
3030
```
3131

32+
```{admonition} GPU
33+
:class: warning
34+
35+
This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.
36+
37+
Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.
38+
39+
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
40+
```
41+
3242
In addition to what's in Anaconda, this lecture will need the following libraries:
3343

3444
```{code-cell} ipython3
@@ -64,17 +74,6 @@ from jax import random
6474
```
6575

6676

67-
Let’s check the backend used by JAX and the devices available
68-
69-
```{code-cell} ipython3
70-
# Check if JAX is using GPU
71-
print(f"JAX backend: {jax.devices()[0].platform}")
72-
73-
# Check the devices available for JAX
74-
print(jax.devices())
75-
```
76-
77-
7877
## Kesten processes
7978

8079
```{index} single: Kesten processes; heavy tails

lectures/newtons_method.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,16 @@ kernelspec:
1414

1515
# Newton’s Method via JAX
1616

17+
```{admonition} GPU
18+
:class: warning
19+
20+
This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.
21+
22+
Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.
23+
24+
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
25+
```
26+
1727
## Overview
1828

1929
Continuing from the [Newton's Method lecture](https://python.quantecon.org/newton_method.html), we are going to solve the multidimensional problem with `JAX`.
@@ -28,16 +38,6 @@ import jax.numpy as jnp
2838
from scipy.optimize import root
2939
```
3040

31-
Let’s check the backend used by JAX and the devices available.
32-
33-
```{code-cell} ipython3
34-
# Check if JAX is using GPU
35-
print(f"JAX backend: {jax.devices()[0].platform}")
36-
37-
# Check the devices available for JAX
38-
print(jax.devices())
39-
```
40-
4141
## The Two Goods Market Equilibrium
4242

4343
Let's have a quick recap of this problem -- a more detailed explanation and derivation can be found at [A Two Goods Market Equilibrium](https://python.quantecon.org/newton_method.html#two-goods-market).

lectures/opt_invest.md

Lines changed: 13 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -14,6 +14,16 @@ kernelspec:
1414

1515
# Optimal Investment
1616

17+
```{admonition} GPU
18+
:class: warning
19+
20+
This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.
21+
22+
Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.
23+
24+
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
25+
```
26+
1727
We require the following library to be installed.
1828

1929
```{code-cell} ipython3
@@ -26,7 +36,9 @@ We require the following library to be installed.
2636
A monopolist faces inverse demand
2737
curve
2838

29-
$$ P_t = a_0 - a_1 Y_t + Z_t, $$
39+
$$
40+
P_t = a_0 - a_1 Y_t + Z_t,
41+
$$
3042

3143
where
3244

@@ -61,18 +73,6 @@ import jax.numpy as jnp
6173
import matplotlib.pyplot as plt
6274
```
6375

64-
65-
Let’s check the backend used by JAX and the devices available
66-
67-
```{code-cell} ipython3
68-
# Check if JAX is using GPU
69-
print(f"JAX backend: {jax.devices()[0].platform}")
70-
71-
# Check the devices available for JAX
72-
print(jax.devices())
73-
```
74-
75-
7676
We will use 64 bit floats with JAX in order to increase the precision.
7777

7878
```{code-cell} ipython3

lectures/opt_savings.md

Lines changed: 10 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,16 @@ kernelspec:
1313

1414
# Optimal Savings
1515

16+
```{admonition} GPU
17+
:class: warning
18+
19+
This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.
20+
21+
Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.
22+
23+
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
24+
```
25+
1626
In addition to what’s in Anaconda, this lecture will need the following libraries:
1727

1828
```{code-cell} ipython3

lectures/short_path.md

Lines changed: 9 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,15 @@ kernelspec:
1515

1616
# Shortest Paths
1717

18+
```{admonition} GPU
19+
:class: warning
20+
21+
This lecture is accelerated via [hardware](status:machine-details) that has access to a GPU and JAX for GPU programming.
22+
23+
Free GPUs are available on Google Colab. To use this option, please click on the play icon top right, select Colab, and set the runtime environment to include a GPU.
24+
25+
Alternatively, if you have your own GPU, you can follow the [instructions](https://github.com/google/jax) for installing JAX with GPU support. If you would like to install JAX running on the `cpu` only you can use `pip install jax[cpu]`
26+
```
1827

1928
## Overview
2029

@@ -28,19 +37,6 @@ import jax.numpy as jnp
2837
import jax
2938
```
3039

31-
32-
33-
Let’s check the backend used by JAX and the devices available.
34-
35-
```{code-cell} ipython3
36-
# Check if JAX is using GPU
37-
print(f"JAX backend: {jax.devices()[0].platform}")
38-
39-
# Check the devices available for JAX
40-
print(jax.devices())
41-
```
42-
43-
4440
## Solving for Minimum Cost-to-Go
4541

4642
Let $J(v)$ denote the minimum cost-to-go from node $v$,

lectures/status.md

Lines changed: 9 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -20,4 +20,12 @@ This table contains the latest execution statistics.
2020

2121
These lectures are built on `linux` instances through `github actions` and `amazon web services (aws)` to
2222
enable access to a `gpu`. These lectures are built on a [p3.2xlarge](https://aws.amazon.com/ec2/instance-types/p3/)
23-
that has access to `8 vcpu's`, a `V100 NVIDIA Tesla GPU`, and `61 Gb` of memory.
23+
that has access to `8 vcpu's`, a `V100 NVIDIA Tesla GPU`, and `61 Gb` of memory.
24+
25+
You can check the backend used by JAX using:
26+
27+
```{code-cell} ipython3
28+
import jax
29+
# Check if JAX is using GPU
30+
print(f"JAX backend: {jax.devices()[0].platform}")
31+
```

0 commit comments

Comments
 (0)